views:

349

answers:

4

My software went in production some days ago and now I want to argue a bit about the database structure.

The software collects data about ships, currently 174 details for each ship, each detail can be a text value, a long text value, a number (of a specified length, with or without a specified number of decimals), a date, a date with time, a boolean field, a menu with many values, a list of data and more.

I solved the problem with the following tables

Ship:
- ID - smallint, Autoincrement identity
- IMO - int, A number that does not change for the life of the ship

ShipDetailType: 
- ID - smallint, Autoincrement identity
- Description - nvarchar(200), The description of the value the field contains
- Position - smallint, The position of the field in the data input form
- ShipDetailGroup_ID - smallint, A key to the group the field belongs to in the data input form
- Type - varchar(4), The type of the field as mentioned above

ShipDetailGroup
- ID - smallint, Autoincrement identity
(snip...)

ShipMenuPresetValue
- ID - smallint, Autoincrement identity
- ShipDetailType_ID - smallint, A key to the detail the values belongs to
- Value - nvarchar(100), The values preset in the menu type detail

ShipTextDetail
- ID - smallint, Autoincrement identity
- Ship_ID - smallint, A Key to the ship the detail belongs to
- ShipDetailType_ID - smallint, a Key to the detail type of the value
- Text - nvarchar(500), the field containing the detail's value
- ModifiedDate - smalldatetime
- User_ID - smallint, A key to the user table

ShipTextDetailHistory
(snip...) 
This table is the same as the ShipTextDetail and contains every change to the details.

Other tables for the list detail type, each with the specified fields required for the list, ...

I just read this article: http://thedailywtf.com/Articles/The_Inner-Platform_Effect.aspx and http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:10678084117056

The articles says that this is not the right way to handle the problem.

My customer has a management gui for the details and groups as he changes the details descriptions and adds more details.

The data input form is dynamically build reading the structure from the DetailGroups and DetailTypes, each detail type generates a specified input control.

The comments suggests that another way of solving this matter is dinamically creating and removing columns from the table.

What do you think?

Diagram Screenshot: http://img24.imageshack.us/my.php?image=66604496uk3.png

+4  A: 
Joe Soul-bringer
Yes, it does work well, but i like to question myself and grow up new abilities and expand my knowledge.
Marco Bettiolo
A: 

I've done similar things, but there are a couple problems with this specific implementation:

  1. You are storing numbers, booleans, dates, etc. as strings. This might be less than ideal. An alternative is to implement separate classes (inheriting from a base) for the different data types then store them in tables made for their data type.
  2. Do the properties that you track change very frequently? Are they a different set per tanker? If not, it might be better to make objects rather than property bags to store all the data. Those objects can then be persisted to the database.
DavGarcia
I decided to store them the simple way as the data stored in the text column does not need to be computed by procedures. Data is only displayed and edited. Right now the customer makes daily changes. The customer want to manage the properties by himself. You think a table with 180 columns is ok?
Marco Bettiolo
Each tanker has the same set of properties. But not all properties are set.
Marco Bettiolo
+3  A: 

I've seen this approach before and it's presented loads of performance issues once the data volume has grown. The kind of problems you'll encounter come when you need to return multiple items and use multiple criteria in your where clause. You join back and forth between Ship and ShipTextDetail to get all your select columns - maybe you have to do that 10/20 times ? You then do the same for your criteria maybe 2-3 times. Now you have a query with so many joins it runs really slowly. Next you 'pre-cook' some of the data to improve performance, ie you drag out common data into a fixed table structure - ah you've returned to a semi-normalised model.

My recommendation would be this - you know the information for 174 fields those are your core attributes. Your customer may add to that list, and may change the description of the fields, but it's still a really good starting point. Create a proper DataModel based around those, and then build in an extensability mechanism, as you have already done, but only for the new fields. The metadata - the descriptions of the fields, can reside in another table, or potentially in a resource file (useful for internationalisation?) and that gives some flexibility for existing fields.

I agree with Joe, you may not have problems if your DB is small, ie <1000 ships and your selects are simple. Although with 174 attributes to chose from this doesn't appear likely. I think you should change some of the 'obvious' fields first, ie I'd assume you have a Ship.Name, Ship.Owner, Ship.Weight, Ship.Registration ...

Good Luck.

MrTelly
A: 

From a performance standpoint, either approach will be fine. How many ships could there possibly be? All the data is going to fit into RAM on any server.

Seun Osewa
I think that there will be less than 200 ships
Marco Bettiolo