views:

239

answers:

4

Hi,

we have a data-layer which contains classes generated by outputs (tables/views/procs/functions) from database. The tables in database are normalized and are designed similar to OOP design ( table for "invoice" has 1:1 relation to table for "document", table for "invoice-item" has 1:1 relation to table for "document-item", etc...". All access to/from databaes is by stored procedures (for simple tables too).

Typical clas looks like (shortly):

public class DocumentItem {
    public Guid? ItemID { get; set; }
    public Guid? IDDocument { get; set; }
    public DateTime? LastChange { get; set; }
}

public class InvoiceItem : DocumentItem {
    public Guid? IDProduct { get; set; }
    public decimal? Price { get; set; }
}

The problem is, the database tables has relations similar to multiple inheritance in OOP. Now we do a new class for every database output. But every database outputs are combination of "pure" tables in database.

The ideal solution would be (IMHO) tranform classes to interface, use the multiple implementation of interfaces, and then automaticly implement the members (this "table-classes" has only properties, and body of properties are always same).

For example: public interface IItem { Guid? ItemID { get; set; } DateTime? LastChange { get; set; } }

public interface IDocumentItem : IItem {
    Guid? IDDocument { get; set; }
}

public interface IItemWithProduct : IItem {
    Guid? IDProduct { get; set; }
}

public interface IItemWithRank : IItem {
    string Rank { get; set; }
}

public interface IItemWithPrice : IItem {
    decimal? Price { get; set; }
}

// example of "final" item interface
public interface IStorageItem : IDocumentItem, IItemWithProduct, IItemWithRank { }

// example of "final" item interface
public interface IInvoiceItem : IDocumentItem, IItemWithProduct, IItemWithPrice { }

// the result should be a object of class which implements "IInvoiceItem"
object myInvoiceItem = SomeMagicClass.CreateClassFromInterface( typeof( IInvoiceItem ) );

The database contains hunderts of tables and the whole solution is composed from dynamicly loaded modules (100+ modules).

What do you think, is the best way, how to deal with it?

EDIT:

Using partial classes is good tip, bud in our solution can not be used, because "IDocumentItem" and "IItemWithPrice" (for example) are in different assemblies.

Now, if we make change in "DocumentItem" table, we must go and re-generate source code in all dependent assemblies. There is almost no reuse (because can not use multiple inheritance). Its quite time consuming, if there are dozens of dependent assemblies.

+1  A: 

So, you're really looking for some kind of mix-in technology. Of course, I have to ask why you aren't using LINQ to Entity Framework or NHibernate. O/RMs handle these problems by mapping the relational model into usable data structures that have APIs to support all of the transactions that you'll need to manipulate data in the database. But I digress.

If you are really looking for a mix-in technology to do dynamic code generation, check out Cecil at the Mono Project. It's a way better place to start than trying to use Reflection.Emit to build dynamic classes. There are other dynamic code generators out there but you may want to start with Cecil since the documentation is pretty good.

W. Kevin Hazzard
Is Cecil better than compilation at runtim from string in something?
TcKs
+2  A: 

I think it is a bad idea to automatically generate your domain model from your database schema.

nightcoder
Why? All meta-data is known, I don't see any reason why should do this developer instead of computer. Automaticaly generation of code is much more cheaper and developers then has more time for complex problems. It's my motivation to do it automaticly.
TcKs
TcKs, Because OOP should NOT be coopted to follow the conventions of RDBMS. If you want everything to be data-centric, put your business logic in stored procs.
Jon Limjap
But there is not OOP model coopted by DB model, but DB model is coopted by OOP model.Move business logic in stored procedures is not posible, because business logic is very variable and is very dependent on user's ad-hoc requirements.
TcKs
For example, Aggregation and inheritance are implemented the same way in databases. When generating, how will you decide if it is aggregation or inheritance? Also, what if you have a bit mask stored as int in the database and you want it to be enum in application?
nightcoder
This should not determined to totaly-universal solution. We have some vetting design rules and conventions, which makes resolving "inheritance/agregation" very easy. The resolving is not problem.
TcKs
A: 

The scenario is unclear to me.

If the code is generated, you don't need any magic: add some metadata to your database objects (e.g. Extended Properties in SQL Server) that flags the "basic" interfaces, and modify your generating template/tool to consider the flags.

If the question is about multiple inheritance, you are out of luck with .Net.

If the code is generated, you may also take advantage of partial classes and methods (are you using .Net 3.5?) to produce code in different source files.

If you need to generate code at run-time there are many techniques, not least ORM tools.

Now may you be a bit more explicit of your design context?

Giulio Vian
Can you more explain "If you need to generate code at run-time there are many techniques", please ? It looks interting.I've added "EDIT" into question, is it cleaner now?
TcKs
About run-time code gen, as someone else said, there is Reflection.Emit, but it's really complex. A simpler approach is through a combination of Reflection, CodeDom and some template technology: you apply a template to generate the class' source code based on reflection data, then compile this code in an assembly, finally you load the assembly and instantiate the class.What I don't understand is why you need run-time code gen when the leaf interface (IInvoiceItem) is known at build time.
Giulio Vian
A: 

If you wish to continue auto-generating from the database and want to model multiple inheritance, then I think you have the right idea: Alter the tool to spit out interfaces with multiple inheritance, plus X num implementations.

You indicated elsewhere that a convention for inheritance vs. aggregation is enforced, and (as I understand) you know exactly how the resulting interfaces and classes should look. I understand that business rules are implemented elsewhere (maybe in a business rules engine?), so regenerating the classes should not require changes to dependent code, unless you want to take advantage of those changes, or existing properties have been altered or removed.

But you won't be done. Your classes will still have id's of related entities. If you want to make things easier for client code, you should have references to related entities (not caring about the related entity's ID), like this:

public class Person{
    public Guid? PersonID { get; set; }
    public Person Parent { get; set; }
}

That would make things easier on the client. When you think about it, going from ID's to references is work you have to do anyway; it's better to do it once in the middle tier than to let the client do it N number of times. Plus this makes your code less database-dependent.

So above all else, I recommend writing an OO wrapper for the auto-generated classes. You would program against this OO wrapper for almost everything; let only the data access layer interact with the auto-generated classes. Sure, you can't reuse inheritance metadata in the database (specified via conventions, I assume?), but at least you won't be carving a new path.

By contrast, what you have now looks like an anemic data model or worse.

apollodude217