views:

385

answers:

2

We've got a database with over 1000+ tables and would like to consider using EF4 for our data access layer, but I'm concerned about the practical realities of using it for such a large data model. I've seen this question and read about the suggested solutions here and here. These may work, but appear to refer to the first version of the Entity Framework (and are more complex than I'd like). Does anyone know if these solutions have been improved upon in EF4? Or have other suggestions all together? Thanks.

UPDATE: After a number of attempts at making EF work, I've decided to abandon it all together for this project. Large data model support just isn't there and while there may be work arounds (e.g. editing and maintaining the xml independent of the designer), they just don't feel ready for prime time. Most problematic for me is the fact that the EF doesn't work well with the domain model spread across multiple XML files without a lot of redundancy and duplication of code. I'm still open to suggestions (I know I haven't peeled back all layers of the EF onion), but for now, I am moving on without EF.

UPDATE #2: It looks like the pending code first support (currently in EF4 CTP4) is likely to end up being the solution we want as it takes the designer and large XML file maintenance out of play.

+4  A: 

You should definately take a look at LLBLGen Pro v3. While LLBLGen is another O/RM tool, just like EF is an O/RM tool, the latest version contains a designer that allows you to generate models for LINQ to SQL, NHibernate, AND Entity Framework (both 1.0 and 4.0). Its designer is pretty solid and has better support for big domain models.

Steven
This sounded like a pretty good solution, but LLBLGenPro became unresponsive as soon as I tried to open a connection to my database. Evidently, the application runs a rather inefficient query against the INFORMATION_SCHEMA views to create the object catalog. I could probably get around this by setting the timeout to greater than 10 minutes for the database connection, but waiting for 10 minutes every time I want to modify the model doesn't feel like a very workable solution. Thanks for the suggestion, though!
David Kreps
I advise you to contact the creators through mail of forum. They have very responsive support. I bet they can tell you very quickly what you are doing wrong. And when it is a bug, they have normally have a fix available in no-time. Good luck.
Steven
Are you sure you don't have a really slow connection? 1000 tables isn't all that large - we have around that many and LLBL pulls the object data AND generates all the ORM code in under 5 minutes. We also are just using straight LLBL, rather than using it to generate EF4 or other ORM code.
Jess
I wrote LLBLGen Pro and have tested it on a db with over 1000 tables, without problems. The query over informationschema, is that the FK query one? If possible could you give more information by posting a post on our forums or email us at support AT llblgen DOT com? Thanks!
Frans Bouma
Yes, it was most definitely the FK query. I've sent an email with more detail per your request. Thanks!
David Kreps
Thanks to David Kreps we found the issue and have written a new FK meta-data retrieval query which is super fast on 2005/2008, and which will be available in the next build (v3.0 builds after september 17th). :) So this slowness on mega dense models with a lot of tables (1000+) is now solved :)
Frans Bouma
+3  A: 

The number I heard in a Microsoft screencast is a maximum of roughly 250 tables per EF model. That doesn't mean EF can't handle more - it might just be sensible to break up your 1000+ tables into several logical groups of tables, and use one EF model per such logical group (with up to 250 tables in it).

I highly doubt you'll have queries that will need to use all 1000 tables at once - most likely not even 10 at once. So you should definitely be able to split up your pretty large model into smaller clusters and turn each into a separate EF model.

marc_s
That's a bummer
Jess
I don't see why people have a problem with this. I'm doing it at the moment and it just makes sense. I have a DB with about 300 tables but I'm splitting it into 12 models since there are 12 "areas" of the system. It's maintainable and simple. So what if code is duplicated...I didn't write the code!
Stimul8d