views:

27

answers:

2

Hello,

I have a thought provoking question, in relation to the use Entity Framework (EF) to persist data. My (intended) application is not a typical LOB scenario where the standard CRUD operations are done on individual records.

Instead, what I would like to use the data stored within my entities, create some Matrices by combining the data on several Entities and do some intensive numerical math. Throughout this intensive process, properties upon the Entities will be continually accessed and updated.

My concern is that the act of accessing/updating the properties on EF Entities will severely reduce the speed of the entire operation, due to all the Lazy Loading, the NotifyPropertyChanged and PropertyChanged, PropertyChanging function calls and the calls to the SaveChanges function on the EF context object... with respect to standard C# objects.

Any thoughts on how to mitigate the speed issues, at the expense of some of the niceties that EF offers?

Regards, LiamV

+1  A: 

Hm, at first, you need to avoid references between entities, and do not use generated entities. Consider using POCO (plain old C# objects) and Entity Framework code-first method. This allows you to control all code entity framework will call throught your entities. May be this link helps http://blogs.msdn.com/b/adonet/archive/2009/05/21/poco-in-the-entity-framework-part-1-the-experience.aspx

STO
+1  A: 

Don't prematurely optimize. Test it and see. Lazy loading can be turned off, and change tracking isn't a huge overhead. Yes, you can use POCOs if need be, but it would be a huge mistake to make such a decision on the basis of imagined performance problems.

That said, I think it's a good decision from a dependency-management point of view to not make business logic dependent on persistent storage. You don't need to use POCO entities to do this, though; you can project onto business types with any kind of entity.

Craig Stuntz