views:

75

answers:

3

Hy everyone.

In c# .net VS 2008 I'm developing an N-tier CRM framework solution and when it's done I want to share it.

The architecture is based on:

Data Access Layer, Entity Framework, Bussines Logic Layer, WCF and finally the presentation layer (win forms).

Somewhere I had read, that more than 2 tier layers are problematic, beacuse of the Optimistic Concurrency Updates (multiple client transactions with the same data).

In max. 2-tier layer solutions this should not be a problem because of the controls (like datagridview) that are solving this problem by themself, so I'm asking myself if it's not better to work with 2-tier layers and so avoid the optimistic concurrency problem?

Actually I want to make an N-tier layer solution for huge projects and not 2-tiers. I don't know how to solve concurrency problems like this and hope to get help right here.

Certainly there should be good some mechanisms to solve this... maybe any suggestions, examples, etc.?

Thanking you in anticipation.

Best regards, Jooj

A: 

We use a combination manual merging (determining change sets and collisions) and last man wins depending on the data requirements. if the data changes collide same field changed from the a common original value then merge type exceptions are thrown and the clients handle that.

Preet Sangha
A: 

A few things come to my mind:

1) If you are using EF surely you don'y have Data Access Layer?! Do you mean database itself?

2) Question with tiers is both a phydical and logical one. So do you mean physical or logical?

3) In any-tiered application there is this issue with concurrency. Even in client-server, people could open a form, go soemwhere and come back and then save while the data has been changed by soemone else. You can use a timestamp to check while saving making sure your last update was when you have had the data.

4) Do not think too much on less or more tiers. Just implement the functionality as simple as possible and with the minimum number of layers.

Aliostad
+1  A: 

It's not really a question of the number of tiers. The question is how does your data access logic deal with concurrency. Dealing with concurrency should happen in whichever tier handles your data access regardless of how many tiers you have. But I understand where you're coming from as the .NET controls and components can hide this functionality and reduce the number of tiers needed.

There are two common methods of optimistic concurrency resolution.

The first is using a timestamp on rows to determine if the version the user was looking at when they started their edit has been modified by the time they commit their edit. Keep in mind that this is not necessarily a proper Timestamp database data type. Different systems will use different data types each with their own benefits and drawbacks. This is the simpler approach and works fine with most well designed databases.

The second common approach is, when committing changes, to identify the row in question not only by id but by all of the original values for the fields that the user changed. If the original values of the fields and id don't match on the record being edited you know that at least one of those fields has been changed by another user. This option has the benefit that even if two users edit the same record, as long as they don't change the same fields, the commit works. The downside is that there is possible extra work involved to guarantee that the data in the database record is in a consistent state as far as business rules are concerned.

Here's a decent explanation of how to implement simple optimistic concurrency in EF.

Chuck