views:

318

answers:

4

In my current project, the business logic is implemented in stored procedures (a 1000+ of them) and now they want to scale it up as the business is growing. Architects have decided to move the business logic to application layer (.net) to boost performance and scalability. But they are not redesigning/rewriting anything. In short the same SQL queries which are fired from an SP will be fired from a .net function using ADO.Net. How can this yield any performance?

To the best of my understanding, we need to move business logic to application layer when we need DB independence or there is some business logic that can be better implemented in a OOP language than an RDBMS engine (like traversing a hierarchy or some image processing, etc..). In rest of the cases, if there is no complicated business logic to implement, I believe that it is better to keep the business logic in DB itself, at least the network delays between application layer and DB can be avoided this way.

Please let me know your views. I am a developer looking at some architecture decisions with a little hesitation, pardon my ignorance in the subject.

+1  A: 

Architectural arguments such as these often need to consider many trades-off, considering performance in isolation, or ideed considering only one aspect of performance such as response time tends to miss the larger picture.

There clearly some trade off between executing logic in the database layer and shipping the data back to the applciation layer and processing it there. Data-ship costs versus processing costs. As you indicate the cost and complexity of the business logic will be a significant factor, the size of the data to be shipped would be another.

It is conceivable, if the DB layer is getting busy, that offloading processing to another layer may allow greater overall throughput even if the individual responses time are increased. We could then scale the App tier in order to deal with some extra load. Would you now say that performance has been improved (greater overall throughput) or worsened (soem increase in response time).

Now consider whether the app tier might implement interesting caching strategies. Perhaps we get a very large performance win - no load on the DB at all for some requests!

djna
+1 for wide view
borisCallens
One question about caching, can not the RDBMS implement it better than we do it in application layer?
Faiz
Again, in my particular case, the same SQL queries (that are used in SPs) are fired from .Net appln layer. So there is an increase in response time, but how can this reduce the load on DB as it is getting the same amount of queries again from appln layer instead of SPs?
Faiz
RDBMS would typically not cache derived data. Further, if it's on RDBMS layer then you need a network hop to go get it. Having it in app tier can actually perform very well. Very much depends on the exact use case, the stabilitiy of the data and so on. Again, no "one-size-fits-all" answer.
djna
+2  A: 

If your business logic is still in SQL statements, the database will be doing as much work as before, and you will not get better performance. (may be more work if it is not able to cache query plans as effectivily as when stored procedures were used)

To get better performance you need to move some work to the application layer, can you for example cache data on the application server, and do a lookup or a validation check without hitting the database?

Shiraz Bhaiji
I am not sure about cache management and synchronization. I am reading about it these days.
Faiz
Check out Enterprise Library Caching application block
Shiraz Bhaiji
+1  A: 

I think those decisions should not be justified using architectural dogma. Data would make a great deal more sense.

Statements like "All business logic belongs in stored procedures" or "Everything should be on the middle tier" tend to be made by people whose knowledge is restricted to databases or objects, respectively. Better to combine both when you judge, and do it on the basis of measurements.

For example, if one of your procedures is crunching a lot of data and returning a handful of results, there's an argument that says it should remain on the database. There's little sense in bringing millions of rows into memory on the middle tier, crunching them, and then updating the database with another round trip.

Another consideration is whether or not the database is shared between apps. If so, the logic should stay in the database so all can use it.

Middle tiers tend to come and go, but data remains forever.

I'm an object guy myself, but I would tread lightly.

It's a complicated problem. I don't think that black and white statements will work in every case.

duffymo
+1  A: 

Well as others have already said, it depends on many factors. But from you question it seems the architects are proposing moving the stored procedures from inside DB to dynamic SQL inside the application. That sounds very dubious to me. SQL is a set oriented language and business logic that requires massaging of large amount of data records would be better in SQL. Think complicated search and reporting type function. On the other hand line item edits with corresponding business rule validation is much better being done in a programming language. Caching of slow changing data in app tier is another advantage. This is even better if you have dedicated middle tier service that acts as a gateway to all the data. If data is shared directly among disparate applications then stored proc may be a good idea. You also have to factor the availability/experience of SQL talent vs programming talent in the organisation. There is realy no general answer to this question.

Pratik
I think the term Dynamic SQL should not be used to point to the SQL used in objects in an OOP language.
Faiz