I have a CRUD-heavy ASP.NET application with all the business logic in Stored Procedures.
As an example, there is an UPDATE stored procedure that's ~500 lines long and contains large amounts of conditional logic, referencing multiple tables & UDFs. The proc takes in the field name being updated and the new value, sets a bunch of declared variables, does a bunch of validation and creates a dynamic SQL statement to do the update. Once size fits all. It's big and confusing.
I would like to move the business logic over to the .NET side to make it easier to manage/update, test and put under source control.
My question is this: where should this business logic go?
Say I have a PurchaseOrder object with a property called 'Factory'. If the Factory gets changed, I need to make sure the new factory assigned makes the product that's on the PurchaseOrder, that it has pricing, and that there is a minimum quantity requested based upon that factory, etc. All these validations require inquiries in the database.
Should I have the PurchaseOrder object's Factory setter be responsible for doing the data validation via an 'isFactoryValid' method/property that makes the multiple calls to a generic data access object then do the update if it is?
Or do I create a PurchaseOrder/Database 'proxy' object that's responsible for handling just PurchaseOrder-related data access. In this case, would I have an 'isFactoryValid' method in the proxy that's called by the PurchaseOrder's setter and then a call to the proxy's update method?
How do I determine if I need to worry about increasing traffic to the database with all these extra calls?