views:

54

answers:

1

My company has recently put up a demo database on his remote servers to allow some of our partners to test our beta version of the software.

We noticed, as expected, some bottlenecks in some parts of the program, in particular on the places in which many queries are done.

For example, I have to load customer from database, with all his products associated to him.

Normally this operation would be done as

  • select somefields from customers where id = someId
  • Store it to business class
  • select someotherfiels from products where customerid = previousid
  • store to collection and show/use them.

Imagine a much more complex business logic, which would gather 8 to 12 tables in the same way. With a local database, this would be instant. But by connecting to a remotely hosted SQL Server, this is extremely slow.

We found out that making a single query, like

SELECT Customer.Field1, Customer.Field2, Product.Field1, Product.Field2,
       --//LIST OTHER 200 FIELDS FROM 10 DIFFERENT TABLE SE SAME WAY 
FROM Customer
LEFT JOIN Product ON Product.IdCustomer = Customer.IdCustomer
--//LEFT JOIN OTHER 10 TABLES THE SAME WAY
WHERE Customer.IdCustomer = 10

amazingly improved the speed of the whole execution.
The fact that this is more complex than the single selects of each of them doesn't compare to the fact that is only one access to the server.
I am talking about ~2000/3000ms to 80-120ms.

Here comes the true problem (sorry about the long preface). How can i save data to a remote SQL Server in an efficient way?

Imagine i have a window/form/dialog/frame/whatever where i ought to save more tables after a given operation.

For example:

  • INSERT INTO Customer (Field) VALUES ('hello world') "IdCustomer" is an identity column
  • Fetch the new Id from database
  • INSERT INTO SomeLinkedTable (IdCustomer, Field) VALUES (x, 'new value')
  • Fetch new "IdSomeLinkedTable", use it to link some table, ecc ecc..

This is a common way of multiple saving in our program. Even if we don't leave the user without some message that the operation is gonna take a while, waiting ~10s to do a frequent multi-insert operation is way too much.

Maybe it's a simple server configuration problem, but i swear all the firewalls in both sides (our & server) are properly configured to allow some SQL Server access.

Anyone encountered the same problem?

A: 

If you pulled all your business logic into the app layer, then your app layer must live close to the database.

The solution to make a fast and snappy remote applicaiton is to use fiefdoms-and-emissaries so you remove the tight coupling to the database.

As an interim solution you can speed up any operation by removing the round-trips to the database. If you have to save a master-detail form, you send the entire operation as a single T-SQL batch (INSERT header, SELECT @@identity, INSERT all details) as one single batch. Of course, in practice this means you moved your business logic into the T-SQL batches, and now that the logic lives in the database is better to have in in stored procs anyway.

At the end of the day, you can't have your cake and eat it too. You can either make a autonomous disconnected application that supports a remote, ocassionally connected db, and demo that, or make a tightly coupled application that must live close to the database and demo that.

Remus Rusanu
That stored procedure would require a pretty long list of arguments, in that case. Or is there a better way to simplify it? Thank you for the link, i'll have a nice time reading it i guess.
Alex Bagnolini
You can use table valued parameters for example. Or wrap multiple parameters into one single XML parameter. But then you'll be approaching more and more toward a multi-layered app (ie. a complicated thing) made just for the purposes of the demo.
Remus Rusanu
TVP: http://msdn.microsoft.com/en-us/library/bb510489.aspx
Remus Rusanu
never use SELECT @@identity, use select scope_identity() instead.
HLGEM
@hlgem: you're right. Gratz on the 10k rep btw
Remus Rusanu