tags:

views:

31

answers:

2

Hello.

I have a very limited experience of database programming and my applications that access databases are simple ones :). Until now :(. I need to create a medium-size desktop application (it's called rich client?) that will use a database on the network to share data between multiple users. Most probably i will use C# and MSSQL/MySQL/SQLite.

I have performed a few drive tests and discovered that on low quality networks database access is not so smooth. In one company's LAN it's a lot of data transferred over network and servers are at constant load, so it's a common situation that a simple INSERT or SELECT SQL query will take 1-2 minutes or even fail with timeout / network error.

Is it any best practices to handle such situations? Of course i can split my app into GUI thread and DB thread so network problems will not lead to frozen GUI. But what to do with lots of network errors? Displaying them to user too often will be not very good :(. I'm thinking about automatic creating local copy of a database on each computer my app is running: first updating local database and synchronize it in background, simple retrying on network errors. This will allow an app to function event if network has great lags / problems.

Any hints and buzzwords what can i look into? Maybe it's some best practices already available that i don't know :)

+2  A: 

Sorry this is prob not the answer you are looking for but you mention that a simple insert / update could take 1-2 minutes or even fail with timeout / network error.

This to me sounds like there may be another problem rather than the network itself. If your working on a corporate network there would have to be insane levels of traffic for this sort of behavior. I would do everything in your power to look at improving the network before proceeding. Can you post the result of a ping to the db box?

If your going to architect your application around this type of network it will significantly alter the end product and even possibly result in a poor quality product for other clients.

Depending upon the nature of the application maybe look at implementing an async persistence queue and caching data on startup or even embedding a copy of the db into your application.

Karl
It's a normal situation for a small and medium-sized companies where networks are bad designed, wrongly assembled and kind of unmaintained :). Of course, MySQL server is more network tolerant than SQLite, but 2-3 second pings and windows shares that can't open due to lags can kill any network conversation between app and database :)
Eye of Hell
+1  A: 

Even though async behaviour/queues/caching/copying the database to each local instance etc will help solve the symptoms, the problem will still remain. If the network really is that bad then I'd address it with their I.T. department, or the project manager and build some performance requirement from their side of things into the contract.

sjhuk
+1 for adding clause to the contract
Karl
It's a good idea. But sometimes it's better to add network tolerance to software than to fight every 10-th person for his network quality :)
Eye of Hell