views:

63

answers:

4

Hi,

We are developing a client-server desktop application(winforms with sql server 2008, using LINQ-SQL).We are now finding many issues related to performance.These relate to querying too much data with LINQ , bad database design,not much caching etc.What do you suggest,we should do - how to go about solving these performance issues? One thing,I am doing is doing sql profiling,and trying to fix some queries.As far caching is concerned,we have static lists.But,how to keep them updated,we don't have any server side implementation.So,these lists can be stale,if someone changes data.

regards

A: 

I found Jeff Atwood's articles on this quite interesting:

Compiled Or Bust

All Abstractions are field Abstractions

Russ C
A: 

For updating, you can create a Table. I called it ListVersions.

Just store list id, name and version.

When you do some changes to a list, just increment its version. In your application, you'll just need to compare version and update only if it has changed. Update lists that have version incremented, not all.

I've described it in my answer to this question

http://stackoverflow.com/questions/2808209/what-is-the-preferred-method-of-refreshing-a-combo-box-when-the-data-changes/2808367#2808367

Good Luck!

hgulyan
A: 

A general recipe for performance issues:

  1. Measure (wall clock time, CPU time, memory consumption etc.)
  2. Design & implement an algorithm that you think could be faster than current code.
  3. Measure again to assess the impact of your fix.

Many times the biggest bottle necks aren't exactly where you though they were. So, base your actions on measured data.

Try to keep the number of SQL queries small. You're more likely to get performance improvements by lowering the amount of queries than restrucrturing the SQL syntax of an individual query.

I recommed adding some server side logic instead of directly firing the SQL queries from the client. You could implement caching shared but all clients on the server side.

Taneli Waltari
A: 

Performance analysis without tools is fruitless, with the wrong tools frustrating. SQL Profiler is the wrong tool to rely on for what you are looking at. I think it is at best giving you a hint of what is wrong.

You need to use a code profiler to determine why/when these queries are being executed. You should be able to find one by Googling it and run it a x day trial.

The key questions are:

  1. Are queries being run multiple times when there is no reason to at all? Is the data already in memory (even if not stored statically). This happens a lot where data is already retrieved but because of some action on the code it loads it again. Class properties are a big culprit here.

  2. Should certain data be stored statically across the application? How volatile is that data? Can you afford to show stale data?

The only way to decide on #2 is to have hard data to examine the cost of a particular transaction. For example, if I know it takes me 1983 ms to create a new invoice, what will it be after I start caching data. After the cache is that savings significant. But recognize you can't answer that question until you know it takes 1983 ms to create an invoice.

When I profile an application transaction I focus on the big contributor and try to determine why it is so big. I look for individual methods that are slow and for any code that is executed frequently. It is often the latter, the death of a thousand cuts, that gets you.

And I wanted to add this, it is also very important to know when to stop working on a performance issue.

Flory