views:

162

answers:

2

My question is what is best practice to optimize performance using LINQ for SQL And performance is response time out in the user interface.

Right now I have some sales data in a SQL Server 2008 database and I display this data (MAT, yearly, in different segments, growth in segment, percent of market growth ,,,,) in charts in a ASP.NET application using LINQ for SQL to constructs Iquerable expressions that are executed

I see the challenge that I have a database and used LINQ to construct all questions and I have no control what SQL is created ( I can track it but ,,,,) and I don't use Stored Procedures so how my data is fetched is like a black box.

Right now I run some unit tests and manual test the application and use the Databasse Engine Tuning Advisor what indexes etc to create....

+1  A: 

In addition to that, I'll usually use both SQL profiler and CLR profiler with some simulated users on a large-ish data set, and watch for long-running queries and/or long-running calls through the datacontext (which may signify multiple round-trips happening under the covers). My personal preference is also to disable deferred loading and object tracking on all my datacontexts by default, so I have to opt-IN to multiple round-trips in most cases. While you can't directly affect the SQL that's generated, you can be careful with LoadWith/AssociateWith and make sure that you're not fetching horribly large/inefficient result sets, and break up queries that have lots of expensive joins (sometimes multiple round-trips are cheaper than mondo joins on big tables).

It's all about measurement- use whatever tools you can get your hands on.

nitzmahone
Thanks good points
salgo60
+1  A: 

Profiling, profiling, profiling. :)

Measure not only timings, but pay attention to I/O as well. A frequently executed query that is I/O intensive can execute fast due to caching, but can in turn have a negative effect on the overall db-server performance since there will be less resources available for other queries.

As you say, L2S can be a bit of a black box, so you need to try to replicate all scenarios and/or profile while the app is in use by real users. Then use that to 1) tweak queries 2) add indexes 3) make any other changes needed to get the performance you need.

I have a profiling tool made specifically for Linq-to-SQL to make it a little bit 'less black box' - it allows you to do runtime profiling while tying the generated queries to the code (call stack) that resulted in a specific query being executed. You can download it and get a free trial license at http://www.huagati.com/L2SProfiler/

The background reason for my profiler is outlined in a bit more detail here: http://huagati.blogspot.com/2009/06/profiling-linq-to-sql-applications.html

...and some advanced profiling options are covered here: http://huagati.blogspot.com/2009/08/walkthrough-of-newest-filters-and.html


Another thing that may help if you have a lot of tables with a lot of columns is to get index info into the code editor. This is done by adding xml doc-comments with that info to the entity classes and member properties; that info is then displayed in the VS code editor's tooltips:

code editor tooltips showing xml doccomments for L2S entity classes, member properties etc

...that way you can see already while typing queries if there is an index covering the column(s) used in where clauses etc. To avoid having to type all of that in, I have created a tool for that too. See the 'update documentation' feature in http://www.huagati.com/dbmltools/

KristoferA - Huagati.com
Thanks I will have a look at your tool. My spontaneous feeling about LINQ-> SQL is that the code you can write and possibiolities to refractor a question in nice smaller parts is excellent compared to in the old days, but this part where you have problems understanding what/how to tune I assume can give problems. What will happen if the linq2SQL implementation will change the generated code in the future then you will have another access pattern in the database..
salgo60
Although the possibility exist that some future change will affect the generated SQL, I don't think it is something to worry about too much. The changes to L2S in .net 4.0 at least don't have any big impact on the generated SQL. Changes in your db (amount of data in the various tables involved), changes in SQL Server itself etc is more likely to have an impact and both of those will affect your app regardless of how the SQL is generated.
KristoferA - Huagati.com
As for the what/how to tune, I guess that is a bit of a 'getting used to' thing just as it is with raw SQL. The 'tweakable' areas are more or less the same in both L2S and raw SQL, so you just need to watch out for what could cause unnecessary I/O and try to eliminate those things.
KristoferA - Huagati.com
Any good book/web recommendations?!?! I have "Inside Microsoft SQL Server 2005 Query Tuning and Optimization"WHen I look on the generated SQL it is crazy 440 lines....
salgo60
Most of the 440 lines are probably things you can ignore. Look at where clause predicates and join criteria. You want to make sure there are indexes that cover whatever you use to eliminate rows, and that there is nothing that prevent those indexes from being used. If you can share your query (anonymized if necessary) portion it will be easier to suggest what might be wrong. It really depends on the nature of the query. Unfortunately I haven't learnt from books so I have no good book recommendations - I have just picked up query optimization the 'hard way' over the years... :)
KristoferA - Huagati.com
My exection plan http://farm3.static.flickr.com/2804/4227477361_4d7d5f23e0_o.jpg
salgo60
Cool idea to add the index info to xml doc....Right now I am running a minor project but I assume that in a bigger project where you have a SQL db Administrator that should receive the SQL database and would like to understand how it is tuned will get upset that he get less control now with Linq...
salgo60
If you can post the query and I/O stats (run 'set statistics io on;' before running the query) and/or the execution plan in xml format it will give a better picture of what might be costly. The graphical representation of the plan is only the beginning; each node has a pile more data behind it...
KristoferA - Huagati.com
...and yep, all OR mappers are from time to time slammed by DBAs who want 'full control' etc. Some will even go as far as denying direct table access and doing all database operations through static stored procedures; an outdated approach in my opinion. That is also costly for companies who adopt that approach, due to reduced flexibility for developers but still fairly common. A much better approach would be for those 'anti-db-access' DBAs would be to assist their co-worker-devs with profiling and optimization.
KristoferA - Huagati.com