views:

347

answers:

2

I'm fairly new to .NET development in general. I would like to do some instrumentation on my web application to tune performance, especially in relation to caching. I've written a lot of custom dynamically generated UserControls which I'd like to try caching in different ways--possibly by ASPX Page declaration or programmatically.

I also have a lot of Oracle queries that depend on each other and I'd like to look into caching results of those to see what will offer the best performance gains.

What would be the best way to go about this? Somehow I don't think using a stopwatch watching to see how long IE takes the page to load is the best idea. I'll have no idea if my caching is getting hit or missed aside from perceived delay. Does VS2008 have tools built in to assist?

+1  A: 

The way I usually approach this is by enabling ASP.NET tracing. There is a pretty good guide to enabling this here. It's quite detailed and best of all it's free.

One thing to focus on is the size of the page (especially its view state) which is an important part of how long the page takes to download once the code has executed. The other is how fast certain parts of the code render which can be achieved by making calls to Trace.Write before and after the code being perf tested.

The perf testing part of using ASP.NET tracing is really just a guide as it's not easily repeatable, but a good one none-the-less. If you want to step it up, there are tools in Visual Studio 2008 Team System that might be better (although I haven't used those). There's also JetBrains dotTrace and ANTS Profiler.

Alex Angas
+2  A: 

A project I worked on recently needed to check the timings of our SQL queries, and output them to the Debug listener in Debug mode. This way we could assess the timings of the SQL queries and how long they took to execute, as well as debug our website code.

I did this by centralising out SQL queries into a wrapper method for the 3 types of SQL methods we used:

  • ExecuteQuery
  • ExecuteNonQuery
  • ExecuteScalar

They did use the Stopwatch class, but also there was code to transform the query into a SQL statement, similar to that seen in SQL Server Profiler.

Each method was similar to the following:

protected static object ExecuteScalar(SqlCommand comm, SqlParameter[] sqlParameters)
{
    Stopwatch st = new Stopwatch();
    bool errorDetected = false;

    try
    {
         comm.Parameters.Add(sqlParameters);

         using(comm)
         {
              st.Start();
              object returnValue = comm.ExecuteScalar();
              st.Stop();
              return returnValue;
         }
    }
    catch(Exception)
    {
         errorDetected = true;
         st.Stop();
         throw;
    }
    finally
    {
         string output = GetSqlStringForParameters(sqlParameters,st.Elapsed,QueryType.Scalar);

         if(errorDetected)
         {
              Debug.WriteLine("/*SQL (Errored)*/" + output,"DataAccess.SqlAdapter.ExecuteScalar");
         }
         else
         {
              Debug.WriteLine("/*SQL*/" + output,"DataAccess.SqlAdapter.ExecuteScalar");
         }
    } 
}

This would then output our SQL statements in DebugView.exe like this:

/*SQL*/ exec spsGetOrder @OrderNumber='234567' -- Rows returned = 1 ;  1 params |--> completed NonQuery in 0.0016144 seconds.

The beauty of this is that although, yes there is a bottleneck for these statements, we can paste the query directly into SQL profiler and get the output of the query. It also means that if you have a tool to view the log files, you can use regular expressions to monitor where the time taken is of a certain range.

So if you want to look for queries taking over 0.5 seconds, you could search for the term:

"in 0.[5-9]\d+" <-- Everything bigger than 0.5 seconds or "in [1-9].\d+" <-- Everything bigger than 1 second

This has helped us focus our efforts immensely. It also helps us with identifying if a problem is database related, or as provided above, an ASP.NET issue.

Finally, there is a tool called Fiddler which can also help you diagnose pages as they come to your computer. This gives you information like file size, references to images/css, download times. This is also very useful as diagnosing ViewState size issues.

I hope this helps

Dominic Zukiewicz