views:

338

answers:

3

I assume that most of the analyzing and tracking is done based on the data gathered from browser actions like page requests. Tools like AWStats, Google Analytics and Omniture take place in this.

But there is also good amount of data available in databases or service level logs. For example GWT based application might be a bit tricky to analyze. Or in case of a financial application customer might be interested in suspicious transfers.

So, please share your best practices:

  • What kind of approaches you have implemented for DB or log analysis?
  • Do you use some existing tools or your own in-house products?
  • Are you happy just to follow which functionality is the used most one and how fast it is processed?
  • Or do you actually store user action paths and use those to spot unusual patterns?
+1  A: 

I know in Oracle you can put in hints in the form of sql comments. There are optimizers that will see the comments and try to use the hints to make the sql run faster or other various functions.

When it comes to db logging you will only decrease performance of the database if you attempt to log what users did what because you are now creating more overhead to build your log file or table rows to store the user related info.

There are other tools for enterprise DBs like oracle that allow you to see concurrent transactions that can help you find bottlenecks when slow downs occur.

The best meta data you can get for analyzing web traffic are the technologies you listed above.

When it comes to banking transactions and the movement and storage of very massive amounts of data, anything you do to log how users move around in your site or the database will create more overhead. You can write code to check out suspicious behavior but you would end up doing it at the cost of slower performance.

mugafuga
+1  A: 

Our application keeps an in-memory list of SQL calls that have been made, storing for each the class that made the call, number of executions, maximum execution time and total execution time. There's a page we can go to in order to see the information since the server has been up.

This is mostly for performance monitoring, but I also use to to see how many times particular queries are run.

WW
+1  A: 

On the tool front, for Windows you can use the MS LogParser tool:

Basically turns your flat log files into a "database" you can run SQL-like queries on. You can even output in grids, charts and graphs.

Patrick Cuff