I have a log file in SqlServer that stores the time an application started, the time the application is ready (i.e. finished loading), and the time that it's exited. Each of these occur as a separate entry. The format (and sample data) is as follows:
Date/Time User Type Application Message
2009-11-03 12:26:12.403 uname1 Info app1 Started
2009-11-03 12:26:22.403 uname1 Info app1 Loaded
2009-11-03 12:27:15.403 uname2 Info app1 Started
2009-11-03 12:27:16.401 uname1 Info app1 Exited
2009-11-03 12:27:18.403 uname2 Info app1 Loaded
2009-11-03 12:29:12.403 uname2 Info app1 Exited
I would like to find out, per application and per user, the amount of time it took for the application to get to a ready state and the amount of time the application was running. This would be a piece of cake if each date/time was in the same record, and it would also be easy (though tedious) to just load up each record as a cursor and sift through the data, but I thought that there must be some way to do this "correctly" in a set-theory manner.
So, to reiterate, the following output (from the sample data from above) would be expected (numbers are in seconds, rounded up):
User Application Ready Uptime
uname1 app1 10 64
uname2 app1 3 117
Any suggestions?
EDIT: The good news is that the application can only be started once. BUT, the log does not take into account if the application crashed (though I suppose I could look for "exited" and "crashed" as final conditions).