views:

764

answers:

3

I am looking for best practices to implement a TraceListener that will write logs inside SQL server from an ASP.NET application.

What are the things that should be taken into account when implementing such a class in order to avoid performance reduction?

Will stored procedures be faster than plain ADO.NET INSERT statements?

I really like the idea of writing the logs into a temporary in-memory buffer and flush it to the database at some later point from a background thread, but what data structure is most suitable for such scenario? Queue<T> seems like a good candidate but I cannot add elements to it without some synchronization mechanism.

I found over the internet an article that shows an example of a custom TraceListener that writes to SQL server but before putting it into production code I would like to have some more feedback.

+1  A: 

Stored Procedures won't be any faster then paramteized SQL. I prefer a Stored Procedure over hardcoding SQL in my application but if you were going to generate the insert statements then that is even better.

Using a buffer is a good idea if your ok with the idea that you might loose data. If you want to decouple the client from the insert and you want it to be durable you could use MSMQ. Then you could write a windows service that would process the queue and it woulod be completly decoupled from the application. It could also then aggregate logs from multiple servers if you have a server farm.

JoshBerke
@Josh, MSMQ seems a good idea. I don't have any experience with it but I will dig it further. Thanks for the pointer.
Darin Dimitrov
+1  A: 

log4net will dump trace to a sql db with a whole heap of flush options etc. check it: http://logging.apache.org/log4net/release/features.html

log4net is proven. if you can avoid 're-writing the wheel' it's good right?

cottsak
@cottsak, I completely agree with you. log4net is a great logging framework, but company policy doesn't allow me to use it. That's why I have to reinvent the wheel :-)
Darin Dimitrov
A: 

i cant speak as to whether SPs are faster than ADO inserts but i will always suggest keeping querys/nonquerys in your application, NOT in the datastore. as you now, the datastore is for data, not logic. avoid SPs.

MS SQL Server is a complex piece of machinery and i dont think your application would be able to cause blocking in your code from too much logging to your db. obviously this is subject to your particular implementation and from your interest in performance i might assume you intend to support high volume. what im saying is i dont think you need an in-memory queue or service to wrap the logging process. just flush the trace to the db in your aspnet app and forget about caching/flushing. SQL will look after keeping the logging queries in memory and it will look after writing it to disk - in fact it manages this very well. SQL's query buffer will ensure your code does not block when u flush your trace. if you dont believe me, test it with timestamps in your debug window or something. if you do need to tweak it, the tweaking should be in your db's memory settings. u dont need to invent a wrapper here, use SPs or start another service on your server (like MSMQ), this will just eat up more of your precious CPU and mem.

cottsak