views:

440

answers:

2

We have a weblogic batch application which processes multiple requests from consumers at the same time. We use log4j for logging puposes. Right now we log into a single log file for multiple requests. It becomes tedious to debug an issue for a given request as for all requests the logs are in a single file.

So plan is to have one log file per request. The consumer sends a request ID for which processing has to be performed. Now, in reality there could be multiple consumers sending the request IDs to our application. So question is how to seggregate the log files based on the request.

We cannot start & stop the production server every time so the point in using an overridden file appender with date time stamp or request ID is ruled out. This is what is explained in the article below: http://veerasundar.com/blog/2009/08/how-to-create-a-new-log-file-for-each-time-the-application-runs/

I also tried playing around with these alternatives:

http://cognitivecache.blogspot.com/2008/08/log4j-writing-to-dynamic-log-file-for.html

http://www.mail-archive.com/[email protected]/msg05099.html

This approach gives the desired results but it does not work properly if multiple request are send at the same time. Due to some concurrency issues logs go here and there.

I anticipate some help from you folks. Thanks in advance....

+1  A: 

Here's my question on the same topic: http://stackoverflow.com/questions/1239227/dynamically-creating-destroying-logging-appenders

I follow this up on a thread where I discuss doing something exactly like this, on the Log4J mailing list: http://www.qos.ch/pipermail/logback-user/2009-August/001220.html

Ceci Gulcu (inventor of log4j) didn't think it was a good idea...suggested using Logback instead.

We went ahead and did this anyway, using a custom file appender. See my discussions above for more details.

eqbridges
@eqbridgesHave you used NDC in log4j? I got this from log4j API...A Nested Diagnostic Context, or NDC in short, is an instrument to distinguish interleaved log output from different sources. Log output is typically interleaved when a server handles multiple clients near-simultaneously. Interleaved log output can still be meaningful if each log entry from different contexts had a distinctive stamp. This is where NDCs come into play.
Gaurav Saini
i have used NDC in the past. That is really the proper solution to what you're trying to do. You're just scratching the surface of the problems you will encounter if you try to do a separate logfile per request.
eqbridges
A: 

Look at SiftingAppender shipping with logback (log4j's successor), it is designed to handle the creation of appenders on runtime criteria.

If you application needs to create just one log file per session, simply create a discriminator based on the session id. Writing a discriminator involves 3 or 4 lines of code and thus should be fairly easy. Shout on the logback-user mailing list if you need help.

Ceki