views:

86

answers:

1

I'm writing a web app using python with web.py, and I want to implement my own logging system. I'd like to log detailed information about each request that come to python (static files are handled by web servers).

Currently I'm thinking about writing the logs to a pipe. On the other side, there should be cronolog.

My main concern is that will the performance be good? How is the time/resource consumed in piping the logs compared to the normal processing of a request (less than 5 database queries, and page generation from templates)?

Or are there other better approaches? I don't want to write the log file in python because tens of processes will be started by fastcgi.

+1  A: 

Pipes are one of the fastest I/O mechanisms available. It's just a shared buffer. Nothing more. If the receiving end of your pipe is totally overwhelmed, you may have an issue. But you have no evidence of that right now.

If you have 10's of processes started by FastCGI, each can have their own independent log file. That's the ideal situation: use Python logging -- make each process have a unique log file.

In the rare event that you need to examine all log files, cat them together for analysis.

S.Lott