tags:

views:

307

answers:

3

I understand that Python loggers cannot be instantiated directly, as the documentation suggests:

Note that Loggers are never instantiated directly, but always through the module-level function logging.getLogger(name)

.. which is reasonable, as you are expected not to create logger objects for every class/module for there is a better alternative.

However, there are cases where I want to create a logger object and attach a file to it exclusively for logging some app-specific output to that file; and then close the log file.

For instance, I have a program that builds all packages in PyPI. So basically assume there is a for loop going over every package. Inside the loop, I want to "create" a logger, attach a file handler (eg: /var/logs/pypi/django/20090302_1324.build.log and send the output of python setup.py build (along with other things) to this log file. Once that is done, I want to close/destroy the logger and continue building other packages in similar fashion.

So you see .. the normal Pythonic way of calling logging.getLogger does not apply here. One needs to create temporary logger objects.

Currently, I achieve this by passing the file name itself as the logger name:

>>> packagelog = logging.getLogger('/var/..../..34.log')
>>> # attach handler, etc..

I want to ask .. is there a better way to do this?

A: 

Assuming you're calling to setup.py build as a subprocess I think you really just want output redirection, which you can get via the subprocess invocation.

from subprocess import Popen
with open('/var/logs/pypi/django/%s.build.log' % time_str, 'w') as fh:
    Popen('python setup.py build'.split(), stdout=fh, stderr=fh).communicate()

If you're calling setup.py build as a Python subroutine (i.e. importing that module and invoking it's main routine) then you could try to add another logging.Handler (FileHandler) to a logger in that module if such a logger exists.

Update

Per answer comment, it sounds like you just want to add a new FileHandler to your current module's logger, then log things into that, then remove it from the logger later on. Is that more what you're looking for?

cdleary
You didn't understand my question. Calling 'python setup.py build' is just an example. What I actually want to do is log some specific things to a file .. temporarily.
Sridhar Ratnakumar
Oh, so you want to log your current program's logging statements to that build file as well as the output of `setup.py build`?
cdleary
Yes. You can ignore that setup.py example. It is a general question.
Sridhar Ratnakumar
I may be misinterpreting what you mean by "specific things". You mean a subset of the things that you log to the current module's logger? As in, you want some log messages to get logged to that file but also fall through to the module's logger?
cdleary
I think Vinay's elaboration below is more clear than what I intended to convey. :-)
Sridhar Ratnakumar
+2  A: 

Instead of many loggers, you could use one logger and many handlers. For example:

log = logging.getLogger(name)
while some_condition:
    try:
        handler = make_handler(filename)
        log.addHandler(handler)
        # do something and log

    finally:
        log.removeHandler(handler)
        handler.close()
ars
+1: exactly -- @srid: And I do not see anything not-pythonic in the original code. see logging.getLogger(...) as a factory method, so it is actually *creating* a logger for you (or returning already existing one for given name).
van
"(...) or returning already existing one for given name" - that is where I got problem with my code .. which expects a logger to be *created*. ars's solution is just what I needed.
Sridhar Ratnakumar
+1  A: 

There are two issues here:

  1. Being able to direct output to different log files during different phases of the process.
  2. Being able to redirect stdout/stderr of arbitrary commands to those same log files.

For point 1, I would go along with ars's answer: he's spot on about just using multiple handlers and one logger. his formatting is a little messed up so I'll reiterate below:

logger = logging.getLogger("pypibuild")
now_as_string = datetime.datetime.utcnow().strftime("%Y%m%d_%H%M")
for package in get_pypi_packages():
    fn = '/var/logs/pypi/%s/%s.log' % (package, now_as_string)
    h = logging.FileHandler(fn, 'w')
    logger.addHandler(h)
    perform_build(package)
    logger.removeHandler(h)
    h.close()

As for point 2, the perform_build() step, I'll assume for simplicity's sake that we don't need to worry about a multicore environment. Then, the subprocess module is your friend. In the snippet below, I've left out error handling, fancy formatting and a couple of other niceties, but it should give you a fair idea.

def perform_build(package):
    logger.debug("Starting build for package %r", package)
    command_line = compute_command_line_for_package(package)
    process = subprocess.Popen(command_line, shell=True,
                               stdin=PIPE, stdout=PIPE, stderr=PIPE)
    stdout, stderr = process.communicate()
    logger.debug("Build stdout contents: %r", stdout)
    logger.debug("Build stderr contents: %r", stderr)
    logger.debug("Finished build for package %r", package)

That's about it.

Vinay Sajip
Perfect. Good enough for me .. and doesn't violate the logging module design. Oh, and one would use `logging.getLogger("app.module.pypibuild")` in app/module.py .. right?
Sridhar Ratnakumar
A good convention is to use ``logging.getLogger(__name__)`` in a module
Vinay Sajip
Ok, in this case I reverted back to using `logging.getLogger("__pypibuild__")` as using the module hierarchy also sends the log messages to parent handlers. Otherwise, normally .. I use `__name__`.
Sridhar Ratnakumar