Python: logging module - globally
I was wondering how to implement a global logger that could be used everywhere with your own settings:
I currently have a custom logger class:
class customLogger(logging.Logger):
...
The class is in a separate file with some formatters and other stuff. The logger works perfectly on its own.
I import this module in my main python file and create an object like this:
self.log = logModule.customLogger(arguments)
But obviously, I cannot access this object from other parts of my code. Am i using a wrong approach? Is there a better way to do this?
Use logging.getLogger(name)
to create a named global logger.
main.py
import log
logger = log.setup_custom_logger('root')
logger.debug('main message')
import submodule
log.py
import logging
def setup_custom_logger(name):
formatter = logging.Formatter(fmt='%(asctime)s - %(levelname)s - %(module)s - %(message)s')
handler = logging.StreamHandler()
handler.setFormatter(formatter)
logger = logging.getLogger(name)
logger.setLevel(logging.DEBUG)
logger.addHandler(handler)
return logger
submodule.py
import logging
logger = logging.getLogger('root')
logger.debug('submodule message')
Output
2011-10-01 20:08:40,049 - DEBUG - main - main message
2011-10-01 20:08:40,050 - DEBUG - submodule - submodule message
Since I haven't found a satisfactory answer, I would like to elaborate on the answer to the question a little bit in order to give some insight into the workings and intents of the logging
library, that comes with Python's standard library.
In contrast to the approach of the OP (original poster) the library clearly separates the interface to the logger and configuration of the logger itself.
The configuration of handlers is the prerogative of the application developer who uses your library.
That means you should not create a custom logger class and configure the logger inside that class by adding any configuration or whatsoever.
The logging
library introduces four components: loggers, handlers, filters, and formatters.
- Loggers expose the interface that application code directly uses.
- Handlers send the log records (created by loggers) to the appropriate destination.
- Filters provide a finer grained facility for determining which log records to output.
- Formatters specify the layout of log records in the final output.
A common project structure looks like this:
Project/
|-- .../
| |-- ...
|
|-- project/
| |-- package/
| | |-- __init__.py
| | |-- module.py
| |
| |-- __init__.py
| |-- project.py
|
|-- ...
|-- ...
Inside your code (like in module.py) you refer to the logger instance of your module to log the events at their specific levels.
A good convention to use when naming loggers is to use a module-level logger, in each module which uses logging, named as follows:
logger = logging.getLogger(__name__)
The special variable __name__
refers to your module's name and looks something like project.package.module
depending on your application's code structure.
module.py (and any other class) could essentially look like this:
import logging
...
log = logging.getLogger(__name__)
class ModuleClass:
def do_something(self):
log.debug('do_something() has been called!')
The logger in each module will propagate any event to the parent logger which in return passes the information to its attached handler! Analogously to the python package/module structure, the parent logger is determined by the namespace using "dotted module names". That's why it makes sense to initialize the logger with the special __name__
variable (in the example above name matches the string "project.package.module").
There are two options to configure the logger globally:
Instantiate a logger in project.py with the name
__package__
which equals "project" in this example and is therefore the parent logger of the loggers of all submodules. It is only necessary to add an appropriate handler and formatter to this logger.Set up a logger with a handler and formatter in the executing script (like main.py) with the name of the topmost package.
When developing a library which uses logging, you should take care to document how the library uses logging - for example, the names of loggers used.
The executing script, like main.py for example, might finally look something like this:
import logging
from project import App
def setup_logger():
# create logger
logger = logging.getLogger('project')
logger.setLevel(logging.DEBUG)
# create console handler and set level to debug
ch = logging.StreamHandler()
ch.setLevel(level)
# create formatter
formatter = logging.Formatter('%(asctime)s [%(levelname)s] %(name)s: %(message)s')
# add formatter to ch
ch.setFormatter(formatter)
# add ch to logger
logger.addHandler(ch)
if __name__ == '__main__' and __package__ is None:
setup_logger()
app = App()
app.do_some_funny_stuff()
The method call log.setLevel(...)
specifies the lowest-severity log message a logger will handle but not necessarily output! It simply means the message is passed to the handler as long as the message's severity level is higher than (or equal to) the one that is set. But the handler is responsible for handling the log message (by printing or storing it for example).
Hence the logging
library offers a structured and modular approach which just needs to be exploited according to one's needs.
Logging documentation
Create an instance of customLogger
in your log module and use it as a singleton - just use the imported instance, rather than the class.