I am using the built in Python "logging" module for my script. When I turn verbosity to "info" it seems like my "debug" messages are significantly slowing down my script.

Some of my "debug" messages print large dictionaries and I'm guessing Python is expanding the text before realizing "debug" messages are disabled. Example:

import pprint
pp = pprint.PrettyPrinter(indent=4)
logger.debug(f"Large Dict Object: {pp.pformat(obj)}")

How can I improve my performance? I'd prefer to still use Python's built in logging module. But need to figure out a "clean" way to solve this issue.


Solution 1:

There is already a feature of logging for the feature mentioned by dankal444, which is slightly neater:

if logger.isEnabledFor(logging.DEBUG):
    logger.debug(f"Large Dict Object: {pp.pformat(obj)}")

Another possible approach is to use %-formatting, which only does the formatting when actually needed (the logging event has to be processed by a handler as well as a logger to get to that point). I know f-strings are the new(ish) hotness and are performant, but it all depends on the exact circumstances as to which will offer the best result.

An example of taking advantage of lazy %-formatting:

class DeferredFormatHelper:
    def __init__(self, func, *args, *kwargs):
        self.func = func  # assumed to return a string
        self.args = args
        self.kwargs = kwargs

    def __str__(self):
        # This is called because the format string contains
        # a %s for this logging argument, lazily if and when
        # the formatting needs to happen
        return self.func(*self.args, **self.kwargs)

if logger.isEnabledFor(logging.DEBUG):
    arg = DeferredFormatHelper(pp.pformat, obj)
    logger.debug('Large Dict Object: %s', arg)

Solution 2:

Check if the current level is good enough:

if logger.getEffectiveLevel() <= logging.DEBUG:
    logger.debug(f"Large Dict Object: {pp.pformact(obj)}")
    

This is not super clean but best that I can think of. You just need to encompass with this if performance bottlenecks