Python best practice in terms of logging

Best practice is to follow Python's rules for software (de)composition - the module is the unit of Python software, not the class. Hence, the recommended approach is to use

logger = logging.getLogger(__name__)

in each module, and to configure logging (using basicConfig() or dictConfig()) from the main script.

Loggers are singletons - there is no point in passing them around or storing them in instances of your classes.


Use JSON or YAML logging configuration - After Python 2.7, you can load logging configuration from a dict. It means you can load the logging configuration from a JSON or YAML file.

Yaml Example -

version: 1
disable_existing_loggers: False
formatters:
    simple:
        format: "%(asctime)s - %(name)s - %(levelname)s - %(message)s"

handlers:
    console:
        class: logging.StreamHandler
        level: DEBUG
        formatter: simple
        stream: ext://sys.stdout

    info_file_handler:
        class: logging.handlers.RotatingFileHandler
        level: INFO            
        formatter: simple
        filename: info.log
        maxBytes: 10485760 # 10MB
        backupCount: 20
        encoding: utf8

    error_file_handler:
        class: logging.handlers.RotatingFileHandler
        level: ERROR            
        formatter: simple
        filename: errors.log
        maxBytes: 10485760 # 10MB
        backupCount: 20
        encoding: utf8

loggers:
    my_module:
        level: ERROR
        handlers: [console]
        propagate: no

root:
    level: INFO
    handlers: [console, info_file_handler, error_file_handler]

Ref - Good-logging-practice-in-python

Tags:

Python

Logging