Scrapy - logging to file and stdout simultaneously, with spider names

For all those folks who came here before reading the current documentation version:

import logging
from scrapy.utils.log import configure_logging

configure_logging(install_root_handler=False)
logging.basicConfig(
    filename='log.txt',
    filemode = 'a',
    format='%(levelname)s: %(message)s',
    level=logging.DEBUG
)

I know this is old but it was a really helpful post since the class still isn't properly documented in the Scrapy docs. Also, we can skip importing logging and use scrapy logs directly. Thanks All!

from scrapy import log

logfile = open('testlog.log', 'a')
log_observer = log.ScrapyFileLogObserver(logfile, level=log.DEBUG)
log_observer.start()

It is very easy to redirect output using: scrapy some-scrapy's-args 2>&1 | tee -a logname

This way, all what scrapy ouputs into stdout and stderr, will be redirected to a logname file and also, prited to the screen.


You want to use the ScrapyFileLogObserver.

import logging
from scrapy.log import ScrapyFileLogObserver

logfile = open('testlog.log', 'w')
log_observer = ScrapyFileLogObserver(logfile, level=logging.DEBUG)
log_observer.start()

I'm glad you asked this question, I've been wanting to do this myself.