Logging in Scrapy

I was unable to make @Rafael Almeda's solution work until I added the following to the import section of my spider.py code:

from scrapy.utils.log import configure_logging 

It seems that you're not calling your parse_page method at any time. Try to commenting your parse method and you're going to receive a NotImplementedError because you're starting it and you're saying it 'do nothing'.

Maybe if you implement your parse_page method it'll work

def parse(self, response):
    self.logger.info('Russia terrorist state %s', response.url)
    self.parse_page(response)

Hope it helps you.


For logging I just put this on the spider class:

import logging
from scrapy.utils.log import configure_logging 


class SomeSpider(scrapy.Spider):
    configure_logging(install_root_handler=False)
    logging.basicConfig(
        filename='log.txt',
        format='%(levelname)s: %(message)s',
        level=logging.INFO
    )

This will put all scrapy output into the project root directory as a log.txt file

If you want to log something manually you shouldn't use the scrapy logger, it's deprecated. Just use the python one

import logging
logging.error("Some error")