Passing arguments to process.crawl in Scrapy python

if you have Scrapyd and you want to schedule the spider, do this

curl http://localhost:6800/schedule.json -d project=projectname -d spider=spidername -d first='James' -d last='Bond'


pass the spider arguments on the process.crawl method:

process.crawl(spider, input='inputargument', first='James', last='Bond')

You can do it the easy way:

from scrapy import cmdline

cmdline.execute("scrapy crawl linkedin_anonymous -a first=James -a last=Bond -o output.json".split())