好得很程序员自学网

<tfoot draggable='sEl'></tfoot>

Scrapy同时启动多个爬虫

1. 在项目文件夹中新建一个commands文件夹

2. 在command的文件夹中新建一个文件 crawlall.py

3.在crawlall.py 中写一个command类,该类继承 scrapy测试数据mands

from scrapy测试数据mands import ScrapyCommand


class Command(ScrapyCommand):
    requires_project = True

    def syntax(self):
        return ‘[options]‘

    def short_desc(self):
        return ‘Runs all of the spiders‘

    def run(self, args, opts):
        spider_list = self.crawler_process.spiders.list()
        for name in spider_list:
            self.crawler_process.crawl(name, **opts.__dict__)
        self.crawler_process.start()

查看更多关于Scrapy同时启动多个爬虫的详细内容...

  阅读:24次