Python.Scrapy.14-scrapy-source-code-analysis-part-4
Scrapy 源代码分析系列-4 scrapy.commands 子包
子包scrapy.commands定义了在命令scrapy中使用的子命令(subcommand): bench, check, crawl, deploy, edit, fetch,
genspider, list, parse, runspider, settings, shell, startproject, version, view。 所有的子命令模块都定义了一个继承自
类ScrapyCommand的子类Command。
首先来看一下子命令crawl, 该子命令用来启动spider。
1. crawl.py
关注的重点在方法run(self, args, opts):
def run(self, args, opts):
if len(args) < 1:
raise UsageError()
elif len(args) > 1:
raise UsageError("running 'scrapy crawl' with more than one spider is no longer supported")
spname = args[0] crawler = self.crawler_process.create_crawler() # A
spider = crawler.spiders.create(spname, **opts.spargs) # B
crawler.crawl(spider) # C
self.crawler_process.start() # D
那么问题来啦,run接口方法是从哪里调用的呢? 让我们回到 Python.Scrapy.11-scrapy-source-code-analysis-part-1
中 "1.2 cmdline.py command.py" 关于"_run_print_help() "的说明。
A: 创建类Crawler对象crawler。在创建Crawler对象时, 同时将创建Crawler对象的实例属性spiders(SpiderManager)。如下所示:
class Crawler(object): def __init__(self, settings):
self.configured = False
self.settings = settings
self.signals = SignalManager(self)
self.stats = load_object(settings['STATS_CLASS'])(self)
self._start_requests = lambda: ()
self._spider = None
# TODO: move SpiderManager to CrawlerProcess
spman_cls = load_object(self.settings['SPIDER_MANAGER_CLASS'])
self.spiders = spman_cls.from_crawler(self) # spiders 的类型是: SpiderManager
Crawler对象对应一个SpiderManager对象,而SpiderManager对象管理多个Spider。
B: 获取Sipder对象。
C: 为Spider对象安装Crawler对象。(为蜘蛛安装爬行器)
D: 类CrawlerProcess的start()方法如下:
def start(self):
if self.start_crawling():
self.start_reactor() def start_crawling(self):
log.scrapy_info(self.settings)
return self._start_crawler() is not None def start_reactor(self):
if self.settings.getbool('DNSCACHE_ENABLED'):
reactor.installResolver(CachingThreadedResolver(reactor))
reactor.addSystemEventTrigger('before', 'shutdown', self.stop)
reactor.run(installSignalHandlers=False) # blocking call def _start_crawler(self):
if not self.crawlers or self.stopping:
return name, crawler = self.crawlers.popitem()
self._active_crawler = crawler
sflo = log.start_from_crawler(crawler)
crawler.configure()
crawler.install()
crawler.signals.connect(crawler.uninstall, signals.engine_stopped)
if sflo:
crawler.signals.connect(sflo.stop, signals.engine_stopped)
crawler.signals.connect(self._check_done, signals.engine_stopped)
crawler.start() # 调用类Crawler的start()方法
return name, crawler
类Crawler的start()方法如下:
def start(self):
yield defer.maybeDeferred(self.configure)
if self._spider:
yield self.engine.open_spider(self._spider, self._start_requests()) # 和Engine建立了联系 (ExecutionEngine)
yield defer.maybeDeferred(self.engine.start)
关于类ExecutionEngine将在子包scrapy.core分析涉及。
2. startproject.py
3. subcommand是如何加载的
在cmdline.py的方法execute()中有如下几行代码:
inproject = inside_project()
cmds = _get_commands_dict(settings, inproject)
cmdname = _pop_command_name(argv)
_get_commands_dict():
def _get_commands_dict(settings, inproject):
cmds = _get_commands_from_module('scrapy.commands', inproject)
cmds.update(_get_commands_from_entry_points(inproject))
cmds_module = settings['COMMANDS_MODULE']
if cmds_module:
cmds.update(_get_commands_from_module(cmds_module, inproject))
return cmds
_get_commands_from_module():
def _get_commands_from_module(module, inproject):
d = {}
for cmd in _iter_command_classes(module):
if inproject or not cmd.requires_project:
cmdname = cmd.__module__.split('.')[-1]
d[cmdname] = cmd()
return d
To Be Continued
接下来解析settings相关的逻辑。Python.Scrapy.15-scrapy-source-code-analysis-part-5
Python.Scrapy.14-scrapy-source-code-analysis-part-4的更多相关文章
- Memcached source code analysis (threading model)--reference
Look under the start memcahced threading process memcached multi-threaded mainly by instantiating mu ...
- Golang Template source code analysis(Parse)
This blog was written at go 1.3.1 version. We know that we use template thought by followed way: fun ...
- Memcached source code analysis -- Analysis of change of state--reference
This article mainly introduces the process of Memcached, libevent structure of the main thread and w ...
- Apache Commons Pool2 源码分析 | Apache Commons Pool2 Source Code Analysis
Apache Commons Pool实现了对象池的功能.定义了对象的生成.销毁.激活.钝化等操作及其状态转换,并提供几个默认的对象池实现.在讲述其实现原理前,先提一下其中有几个重要的对象: Pool ...
- Redis source code analysis
http://zhangtielei.com/posts/blog-redis-dict.html http://zhangtielei.com/assets/photos_redis/redis_d ...
- linux kernel & source code analysis& hacking
https://kernelnewbies.org/ http://www.tldp.org/LDP/lki/index.html https://kernelnewbies.org/ML https ...
- 2018.6.21 HOLTEK HT49R70A-1 Source Code analysis
Cange note: “Reading TMR1H will latch the contents of TMR1H and TMR1L counter to the destination”? F ...
- The Ultimate List of Open Source Static Code Analysis Security Tools
https://www.checkmarx.com/2014/11/13/the-ultimate-list-of-open-source-static-code-analysis-security- ...
- Top 40 Static Code Analysis Tools
https://www.softwaretestinghelp.com/tools/top-40-static-code-analysis-tools/ In this article, I have ...
- Source Code Reading for Vue 3: How does `hasChanged` work?
Hey, guys! The next generation of Vue has released already. There are not only the brand new composi ...
随机推荐
- <<精益创业>>读书笔记
不要以严格地职能部门来组成公司,而是要以人们在各自专长的领域做出表现,组建跨部门的团队 在普通的管理中,如果无法实现目标,要么是计划不足,要么是技术不足 这点我感触比较深,以前在老东家的时间,刚开始是 ...
- ASP.NET MVC IOC 之AutoFac攻略
一.为什么使用AutoFac? 之前介绍了Unity和Ninject两个IOC容器,但是发现园子里用AutoFac的貌似更为普遍,于是捯饬了两天,发现这个东东确实是个高大上的IOC容器~ Autofa ...
- loadView、viewDidLoad、initWithCoder、initWithNibName、awakeFromNib的用法
转载,原地址为:http://jianyu996.blog.163.com/blog/static/11211455520131226840879/ 请尊重原创: 1,无论XIB还是代码创建都会调用l ...
- Linux Bash 使用$?来防止一些误删操作
在shell脚本中如果不加任何检查机制,很有可能导致误删的出现,例如: cd $remove_directory rm * 这段代码的目的是删除目标目录下所有的文件.但是如果目标目录由于输入错误等原因 ...
- (转)JPEG图片数据结构分析- 附Png数据格式详解.doc
一.简述 JPEG是一个压缩标准,又可分为标准JPEG.渐进式JPEG及JPEG2000三种: ①标准JPEG:以24位颜色存储单个光栅图像,是与平台无关的格式,支持最高级别的压缩,不过,这种压 ...
- [译] Python 3.5 协程究竟是个啥
转自:http://blog.rainy.im/2016/03/10/how-the-heck-does-async-await-work-in-python-3-5/ [译] Python 3.5 ...
- 微信订阅号里实现oauth授权登录,并获取用户信息 (完整篇)
摘要 这段时间一直有人问我,订阅号实现的oauth授权登录的问题,之前写的比较简单,很多人不明白.众所周知,微信公众号分订阅号.服务号.企业号:每个号的用途不一样,接口开放程度也不一样.微信还有个扯淡 ...
- (转自http://www.blogjava.net/moxie/archive/2006/10/20/76375.html)WebWork深入浅出
(转自http://www.blogjava.net/moxie/archive/2006/10/20/76375.html) WebWork深入浅出 本文发表于<开源大本营> 作者:钱安 ...
- IMX6输出可控PWM
驱动部分 #include <linux/init.h> #include <linux/module.h> #include <linux/moduleparam.h& ...
- iptables-qos-tcpcopy-tc-tcpdump
QOS: https://www.chiphell.com/thread-427876-1-1.html iptables指南: http://man.chinaunix.net/network/ip ...