2019-10-21 19:01:00 [scrapy.core.engine] INFO: Spider opened
2019-10-21 19:01:00 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2019-10-21 19:01:00 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023
2019-10-21 19:01:01 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://amp-api-search-edge.apps.apple.com/v1/catalog/cn/search?term=%E7%AE%A1%E8%B4%A6&platform=iphone&include=apps,top-apps&bubble[search]=apps,developers,groupings,editorial-items,app-bundles,in-apps&l=zh-Hans-CN&extend=editorialBadgeInfo,messagesScreenshots,minimumOSVersion,requiredCapabilities,screenshotsByType,supportsFunCamera,videoPreviewsByType> (referer: None)
2019-10-21 19:01:01 [scrapy.core.scraper] DEBUG: Scraped from <200 https://amp-api-search-edge.apps.apple.com/v1/catalog/cn/search?term=%E7%AE%A1%E8%B4%A6&platform=iphone&include=apps,top-apps&bubble[search]=apps,developers,groupings,editorial-items,app-bundles,in-apps&l=zh-Hans-CN&extend=editorialBadgeInfo,messagesScreenshots,minimumOSVersion,requiredCapabilities,screenshotsByType,supportsFunCamera,videoPreviewsByType>
None
2019-10-21 19:01:01 [scrapy.core.engine] INFO: Closing spider (finished)
Traceback (most recent call last):
File "/usr/lib/python2.7/logging/__init__.py", line 861, in emit
msg = self.format(record)
File "/usr/lib/python2.7/logging/__init__.py", line 734, in format
return fmt.format(record)
File "/usr/lib/python2.7/logging/__init__.py", line 465, in format
record.message = record.getMessage()
File "/usr/lib/python2.7/logging/__init__.py", line 329, in getMessage
msg = msg % self.args
File "/home/project/release/venv2.7/local/lib/python2.7/site-packages/scrapy/spiders/__init__.py", line 107, in __str__
return "<%s %r at 0x%0x>" % (type(self).__name__, self.name, id(self))
...... 重复n行红色日志 .......
File "/home/project/release/venv2.7/local/lib/python2.7/site-packages/scrapy/spiders/__init__.py", line 107, in __str__
return "<%s %r at 0x%0x>" % (type(self).__name__, self.name, id(self))
RuntimeError: maximum recursion depth exceeded while calling a Python object
Logged from file signal.py, line 57
2019-10-21 19:01:01 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'downloader/request_bytes': 812,
'downloader/request_count': 1,
'downloader/request_method_count/GET': 1,
'downloader/response_bytes': 19255,
'downloader/response_count': 1,
'downloader/response_status_count/200': 1,
'finish_reason': 'finished',
'finish_time': datetime.datetime(2019, 10, 21, 11, 1, 1, 765291),
'item_scraped_count': 1,
'log_count/DEBUG': 2,
'log_count/ERROR': 1,
'log_count/INFO': 9,
'memusage/max': 1424973824,
'memusage/startup': 1424973824,
'response_received_count': 1,
'scheduler/dequeued': 1,
'scheduler/dequeued/memory': 1,
'scheduler/enqueued': 1,
'scheduler/enqueued/memory': 1,
'start_time': datetime.datetime(2019, 10, 21, 11, 1, 0, 418993)}
2019-10-21 19:01:01 [scrapy.core.engine] INFO: Spider closed (finished)

Process finished with exit code 0

原因:spiders中的close函数中有异常!

scrapy RuntimeError: maximum recursion depth exceeded while calling a Python object 超出python最大递归数异常的更多相关文章

  1. Python递归报错:RuntimeError: maximum recursion depth exceeded in comparison

    Python中默认的最大递归深度是989,当尝试递归第990时便出现递归深度超限的错误: RuntimeError: maximum recursion depth exceeded in compa ...

  2. 记 suds 模块循环依赖的坑-RuntimeError: maximum recursion depth exceeded

    下面是soa接口调用的核心代码 #! /usr/bin/python # coding:utf-8 from suds.client import Clientdef SoaRequest(wsdl, ...

  3. Odoo8查询产品时提示"maximum recursion depth exceeded while calling a Python object"

    今天在生产系统中查询产品时,莫名提示错误:maximum recursion depth exceeded while calling a Python object,根据错误日志提示,发现在查询产品 ...

  4. python递归深度报错--RuntimeError: maximum recursion depth exceeded

    当你的程序递归的次数超过999次的时候,就会引发RuntimeError: maximum recursion depth exceeded. 解决方法两个: 1.增加系统的递归调用的次数: impo ...

  5. 函数递归时,递归次数到900多时,就是抛出异常exception RuntimeError('maximum recursion depth exceeded',)

    import subprocess import multiprocessing import urllib import sys import os import pymongo import si ...

  6. python RecursionError: maximum recursion depth exceeded while calling

    import copyimport sys # 导入sys模块sys.setrecursionlimit(8192) # 将默认的递归深度修改为r = sys.getrecursionlimit()_ ...

  7. python maximum recursion depth exceeded 处理办法

    1.在执行命令 pyinstaller -F D:\py\programe\banksystem.py打包生成.exe文件时报错:python maximum recursion depth exce ...

  8. 爬豆瓣影评,记下解决maximum recursion depth exceeded in cmp

    #主要是爬取后给别人做自然语言分析,没其他意思. #coding=utf8 import requests,re from lxml import etree import sys reload(sy ...

  9. python --RecursionError: maximum recursion depth exceeded in comparison

    在学习汉娜塔的时候,遇到一个error RecursionError: maximum recursion depth exceeded in comparison 经过百度,百度的方法: 加上: i ...

随机推荐

  1. lint-staged那些事儿

    一.工具选型 [预提交工具](https://www.npmtrends.com/lint-staged-vs-pre-commit-vs-pretty-quick) 1.lint-staged 检查 ...

  2. [简短问答]lodop打印过慢或有进度条

    问法1:打印预览显示进度条,过慢出现进度条,打印过慢,可能和很多原因有关:打印内容或样式或图片等过多,有需要下载有脚步执行或本身网络慢:机器性能过低 系统ie有问题或缓存过多:或使用的是共享打印机.如 ...

  3. 【python基础】Python性能加速的方法

    参考 1. 给Python加速(性能加速的方法); 2. pythonSpeed_PerformanceTips; 完

  4. [LeetCode] 681. Next Closest Time 下一个最近时间点

    Given a time represented in the format "HH:MM", form the next closest time by reusing the ...

  5. ubuntu samba 服务器搭建

    最近总是在搭建 samba 环境,写在笔记上记录下以备后用,长时间不操作了肯定会忘记. Linux 版本:Ubuntu 18.04 具体的操作命令: 1. 安装: sudo apt-get insta ...

  6. SQL中EXPLAIN命令详解---(转)

    MySQL Explain详解   在日常工作中,我们会有时会开慢查询去记录一些执行时间比较久的SQL语句,找出这些SQL语句并不意味着完事了,些时我们常常用到explain这个命令来查看一个这些SQ ...

  7. 如何申请百度地图用户Key

    打开网页http://lbsyun.baidu.com/,进入百度地图开发平台. 单击[登录],登录百度账号.如果您还没有百度账号,单击箭头处[立即注册]注册百度账号. 登录完成后,单击右上角箭头处[ ...

  8. C程序设计语言练习 第二章

    2.3 常量 strlen函数:返回s的长度 int strlenn(char s[]) { int i=0; while(s[i] != '\0') ++i; return i; } 2.7 类型转 ...

  9. 【C#】上机实验二

    实验1: 求解 1/1 + 1 / 2  + 1 / 3  + 1 / 4 …… + 1 / i = ? 确保精度在 1e-6内. using System; using System.Collect ...

  10. 网页中插入Flash动画(.swf)代码和常用参数设置

    我们现在大部分人做网页,都是直接用DW插入flash,而且DW也是所见即所得,直接生成了相应的flash显示代码.可是我们又有多少人了解这些直接由DW生成的代码呢?其实我接触flash player标 ...