Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

json.dumps raises error when the scrapy-fieldstats extension is enabled #16

Closed
my8100 opened this issue Aug 20, 2019 · 1 comment
Closed
Labels
bug Something isn't working

Comments

@my8100
Copy link
Owner

my8100 commented Aug 20, 2019

my8100/scrapydweb#88 (comment)

Hello,

I have an error:

[2019-08-20 07:41:48,062] INFO in werkzeug: 36.84.63.175 - - [20/Aug/2019 07:41:48] "POST /1/api/daemonstatus/ HTTP/1.1" 200 - {"downloader/request_bytes": 296, "downloader/request_count": 1, "downloader/request_method_count/GET": 1, "downloader/response_bytes": 2246568, "downloader/response_count": 1, "downloader/response_status_count/200": 1, "fields_coverage": {"distributor_id": "100%", "has_change": "100%", "last_update": "100%", "sk": "100%", "status": "100%", "stock": "100%"} Traceback (most recent call last): File "/root/Virtualenvs/scrapy/local/lib/python2.7/site-packages/logparser/common.py", line 163, in parse_crawler_stats return json.loads(text) File "/usr/lib/python2.7/json/__init__.py", line 339, in loads return _default_decoder.decode(s) File "/usr/lib/python2.7/json/decoder.py", line 364, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib/python2.7/json/decoder.py", line 380, in raw_decode obj, end = self.scan_once(s, idx) ValueError: Expecting object: line 12 column 37 (char 469) [2019-08-20 07:41:57,411] ERROR in logparser.logparser: Traceback (most recent call last): File "/root/Virtualenvs/scrapy/local/lib/python2.7/site-packages/logparser/logparser.py", line 467, in run data = self.handle_logfile(log_path) File "/root/Virtualenvs/scrapy/local/lib/python2.7/site-packages/logparser/logparser.py", line 255, in handle_logfile self.parse_appended_log(data, appended_log) File "/root/Virtualenvs/scrapy/local/lib/python2.7/site-packages/logparser/logparser.py", line 315, in parse_appended_log self.logger.debug("Parsed data_ from appended_log:\n%s", self.json_dumps(data_)) File "/root/Virtualenvs/scrapy/local/lib/python2.7/site-packages/logparser/logparser.py", line 284, in json_dumps return json.dumps(obj, ensure_ascii=False, indent=4, sort_keys=sort_keys) File "/usr/lib/python2.7/json/__init__.py", line 251, in dumps sort_keys=sort_keys, **kw).encode(obj) File "/usr/lib/python2.7/json/encoder.py", line 209, in encode chunks = list(chunks) File "/usr/lib/python2.7/json/encoder.py", line 434, in _iterencode for chunk in _iterencode_dict(o, _current_indent_level): File "/usr/lib/python2.7/json/encoder.py", line 408, in _iterencode_dict for chunk in chunks: File "/usr/lib/python2.7/json/encoder.py", line 408, in _iterencode_dict for chunk in chunks: File "/usr/lib/python2.7/json/encoder.py", line 442, in _iterencode o = _default(o) File "/usr/lib/python2.7/json/encoder.py", line 184, in default raise TypeError(repr(o) + " is not JSON serializable") TypeError: ValueError('Expecting object: line 12 column 37 (char 469)',) is not JSON serializable

I think this is due to the use of :

https://github.com/stummjr/scrapy-fieldstats

Which save stats as JSON:

"fields_coverage": {"distributor_id": "100%",
"has_change": "100%",
"last_update": "100%",
"sk": "100%",
"status": "100%",
"stock": "100%"}

@my8100 my8100 added the bug Something isn't working label Aug 20, 2019
@my8100
Copy link
Owner Author

my8100 commented Aug 21, 2019

@webtekindo
The bug is fixed in PR #17.
You can run command pip install --upgrade git+https://github.com/my8100/logparser.git to get it.

@my8100 my8100 closed this as completed Aug 30, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant