-
Notifications
You must be signed in to change notification settings - Fork 573
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
support EventSource/long polling in scrapyd #55
Comments
Could something like this work through a register functionality, where the client can request that scrapyd push results or a "finished" or "error" message about a specific job using SSE? I would be interested in trying to write code for it. |
Yes you subscribe/register to an EventSource resource with a GET request. |
I'm interested in the enhancement implementation too, can I help with the implementation ? |
@aleroot, sure. |
@Digenis In the meantime I've added a simple status.json to being able to poll the status of a specific job from the client . After knowing that the status id finished I can then call the website url : http://localhost:6800/items/myscraper/myspider/c9514588cf9511e7a2140242ac110003.jl See Pull request #260 . |
currently I am checking at every 5 secs to get the result of the scrapyd job.. Can't we just use the crawler callback method from scrapyd server? |
The solution in my projects is to use the |
I will close for now as wontfix, since to push messages to known subscribers, you can simply create a Scrapy extension, and implement the push logic in If there is a real need for pushing messages to unknown subscribers (like pub/sub model), then that would have to be done at the Scrapyd level, I believe, since the subscriber could subscribe after the crawl started. |
Moved from: scrapy/scrapy#335
Originally by: @graingert
The text was updated successfully, but these errors were encountered: