Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can I feed a crawler with multi urls? #4

Open
chzhcpu opened this issue Dec 12, 2016 · 1 comment
Open

Can I feed a crawler with multi urls? #4

chzhcpu opened this issue Dec 12, 2016 · 1 comment

Comments

@chzhcpu
Copy link

chzhcpu commented Dec 12, 2016

Can I feed a crawler with more than one url?

@chzhcpu chzhcpu changed the title Can I feed a crawler multi urls? Can I feed a crawler with multi urls? Dec 12, 2016
@esbencarlsen
Copy link
Owner

Hi

Yes, just call the extension method CrawlSeed as many times as you have url's or create multiple instances of the crawler

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants