Skip to content

lyubomirr/crawlalyzer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Concurrect web crawler which tries to detect used technologies

The crawler uses Wappalyzer's technology fingerprints in order to detect used technologies of a website. After finishing all the technologies are aggregated per single root URL.

Usage:

There are two command line arguments:

  • urls - Comma-separated urls to crawl.
  • follow-external - Specifies whether to follow external links.

Example usage: go run main.go -urls=https://google.com -follow-external=true

When you press a key the crawling stops and the aggregated technologies for the root urls and their links are saved into fingerprints.json file.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages