Netscan is a network scanner made for large-scope pentesting. It lets you scan and do your recon phase on more that 20+ protocols very quickly. All results are store in an elasticsearch database and browsable with the Kibana power. Scan, Filter, Exploit !
- FTP scanner
- MySQL scanner
- MongoDB scanner
- Ping scanner
- Port scanner
- Postgres scanner
- RDP scanner
- Redis scanner
- Rsync scanner
- RTSP scanner
- SMB scanner
- SSH scanner
- Telnet scanner
- VNC scanner
- WinRM scanner
- AD scanner
- HTTP scanner
- DNS scanner
- SNMP scanner
- TLS scanner
Display a specific module help menu
Run a ping scan to discover devices in the network
Run a port scan to get all opened ports with the nmap options
Display the result in a way-to-cool interface!
Run the following command and enjoy immediately..
~/netscan$> ./configure_docker.sh
The previous command will build and/or start all the framework docker containers used by netscan. It will create and configure :
- an elasticsearch container
- a kibana container
- a neo4j container
When everything is up and running, you can use the netscan
command and enjoy.
- Install dependencies
$> pip3 install -r requirements.txt
- Create the configuration file
$> cp config.cfg.sample config.cfg
- If needed, deploy
Elasticsearch
andKibana
on your systema.
Note:
The docker version is already configured with default settings. You're good to go.
On your system or in the docker container,
-
Edit the
config.cfg
file to set the name of your current pentest session under the[Global]
section. -
Enable elasticsearch if you want to send all your scan outputs to the database under the
[Elasticsearch]
section. -
Configure the Kibana dashboards
Via GUI
The kibana dashboards are located at kibana/kibana_dashboards.ndjson.
- Open kibana at http://127.0.0.1:5601/
- Go to "Management > Stack Management"
- Go to "Kibana > Saved Objects"
- Click on "Import"
- Select the
kibana_dashboards.ndjson
file provided in this repo - Click on "Import"
Via CLI
$> curl -X POST 'http://127.0.0.1:5601/api/saved_objects/_import?createNewCopies=true' -H "kbn-xsrf: true" --form "file=@$(pwd)/kibana/kibana_dashboards.ndjson"
The dashboards should now be available within Kibana
Problem: Elasticsearch has not enough memory-mapped areas to run smoothly.
Solution : Run the following command on you system
sudo sysctl -w vm.max_map_count=262144
Doc: https://www.elastic.co/guide/en/elasticsearch/reference/current/_maximum_map_count_check.html
Problem: Elastic needs at least 10% free space of your hard disk (whatever the disk size). Solution : You can disable the disk size threshold by running the following commands on you system
$> curl -X PUT -H "Content-Type: application/json" http://localhost:9200/_cluster/settings -d '{ "transient": { "cluster.routing.allocation.disk.threshold_enabled": false } }'