Exnoscan is a simple bash script that can help you identify gaps. We often monitor what we know, so Exnoscan aims to identify what you don’t. Exnoscan uses a bunch of tools to complete the job which are listed below:
Nmap Parser: https://github.com/laconicwolf/Nmap-Scan-to-CSV.git
- Python 3.7+
- Optional: TheHarvester [https://github.com/laramies/theHarvester/wiki/Installation]
I personally run on a kali linux box as most come pre-installed.
How To Run
Once the dependencies are met, download the script and run bash exnoscan.sh
This will download the necessary files in the directory ran in (Minus TheHarvester):
Nothing will scan as the domains.txt needs to be populated. This is stored within the scan folder:
domains.txt is a requirement, else the iplist and urls are optional. Here is the indended purpose:
- Domains.txt – Populated with email domains so that subdomain enumeration can run.
- Iplist.txt – This will be added to the list which will be scanned by NMAP. It can single IPs or CIDR notation.
- Urls.txt – These can be custom URLS you want to be added scanned by both NMAP and DirSearch.
The following can be tweaked to change the type of scan. Use the links above of the creators to help shape them out.
once populated and happy, run the script again to start the scan: bash exnoscan.sh
The script does the following:
- Takes the domains.txt and runs subdomain enumeration.
- Run nc to find common webhosts
- Scan each site to find hidden directories
- Compile scanning list using all 3 txt files mentioned above.
- nmap scan the scan list
- Parse the log into a CSV
- [If installed] run email enumeration
- move all to result folder and zip using the date as the name
Once finished, the zip contains the following files:
This folder contains the scan results which will hopefully show you what is externally exposed. The purpose of the emailsfound file is so you can be aware of email addresses that are most likely going to be heavily phished.
I like these results to be sent the results and because I run my machines within Azure, I use a combination of a storage account and a logic app.
The storage account is mounted to the kali box using BlobFuse. Once the script runs, the $D folder is moved to a container within my blob storage. The Logic app notices the write and emails the zip to myself:
Because of this, the script can be ran on the box using cron. This means the majority of process is automated.