Dirhunt
Dirhunt is a web crawler optimize for search and analyze directories. This tool can find interesting things if the server has the "index of" mode enabled. Dirhunt is also useful if the directory listing is not enabled. It detects directories with false 404 errors, directories where an empty index file has been created to hide things and much more.
$ dirhunt http://website.com/
Dirhunt does not use brute force. But neither is it just a crawler. This tool is faster than others because it minimizes requests to the server. Generally, this tool takes between 5-30 seconds, depending on the website and the server.
Read more about how to use Dirhunt in the documentation.
Features
- Process one or multiple sites at a time.
- Process 'Index Of' pages and report interesting files.
- Detect redirectors.
- Detect blank index file created on directory to hide things.
- Process some html files in search of new directories.
- 404 error pages and detect fake 404 errors.
- Filter results by flags.
- Analyze results at end. It also processes date & size of the Index Pages.
- Get new directories using robots.txt, VirusTotal, Google, CommonCrawl (NEW!), SSL Certificate (NEW!), Crt.sh (NEW!) & Wayback (NEW!).
- Delay between requests.
- One or multiple proxies option. It can also search for free proxies.
- Save the results to a JSON file
- Resume the aborted scans
Install
If you have Pip installed on your system, you can use it to install the latest Dirhunt stable version:
$ sudo pip3 install dirhunt
Python 2.7 & 3.5-3.10 are supported but Python 3.x is recommended. Use pip2
on install for Python2.
There are other installation methods available.
Disclaimer
This software must not be used on third-party servers without permission. Dirhunt has been created to be used by audit teams with the consent of the owners of the website analyzed. The author is not responsible for the use of this tool outside the law.
This software is under the MIT license. The author does not provide any warranty. But issues and pull requests are welcome.