• Stars
    star
    259
  • Rank 157,669 (Top 4 %)
  • Language
    HTML
  • License
    MIT License
  • Created almost 4 years ago
  • Updated almost 3 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

python3实现的集成了github上多个扫描工具的命令行WEB扫描工具

HXnineTails 花溪九尾

English | 简体中文

Plain Violent Powerful Self-expanding stitching monster

+-+-+-+-+-+-+-+-+-+-+-+-+
|H|X|n|i|n|e|T|a|i|l|s|
+-+-+-+-+-+-+-+-+-+-+-+-+

Content List🚀

Introduction

🐾python3 implementation of a command-line WEB scanning tool that integrates several scanning tools on github.

🔱The goal is to lie down and dig a hole

The project code is tested under the latest community version of xray1.7 without errors

Currently integrated in this project: crawlergo OneForAll subDomainsBrute Subfinder Sublist3r Xray JSfinder pppXray Server Sauce

The next project that I want to integrate is ARL Asset Lighthouse System

The result of the project is the fusion of these individually powerful components into a single application, suitable for SRC batch scanning, CNVD vertical upscaling, etc.

Project structure:

Project Structure

Installation

Install python3 (python2 is not supported at this time)

Download the code for this project: git clone https://github.com/Cl0udG0d/HXnineTails

Install the appropriate library files pip3 install -r requirements.txt

For domestic users, the first line of requirements.txt uses the Aliyun mirror

If you are installing python library files on a foreign server, please delete the first line of requirements.txt for speedup

The following project needs to be installed and the path configured in the config.py file

Google Chrome

Xray (better with the premium version)

crawlergo

OneForAll

subDomainsBrute

subfinder

For example, on my personal laptop, the path information in config.py is

'''
Paths where each project is located.
'''
Chrome_Path='C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe'
Xray_Path='D:\\Xray\\xray.exe'
crawlergo_Path='C:\\Users\\Administrator\\Desktop\\test_tools\\crawlergo.exe'
OneForAll_Path='C:\\Users\\Administrator\\Desktop\\test_tools\\OneForAll-master\\'
subDomainsBrute_Path='C:\\Users\\Administrator\\Desktop\\test_tools\\subDomainsBrute-master\\'
subfinder_Path='C:\\Users\\Administrator\\Desktop\\test_tools\\subfinder\\'

Open the command line in the HXnineTails folder and enter the scan parameters

Instructions

Command line use, with the following parameter details.

-h --help output help information such as python3 scan.py --help
-a --attone for a single URL, only crawlergo dynamic crawler + xray scan For example Baidu official website python3 scan.py -a https://www.baidu.com
-s --attsrc for SRC assets, information gathering +crawlergo+xray , for example Baidu SRC python3 scan.py -s baidu.com
-d --attdetail for SRC assets, information collection + crawlergo + xray + C segment information collection + js sensitive information collection , for example Baidu SRC input python3 scan.py -d baidu.com
-t --thread Number of threads, default is 5 e.g. python3 scan.py -t 10 -a http://testphp.vulnweb.com/ 
-r reads the txt file to be scanned, one URL per line, and -a scans each URL taken out, e.g. python3 scan.py -t 10 -r target.txt
-c Clean up the saved vulnerability-related reports, i.e. clean up the files in the save folder

It is recommended to use the -a or -s parameter for scanning

There are also some global configurations in config.py that can be modified by yourself, such as

SERVERKEY=''

portlist=['80','8080','8000','8081','8001']
blacklist=["spider", "org"]

ThreadNum=5
PYTHON="python3"

SERVERKEY is the key value used by Server Sauce for your registration

portlist is the default list of ports scanned during C-segment scanning

The string in blacklist that appears in the URL to be scanned will not be scanned

ThreadNum The default number of threads

PYTHON The name of the host python interpreter, default is python3

Some external programs or configurations are mentioned above, if you don't need to use them in your scan, you can leave them uninstalled and they will pass by themselves during the program run

screenshots

python3 scan.py --help

Screenshot 1

python3 scan.py -t 3 -a http://testphp.vulnweb.com/

Screenshot 2

View saved reports

Screenshot 3

TODO

  • Write an English readme(Thanks brother wenyurush
  • Streamline and add modules
  • Add ARL module
  • ...

Maintainer

@春告鳥 @Throokie

Contribute

🍺You're very welcome to join us! Raise an Issue or submit a Pull Request.

🍻And of course feel free to send me an email at [email protected] Join us!

🍻 welcome call Throokie throught the email [email protected]

Reward

  • 背人语
  • 掌控安全-hab

License

MIT © Spring Teller

Appreciation_Code

If it helps you, how about buying the author a cup of milk tea? (hehehe)👍 (Please leave a message with your ID when you reward

打赏码

More Repositories

1

SZhe_Scan

碎遮SZhe_Scan Web漏洞扫描器,基于python Flask框架,对输入的域名/IP进行全面的信息搜集,漏洞扫描,可自主添加POC
HTML
847
star
2

Fofa-hack

非付费会员,fofa数据采集工具
Python
462
star
3

pppXray

Xray批量化自动扫描
HTML
117
star
4

QQFishing

QQ钓鱼,社会工程
HTML
104
star
5

Fofa-script

Fofa爬虫
Python
80
star
6

edusrc_POC

在edusrc平台上对于一些通用漏洞检测时编写的简单python POC脚本
Python
53
star
7

pochub

漏洞poc仓库
HTML
30
star
8

AWDDocker

标准化AWD靶场Docker
Dockerfile
29
star
9

testAWD

AWD平台
CSS
23
star
10

GPTHack

更适合中国宝宝体质的GPTHack,免API使用GPT,目前支持3.5
Python
12
star
11

pythonStunt

Python绝技[运用Python成为顶级黑客 python3代码]
Python
7
star
12

WebShow

整理了一些能够拿来就用的Web前端框架
HTML
6
star
13

FanDuDocument

凡渡AWD平台相关文档
HTML
4
star
14

JavaWebStudy

Java Web程序设计学习
Java
4
star
15

szhe-docs

碎遮-风起 Web漏洞扫描器的文档,项目地址:https://github.com/Cl0udG0d/SZhe_Scan
HTML
3
star
16

Cl0udG0d

2
star
17

car_bus_security_simulator

汽车总线安全模拟器
Python
2
star
18

Jing_Scan

静湖ABC段扫描器,扫描网段及其常用端口,获取其title信息和地理位置
Python
1
star
19

scrapy_demo

练习scrapy框架时python编写的程序
Python
1
star
20

Shell-script

Linux Shell编程学习脚本
Shell
1
star
21

HTML_CSS_JS_Study

记录学习前端的过程
HTML
1
star
22

blog-comment

记录博客评论
1
star
23

httprobe

子域名检测存活,在原版基础上进行部分修改
Go
1
star