The API Scraper is a Python 3.x tool designed to find "hidden" API calls powering a website.
The following Python libraries should be installed (with pip, or the package manager of your choice): Installation
- Selenium
- Requests
self.browser = Browser("chromedriver/chromedriver", "browsermob-proxy-2.1.4/bin/browsermob-proxy", self.harDirectory)
The script can be run from the command line using: Usage
$python3 consoleservice.py [commands]
If you get confused about which commands to use, use the -h flag.
$ python3 consoleservice.py -h
usage: consoleService.py [-h] [-u [U]] [-d [D]] [-s [S]] [-c [C]] [--p]
optional arguments:
-h, --help show this help message and exit
-u [U] Target URL. If not provided, target directory will be scanned
for har files.
-d [D] Target directory (default is "hars"). If URL is provided,
directory will store har files. If URL is not provided,
directory will be scanned.
-s [S] Search term
-c [C] Count of pages to crawl (with target URL only)
--p Flag, remove unnecessary parameters (may dramatically increase
run time)
Hey, I heard you like APIs so I'm writing an API to find you APIs so you can API while you API. Running the API
Kicking this off over HTTP is absolutely not necessary at all, however, I am including a Flask wrapper around the API Finder, so it might as well be documented!
Install Flask
export FLASK_APP=webservice.py
flask run