๐ค English | ๐ ็ฎไฝไธญๆ
ScrapydWeb: Web app for Scrapyd cluster management, with support for Scrapy log analysis & visualization.
Scrapyd โ ScrapydWeb โ LogParser
๐ Recommended Reading
๐ How to efficiently manage your distributed web scraping projects
๐ How to set up Scrapyd cluster on Heroku
๐ Demo
โญ Features
View contents
-
๐ Scrapyd Cluster Management
- ๐ฏ All Scrapyd JSON API Supported
- โ๏ธ Group, filter and select any number of nodes
- ๐ฑ๏ธ Execute command on multinodes with just a few clicks
-
๐ Scrapy Log Analysis
- ๐ Stats collection
- ๐ Progress visualization
- ๐ Logs categorization
-
๐ Enhancements
- ๐ฆ Auto packaging
- ๐ต๏ธโโ๏ธ Integrated with ๐ LogParser
- โฐ Timer tasks
- ๐ง Monitor & Alert
- ๐ฑ Mobile UI
- ๐ Basic auth for web UI
๐ป Getting Started
View contents
โ ๏ธ Prerequisites
โ Make sure that ๐ Scrapyd has been installed and started on all of your hosts.
โฌ๏ธ Install
- Use pip:
pip install scrapydweb
โ Note that you may need to execute python -m pip install --upgrade pip
first in order to get the latest version of scrapydweb, or download the tar.gz file from https://pypi.org/project/scrapydweb/#files and get it installed via pip install scrapydweb-x.x.x.tar.gz
- Use git:
pip install --upgrade git+https://github.com/my8100/scrapydweb.git
Or:
git clone https://github.com/my8100/scrapydweb.git
cd scrapydweb
python setup.py install
โถ๏ธ Start
- Start ScrapydWeb via command
scrapydweb
. (a config file would be generated for customizing settings at the first startup.) - Visit http://127.0.0.1:5000 (It's recommended to use Google Chrome for a better experience.)
๐ Browser Support
The latest version of Google Chrome, Firefox, and Safari.
โ๏ธ Running the tests
View contents
$ git clone https://github.com/my8100/scrapydweb.git
$ cd scrapydweb
# To create isolated Python environments
$ pip install virtualenv
$ virtualenv venv/scrapydweb
# Or specify your Python interpreter: $ virtualenv -p /usr/local/bin/python3.7 venv/scrapydweb
$ source venv/scrapydweb/bin/activate
# Install dependent libraries
(scrapydweb) $ python setup.py install
(scrapydweb) $ pip install pytest
(scrapydweb) $ pip install coverage
# Make sure Scrapyd has been installed and started, then update the custom_settings item in tests/conftest.py
(scrapydweb) $ vi tests/conftest.py
(scrapydweb) $ curl http://127.0.0.1:6800
# '-x': stop on first failure
(scrapydweb) $ coverage run --source=scrapydweb -m pytest tests/test_a_factory.py -s -vv -x
(scrapydweb) $ coverage run --source=scrapydweb -m pytest tests -s -vv --disable-warnings
(scrapydweb) $ coverage report
# To create an HTML report, check out htmlcov/index.html
(scrapydweb) $ coverage html
๐๏ธ Built With
๐ Changelog
Detailed changes for each release are documented in the ๐ HISTORY.md.
๐จโ๐ป Author
my8100 |
---|
๐ฅ Contributors
Kaisla |
---|
ยฉ๏ธ License
This project is licensed under the GNU General Public License v3.0 - see the ๐ LICENSE file for details.