• Stars
    star
    386
  • Rank 110,527 (Top 3 %)
  • Language
    Python
  • License
    MIT License
  • Created almost 4 years ago
  • Updated over 3 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Dead easy interface for executing many HTTP requests asynchronously. Also provides helper functions for executing embarrassingly parallel async coroutines.

PyPI Python Build Status Documentation

many_requests

Dead easy interface for executing many HTTP requests asynchronously. It has been tested in the wild with over 10 million requests. Automatically handles errors and executes retries.

Built on-top of Trio and asks. Interface heavily inspired by Requests and joblib.

Also provides helper functions for executing embarrassingly parallel async coroutines.

To install:

pip install many-requests

Example Usage

Execute 10 GET requests for example.org:

from many_requests import ManyRequests
responses = ManyRequests(n_workers=5, n_connections=5)(
                method='GET',
                url=['https://example.org' for i in range(10)])

Query HackNews API for 10 items and parse JSON output:

responses = ManyRequests(n_workers=5, n_connections=5, json=True)(
                method='GET',
                url=[f'https://hacker-news.firebaseio.com/v0/item/{i}.json?print=pretty' for i in range(10)])

To use basic authentication with all requests:

from asks import BasicAuth
username = 'user'
password = 'pw'
responses = ManyRequests(n_workers=5, n_connections=5)(
                method='GET',
                url=['https://example.org' for i in range(10)],
                auth=BasicAuth((username, password)))

To execute embarrassingly parallel async coroutines, for example 10 trio.sleep calls:

from many_requests import EasyAsync, delayed
import trio
outputs = EasyAsync(n_workers = 4)(delayed(trio.sleep)(i) for i in range(10))