This library is currently being beta-tested. See something that's broken? Did we get something wrong? Create an issue and let us know!
Databay is a Python interface for scheduled data transfer. It facilitates transfer of (any) data from A to B, on a scheduled interval.
Installation
pip install databay
Documentation
See full Databay documentation.
Or more specifically:
- Overview - Learn what is Databay.
- Examples - See Databay in use.
- Extending Databay - Use Databay in your project.
- API Reference - Read the API documentation.
Features
-
Simple, decoupled interface — Easily implement data production and consumption that fits your needs.
-
Granular control over data transfer — Multiple ways of passing information between producers and consumers.
-
Asyncio supported — You can produce or consume asynchronously.
-
We'll handle the rest — scheduling, startup and shutdown, exception handling, logging.
-
Support for custom scheduling — Use your own scheduling logic if you like.
Overview
In Databay, data transfer is expressed with three components:
Inlets
- for data production.Outlets
- for data consumption.Links
- for handling the data transit between inlets and outlets.
Scheduling is implemented using third party libraries, exposed through the BasePlanner
interface. Currently two BasePlanner
implementations are available - using Advanced Python Scheduler and Schedule.
# Data producer
inlet = HttpInlet('https://some.test.url.com/')
# Data consumer
outlet = MongoOutlet('databay', 'test_collection')
# Data transfer between the two
link = Link(inlet, outlet, datetime.timedelta(seconds=5))
# Start scheduling
planner = ApsPlanner(link)
planner.start()
Every 5 seconds this snippet will pull data from a test URL, and write it to MongoDB.
Example use:
While Databay comes with a handful of built-in inlets and outlets, its strength lies in extendability. To use Databay in your project, create concrete implementations of Inlet
and Outlet
classes that handle the data production and consumption functionality you require. Databay will then make sure data can repeatedly flow between the inlets and outlets you create. Extending inlets and extending outlets is easy and has a wide range of customization. Head over to Extending Databay section for a detailed explanation or to Examples for real use cases.
Supported Python Versions
Python Version | <3.6 | 3.6 | 3.7 | 3.8 | 3.9 |
---|---|---|---|---|---|
Supported | ❌ | ✅ | ✅ | ✅ | ✅ |
Community Contributions
We aim to support the ecosystem of Databay users by collating and promoting inlets and outlets that implement popular functionalities. We encourage you to share the inlets and outlets you write with the community - start by reading the guidelines on contributing to the Databay community.
Did you write a cool inlet or outlet that you'd like to share with others? Put it on a public repo, send us an email and we'll list it here!
Inlets
- FileInlet - File input inlet (built-in).
- HttpInlet - Asynchronous http request inlet using aiohttp (built-in).
Outlets
- FileOutlet - Generic file outlet (built-in).
- CsvOutlet - CSV file outlet (built-in).
- MongoOutlet - MongoDB outlet (built-in).
Requests
The following are inlets and outlets that others would like to see implemented. Feel free to build an item from this list and share your implementation! Let us know if you'd like to add an item to this list.
- PostgreSqlOutlet - PostgreSQL Outlet
Roadmap
See full Databay Roadmap. Bare in mind this a live document that is shared to give you an idea of what can be expected in the future releases, rather than a locked schedule. Priorities and order of implementation may change without warning.
v1.0
- Beta test the pre-release.
Complete 100% test coverage.Add more advanced examples.- Release v1.0.
- Buy a carrot cake and celebrate.
v1.1
- Filters and translators - callbacks for processing data between inlets and outlets.
- Advanced scheduling - conditional, non uniform intervals.
Licence
See LICENSE