• Stars
    star
    7,668
  • Rank 4,983 (Top 0.1 %)
  • Language
    Python
  • License
    BSD 3-Clause "New...
  • Created about 12 years ago
  • Updated 3 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

A set of tools to keep your pinned Python dependencies fresh.

jazzband-image pypi pyversions pre-commit buildstatus-gha codecov Matrix Room Badge Matrix Space Badge discord-chat-image

pip-tools = pip-compile + pip-sync

A set of command line tools to help you keep your pip-based packages fresh, even when you've pinned them. You do pin them, right? (In building your Python application and its dependencies for production, you want to make sure that your builds are predictable and deterministic.)

pip-tools overview for phase II

Installation

Similar to pip, pip-tools must be installed in each of your project's virtual environments:

$ source /path/to/venv/bin/activate
(venv) $ python -m pip install pip-tools

Note: all of the remaining example commands assume you've activated your project's virtual environment.

Example usage for pip-compile

The pip-compile command lets you compile a requirements.txt file from your dependencies, specified in either pyproject.toml, setup.cfg, setup.py, or requirements.in.

Run it with pip-compile or python -m piptools compile (or pipx run --spec pip-tools pip-compile if pipx was installed with the appropriate Python version). If you use multiple Python versions, you can also run py -X.Y -m piptools compile on Windows and pythonX.Y -m piptools compile on other systems.

pip-compile should be run from the same virtual environment as your project so conditional dependencies that require a specific Python version, or other environment markers, resolve relative to your project's environment.

Note: If pip-compile finds an existing requirements.txt file that fulfils the dependencies then no changes will be made, even if updates are available. To compile from scratch, first delete the existing requirements.txt file, or see Updating requirements for alternative approaches.

Requirements from pyproject.toml

The pyproject.toml file is the latest standard for configuring packages and applications, and is recommended for new projects. pip-compile supports both installing your project.dependencies as well as your project.optional-dependencies. Thanks to the fact that this is an official standard, you can use pip-compile to pin the dependencies in projects that use modern standards-adhering packaging tools like Setuptools, Hatch or flit.

Suppose you have a 'foobar' Python application that is packaged using Setuptools, and you want to pin it for production. You can declare the project metadata as:

[build-system]
requires = ["setuptools", "setuptools-scm"]
build-backend = "setuptools.build_meta"

[project]
requires-python = ">=3.9"
name = "foobar"
dynamic = ["dependencies", "optional-dependencies"]

[tool.setuptools.dynamic]
dependencies = { file = ["requirements.in"] }
optional-dependencies.test = { file = ["requirements-test.txt"] }

If you have a Django application that is packaged using Hatch, and you want to pin it for production. You also want to pin your development tools in a separate pin file. You declare django as a dependency and create an optional dependency dev that includes pytest:

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[project]
name = "my-cool-django-app"
version = "42"
dependencies = ["django"]

[project.optional-dependencies]
dev = ["pytest"]

You can produce your pin files as easily as:

$ pip-compile -o requirements.txt pyproject.toml
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
#    pip-compile --output-file=requirements.txt pyproject.toml
#
asgiref==3.6.0
    # via django
django==4.1.7
    # via my-cool-django-app (pyproject.toml)
sqlparse==0.4.3
    # via django

$ pip-compile --extra dev -o dev-requirements.txt pyproject.toml
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
#    pip-compile --extra=dev --output-file=dev-requirements.txt pyproject.toml
#
asgiref==3.6.0
    # via django
attrs==22.2.0
    # via pytest
django==4.1.7
    # via my-cool-django-app (pyproject.toml)
exceptiongroup==1.1.1
    # via pytest
iniconfig==2.0.0
    # via pytest
packaging==23.0
    # via pytest
pluggy==1.0.0
    # via pytest
pytest==7.2.2
    # via my-cool-django-app (pyproject.toml)
sqlparse==0.4.3
    # via django
tomli==2.0.1
    # via pytest

This is great for both pinning your applications, but also to keep the CI of your open-source Python package stable.

Requirements from setup.py and setup.cfg

pip-compile has also full support for setup.py- and setup.cfg-based projects that use setuptools.

Just define your dependencies and extras as usual and run pip-compile as above.

Requirements from requirements.in

You can also use plain text files for your requirements (e.g. if you don't want your application to be a package). To use a requirements.in file to declare the Django dependency:

# requirements.in
django

Now, run pip-compile requirements.in:

$ pip-compile requirements.in
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
#    pip-compile requirements.in
#
asgiref==3.6.0
    # via django
django==4.1.7
    # via -r requirements.in
sqlparse==0.4.3
    # via django

And it will produce your requirements.txt, with all the Django dependencies (and all underlying dependencies) pinned.

(updating-requirements)=

Updating requirements

pip-compile generates a requirements.txt file using the latest versions that fulfil the dependencies you specify in the supported files.

If pip-compile finds an existing requirements.txt file that fulfils the dependencies then no changes will be made, even if updates are available.

To force pip-compile to update all packages in an existing requirements.txt, run pip-compile --upgrade.

To update a specific package to the latest or a specific version use the --upgrade-package or -P flag:

# only update the django package
$ pip-compile --upgrade-package django

# update both the django and requests packages
$ pip-compile --upgrade-package django --upgrade-package requests

# update the django package to the latest, and requests to v2.0.0
$ pip-compile --upgrade-package django --upgrade-package requests==2.0.0

You can combine --upgrade and --upgrade-package in one command, to provide constraints on the allowed upgrades. For example to upgrade all packages whilst constraining requests to the latest version less than 3.0:

$ pip-compile --upgrade --upgrade-package 'requests<3.0'

Using hashes

If you would like to use Hash-Checking Mode available in pip since version 8.0, pip-compile offers --generate-hashes flag:

$ pip-compile --generate-hashes requirements.in
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
#    pip-compile --generate-hashes requirements.in
#
asgiref==3.6.0 \
    --hash=sha256:71e68008da809b957b7ee4b43dbccff33d1b23519fb8344e33f049897077afac \
    --hash=sha256:9567dfe7bd8d3c8c892227827c41cce860b368104c3431da67a0c5a65a949506
    # via django
django==4.1.7 \
    --hash=sha256:44f714b81c5f190d9d2ddad01a532fe502fa01c4cb8faf1d081f4264ed15dcd8 \
    --hash=sha256:f2f431e75adc40039ace496ad3b9f17227022e8b11566f4b363da44c7e44761e
    # via -r requirements.in
sqlparse==0.4.3 \
    --hash=sha256:0323c0ec29cd52bceabc1b4d9d579e311f3e4961b98d174201d5622a23b85e34 \
    --hash=sha256:69ca804846bb114d2ec380e4360a8a340db83f0ccf3afceeb1404df028f57268
    # via django

Output File

To output the pinned requirements in a filename other than requirements.txt, use --output-file. This might be useful for compiling multiple files, for example with different constraints on django to test a library with both versions using tox:

$ pip-compile --upgrade-package 'django<1.0' --output-file requirements-django0x.txt
$ pip-compile --upgrade-package 'django<2.0' --output-file requirements-django1x.txt

Or to output to standard output, use --output-file=-:

$ pip-compile --output-file=- > requirements.txt
$ pip-compile - --output-file=- < requirements.in > requirements.txt

Forwarding options to pip

Any valid pip flags or arguments may be passed on with pip-compile's --pip-args option, e.g.

$ pip-compile requirements.in --pip-args "--retries 10 --timeout 30"

Configuration

You can define project-level defaults for pip-compile and pip-sync by writing them to a configuration file in the same directory as your requirements input files (or the current working directory if piping input from stdin). By default, both pip-compile and pip-sync will look first for a .pip-tools.toml file and then in your pyproject.toml. You can also specify an alternate TOML configuration file with the --config option.

It is possible to specify configuration values both globally and command-specific. For example, to by default generate pip hashes in the resulting requirements file output, you can specify in a configuration file:

[tool.pip-tools]
generate-hashes = true

Options to pip-compile and pip-sync that may be used more than once must be defined as lists in a configuration file, even if they only have one value.

pip-tools supports default values for all valid command-line flags of its subcommands. Configuration keys may contain underscores instead of dashes, so the above could also be specified in this format:

[tool.pip-tools]
generate_hashes = true

Configuration defaults specific to pip-compile and pip-sync can be put beneath separate sections. For example, to by default perform a dry-run with pip-compile:

[tool.pip-tools.compile] # "sync" for pip-sync
dry-run = true

This does not affect the pip-sync command, which also has a --dry-run option. Note that local settings take preference over the global ones of the same name, whenever both are declared, thus this would also make pip-compile generate hashes, but discard the global dry-run setting:

[tool.pip-tools]
generate-hashes = true
dry-run = true

[tool.pip-tools.compile]
dry-run = false

You might be wrapping the pip-compile command in another script. To avoid confusing consumers of your custom script you can override the update command generated at the top of requirements files by setting the CUSTOM_COMPILE_COMMAND environment variable.

$ CUSTOM_COMPILE_COMMAND="./pipcompilewrapper" pip-compile requirements.in
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
#    ./pipcompilewrapper
#
asgiref==3.6.0
    # via django
django==4.1.7
    # via -r requirements.in
sqlparse==0.4.3
    # via django

Workflow for layered requirements

If you have different environments that you need to install different but compatible packages for, then you can create layered requirements files and use one layer to constrain the other.

For example, if you have a Django project where you want the newest 2.1 release in production and when developing you want to use the Django debug toolbar, then you can create two *.in files, one for each layer:

# requirements.in
django<2.2

At the top of the development requirements dev-requirements.in you use -c requirements.txt to constrain the dev requirements to packages already selected for production in requirements.txt.

# dev-requirements.in
-c requirements.txt
django-debug-toolbar<2.2

First, compile requirements.txt as usual:

$ pip-compile
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
#    pip-compile
#
django==2.1.15
    # via -r requirements.in
pytz==2023.3
    # via django

Now compile the dev requirements and the requirements.txt file is used as a constraint:

$ pip-compile dev-requirements.in
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
#    pip-compile dev-requirements.in
#
django==2.1.15
    # via
    #   -c requirements.txt
    #   django-debug-toolbar
django-debug-toolbar==2.1
    # via -r dev-requirements.in
pytz==2023.3
    # via
    #   -c requirements.txt
    #   django
sqlparse==0.4.3
    # via django-debug-toolbar

As you can see above, even though a 2.2 release of Django is available, the dev requirements only include a 2.1 version of Django because they were constrained. Now both compiled requirements files can be installed safely in the dev environment.

To install requirements in production stage use:

$ pip-sync

You can install requirements in development stage by:

$ pip-sync requirements.txt dev-requirements.txt

Version control integration

You might use pip-compile as a hook for the pre-commit. See pre-commit docs for instructions. Sample .pre-commit-config.yaml:

repos:
  - repo: https://github.com/jazzband/pip-tools
    rev: 7.4.1
    hooks:
      - id: pip-compile

You might want to customize pip-compile args by configuring args and/or files, for example:

repos:
  - repo: https://github.com/jazzband/pip-tools
    rev: 7.4.1
    hooks:
      - id: pip-compile
        files: ^requirements/production\.(in|txt)$
        args: [--index-url=https://example.com, requirements/production.in]

If you have multiple requirement files make sure you create a hook for each file.

repos:
  - repo: https://github.com/jazzband/pip-tools
    rev: 7.4.1
    hooks:
      - id: pip-compile
        name: pip-compile setup.py
        files: ^(setup\.py|requirements\.txt)$
      - id: pip-compile
        name: pip-compile requirements-dev.in
        args: [requirements-dev.in]
        files: ^requirements-dev\.(in|txt)$
      - id: pip-compile
        name: pip-compile requirements-lint.in
        args: [requirements-lint.in]
        files: ^requirements-lint\.(in|txt)$
      - id: pip-compile
        name: pip-compile requirements.in
        args: [requirements.in]
        files: ^requirements\.(in|txt)$

Example usage for pip-sync

Now that you have a requirements.txt, you can use pip-sync to update your virtual environment to reflect exactly what's in there. This will install/upgrade/uninstall everything necessary to match the requirements.txt contents.

Run it with pip-sync or python -m piptools sync. If you use multiple Python versions, you can also run py -X.Y -m piptools sync on Windows and pythonX.Y -m piptools sync on other systems.

pip-sync must be installed into and run from the same virtual environment as your project to identify which packages to install or upgrade.

Be careful: pip-sync is meant to be used only with a requirements.txt generated by pip-compile.

$ pip-sync
Uninstalling flake8-2.4.1:
    Successfully uninstalled flake8-2.4.1
Collecting click==4.1
    Downloading click-4.1-py2.py3-none-any.whl (62kB)
    100% |................................| 65kB 1.8MB/s
    Found existing installation: click 4.0
    Uninstalling click-4.0:
        Successfully uninstalled click-4.0
Successfully installed click-4.1

To sync multiple *.txt dependency lists, just pass them in via command line arguments, e.g.

$ pip-sync dev-requirements.txt requirements.txt

Passing in empty arguments would cause it to default to requirements.txt.

Any valid pip install flags or arguments may be passed with pip-sync's --pip-args option, e.g.

$ pip-sync requirements.txt --pip-args "--no-cache-dir --no-deps"

Note: pip-sync will not upgrade or uninstall packaging tools like setuptools, pip, or pip-tools itself. Use python -m pip install --upgrade to upgrade those packages.

Should I commit requirements.in and requirements.txt to source control?

Generally, yes. If you want a reproducible environment installation available from your source control, then yes, you should commit both requirements.in and requirements.txt to source control.

Note that if you are deploying on multiple Python environments (read the section below), then you must commit a separate output file for each Python environment. We suggest to use the {env}-requirements.txt format (ex: win32-py3.7-requirements.txt, macos-py3.10-requirements.txt, etc.).

Cross-environment usage of requirements.in/requirements.txt and pip-compile

The dependencies of a package can change depending on the Python environment in which it is installed. Here, we define a Python environment as the combination of Operating System, Python version (3.7, 3.8, etc.), and Python implementation (CPython, PyPy, etc.). For an exact definition, refer to the possible combinations of PEP 508 environment markers.

As the resulting requirements.txt can differ for each environment, users must execute pip-compile on each Python environment separately to generate a requirements.txt valid for each said environment. The same requirements.in can be used as the source file for all environments, using PEP 508 environment markers as needed, the same way it would be done for regular pip cross-environment usage.

If the generated requirements.txt remains exactly the same for all Python environments, then it can be used across Python environments safely. But users should be careful as any package update can introduce environment-dependent dependencies, making any newly generated requirements.txt environment-dependent too. As a general rule, it's advised that users should still always execute pip-compile on each targeted Python environment to avoid issues.

Maximizing reproducibility

pip-tools is a great tool to improve the reproducibility of builds. But there are a few things to keep in mind.

  • pip-compile will produce different results in different environments as described in the previous section.
  • pip must be used with the PIP_CONSTRAINT environment variable to lock dependencies in build environments as documented in #8439.
  • Dependencies come from many sources.

Continuing the pyproject.toml example from earlier, creating a single lock file could be done like:

$ pip-compile --all-build-deps --all-extras --output-file=constraints.txt --strip-extras pyproject.toml
#
# This file is autogenerated by pip-compile with Python 3.9
# by the following command:
#
#    pip-compile --all-build-deps --all-extras --output-file=constraints.txt --strip-extras pyproject.toml
#
asgiref==3.5.2
    # via django
attrs==22.1.0
    # via pytest
backports-zoneinfo==0.2.1
    # via django
django==4.1
    # via my-cool-django-app (pyproject.toml)
editables==0.3
    # via hatchling
hatchling==1.11.1
    # via my-cool-django-app (pyproject.toml::build-system.requires)
iniconfig==1.1.1
    # via pytest
packaging==21.3
    # via
    #   hatchling
    #   pytest
pathspec==0.10.2
    # via hatchling
pluggy==1.0.0
    # via
    #   hatchling
    #   pytest
py==1.11.0
    # via pytest
pyparsing==3.0.9
    # via packaging
pytest==7.1.2
    # via my-cool-django-app (pyproject.toml)
sqlparse==0.4.2
    # via django
tomli==2.0.1
    # via
    #   hatchling
    #   pytest

Some build backends may also request build dependencies dynamically using the get_requires_for_build_ hooks described in PEP 517 and PEP 660. This will be indicated in the output with one of the following suffixes:

  • (pyproject.toml::build-system.backend::editable)
  • (pyproject.toml::build-system.backend::sdist)
  • (pyproject.toml::build-system.backend::wheel)

Other useful tools

Deprecations

This section lists pip-tools features that are currently deprecated.

  • In the next major release, the --allow-unsafe behavior will be enabled by default (#989). Use --no-allow-unsafe to keep the old behavior. It is recommended to pass --allow-unsafe now to adapt to the upcoming change.
  • The legacy resolver is deprecated and will be removed in future versions. The new default is --resolver=backtracking.
  • In the next major release, the --strip-extras behavior will be enabled by default (#1613). Use --no-strip-extras to keep the old behavior.

A Note on Resolvers

You can choose from either default backtracking resolver or the deprecated legacy resolver.

The legacy resolver will occasionally fail to resolve dependencies. The backtracking resolver is more robust, but can take longer to run in general.

You can continue using the legacy resolver with --resolver=legacy although note that it is deprecated and will be removed in a future release.

More Repositories

1

django-debug-toolbar

A configurable set of panels that display various debug information about the current request/response.
Python
8,023
star
2

tablib

Python Module for Tabular Datasets in XLS, CSV, JSON, YAML, &c.
Python
4,586
star
3

django-silk

Silky smooth profiling for Django
Python
4,380
star
4

djangorestframework-simplejwt

A JSON Web Token authentication plugin for the Django REST Framework.
Python
3,957
star
5

django-taggit

Simple tagging for django
Python
3,307
star
6

django-oauth-toolkit

OAuth2 goodies for the Djangonauts!
Python
3,148
star
7

django-redis

Full featured redis cache backend for Django.
Python
2,860
star
8

django-model-utils

Django model mixins and utilities.
Python
2,638
star
9

Watson

⌚ A wonderful CLI to track your time!
Python
2,450
star
10

django-push-notifications

Send push notifications to mobile devices through GCM or APNS in Django.
Python
2,275
star
11

django-simple-history

Store model history and view/revert changes from admin site.
Python
2,189
star
12

django-widget-tweaks

Tweak the form field rendering in templates, not in python-level form definitions. CSS classes and HTML attributes can be altered.
Python
2,077
star
13

sorl-thumbnail

Thumbnails for Django
Python
1,743
star
14

django-constance

Dynamic Django settings.
Python
1,687
star
15

django-two-factor-auth

Complete Two-Factor Authentication for Django providing the easiest integration into most Django projects.
Python
1,679
star
16

django-polymorphic

Improved Django model inheritance with automatic downcasting
Python
1,648
star
17

django-pipeline

Pipeline is an asset packaging library for Django.
Python
1,508
star
18

dj-database-url

Use Database URLs in your Django Application.
Python
1,471
star
19

django-axes

Keep track of failed login attempts in Django-powered sites.
Python
1,463
star
20

prettytable

Display tabular data in a visually appealing ASCII table format
Python
1,336
star
21

django-tinymce

TinyMCE integration for Django
JavaScript
1,270
star
22

django-analytical

Analytics services for Django projects
Python
1,197
star
23

django-admin2

Extendable, adaptable rewrite of django.contrib.admin
Python
1,185
star
24

django-rest-knox

Authentication Module for django rest auth
Python
1,130
star
25

django-waffle

A feature flipper for Django
Python
1,128
star
26

django-smart-selects

chained and grouped selects for django forms
Python
1,125
star
27

django-auditlog

A Django app that keeps a log of changes made to an object.
Python
1,108
star
28

django-configurations

A helper for organizing Django project settings by relying on well established programming patterns.
Python
1,085
star
29

django-defender

A simple super fast django reusable app that blocks people from brute forcing login attempts
Python
1,035
star
30

django-payments

Universal payment handling for Django.
Python
1,023
star
31

django-hosts

Dynamic and static host resolving for Django. Maps hostnames to URLconfs.
Python
977
star
32

django-dbbackup

Management commands to help backup and restore your project database and media files
Python
959
star
33

geojson

Python bindings and utilities for GeoJSON
Python
913
star
34

django-nose

Django test runner using nose
Python
882
star
35

django-newsletter

An email newsletter application for the Django web application framework, including an extended admin interface, web (un)subscription, dynamic e-mail templates, an archive and HTML email support.
Python
845
star
36

django-floppyforms

Full control of form rendering in the templates.
Python
841
star
37

django-avatar

A Django app for handling user avatars.
Python
806
star
38

django-formtools

A set of high-level abstractions for Django forms
Python
790
star
39

django-user-sessions

Extend Django sessions with a foreign key back to the user, allowing enumerating all user's sessions.
Python
616
star
40

django-admin-sortable

Generic drag-and-drop ordering for objects and tabular inlines in Django Admin
Python
564
star
41

django-invitations

Generic invitations app for Django
Python
557
star
42

django-sortedm2m

A transparent sorted ManyToMany field for django.
Python
511
star
43

django-recurrence

Utility for working with recurring dates in Django.
Python
475
star
44

django-categories

This app attempts to provide a generic category system that multiple apps could use. It uses MPTT for the tree storage and provides a custom admin for better visualization (copied and modified from feinCMS).
Python
458
star
45

django-robots

A Django app for managing robots.txt files following the robots exclusion protocol
Python
457
star
46

wagtailmenus

An app to help you manage and render menus in your Wagtail projects more effectively
Python
394
star
47

django-embed-video

Django app for easy embedding YouTube and Vimeo videos and music from SoundCloud.
Python
383
star
48

django-downloadview

Serve files with Django.
Python
378
star
49

django-eav2

Django EAV 2 - EAV storage for modern Django
Python
343
star
50

jsonmodels

jsonmodels is library to make it easier for you to deal with structures that are converted to, or read from JSON.
Python
335
star
51

django-queued-storage

Provides a proxy for Django storage backends that allows you to upload files locally and eventually serve them remotely
Python
316
star
52

django-permission

[Not maintained] An enhanced permission system which support object permission in Django
Python
302
star
53

django-revproxy

Reverse Proxy view that supports all HTTP methods, Diazo transformations and Single Sign-On.
Python
300
star
54

django-authority

A Django app that provides generic per-object-permissions for Django's auth app and helpers to create custom permission checks.
Python
292
star
55

django-simple-menu

Simple, yet powerful, code-based menus for Django applications
Python
264
star
56

django-dbtemplates

Django template loader for database stored templates with extensible cache backend
JavaScript
252
star
57

django-fsm-log

Automatic logging for Django FSM
Python
242
star
58

django-mongonaut

Built from scratch to replicate some of the Django admin functionality and add some more, to serve as an introspective interface for Django and Mongo.
Python
240
star
59

django-cookie-consent

Reusable application for managing various cookies and visitors consent for their use in Django project.
Python
224
star
60

django-celery-monitor

Celery Monitoring for Django
Python
197
star
61

docopt-ng

Humane command line arguments parser. Now with maintenance, typehints, and complete test coverage.
Python
178
star
62

django-ddp

Django/PostgreSQL implementation of the Meteor server.
Python
167
star
63

icalevents

Python module for iCal URL/file parsing and querying.
Python
156
star
64

django-voting

A generic voting application for Django
Python
99
star
65

django-ical

iCal feeds for Django based on Django's syndication feed framework.
Python
92
star
66

django-redshift-backend

Redshift database backend for Django
Python
83
star
67

django-flatblocks

django-chunks + headerfield + variable chunknames + "inclusion tag" == django-flatblocks
Python
82
star
68

pathlib2

Backport of pathlib aiming to support the full stdlib Python API.
Python
81
star
69

website

Code for the Jazzband website
Python
66
star
70

django-sorter

A helper app for sorting objects in Django templates.
Python
54
star
71

django-discover-jenkins

A streamlined fork of django-jenkins designed to work with the default test command and the discover runner
Python
49
star
72

django-fernet-encrypted-fields

Python
47
star
73

contextlib2

contextlib2 is a backport of the standard library's contextlib module to earlier Python versions.
Python
38
star
74

imaplib2

Fork of Piers Lauder's imaplib2 library for Python.
Python
33
star
75

help

Use this repo to get help from the roadies
27
star
76

django-postgres-utils

Django app providing additional lookups and functions for PostgreSQL
Python
9
star
77

.github

Community health and config files for Jazzband
7
star
78

admin

Some admin files for Jazzband
3
star
79

actions

Various GitHub actions for Jazzband projects
1
star