• Stars
    star
    142
  • Rank 258,495 (Top 6 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created over 2 years ago
  • Updated almost 2 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Create a SQLite database containing metadata from Google Drive

google-drive-to-sqlite

PyPI Changelog Tests License

Create a SQLite database containing metadata from Google Drive

For background on this project, see Google Drive to SQLite on my blog.

If you use Google Drive, and especially if you have shared drives with other people there's a good chance you have hundreds or even thousands of files that you may not be fully aware of.

This tool can download metadata about those files - their names, sizes, folders, content types, permissions, creation dates and more - and store them in a SQLite database.

This lets you use SQL to analyze your Google Drive contents, using Datasette or the SQLite command-line tool or any other SQLite database browsing software.

Installation

Install this tool using pip:

pip install google-drive-to-sqlite

Quickstart

Authenticate with Google Drive by running:

google-drive-to-sqlite auth

Now create a SQLite database with metadata about all of the files you have starred using:

google-drive-to-sqlite files starred.db --starred

You can explore the resulting database using Datasette:

$ pip install datasette
$ datasette starred.db
INFO:     Started server process [24661]
INFO:     Uvicorn running on http://127.0.0.1:8001

Authentication

⚠️ This application has not yet been verified by Google - you may find you are unable to authenticate until that verification is complete. #10

You can work around this issue by creating your own OAuth client ID key and passing it to the auth command using --google-client-id and --google-client-secret.

First, authenticate with Google Drive using the auth command:

$ google-drive-to-sqlite auth
Visit the following URL to authenticate with Google Drive

https://accounts.google.com/o/oauth2/v2/auth?...

Then return here and paste in the resulting code:
Paste code here: 

Follow the link, sign in with Google Drive and then copy and paste the resulting code back into the tool.

This will save an authentication token to the file called auth.json in the current directory.

To specify a different location for that file, use the --auth option:

google-drive-to-sqlite auth --auth ~/google-drive-auth.json

The auth command also provides options for using a different scope, Google client ID and Google client secret. You can use these to create your own custom authentication tokens that can work with other Google APIs, see issue #5 for details.

Full --help:

Usage: google-drive-to-sqlite auth [OPTIONS]

  Authenticate user and save credentials

Options:
  -a, --auth FILE              Path to save token, defaults to auth.json
  --google-client-id TEXT      Custom Google client ID
  --google-client-secret TEXT  Custom Google client secret
  --scope TEXT                 Custom token scope
  --help                       Show this message and exit.

To revoke the token that is stored in auth.json, such that it cannot be used to access Google Drive in the future, run the revoke command:

google-drive-to-sqlite revoke

Or if your token is stored in another location:

google-drive-to-sqlite revoke -a ~/google-drive-auth.json

You will need to obtain a fresh token using the auth command in order to continue using this tool.

google-drive-to-sqlite files

To retrieve metadata about the files in your Google Drive, or a folder or search within it, use the google-drive-to-sqlite files command.

This will default to writing details about every file in your Google Drive to a SQLite database:

google-drive-to-sqlite files files.db

Files and folders will be written to databases tables, which will be created if they do not yet exist. The database schema is shown below.

If a file or folder already exists, based on a matching id, it will be replaced with fresh data.

Instead of writing to SQLite you can use --json to output as JSON, or --nl to output as newline-delimited JSON:

google-drive-to-sqlite files --nl

Use --folder ID to retrieve everything in a specified folder and its sub-folders:

google-drive-to-sqlite files files.db --folder 1E6Zg2X2bjjtPzVfX8YqdXZDCoB3AVA7i

Use --q QUERY to use a custom search query:

google-drive-to-sqlite files files.db -q "viewedByMeTime > '2022-01-01'"

The following shortcut options help build queries:

  • --full-text TEXT to search for files where the full text matches a search term
  • --starred for files and folders you have starred
  • --trashed for files and folders in the trash
  • --shared-with-me for files and folders that have been shared with you
  • --apps for Google Apps documents, spreadsheets, presentations and drawings (equivalent to setting all of the next four options)
  • --docs for Google Apps documents
  • --sheets for Google Apps spreadsheets
  • --presentations for Google Apps presentations
  • --drawings for Google Apps drawings

You can combine these - for example, this returns all files that you have starred and that were shared with you:

google-drive-to-sqlite files highlights.db \
  --starred --shared-with-me

Multiple options are treated as AND, with the exception of the Google Apps options which are treated as OR - so the following would retrieve all spreadsheets and presentations that have also been starred:

google-drive-to-sqlite files highlights.db \
  --starred --sheets --presentations

You can use --stop-after X to stop after retrieving X files, useful for trying out a new search pattern and seeing results straight away.

The --import-json and --import-nl options are mainly useful for testing and developing this tool. They allow you to replay the JSON or newline-delimited JSON that was previously fetched using --json or --nl and use it to create a fresh SQLite database, without needing to make any outbound API calls:

# Fetch all starred files from the API, write to starred.json
google-drive-to-sqlite files -q 'starred = true' --json > starred.json
# Now import that data into a new SQLite database file
google-drive-to-sqlite files starred.db --import-json starred.json

Full --help:

Usage: google-drive-to-sqlite files [OPTIONS] [DATABASE]

  Retrieve metadata for files in Google Drive, and write to a SQLite database or
  output as JSON.

      google-drive-to-sqlite files files.db

  Use --json to output JSON, --nl for newline-delimited JSON:

      google-drive-to-sqlite files files.db --json

  Use a folder ID to recursively fetch every file in that folder and its sub-
  folders:

      google-drive-to-sqlite files files.db --folder
      1E6Zg2X2bjjtPzVfX8YqdXZDCoB3AVA7i

  Fetch files you have starred:

      google-drive-to-sqlite files starred.db --starred

Options:
  -a, --auth FILE       Path to auth.json token file
  --folder TEXT         Files in this folder ID and its sub-folders
  -q TEXT               Files matching this query
  --full-text TEXT      Search for files with text match
  --starred             Files you have starred
  --trashed             Files in the trash
  --shared-with-me      Files that have been shared with you
  --apps                Google Apps docs, spreadsheets, presentations and
                        drawings
  --docs                Google Apps docs
  --sheets              Google Apps spreadsheets
  --presentations       Google Apps presentations
  --drawings            Google Apps drawings
  --json                Output JSON rather than write to DB
  --nl                  Output newline-delimited JSON rather than write to DB
  --stop-after INTEGER  Stop paginating after X results
  --import-json FILE    Import from this JSON file instead of the API
  --import-nl FILE      Import from this newline-delimited JSON file
  -v, --verbose         Send verbose output to stderr
  --help                Show this message and exit.

google-drive-to-sqlite download FILE_ID

The download command can be used to download files from Google Drive.

You'll need one or more file IDs, which look something like 0B32uDVNZfiEKLUtIT1gzYWN2NDI4SzVQYTFWWWxCWUtvVGNB.

To download the file, run this:

google-drive-to-sqlite download 0B32uDVNZfiEKLUtIT1gzYWN2NDI4SzVQYTFWWWxCWUtvVGNB

This will detect the content type of the file and use that as the extension - so if this file is a JPEG the file would be downloaded as:

0B32uDVNZfiEKLUtIT1gzYWN2NDI4SzVQYTFWWWxCWUtvVGNB.jpeg

You can pass multiple file IDs to the command at once.

To hide the progress bar and filename output, use -s or --silent.

If you are downloading a single file you can use the -o output to specify a filename and location:

google-drive-to-sqlite download 0B32uDVNZfiEKLUtIT1gzYWN2NDI4SzVQYTFWWWxCWUtvVGNB \
  -o my-image.jpeg

Use -o - to write the file contents to standard output:

google-drive-to-sqlite download 0B32uDVNZfiEKLUtIT1gzYWN2NDI4SzVQYTFWWWxCWUtvVGNB \
  -o - > my-image.jpeg

Full --help:

Usage: google-drive-to-sqlite download [OPTIONS] FILE_IDS...

  Download one or more files to disk, based on their file IDs.

  The file content will be saved to a file with the name:

      FILE_ID.ext

  Where the extension is automatically picked based on the type of file.

  If you are downloading a single file you can specify a filename with -o:

      google-drive-to-sqlite download MY_FILE_ID -o myfile.txt

Options:
  -a, --auth FILE    Path to auth.json token file
  -o, --output FILE  File to write to, or - for standard output
  -s, --silent       Hide progress bar and filename
  --help             Show this message and exit.

google-drive-to-sqlite export FORMAT FILE_ID

The export command can be used to export Google Docs documents, spreadsheets and presentations in a number of different formats.

You'll need one or more document IDs, which look something like 10BOHGDUYa7lBjUSo26YFCHTpgEmtXabdVFaopCTh1vU. You can find these by looking at the URL of your document on the Google Docs site.

To export that document as PDF, run this:

google-drive-to-sqlite export pdf 10BOHGDUYa7lBjUSo26YFCHTpgEmtXabdVFaopCTh1vU

The file will be exported as:

10BOHGDUYa7lBjUSo26YFCHTpgEmtXabdVFaopCTh1vU-export.pdf

You can pass multiple file IDs to the command at once.

For the FORMAT option you can use any of the mime type options listed on this page - for example, to export as an Open Office document you could use:

google-drive-to-sqlite export \
 application/vnd.oasis.opendocument.text \
 10BOHGDUYa7lBjUSo26YFCHTpgEmtXabdVFaopCTh1vU

For convenience the following shortcuts for common file formats are provided:

  • Google Docs: html, txt, rtf, pdf, doc, zip, epub
  • Google Sheets: xls, pdf, csv, tsv, zip
  • Presentations: ppt, pdf, txt
  • Drawings: jpeg, png, svg

The zip option returns a zip file of HTML. txt returns plain text. The others should be self-evident.

To hide the filename output, use -s or --silent.

If you are exporting a single file you can use the -o output to specify a filename and location:

google-drive-to-sqlite export pdf 10BOHGDUYa7lBjUSo26YFCHTpgEmtXabdVFaopCTh1vU \
  -o my-document.pdf

Use -o - to write the file contents to standard output:

google-drive-to-sqlite export pdf 10BOHGDUYa7lBjUSo26YFCHTpgEmtXabdVFaopCTh1vU \
  -o - > my-document.pdf

Full --help:

Usage: google-drive-to-sqlite export [OPTIONS] FORMAT FILE_IDS...

  Export one or more files to the specified format.

  Usage:

      google-drive-to-sqlite export pdf FILE_ID_1 FILE_ID_2

  The file content will be saved to a file with the name:

      FILE_ID-export.ext

  Where the extension is based on the format you specified.

  Available export formats can be seen here:
  https://developers.google.com/drive/api/v3/ref-export-formats

  Or you can use one of the following shortcuts:

  - Google Docs: html, txt, rtf, pdf, doc, zip, epub
  - Google Sheets: xls, pdf, csv, tsv, zip
  - Presentations: ppt, pdf, txt
  - Drawings: jpeg, png, svg

  "zip" returns a zip file of HTML.

  If you are exporting a single file you can specify a filename with -o:

      google-drive-to-sqlite export zip MY_FILE_ID -o myfile.zip

Options:
  -a, --auth FILE    Path to auth.json token file
  -o, --output FILE  File to write to, or - for standard output
  -s, --silent       Hide progress bar and filename
  --help             Show this message and exit.

google-drive-to-sqlite get URL

The get command makes authenticated requests to the specified URL, using credentials derived from the auth.json file.

For example:

$ google-drive-to-sqlite get 'https://www.googleapis.com/drive/v3/about?fields=*'
{
    "kind": "drive#about",
    "user": {
        "kind": "drive#user",
        "displayName": "Simon Willison",
# ...

If the resource you are fetching supports pagination you can use --paginate key to paginate through all of the rows in a specified key. For example, the following API has a nextPageToken key and a files list, suggesting it supports pagination:

$ google-drive-to-sqlite get https://www.googleapis.com/drive/v3/files
{
    "kind": "drive#fileList",
    "nextPageToken": "~!!~AI9...wogHHYlc=",
    "incompleteSearch": false,
    "files": [
        {
            "kind": "drive#file",
            "id": "1YEsITp_X8PtDUJWHGM0osT-TXAU1nr0e7RSWRM2Jpyg",
            "name": "Title of a spreadsheet",
            "mimeType": "application/vnd.google-apps.spreadsheet"
        },

To paginate through everything in the files list you would use --paginate files like this:

$ google-drive-to-sqlite get https://www.googleapis.com/drive/v3/files --paginate files
[
  {
    "kind": "drive#file",
    "id": "1YEsITp_X8PtDUJWHGM0osT-TXAU1nr0e7RSWRM2Jpyg",
    "name": "Title of a spreadsheet",
    "mimeType": "application/vnd.google-apps.spreadsheet"
  },
  # ...

Add --nl to stream paginated data as newline-delimited JSON:

$ google-drive-to-sqlite get https://www.googleapis.com/drive/v3/files --paginate files --nl
{"kind": "drive#file", "id": "1YEsITp_X8PtDUJWHGM0osT-TXAU1nr0e7RSWRM2Jpyg", "name": "Title of a spreadsheet", "mimeType": "application/vnd.google-apps.spreadsheet"}
{"kind": "drive#file", "id": "1E6Zg2X2bjjtPzVfX8YqdXZDCoB3AVA7i", "name": "Subfolder", "mimeType": "application/vnd.google-apps.folder"}

Add --stop-after 5 to stop after 5 records - useful for testing.

Full --help:

Usage: google-drive-to-sqlite get [OPTIONS] URL

  Make an authenticated HTTP GET to the specified URL

Options:
  -a, --auth FILE       Path to auth.json token file
  --paginate TEXT       Paginate through all results in this key
  --nl                  Output paginated data as newline-delimited JSON
  --stop-after INTEGER  Stop paginating after X results
  -v, --verbose         Send verbose output to stderr
  --help                Show this message and exit.

Database schema

The database created by this tool has the following schema:

CREATE TABLE [drive_users] (
   [permissionId] TEXT PRIMARY KEY,
   [kind] TEXT,
   [displayName] TEXT,
   [photoLink] TEXT,
   [me] INTEGER,
   [emailAddress] TEXT
);
CREATE TABLE [drive_folders] (
   [id] TEXT PRIMARY KEY,
   [_parent] TEXT,
   [_owner] TEXT,
   [lastModifyingUser] TEXT,
   [kind] TEXT,
   [name] TEXT,
   [mimeType] TEXT,
   [starred] INTEGER,
   [trashed] INTEGER,
   [explicitlyTrashed] INTEGER,
   [parents] TEXT,
   [spaces] TEXT,
   [version] TEXT,
   [webViewLink] TEXT,
   [iconLink] TEXT,
   [hasThumbnail] INTEGER,
   [thumbnailVersion] TEXT,
   [viewedByMe] INTEGER,
   [createdTime] TEXT,
   [modifiedTime] TEXT,
   [modifiedByMe] INTEGER,
   [shared] INTEGER,
   [ownedByMe] INTEGER,
   [viewersCanCopyContent] INTEGER,
   [copyRequiresWriterPermission] INTEGER,
   [writersCanShare] INTEGER,
   [folderColorRgb] TEXT,
   [quotaBytesUsed] TEXT,
   [isAppAuthorized] INTEGER,
   [linkShareMetadata] TEXT,
   FOREIGN KEY([_parent]) REFERENCES [drive_folders]([id]),
   FOREIGN KEY([_owner]) REFERENCES [drive_users]([permissionId]),
   FOREIGN KEY([lastModifyingUser]) REFERENCES [drive_users]([permissionId])
);
CREATE TABLE [drive_files] (
   [id] TEXT PRIMARY KEY,
   [_parent] TEXT,
   [_owner] TEXT,
   [lastModifyingUser] TEXT,
   [kind] TEXT,
   [name] TEXT,
   [mimeType] TEXT,
   [starred] INTEGER,
   [trashed] INTEGER,
   [explicitlyTrashed] INTEGER,
   [parents] TEXT,
   [spaces] TEXT,
   [version] TEXT,
   [webViewLink] TEXT,
   [iconLink] TEXT,
   [hasThumbnail] INTEGER,
   [thumbnailVersion] TEXT,
   [viewedByMe] INTEGER,
   [createdTime] TEXT,
   [modifiedTime] TEXT,
   [modifiedByMe] INTEGER,
   [shared] INTEGER,
   [ownedByMe] INTEGER,
   [viewersCanCopyContent] INTEGER,
   [copyRequiresWriterPermission] INTEGER,
   [writersCanShare] INTEGER,
   [quotaBytesUsed] TEXT,
   [isAppAuthorized] INTEGER,
   [linkShareMetadata] TEXT,
   FOREIGN KEY([_parent]) REFERENCES [drive_folders]([id]),
   FOREIGN KEY([_owner]) REFERENCES [drive_users]([permissionId]),
   FOREIGN KEY([lastModifyingUser]) REFERENCES [drive_users]([permissionId])
);

Thumbnails

You can construct a thumbnail image for a known file ID using the following URL:

https://drive.google.com/thumbnail?sz=w800-h800&id=FILE_ID

Users who are signed into Google Drive and have permission to view a file will be redirected to a thumbnail version of that file. You can tweak the w800 and h800 parameters to request different thumbnail sizes.

Privacy policy

This tool requests access to your Google Drive account in order to retrieve metadata about your files there. It also offers a feature that can download the content of those files.

The credentials used to access your account are stored in the auth.json file on your computer. The metadata and content retrieved from Google Drive is also stored only on your own personal computer.

At no point do the developers of this tool gain access to any of your data.

Development

To contribute to this tool, first checkout the code. Then create a new virtual environment:

cd google-drive-to-sqlite
python -m venv venv
source venv/bin/activate

Or if you are using pipenv:

pipenv shell

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest

More Repositories

1

datasette

An open source multi-tool for exploring and publishing data
Python
7,807
star
2

sqlite-utils

Python CLI utility and library for manipulating SQLite databases
Python
1,191
star
3

shot-scraper

A command-line utility for taking automated screenshots of websites
Python
1,006
star
4

csvs-to-sqlite

Convert CSV files into a SQLite database
Python
758
star
5

til

Today I Learned
HTML
719
star
6

django-sql-dashboard

Django app for building dashboards using raw SQL queries
Python
400
star
7

simonw

https://simonwillison.net/2020/Jul/10/self-updating-profile-readme/
Python
362
star
8

llm

Access large language models from the command-line
Python
309
star
9

db-to-sqlite

CLI tool for exporting tables or queries from any SQL database to a SQLite file
Python
302
star
10

djangode

Utilities functions for node.js that borrow some useful concepts from Django
JavaScript
256
star
11

csv-diff

Python CLI tool and library for diffing CSV and JSON files
Python
238
star
12

datasette-lite

Datasette running in your browser using WebAssembly and Pyodide
HTML
237
star
13

shot-scraper-template

Template repository for setting up shot-scraper
217
star
14

geocoders

Ultra simple API for geocoding a single string against various web services.
Python
184
star
15

ca-fires-history

Tracking fire data from www.fire.ca.gov
165
star
16

django-openid

A modern library for integrating OpenID with Django - incomplete, but really nearly there (promise)
Python
163
star
17

openai-to-sqlite

Save OpenAI API results to a SQLite database
Python
161
star
18

action-transcription

A tool for creating a repository of transcribed videos
Python
158
star
19

s3-credentials

A tool for creating credentials for accessing S3 buckets
Python
149
star
20

git-history

Tools for analyzing Git history using SQLite
Python
147
star
21

django-queryset-transform

Experimental .transform(fn) method for Django QuerySets, for clever lazily evaluated optimisations.
Python
142
star
22

ratelimitcache

A memcached backed rate limiting decorator for Django.
Python
141
star
23

optfunc

Syntactic sugar for creating Python command line scripts by introspecting a function definition
Python
134
star
24

djng

Turtles all the way down
Python
129
star
25

cougar-or-not

An API for identifying cougars v.s. bobcats v.s. other USA cat species
Jupyter Notebook
119
star
26

simonwillisonblog

The source code behind my blog
JavaScript
118
star
27

advent-of-code-2022-in-rust

Copilot-assisted Advent of Code 2022 to learn Rust
Rust
114
star
28

djangopeople.net

A geographical community site for Django developers.
Python
111
star
29

scrape-chatgpt-plugin-prompts

Shell
107
star
30

s3-ocr

Tools for running OCR against files stored in S3
Python
103
star
31

datasette-app

The Datasette macOS application
JavaScript
100
star
32

django-redis-monitor

Request per second / SQLop per second monitoring for Django, using Redis for storage
Python
97
star
33

python-lib

Opinionated cookiecutter template for creating a new Python library
Python
97
star
34

ttok

Count and truncate text based on tokens
Python
96
star
35

mytweets

Script for saving a JSON archive of your tweets.
Python
81
star
36

airtable-export

Export Airtable data to YAML, JSON or SQLite files on disk
Python
79
star
37

datasette-graphql

Datasette plugin providing an automatic GraphQL API for your SQLite databases
Python
77
star
38

llm-mlc

LLM plugin for running models using MLC
Python
74
star
39

strip-tags

CLI tool for stripping tags from HTML
Python
73
star
40

django_cropper

Integration of jCrop with the Django admin
Python
71
star
41

click-app

Cookiecutter template for creating new Click command-line tools
Python
70
star
42

datasette-ripgrep

Web interface for searching your code using ripgrep, built as a Datasette plugin
Python
69
star
43

download-esm

Download ESM modules from npm and jsdelivr
Python
67
star
44

datasette.io

The official project website for Datasette
HTML
66
star
45

ftfy-web

Paste in some broken unicode text and FTFY will tell you how to fix it!
Python
63
star
46

markdown-to-sqlite

CLI tool for loading markdown files into a SQLite database
Python
63
star
47

sqlite-history

Track changes to SQLite tables using triggers
Python
62
star
48

yaml-to-sqlite

Utility for converting YAML files to SQLite
Python
62
star
49

sqlite-diffable

Tools for dumping/loading a SQLite database to diffable directory structure
Python
62
star
50

covid-19-datasette

Deploys a Datasette instance of COVID-19 data from Johns Hopkins CSSE and the New York Times
Python
61
star
51

dogproxy

Experimental HTTP proxy (using node.js) for avoiding the dog pile effect.
JavaScript
61
star
52

soupselect

CSS selector support for BeautifulSoup.
Python
60
star
53

laion-aesthetic-datasette

Use Datasette to explore LAION improved_aesthetics_6plus training data used by Stable DIffusion
Python
58
star
54

datasette-cluster-map

Datasette plugin that shows a map for any data with latitude/longitude columns
JavaScript
55
star
55

action-transcription-demo

A tool for creating a repository of transcribed videos
Python
53
star
56

datasette-vega

Datasette plugin for visualizing data using Vega
JavaScript
52
star
57

pge-outages-pre-2024

Tracking PG&E outages
Python
51
star
58

google-calendar-to-sqlite

Create a SQLite database containing your data from Google Calendar
Python
50
star
59

url-map

Use URL parameters to generate a map with markers, using Leaflet and OpenStreetMap
HTML
49
star
60

disaster-scrapers

Scrapers for disaster data - writes to https://github.com/simonw/disaster-data
Python
46
star
61

djp

A plugin system for Django
Python
46
star
62

geojson-to-sqlite

CLI tool for converting GeoJSON files to SQLite (with SpatiaLite)
Python
45
star
63

asgi-csrf

ASGI middleware for protecting against CSRF attacks
Python
44
star
64

datasette-chatgpt-plugin

A Datasette plugin that turns a Datasette instance into a ChatGPT plugin
Python
44
star
65

nodecast

A simple comet broadcast server, originally implemented as a demo for Full Frontal 2009.
JavaScript
44
star
66

bugle_project

Group collaboration tools for hackers in forts.
Python
42
star
67

django-html

A way of rendering django.forms widgets that differentiates between HTML and XHTML.
Python
42
star
68

datasette-auth-github

Datasette plugin that authenticates users against GitHub
Python
41
star
69

puppeteer-screenshot

Vercel app for taking screenshots of web pages using Puppeteer
JavaScript
40
star
70

llm-replicate

LLM plugin for models hosted on Replicate
Python
40
star
71

python-lib-template-repository

GitHub template repository for creating new Python libraries, using the simonw/python-lib cookiecutter template
39
star
72

django-signed

Signing utilities for Django, to try out an API which is being proposed for inclusion in Django core.
Python
37
star
73

museums

A website recommending niche museums to visit
JavaScript
36
star
74

pypi-rename

Cookiecutter template for creating renamed PyPI packages
Python
36
star
75

help-scraper

Record a history of --help for various commands
Python
35
star
76

dbf-to-sqlite

CLI tool for converting DBF files (dBase, FoxPro etc) to SQLite
Python
35
star
77

disaster-data

Data scraped by https://github.com/simonw/disaster-scrapers
35
star
78

asyncinject

Run async workflows using pytest-fixtures-style dependency injection
Python
35
star
79

datasette-publish-vercel

Datasette plugin for publishing data using Vercel
Python
34
star
80

gzthermal-web

A web interface to gzthermal by caveman on encode.ru
Python
32
star
81

asgi-auth-github

ASGI middleware that authenticates users against GitHub
Python
31
star
82

json-head

JSON microservice for performing HEAD requests
Python
31
star
83

django-safeform

CSRF protection for Django forms.
Python
31
star
84

s3-image-proxy

A tiny proxy for serving and resizing images fetched from a private S3 bucket
Python
31
star
85

sqlite-transform

Tool for running transformations on columns in a SQLite database
Python
30
star
86

webhook-relay

A simple Node.js server for queueing and relaying webhook requests
JavaScript
30
star
87

datasette-tiddlywiki

Run TiddlyWiki in Datasette and save Tiddlers to a SQLite database
HTML
29
star
88

image-diff

CLI tool for comparing images
Python
29
star
89

getlatlon.com

Source code for getlatlon.com - a simple, single page, pure JavaScript Google Maps application.
29
star
90

sf-tree-history

Tracking the history of trees in San Francisco
29
star
91

scrape-hacker-news-by-domain

Scrape HN to track links from specific domains
JavaScript
28
star
92

timezones-api

A Datasette-powered API for finding the time zone for a latitude/longitude point
Python
26
star
93

owlsnearme

A website that tells you where your nearest owls are!
JavaScript
26
star
94

datasette-table

A Web Component for embedding a Datasette table on a page
JavaScript
26
star
95

xml-analyser

Simple command line tool for quickly analysing the structure of an arbitrary XML file
Python
26
star
96

shapefile-to-sqlite

Load shapefiles into a SQLite (optionally SpatiaLite) database
Python
26
star
97

cdc-vaccination-history

A git scraper recording the CDC's Covid Data Tracker numbers on number of vaccinations per state.
Python
24
star
98

json-flatten

Python functions for flattening a JSON object to a single dictionary of pairs, and unflattening that dictionary back to a JSON object
Python
24
star
99

datasette-json-html

Datasette plugin for rendering HTML based on JSON values
Python
24
star
100

djangocon-2022-productivity

Supporting links for my DjangoCon 2022 talk
23
star