• Stars
    star
    110
  • Rank 306,508 (Top 7 %)
  • Language
    Python
  • Created over 2 years ago
  • Updated 3 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Timescale NFT Starter Kit

Timescale NFT Starter Kit

The Timescale NFT Starter Kit is a step-by-step guide to get up and running with collecting, storing, analyzing and visualizing NFT data from OpenSea, using PostgreSQL and TimescaleDB.

The NFT Starter Kit will give you a foundation for analyzing NFT trends so that you can bring some data to your purchasing decisions, or just learn about the NFT space from a data-driven perspective. It also serves as a solid foundation for your more complex NFT analysis projects in the future.

We recommend following along with the NFT Starter Kit tutorial to get familar with the contents of this repository.

For more information about the NFT Starter Kit, see the announcement blog post.

Project components

We provide multiple standalone components to help your data exploration journey at each level.

Design database schema

Get data

Build dashboards

Analyze data

Get started

Whichever component you are most interested in, first clone the repository:

git clone https://github.com/timescale/nft-starter-kit.git
cd nft-starter-kit

Setting up the pre-built Superset dashboards

This part of the project is fully Dockerized. TimescaleDB and the Superset dashboard is built out automatically using docker-compose. After completing the steps below, you will have a local TimescaleDB and Superset instance running in containers - containing 500K+ NFT transactions from OpenSea.

superset dashboard

The Docker service uses port 8088 (for Superset) and 6543 (for TimescaleDB) so make sure there's no other services using those ports before starting the installation process.

Prerequisites

  • Docker

  • Docker compose

    Verify that both are installed:

    docker --version && docker-compose --version

Instructions

  1. Run docker-compose up --build in the /pre-built-dashboards folder:

    cd pre-built-dashboards
    docker-compose up --build

    See when the process is done (it could take a couple of minutes):

    timescaledb_1      | PostgreSQL init process complete; ready for start up.
  2. Go to http://0.0.0.0:8088/ in your browser and login with these credentials:

    user: admin
    password: admin
  3. Open the Databases page inside Superset (http://0.0.0.0:8088/databaseview/list/). You will see exactly one item there called NFT Starter Kit.

  4. Go check out your NFT dashboards!

    Collections dashboard: http://0.0.0.0:8088/superset/dashboard/1

    Assets dashboard: http://0.0.0.0:8088/superset/dashboard/2

Running the data ingestion script

If you'd like to ingest data into your database (be it a local TimescaleDB, or in Timescale Cloud) straight from the OpenSea API, follow these steps to configure the ingestion script:

Prerequisites

Instructions

  1. Go to the root folder of the project:
    cd nft-starter-kit
  2. Create a new Python virtual environment and install the requirements:
    virtualenv env && source env/bin/activate
    pip install -r requirements.txt
  3. Replace the parameters in the config.py file:
    DB_NAME="tsdb"
    HOST="YOUR_HOST_URL"
    USER="tsdbadmin"
    PASS="YOUR_PASSWORD_HERE"
    PORT="PORT_NUMBER"
    OPENSEA_START_DATE="2021-10-01T00:00:00" # example start date (UTC)
    OPENSEA_END_DATE="2021-10-06T23:59:59" # example end date (UTC)
    OPENSEA_APIKEY="YOUR_OPENSEA_APIKEY" # need to request from OpenSea's docs
  4. Run the Python script:
    python opensea_ingest.py
    This will start ingesting data in batches, ~300 rows at a time:
    Start ingesting data between 2021-10-01 00:00:00+00:00 and 2021-10-06 23:59:59+00:00
    ---
    Fetching transactions from OpenSea...
    Data loaded into temp table!
    Data ingested!
    Data has been backfilled until this time: 2021-10-06 23:51:31.140126+00:00
    ---
    You can stop the ingesting process anytime (Ctrl+C), otherwise the script will run until all the transactions have been ingested from the given time period.

Ingest the sample data

If you don't want to spend time waiting until a decent amount of data is ingested, you can just use our sample dataset which contains 500K+ sale transactions from OpenSea (this sample was used for the Superset dashboard as well)

Prerequisites

Instructions

  1. Go to the folder with the sample CSV files (or you can also download them from here):
    cd pre-built-dashboards/database/data
  2. Connect to your database with PSQL:
    psql -x "postgres://host:port/tsdb?sslmode=require"
    If you're using Timescale Cloud, the instructions under How to Connect provide a customized command to run to connect directly to your database.
  3. Import the CSV files in this order (it can take a few minutes in total):
    \copy accounts FROM 001_accounts.csv CSV HEADER;
    \copy collections FROM 002_collections.csv CSV HEADER;
    \copy assets FROM 003_assets.csv CSV HEADER;
    \copy nft_sales FROM 004_nft_sales.csv CSV HEADER;
  4. Try running some queries on your database:
    SELECT count(*), MIN(time) AS min_date, MAX(time) AS max_date FROM nft_sales 

More Repositories

1

timescaledb

An open-source time-series SQL database optimized for fast ingest and complex queries. Packaged as a PostgreSQL extension.
C
16,259
star
2

promscale

[DEPRECATED] Promscale is a unified metric and trace observability backend for Prometheus, Jaeger and OpenTelemetry built on PostgreSQL and TimescaleDB.
Go
1,330
star
3

tsbs

Time Series Benchmark Suite, a tool for comparing and evaluating databases for time series data
Go
1,196
star
4

tobs

tobs - The Observability Stack for Kubernetes. Easy install of a full observability stack into a k8s cluster with Helm charts.
Shell
549
star
5

timescaledb-tune

A tool for tuning TimescaleDB for better performance by adjusting settings to match your system's CPU and memory resources.
Go
397
star
6

timescaledb-parallel-copy

A binary for parallel copying of CSV data into a TimescaleDB hypertable
Go
345
star
7

prometheus-postgresql-adapter

Use PostgreSQL as a remote storage database for Prometheus
Go
335
star
8

timescaledb-toolkit

Extension for more hyperfunctions, fully compatible with TimescaleDB and PostgreSQL 📈
Rust
324
star
9

timescaledb-docker

Release Docker builds of TimescaleDB
Dockerfile
279
star
10

helm-charts

Configuration and Documentation to run TimescaleDB in your Kubernetes cluster
Shell
260
star
11

pg_prometheus

PostgreSQL extension for Prometheus data
C
213
star
12

timescaledb-docker-ha

Create Docker images containing TimescaleDB, Patroni to be used by developers and Kubernetes.
Python
134
star
13

examples

Collection of example applications and tools to help you get familiar with TimescaleDB
JavaScript
119
star
14

outflux

Export data from InfluxDB to TimescaleDB
Go
83
star
15

vector-cookbook

Timescale Vector Cookbook. A collection of recipes to build applications with LLMs using PostgreSQL and Timescale Vector.
Jupyter Notebook
77
star
16

opentelemetry-demo

A demo system for exploring the tracing features of Promscale
Python
61
star
17

streaming-replication-docker

TimescaleDB Streaming Replication in Docker
Shell
56
star
18

docs

Timescale product documentation 📖
JavaScript
46
star
19

timescaledb-extras

Helper functions and procedures for timescale
PLpgSQL
39
star
20

benchmark-postgres

Tools for benchmarking TimescaleDB vs PostgreSQL
Go
37
star
21

promscale_extension

[DEPRECATED] Tables, types and functions supporting Promscale
PLpgSQL
37
star
22

docs.timescale.com-content

Content pages for TimescaleDB documentation
JavaScript
35
star
23

timescaledb-backup

Go
33
star
24

pgspot

Spot vulnerabilities in postgres SQL scripts
Python
28
star
25

timescaledb-wale

Dockerized WAL-E with an HTTP API
Python
21
star
26

terraform-provider-timescale

Timescale Cloud Terraform Provider
Go
17
star
27

homebrew-tap

TimescaleDB Homebrew tap, containing formulas for the database, tools, etc.
Ruby
17
star
28

pg_influx

InfluxDB Line Protocol Listener for PostgreSQL
C
16
star
29

python-vector

Jupyter Notebook
16
star
30

tsv-timemachine

Sample application for time aware RAG with Streamlit, LlamaIndex and Timescale Vector. Learn more at https://www.timescale.com/ai
Python
8
star
31

promscale-benchmark

Makefile
8
star
32

templates

Templates to get started with Timescale on Finance or Sensors (IoT)
PLpgSQL
7
star
33

timescale-extension-utils-rs

Rust
5
star
34

doctor

Rule-based recommendations about your timeseries database.
Python
4
star
35

web-developer-assignment

HTML
3
star
36

wikistream-docker

A Docker environment for https://github.com/timescale/wikistream
Shell
3
star
37

mta-timescale

Demo: Load MTA bus feeds into TimescaleDB
3
star
38

cloud-actions

Cloud public actions
Shell
3
star
39

aws-lambda-example

A sample serverless AWS Lambda time-series application.
Python
2
star
40

frontend-developer-assignment

HTML
2
star
41

pg_traceam

Simple table access method that just prints out what functions in the access methods and related functions that are called.
C
2
star
42

state_of_postgres

2019
SCSS
1
star
43

build-actions

GitHub actions for release pipelines (building, publishing, checking, etc.)
Shell
1
star
44

pgschema

1
star
45

docs-htmltojsx

A fork of react-magic html-to-jsx specifically modified to parse timescale docs
JavaScript
1
star
46

postgres_cheat_sheet

1
star
47

promscale_specs

Formal specifications for Promscale components
TLA
1
star