• Stars
    star
    572
  • Rank 77,436 (Top 2 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created over 5 years ago
  • Updated 18 days ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

📘 The experiment tracker for foundation model training
 

neptune.ai

Quickstart   •   Website   •   Docs   •   Examples   •   Resource center   •   Blog  

What is neptune.ai?

neptune.ai makes it easy to log, store, organize, compare, register, and share all your ML model metadata in a single place.

  • Automate and standardize as your modeling team grows.
  • Collaborate on models and results with your team and across the org.
  • Use hosted, deploy on-premises, or in a private cloud. Integrate with any MLOps stack.  

 

 

  Play with a live neptune.ai app →  

 

Getting started

Step 1: Create a free account

Step 2: Install Neptune client library

pip install neptune

Step 3: Add experiment tracking snippet to your code

import neptune

run = neptune.init_run(project="Me/MyProject")
run["parameters"] = {"lr": 0.1, "dropout": 0.4}
run["test_accuracy"] = 0.84

Open in Colab  

 

Core features

Log and display

Add a snippet to any step of your ML pipeline once. Decide what and how you want to log. Run a million times.

  • Any framework: any code, PyTorch, PyTorch Lightning, TensorFlow/Keras, scikit-learn, LightGBM, XGBoost, Optuna, Kedro.

  • Any metadata type: metrics, parameters, dataset and model versions, images, interactive plots, videos, hardware (GPU, CPU, memory), code state.

  • From anywhere in your ML pipeline: multinode pipelines, distributed computing, log during or after execution, log offline, and sync when you are back online.  

 

all metadata metrics
 

 

Organize experiments

Organize logs in a fully customizable nested structure. Display model metadata in user-defined dashboard templates.

  • Nested metadata structure: flexible API lets you customize the metadata logging structure however you want. Talk to a dictionary at the code level. See the folder structure in the app. Organize nested parameter configs or the results on k-fold validation splits the way they should be.

  • Custom dashboards: combine different metadata types in one view. Define it for one run. Use anywhere. Look at GPU, memory consumption, and load times to debug training speed. See learning curves, image predictions, and confusion matrix to debug model quality.

  • Table views: create different views of the runs table and save them for later. You can have separate table views for debugging, comparing parameter sets, or best experiments.  

 

organize dashboards
 

 

Compare results

Visualize training live in the neptune.ai web app. See how different parameters and configs affect the results. Optimize models quicker.

  • Compare: learning curves, parameters, images, datasets.

  • Search, sort, and filter: experiments by any field you logged. Use our query language to filter runs based on parameter values, metrics, execution times, or anything else.

  • Visualize and display: runs table, interactive display, folder structure, dashboards.

  • Monitor live: hardware consumption metrics, GPU, CPU, memory.

  • Group by: dataset versions, parameters.  

 

compare, search, filter
 

 

Register models

Version, review, and access production-ready models and metadata associated with them in a single place.

  • Version models: register models, create model versions, version external model artifacts.

  • Review and change stages: look at the validation, test metrics and other model metadata. You can move models between None/Staging/Production/Archived.

  • Access and share models: every model and model version is accessible via the neptune.ai web app or through the API.  

 

register models
 

 

Share results

Have a single place where your team can see the results and access all models and experiments.

  • Send a link: share every chart, dashboard, table view, or anything else you see in the neptune.ai app by copying and sending persistent URLs.

  • Query API: access all model metadata via neptune.ai API. Whatever you logged, you can query in a similar way.

  • Manage users and projects: create different projects, add users to them, and grant different permissions levels.

  • Add your entire org: get unlimited users on every paid plan. So you can invite your entire organization, including product managers and subject matter experts at no extra cost.  

 

share persistent link
 

 

Integrate with any MLOps stack

neptune.ai integrates with 25+ frameworks: PyTorch, PyTorch Lightning, TensorFlow/Keras, LightGBM, scikit-learn, XGBoost, Optuna, Kedro, 🤗 Transformers, fastai, Prophet, and more.



PyTorch Lightning

Example:

from pytorch_lightning import Trainer
from pytorch_lightning.loggers import NeptuneLogger

# Create NeptuneLogger instance
from neptune import ANONYMOUS_API_TOKEN

neptune_logger = NeptuneLogger(
    api_key=ANONYMOUS_API_TOKEN,
    project="common/pytorch-lightning-integration",
    tags=["training", "resnet"],  # optional
)

# Pass the logger to the Trainer
trainer = Trainer(max_epochs=10, logger=neptune_logger)

# Run the Trainer
trainer.fit(my_model, my_dataloader)

neptune-pl  

github-code jupyter-code Open In Colab  

 

neptune.ai is trusted by great companies

 

 

Read how various customers use Neptune to improve their workflow.  

 

Support

If you get stuck or simply want to talk to us about something, here are your options:

 

People behind

Created with ❤️ by the neptune.ai team:

Piotr, Paulina, Chaz, Prince, Parth, Kshitij, Siddhant, Jakub, Patrycja, Dominika, Karolina, Stephen, Artur, Aleksiej, Martyna, Małgorzata, Magdalena, Karolina, Marcin, Michał, Tymoteusz, Rafał, Aleksandra, Sabine, Tomek, Piotr, Rafał, Adam, Hubert, Marcin, Jakub, Paweł, Franciszek, Bartosz, Aleksander, Dawid, Patryk, Krzysztof, Aurimas, Jakub, Bartosz, and you?

More Repositories

1

open-solution-mapping-challenge

Open solution to the Mapping Challenge 🌎
Python
380
star
2

open-solution-salt-identification

Open solution to the TGS Salt Identification Challenge
Python
120
star
3

examples

📝 Examples of how to use Neptune for different use cases and with various MLOps tools
Jupyter Notebook
76
star
4

blog-binary-classification-metrics

Codebase for the blog post "24 Evaluation Metrics for Binary Classification (And When to Use Them)"
Jupyter Notebook
55
star
5

neptune-notebooks

📚 Jupyter Notebooks extension for versioning, managing and sharing notebook checkpoints in your machine learning and data science projects.
Python
34
star
6

neptune-mlflow

Neptune - MLflow integration 🧩 Experiment tracking with advanced UI, collaborative features, and user access management.
Python
31
star
7

neptune-contrib

This library is a location of the LegacyLogger for PyTorch Lightning.
Python
27
star
8

neptune-examples

Examples of using Neptune to keep track of your experiments (maintenance only).
Jupyter Notebook
26
star
9

kedro-neptune

📌 Track & manage metadata, visualize & compare Kedro pipelines in a nice UI.
Python
18
star
10

neptune-r

📒 The MLOps stack component for experiment tracking (R interface)
R
14
star
11

neptune-tensorboard

Neptune - TensorBoard integration 🧩 Experiment tracking with advanced UI, collaborative features, and user access management.
Python
13
star
12

neptune-optuna

🚀 Optuna visualization dashboard that lets you log and monitor hyperparameter sweep live.
Python
11
star
13

blog-hyperparameter_optimization

Codebase for the series of blog posts on Medium
Python
8
star
14

model-fairness-in-practice

Materials for the ODSC West 2019 workshop "Model fairness in practice"
Jupyter Notebook
8
star
15

neptune-sklearn

Experiment tracking for scikit-learn. 🧩 Log, organize, visualize and compare model metrics, parameters, dataset versions, and more.
Python
6
star
16

neptune-tensorflow-keras

💡 Experiment tracking for TensorFlow/Keras. Log, organize, and compare model metrics, learning curves, hyperparameters, dataset versions, and more.
Python
6
star
17

neptune-xgboost

Experiment tracking for XGBoost. 🧩 Log, organize, visualize and compare machine learning model metrics, parameters, dataset versions, and more.
Python
6
star
18

project-time-series-forecasting

Experiment tracking and model registry in the time series forecasting project
Python
5
star
19

neptune-lib

Project is deprecated. Please go to neptune-client.
Python
5
star
20

neptune-sacred

Sacred-compatible UI for experiment tracking. 📊 Log and visualize machine learning model metrics, hyperparameters, code, dataset versions, and more.
Python
5
star
21

neptune-fastai

Experiment tracking for fastai. 🧩 Log, organize, visualize and compare model metrics, hyperparameters, dataset versions, and more.
Python
4
star
22

docs

Neptune documentation
Shell
4
star
23

kaggle-ieee-fraud-detection

Example of a project with experiment management
Jupyter Notebook
4
star
24

neptune-action

Continuous Integration with GitHub Actions and Neptune
Python
3
star
25

neptune-prophet

Experiment tracking for Prophet. 🧩 Log, organize, visualize and compare model parameters, forecasts, and more.
Python
3
star
26

neptune-lightgbm

Experiment tracking for LightGBM. 🧩 Log, organize, visualize and compare model metrics, parameters, dataset versions, and more.
Python
3
star
27

neptune-pytorch-lightning

PyTorch Lightning logger for experiment tracking. 🧩 Monitor model training live, track metrics & hyperparameters, visualize models, and more.
Python
3
star
28

neptune-client-experimental

Python
3
star
29

neptune-airflow

Python
2
star
30

project-images-segmentation

Experiment tracking and model registry in the images segmentation project
Jupyter Notebook
2
star
31

neptune-client-scale

Python
2
star
32

neptune-fetcher

Python
2
star
33

workshops

code for Neptune workshops
Jupyter Notebook
1
star
34

tour-pytorch

Example project with PyTorch and Neptune.
Python
1
star
35

example-project-code

Code used in the "example-project" in neptune.
Python
1
star
36

neptune-client-e2e

Python
1
star
37

automation-pipelines

Python
1
star
38

tour-scikit-learn

Example project with scikit-learn and neptune.
Python
1
star
39

neptune

Python
1
star
40

neptune-integration-template

Python
1
star
41

tour-tf-keras

Source code for project with tour-tf-keras.
Python
1
star
42

examples-r

R
1
star
43

neptune-aws

Python
1
star
44

project-nlp

Experiment tracking and model registry in the NLP project
Jupyter Notebook
1
star
45

neptune-detectron2

Experiment tracking for Detectron2. 🧩 Log, organize, visualize, and compare model metrics, hyperparameters, dataset versions, and more.
Python
1
star
46

neptune-api

Python
1
star