• Stars
    star
    276
  • Rank 144,032 (Top 3 %)
  • Language
    Julia
  • Created about 2 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Explicitly Parameterized Neural Networks in Julia

Join the chat at https://julialang.zulipchat.com #machine-learning Latest Docs Stable Docs

CI Build status codecov Package Downloads Aqua QA

ColPrac: Contributor's Guide on Collaborative Practices for Community Packages SciML Code Style

The πŸ”₯ Deep Learning Framework

Installation

] add Lux

Getting Started

using Lux, Random, Optimisers, Zygote
# using LuxCUDA, LuxAMDGPU # Optional packages for GPU support

# Seeding
rng = Random.default_rng()
Random.seed!(rng, 0)

# Construct the layer
model = Chain(BatchNorm(128), Dense(128, 256, tanh), BatchNorm(256),
              Chain(Dense(256, 1, tanh), Dense(1, 10)))

# Get the device determined by Lux
device = gpu_device()

# Parameter and State Variables
ps, st = Lux.setup(rng, model) .|> device

# Dummy Input
x = rand(rng, Float32, 128, 2) |> device

# Run the model
y, st = Lux.apply(model, x, ps, st)

# Gradients
gs = gradient(p -> sum(Lux.apply(model, x, p, st)[1]), ps)[1]

# Optimization
st_opt = Optimisers.setup(Optimisers.Adam(0.0001), ps)
st_opt, ps = Optimisers.update(st_opt, ps, gs)

Examples

Look in the examples directory for self-contained usage examples. The documentation has examples sorted into proper categories.

Ecosystem

Checkout our Ecosystem page for more details.

Getting Help

For usage related questions, please use Github Discussions or JuliaLang Discourse (machine learning domain) which allows questions and answers to be indexed. To report bugs use github issues or even better send in a pull request.

Package Ecosystem Structure

Structure of the packages part of the Lux.jl Universe1: (Rounded Rectangles denote packages maintained by Lux.jl developers)

flowchart LR
    subgraph Interface
        LuxCore(LuxCore)
    end
    subgraph Backend
        LuxLib(LuxLib)
        NNlib
        CUDA
    end
    subgraph ExternalML[External ML Packages]
        Flux
        Metalhead
    end
    subgraph CompViz[Computer Vision]
        Boltz(Boltz)
    end
    subgraph SciML[Scientific Machine Learning]
        DeepEquilibriumNetworks(DeepEquilibriumNetworks)
        DiffEqFlux(DiffEqFlux)
        NeuralPDE[Neural PDE: PINNs]
    end
    subgraph AD[Automatic Differentiation]
        Zygote
        Enzyme["Enzyme (experimental)"]
    end
    subgraph Dist[Distributed Training]
        FluxMPI(FluxMPI)
    end
    subgraph SerializeModels[Serialize Models]
        Serial[Serialization]
        JLD2
        BSON
    end
    subgraph Opt[Optimization]
        Optimisers
        Optimization
    end
    subgraph Parameters
        ComponentArrays
    end
    Lux(Lux)
    Parameters --> Lux
    LuxCore --> Lux
    Backend --> Lux
    Lux --> SciML
    AD --> Lux
    Lux --> Dist
    Lux --> SerializeModels
    Lux --> Opt
    Lux --> CompViz
    ExternalML -.-> CompViz

Related Projects

  • Flux.jl -- We share most of the backend infrastructure with Flux (Roadmap hints towards making Flux explicit-parameter first)
  • Knet.jl -- One of the mature and OG Julia Deep Learning Frameworks
  • SimpleChains.jl -- Extremely Efficient for Small Neural Networks on CPU
  • Avalon.jl -- Uses tracing based AD Yota.jl

Citation

If you found this library to be useful in academic work, then please cite:

@software{pal2023lux,
  author       = {Pal, Avik},
  title        = {{Lux: Explicit Parameterization of Deep Neural Networks in Julia}},
  month        = April,
  year         = 2023,
  note         = {If you use this software, please cite it as below.},
  publisher    = {Zenodo},
  version      = {v0.4.50},
  doi          = {10.5281/zenodo.7808904},
  url          = {https://doi.org/10.5281/zenodo.7808904}
}

Also consider starring our github repo

Footnotes

  1. These packages only constitute a subset of the ecosystem. Specifically these are the packages which the maintainers of Lux.jl have personally tested out. If you want a new package to be listed here, please open an issue. ↩

More Repositories

1

RayTracer.jl

Differentiable RayTracing in Julia
Julia
141
star
2

Wandb.jl

Unofficial Julia bindings for logging experiments to wandb.ai
Julia
74
star
3

FluxMPI.jl

Distributed Data Parallel Training of Deep Neural Networks
Julia
55
star
4

FastStyleTransfer.jl

Fast Neural Style Transfer in Julia
Julia
28
star
5

RegNeuralDE.jl

Official Implementation of "Opening the Blackbox: Accelerating Neural Differential Equations by Regularizing Internal Solver Heuristics" (ICML 2021)
Julia
27
star
6

DeepLearningBenchmarks

Benchmarks across Deep Learning Frameworks in Julia and Python
Julia
24
star
7

SimpleConfig.jl

A simple way to specify experiment configurations
Julia
16
star
8

LocalRegNeuralDE.jl

Local regularization of Neural Differential Equations
Julia
7
star
9

smthesis

On efficient training and inference of Neural Differential Equations
TeX
4
star
10

CNNVisualize.jl

CNN Visualizations in Flux
Julia
4
star
11

cs335-compiler

CS335 Compilers IITK Course Project
Python
3
star
12

ML_DL_Assignments

Machine Learning And Deep Learning Basics covered by working on Commonplace Datasets
Jupyter Notebook
3
star
13

DeepDream.jl

Implementation of Google's Deep Dream in Julia using Flux
Julia
3
star
14

ESC101-lab-Y17

ESC101 IIT Kanpur Lab Assignment Solutions
C
2
star
15

RobustNonlinearSolvers.jl

Testing out Nonlinear Solvers that Automatically Switch between Discrete and Continuous Variants
Julia
2
star
16

NNPACK.jl

Julia wrapper for NNPACK
Julia
1
star
17

hybrid_recommender

Python
1
star
18

BatchedNonlinearSolve.jl

Test bed for batched non linear solve algorithms (mostly for machine learning applications)
Julia
1
star
19

MURA.jl

Densenet and Other Models on the MURA (musculoskeletal radiographs) Dataset using Flux
Julia
1
star