• Stars
    star
    1,431
  • Rank 31,635 (Top 0.7 %)
  • Language
    Julia
  • License
    Other
  • Created almost 6 years ago
  • Updated 3 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

21st century AD

CI Testing Coverage Dev Docs

] add Zygote

Zygote provides source-to-source automatic differentiation (AD) in Julia, and is the next-gen AD system for the Flux differentiable programming framework. For more details and benchmarks of Zygote's technique, see our paper. You may want to check out Flux for more interesting examples of Zygote usage; the documentation here focuses on internals and advanced AD usage.

Zygote supports Julia 1.6 onwards, but we highly recommend using Julia 1.8 or later.

julia> using Zygote

julia> f(x) = 5x + 3

julia> f(10), f'(10)
(53, 5.0)

julia> @code_llvm f'(10)
define i64 @"julia_#625_38792"(i64) {
top:
  ret i64 5
}

"Source-to-source" means that Zygote hooks into Julia's compiler, and generates the backwards pass for you – as if you had written it by hand.

Zygote supports the flexibility and dynamism of the Julia language, including control flow, recursion, closures, structs, dictionaries, and more. Mutation and exception handling are currently not supported.

julia> fs = Dict("sin" => sin, "cos" => cos, "tan" => tan);

julia> gradient(x -> fs[readline()](x), 1)
sin
0.5403023058681398

Zygote benefits from using the ChainRules.jl ruleset. Custom gradients can be defined by extending the ChainRulesCore.jl's rrule:

julia> using ChainRulesCore

julia> add(a, b) = a + b

julia> function ChainRulesCore.rrule(::typeof(add), a, b)
           add_pb(dy) = (NoTangent(), dy, dy)
           return add(a, b), add_pb
       end

To support large machine learning models with many parameters, Zygote can differentiate implicitly-used parameters, as opposed to just function arguments.

julia> W, b = rand(2, 3), rand(2);

julia> predict(x) = W*x .+ b;

julia> g = gradient(Params([W, b])) do
         sum(predict([1,2,3]))
       end
Grads(...)

julia> g[W], g[b]
([1.0 2.0 3.0; 1.0 2.0 3.0], [1.0, 1.0])

More Repositories

1

Flux.jl

Relax! Flux is the ML library that doesn't make you tensor
Julia
4,359
star
2

model-zoo

Please do not feed the models
Julia
878
star
3

FastAI.jl

Repository of best practices for deep learning in Julia, inspired by fastai
Julia
578
star
4

GeometricFlux.jl

Geometric Deep Learning for Flux
Julia
348
star
5

Metalhead.jl

Computer vision models for Flux
Julia
314
star
6

MacroTools.jl

MacroTools provides a library of tools for working with Julia code and expressions.
Julia
301
star
7

Torch.jl

Sensible extensions for exposing torch in Julia.
Julia
200
star
8

NNlib.jl

Neural Network primitives with multiple backends
Julia
188
star
9

MLJFlux.jl

Wrapping deep learning models from the package Flux.jl for use in the MLJ.jl toolbox
Julia
137
star
10

ONNX.jl

Read ONNX graphs in Julia
Julia
137
star
11

FluxTraining.jl

A flexible neural net training library inspired by fast.ai
Julia
114
star
12

IRTools.jl

Mike's Little Intermediate Representation
Julia
107
star
13

Functors.jl

Parameterise all the things
Julia
107
star
14

Flux3D.jl

3D computer vision library in Julia
Julia
100
star
15

Mjolnir.jl

A little less conversation, a little more abstraction
Julia
87
star
16

Optimisers.jl

Optimisers.jl defines many standard optimisers and utilities for learning loops.
Julia
68
star
17

DaggerFlux.jl

Distributed computation of differentiation pipelines to use multiple workers, devices, GPU, etc. since Julia wasn't fast enough already
Julia
65
star
18

Gym.jl

Gym environments in Julia
Julia
54
star
19

FluxML-Community-Call-Minutes

The FluxML Community Team repo
51
star
20

XLA.jl

"Maybe we have our own magic."
Julia
47
star
21

Tracker.jl

Flux's ex AD
Julia
44
star
22

FluxJS.jl

I heard you like compile times
Julia
42
star
23

DataAugmentation.jl

Flexible data augmentation library for machine and deep learning
Julia
41
star
24

HuggingFaceApi.jl

Julia
33
star
25

Hydra.jl

SPMD + Neural Nets
Julia
31
star
26

ParameterSchedulers.jl

Common hyperparameter scheduling for ML
Julia
28
star
27

Alloc.jl

Julia
26
star
28

Trebuchet.jl

throw stuff
Julia
21
star
29

fluxml.github.io

Flux Website
HTML
20
star
30

YaoFlux.jl

Differentiable programming on quantum circuits with Flux
Julia
19
star
31

OneHotArrays.jl

Memory efficient one-hot array encodings
Julia
17
star
32

ZygoteRules.jl

Julia
15
star
33

FluxBench.jl

Benchmarks for the FluxML ecosystem for deep learning, scientific machine learning, differentiable programming etc including AD and CUDA accelerated workloads
Julia
14
star
34

NNlibCUDA.jl

CUDA integration for the NNlib API
Julia
14
star
35

DiffImages.jl

Differentiable Computer Vision using pure Julia
Julia
14
star
36

Fluxperimental.jl

Experimental features for Flux.jl
Julia
13
star
37

SafeTensors.jl

Julia
8
star
38

MetalheadWeights

Pre-trained model weight artifacts for Metalhead.jl
Julia
7
star
39

FluxMLBenchmarks.jl

A benchmarking suite for the FluxML org
Julia
6
star
40

IArrays.jl

Julia
5
star
41

differentiable.dev

HTML
3
star
42

FluxCUDA.jl

Julia
3
star
43

FluxMLDocs

Unified documentation across the FluxML ecosystem
Julia
1
star
44

.github

Repository for default community health files
1
star