• Stars
    star
    111
  • Rank 314,591 (Top 7 %)
  • Language
    Julia
  • License
    Other
  • Created over 3 years ago
  • Updated about 2 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Tools for building non-allocating pre-cached functions in Julia, allowing for GC-free usage of automatic differentiation in complex codes

PreallocationTools.jl

Join the chat at https://julialang.zulipchat.com #sciml-bridged Global Docs

codecov Build Status Build status

ColPrac: Contributor's Guide on Collaborative Practices for Community Packages SciML Code Style

PreallocationTools.jl is a set of tools for helping build non-allocating pre-cached functions for high-performance computing in Julia. Its tools handle edge cases of automatic differentiation to make it easier for users to get high performance even in the cases where code generation may change the function that is being called.

DiffCache

DiffCache is a type for doubly-preallocated vectors which are compatible with non-allocating forward-mode automatic differentiation by ForwardDiff.jl. Since ForwardDiff uses chunked duals in its forward pass, two vector sizes are required in order for the arrays to be properly defined. DiffCache creates a dispatching type to solve this, so that by passing a qualifier it can automatically switch between the required cache. This method is fully type-stable and non-dynamic, made for when the highest performance is needed.

Using DiffCache

DiffCache(u::AbstractArray, N::Int = ForwardDiff.pickchunksize(length(u)); levels::Int = 1)
DiffCache(u::AbstractArray, N::AbstractArray{<:Int})

The DiffCache function builds a DiffCache object that stores both a version of the cache for u and for the Dual version of u, allowing use of pre-cached vectors with forward-mode automatic differentiation. Note that DiffCache, due to its design, is only compatible with arrays that contain concretely typed elements.

To access the caches, one uses:

get_tmp(tmp::DiffCache, u)

When u has an element subtype of Dual numbers, then it returns the Dual version of the cache. Otherwise it returns the standard cache (for use in the calls without automatic differentiation).

In order to preallocate to the right size, the DiffCache needs to be specified to have the correct N matching the chunk size of the dual numbers or larger. If the chunk size N specified is too large, get_tmp will automatically resize when dispatching; this remains type-stable and non-allocating, but comes at the expense of additional memory.

In a differential equation, optimization, etc., the default chunk size is computed from the state vector u, and thus if one creates the DiffCache via DiffCache(u) it will match the default chunking of the solver libraries.

DiffCache is also compatible with nested automatic differentiation calls through the levels keyword (N for each level computed using based on the size of the state vector) or by specifying N as an array of integers of chunk sizes, which enables full control of chunk sizes on all differentation levels.

DiffCache Example 1: Direct Usage

using ForwardDiff, PreallocationTools
randmat = rand(5, 3)
sto = similar(randmat)
stod = DiffCache(sto)

function claytonsample!(sto, τ, α; randmat = randmat)
    sto = get_tmp(sto, Ï„)
    sto .= randmat
    Ï„ == 0 && return sto

    n = size(sto, 1)
    for i in 1:n
        v = sto[i, 2]
        u = sto[i, 1]
        sto[i, 1] = (1 - u^(-τ) + u^(-τ) * v^(-(τ / (1 + τ))))^(-1 / τ) * α
        sto[i, 2] = (1 - u^(-Ï„) + u^(-Ï„) * v^(-(Ï„ / (1 + Ï„))))^(-1 / Ï„)
    end
    return sto
end

ForwardDiff.derivative(Ï„ -> claytonsample!(stod, Ï„, 0.0), 0.3)
ForwardDiff.jacobian(x -> claytonsample!(stod, x[1], x[2]), [0.3; 0.0])

In the above, the chunk size of the dual numbers has been selected based on the size of randmat, resulting in a chunk size of 8 in this case. However, since the derivative is calculated with respect to τ and the Jacobian is calculated with respect to τ and α, specifying the DiffCache with stod = DiffCache(sto, 1) or stod = DiffCache(sto, 2), respectively, would have been the most memory efficient way of performing these calculations (only really relevant for much larger problems).

DiffCache Example 2: ODEs

using LinearAlgebra, OrdinaryDiffEq
function foo(du, u, (A, tmp), t)
    mul!(tmp, A, u)
    @. du = u + tmp
    nothing
end
prob = ODEProblem(foo, ones(5, 5), (0.0, 1.0), (ones(5, 5), zeros(5, 5)))
solve(prob, TRBDF2())

fails because tmp is only real numbers, but during automatic differentiation we need tmp to be a cache of dual numbers. Since u is the value that will have the dual numbers, we dispatch based on that:

using LinearAlgebra, OrdinaryDiffEq, PreallocationTools
function foo(du, u, (A, tmp), t)
    tmp = get_tmp(tmp, u)
    mul!(tmp, A, u)
    @. du = u + tmp
    nothing
end
chunk_size = 5
prob = ODEProblem(foo,
    ones(5, 5),
    (0.0, 1.0),
    (ones(5, 5), DiffCache(zeros(5, 5), chunk_size)))
solve(prob, TRBDF2(chunk_size = chunk_size))

or just using the default chunking:

using LinearAlgebra, OrdinaryDiffEq, PreallocationTools
function foo(du, u, (A, tmp), t)
    tmp = get_tmp(tmp, u)
    mul!(tmp, A, u)
    @. du = u + tmp
    nothing
end
chunk_size = 5
prob = ODEProblem(foo, ones(5, 5), (0.0, 1.0), (ones(5, 5), DiffCache(zeros(5, 5))))
solve(prob, TRBDF2())

DiffCache Example 3: Nested AD calls in an optimization problem involving a Hessian matrix

using LinearAlgebra, OrdinaryDiffEq, PreallocationTools, Optimization, OptimizationOptimJL
function foo(du, u, p, t)
    tmp = p[2]
    A = reshape(p[1], size(tmp.du))
    tmp = get_tmp(tmp, u)
    mul!(tmp, A, u)
    @. du = u + tmp
    nothing
end

coeffs = -collect(0.1:0.1:0.4)
cache = DiffCache(zeros(2, 2), levels = 3)
prob = ODEProblem(foo, ones(2, 2), (0.0, 1.0), (coeffs, cache))
realsol = solve(prob, TRBDF2(), saveat = 0.0:0.1:10.0, reltol = 1e-8)

function objfun(x, prob, realsol, cache)
    prob = remake(prob, u0 = eltype(x).(prob.u0), p = (x, cache))
    sol = solve(prob, TRBDF2(), saveat = 0.0:0.1:10.0, reltol = 1e-8)

    ofv = 0.0
    if any((s.retcode != :Success for s in sol))
        ofv = 1e12
    else
        ofv = sum((sol .- realsol) .^ 2)
    end
    return ofv
end
fn(x, p) = objfun(x, p[1], p[2], p[3])
optfun = OptimizationFunction(fn, Optimization.AutoForwardDiff())
optprob = OptimizationProblem(optfun, zeros(length(coeffs)), (prob, realsol, cache))
solve(optprob, Newton())

Solves an optimization problem for the coefficients, coeffs, appearing in a differential equation. The optimization is done with Optim.jl's Newton() algorithm. Since this involves automatic differentiation in the ODE solver and the calculation of Hessians, three automatic differentiations are nested within each other. Therefore, the DiffCache is specified with levels = 3.

FixedSizeDiffCache

FixedSizeDiffCache is a lot like DiffCache, but it stores dual numbers in its caches instead of a flat array. Because of this, it can avoid a view, making it a little bit more performant for generating caches of non-Array types. However, it is a lot less flexible than DiffCache, and is thus only recommended for cases where the chunk size is known in advance (for example, ODE solvers) and where u is not an Array.

The interface is almost exactly the same, except with the constructor:

FixedSizeDiffCache(u::AbstractArray, chunk_size = Val{ForwardDiff.pickchunksize(length(u))})
FixedSizeDiffCache(u::AbstractArray, chunk_size::Integer)

Note that the FixedSizeDiffCache can support duals that are of a smaller chunk size than the preallocated ones, but not a larger size. Nested duals are not supported with this construct.

LazyBufferCache

LazyBufferCache(f::F = identity)

A LazyBufferCache is a Dict-like type for the caches which automatically defines new cache arrays on demand when they are required. The function f maps size_of_cache = f(size(u)), which by default creates cache arrays of the same size.

Note that LazyBufferCache is type-stable and contains no dynamic dispatch. This gives it a ~15ns overhead. The upside of LazyBufferCache is that the user does not have to worry about potential issues with chunk sizes and such: LazyBufferCache is much easier!

Example

using LinearAlgebra, OrdinaryDiffEq, PreallocationTools
function foo(du, u, (A, lbc), t)
    tmp = lbc[u]
    mul!(tmp, A, u)
    @. du = u + tmp
    nothing
end
prob = ODEProblem(foo, ones(5, 5), (0.0, 1.0), (ones(5, 5), LazyBufferCache()))
solve(prob, TRBDF2())

GeneralLazyBufferCache

GeneralLazyBufferCache(f = identity)

A GeneralLazyBufferCache is a Dict-like type for the caches which automatically defines new caches on demand when they are required. The function f generates the cache matching for the type of u, and subsequent indexing reuses that cache if that type of u has already ben seen.

Note that GeneralLazyBufferCache's return is not type-inferred. This means it's the slowest of the preallocation methods, but it's the most general.

Example

In all of the previous cases our cache was an array. However, in this case we want to preallocate a DifferentialEquations ODEIntegrator object. This object is the one created via DifferentialEquations.init(ODEProblem(ode_fnc, yâ‚€, (0.0, T), p), Tsit5(); saveat = t), and we want to optimize p in a way that changes its type to ForwardDiff. Thus what we can do is make a GeneralLazyBufferCache which holds these integrator objects, defined by p, and indexing it with p in order to retrieve the cache. The first time it's called it will build the integrator, and in subsequent calls it will reuse the cache.

Defining the cache as a function of p to build an integrator thus looks like:

lbc = GeneralLazyBufferCache(function (p)
    DifferentialEquations.init(ODEProblem(ode_fnc, yâ‚€, (0.0, T), p), Tsit5(); saveat = t)
end)

then lbc[p] (or, equivalently, get_tmp(lbc, p)) will be smart and reuse the caches. A full example looks like the following:

using Random, DifferentialEquations, LinearAlgebra, Optimization, OptimizationNLopt,
      OptimizationOptimJL, PreallocationTools

lbc = GeneralLazyBufferCache(function (p)
    DifferentialEquations.init(ODEProblem(ode_fnc, yâ‚€, (0.0, T), p), Tsit5(); saveat = t)
end)

Random.seed!(2992999)
λ, y₀, σ = -0.5, 15.0, 0.1
T, n = 5.0, 200
Δt = T / n
t = [j * Δt for j in 0:n]
y = y₀ * exp.(λ * t)
yᵒ = y .+ [0.0, σ * randn(n)...]
ode_fnc(u, p, t) = p * u
function loglik(θ, data, integrator)
    yᵒ, n, ε = data
    λ, σ, u0 = θ
    integrator.p = λ
    reinit!(integrator, u0)
    solve!(integrator)
    ε = yᵒ .- integrator.sol.u
    ℓ = -0.5n * log(2π * σ^2) - 0.5 / σ^2 * sum(ε .^ 2)
end
θ₀ = [-1.0, 0.5, 19.73]
negloglik = (θ, p) -> -loglik(θ, p, lbc[θ[1]])
fnc = OptimizationFunction(negloglik, Optimization.AutoForwardDiff())
ε = zeros(n)
prob = OptimizationProblem(fnc,
    θ₀,
    (yᵒ, n, ε),
    lb = [-10.0, 1e-6, 0.5],
    ub = [10.0, 10.0, 25.0])
solve(prob, LBFGS())

Similar Projects

AutoPreallocation.jl tries to do this automatically at the compiler level. Alloc.jl tries to do this with a bump allocator.

More Repositories

1

DifferentialEquations.jl

Multi-language suite for high-performance solvers of differential equations and scientific machine learning (SciML) components. Ordinary differential equations (ODEs), stochastic differential equations (SDEs), delay differential equations (DDEs), differential-algebraic equations (DAEs), and more in Julia.
Julia
2,599
star
2

SciMLBook

Parallel Computing and Scientific Machine Learning (SciML): Methods and Applications (MIT 18.337J/6.338J)
HTML
1,841
star
3

ModelingToolkit.jl

An acausal modeling framework for automatically parallelized scientific machine learning (SciML) in Julia. A computer algebra system for integrated symbolics for physics-informed machine learning and automated transformations of differential equations
Julia
1,424
star
4

NeuralPDE.jl

Physics-Informed Neural Networks (PINN) Solvers of (Partial) Differential Equations for Scientific Machine Learning (SciML) accelerated simulation
Julia
961
star
5

DiffEqFlux.jl

Pre-built implicit layer architectures with O(1) backprop, GPUs, and stiff+non-stiff DE solvers, demonstrating scientific machine learning (SciML) and physics-informed machine learning methods
Julia
858
star
6

SciMLTutorials.jl

Tutorials for doing scientific machine learning (SciML) and high-performance differential equation solving with open source software.
CSS
710
star
7

Optimization.jl

Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
Julia
708
star
8

OrdinaryDiffEq.jl

High performance ordinary differential equation (ODE) and differential-algebraic equation (DAE) solvers, including neural ordinary differential equations (neural ODEs) and scientific machine learning (SciML)
Julia
528
star
9

diffeqpy

Solving differential equations in Python using DifferentialEquations.jl and the SciML Scientific Machine Learning organization
Python
521
star
10

Catalyst.jl

Chemical reaction network and systems biology interface for scientific machine learning (SciML). High performance, GPU-parallelized, and O(1) solvers in open source software.
Julia
462
star
11

DataDrivenDiffEq.jl

Data driven modeling and automated discovery of dynamical systems for the SciML Scientific Machine Learning organization
Julia
405
star
12

SciMLSensitivity.jl

A component of the DiffEq ecosystem for enabling sensitivity analysis for scientific machine learning (SciML). Optimize-then-discretize, discretize-then-optimize, adjoint methods, and more for ODEs, SDEs, DDEs, DAEs, etc.
Julia
330
star
13

Surrogates.jl

Surrogate modeling and optimization for scientific machine learning (SciML)
Julia
328
star
14

SciMLBenchmarks.jl

Scientific machine learning (SciML) benchmarks, AI for science, and (differential) equation solvers. Covers Julia, Python (PyTorch, Jax), MATLAB, R
MATLAB
299
star
15

DiffEqOperators.jl

Linear operators for discretizations of differential equations and scientific machine learning (SciML)
Julia
285
star
16

DiffEqGPU.jl

GPU-acceleration routines for DifferentialEquations.jl and the broader SciML scientific machine learning ecosystem
Julia
283
star
17

FluxNeuralOperators.jl

DeepONets, (Fourier) Neural Operators, Physics-Informed Neural Operators, and more in Julia
Julia
267
star
18

DiffEqDocs.jl

Documentation for the DiffEq differential equations and scientific machine learning (SciML) ecosystem
Julia
262
star
19

DiffEqBase.jl

The lightweight Base library for shared types and functionality for defining differential equation and scientific machine learning (SciML) problems
Julia
254
star
20

LinearSolve.jl

LinearSolve.jl: High-Performance Unified Interface for Linear Solvers in Julia. Easily switch between factorization and Krylov methods, add preconditioners, and all in one interface.
Julia
245
star
21

Integrals.jl

A common interface for quadrature and numerical integration for the SciML scientific machine learning organization
Julia
226
star
22

NonlinearSolve.jl

High-performance and differentiation-enabled nonlinear solvers (Newton methods), bracketed rootfinding (bisection, Falsi), with sparsity and Newton-Krylov support.
Julia
226
star
23

DataInterpolations.jl

A library of data interpolation and smoothing functions
Julia
212
star
24

SciMLStyle

A style guide for stylish Julia developers
Julia
211
star
25

StochasticDiffEq.jl

Solvers for stochastic differential equations which connect with the scientific machine learning (SciML) ecosystem
Julia
209
star
26

ReservoirComputing.jl

Reservoir computing utilities for scientific machine learning (SciML)
Julia
206
star
27

Sundials.jl

Julia interface to Sundials, including a nonlinear solver (KINSOL), ODE's (CVODE and ARKODE), and DAE's (IDA) in a SciML scientific machine learning enabled manner
Julia
195
star
28

RecursiveArrayTools.jl

Tools for easily handling objects like arrays of arrays and deeper nestings in scientific machine learning (SciML) and other applications
Julia
167
star
29

MethodOfLines.jl

Automatic Finite Difference PDE solving with Julia SciML
Julia
160
star
30

diffeqr

Solving differential equations in R using DifferentialEquations.jl and the SciML Scientific Machine Learning ecosystem
R
140
star
31

JumpProcesses.jl

Build and simulate jump equations like Gillespie simulations and jump diffusions with constant and state-dependent rates and mix with differential equations and scientific machine learning (SciML)
Julia
140
star
32

SciMLBase.jl

The Base interface of the SciML ecosystem
Julia
129
star
33

NBodySimulator.jl

A differentiable simulator for scientific machine learning (SciML) with N-body problems, including astrophysical and molecular dynamics
Julia
128
star
34

ColPrac

Contributor's Guide on Collaborative Practices for Community Packages
Julia
123
star
35

DiffEqBayes.jl

Extension functionality which uses Stan.jl, DynamicHMC.jl, and Turing.jl to estimate the parameters to differential equations and perform Bayesian probabilistic scientific machine learning
Julia
121
star
36

LabelledArrays.jl

Arrays which also have a label for each element for easy scientific machine learning (SciML)
Julia
120
star
37

PolyChaos.jl

A Julia package to construct orthogonal polynomials, their quadrature rules, and use it with polynomial chaos expansions.
Julia
116
star
38

SymbolicNumericIntegration.jl

SymbolicNumericIntegration.jl: Symbolic-Numerics for Solving Integrals
Julia
116
star
39

ModelingToolkitStandardLibrary.jl

A standard library of components to model the world and beyond
Julia
112
star
40

StructuralIdentifiability.jl

Fast and automatic structural identifiability software for ODE systems
Julia
110
star
41

ODE.jl

Assorted basic Ordinary Differential Equation solvers for scientific machine learning (SciML). Deprecated: Use DifferentialEquations.jl instead.
Julia
103
star
42

QuasiMonteCarlo.jl

Lightweight and easy generation of quasi-Monte Carlo sequences with a ton of different methods on one API for easy parameter exploration in scientific machine learning (SciML)
Julia
102
star
43

RuntimeGeneratedFunctions.jl

Functions generated at runtime without world-age issues or overhead
Julia
100
star
44

FEniCS.jl

A scientific machine learning (SciML) wrapper for the FEniCS Finite Element library in the Julia programming language
Julia
96
star
45

DiffEqCallbacks.jl

A library of useful callbacks for hybrid scientific machine learning (SciML) with augmented differential equation solvers
Julia
94
star
46

ExponentialUtilities.jl

Fast and differentiable implementations of matrix exponentials, Krylov exponential matrix-vector multiplications ("expmv"), KIOPS, ExpoKit functions, and more. All your exponential needs in SciML form.
Julia
93
star
47

EllipsisNotation.jl

Julia-based implementation of ellipsis array indexing notation `..`
Julia
80
star
48

EasyModelAnalysis.jl

High level functions for analyzing the output of simulations
Julia
79
star
49

AutoOptimize.jl

Automatic optimization and parallelization for Scientific Machine Learning (SciML)
Julia
78
star
50

ParameterizedFunctions.jl

A simple domain-specific language (DSL) for defining differential equations for use in scientific machine learning (SciML) and other applications
Julia
73
star
51

HighDimPDE.jl

A Julia package for Deep Backwards Stochastic Differential Equation (Deep BSDE) and Feynman-Kac methods to solve high-dimensional PDEs without the curse of dimensionality
Julia
71
star
52

SciMLExpectations.jl

Fast uncertainty quantification for scientific machine learning (SciML) and differential equations
Julia
64
star
53

MultiScaleArrays.jl

A framework for developing multi-scale arrays for use in scientific machine learning (SciML) simulations
Julia
64
star
54

SimpleNonlinearSolve.jl

Fast and simple nonlinear solvers for the SciML common interface. Newton, Broyden, Bisection, Falsi, and more rootfinders on a standard interface.
Julia
63
star
55

DiffEqNoiseProcess.jl

A library of noise processes for stochastic systems like stochastic differential equations (SDEs) and other systems that are present in scientific machine learning (SciML)
Julia
63
star
56

CellMLToolkit.jl

CellMLToolkit.jl is a Julia library that connects CellML models to the Scientific Julia ecosystem.
Julia
62
star
57

SciMLDocs

Global documentation for the Julia SciML Scientific Machine Learning Organization
Julia
60
star
58

SparsityDetection.jl

Automatic detection of sparsity in pure Julia functions for sparsity-enabled scientific machine learning (SciML)
Julia
59
star
59

DelayDiffEq.jl

Delay differential equation (DDE) solvers in Julia for the SciML scientific machine learning ecosystem. Covers neutral and retarded delay differential equations, and differential-algebraic equations.
Julia
58
star
60

DiffEqProblemLibrary.jl

A library of premade problems for examples and testing differential equation solvers and other SciML scientific machine learning tools
Julia
56
star
61

sciml.ai

The SciML Scientific Machine Learning Software Organization Website
CSS
53
star
62

DiffEqParamEstim.jl

Easy scientific machine learning (SciML) parameter estimation with pre-built loss functions
Julia
52
star
63

Static.jl

Static types useful for dispatch and generated functions.
Julia
52
star
64

GlobalSensitivity.jl

Robust, Fast, and Parallel Global Sensitivity Analysis (GSA) in Julia
Julia
51
star
65

DeepEquilibriumNetworks.jl

Implicit Layer Machine Learning via Deep Equilibrium Networks, O(1) backpropagation with accelerated convergence.
Julia
50
star
66

MinimallyDisruptiveCurves.jl

Finds relationships between the parameters of a mathematical model
Julia
49
star
67

DiffEqPhysics.jl

A library for building differential equations arising from physical problems for physics-informed and scientific machine learning (SciML)
Julia
48
star
68

OperatorLearning.jl

No need to train, he's a smooth operator
Julia
44
star
69

MuladdMacro.jl

This package contains a macro for converting expressions to use muladd calls and fused-multiply-add (FMA) operations for high-performance in the SciML scientific machine learning ecosystem
Julia
44
star
70

DiffEqDevTools.jl

Benchmarking, testing, and development tools for differential equations and scientific machine learning (SciML)
Julia
43
star
71

BoundaryValueDiffEq.jl

Boundary value problem (BVP) solvers for scientific machine learning (SciML)
Julia
42
star
72

SciMLOperators.jl

SciMLOperators.jl: Matrix-Free Operators for the SciML Scientific Machine Learning Common Interface in Julia
Julia
42
star
73

SBMLToolkit.jl

SBML differential equation and chemical reaction model (Gillespie simulations) for Julia's SciML ModelingToolkit
Julia
41
star
74

HelicopterSciML.jl

Helicopter Scientific Machine Learning (SciML) Challenge Problem
Julia
38
star
75

ADTypes.jl

Repository for automatic differentiation backend types
Julia
38
star
76

RootedTrees.jl

A collection of functionality around rooted trees to generate order conditions for Runge-Kutta methods in Julia for differential equations and scientific machine learning (SciML)
Julia
37
star
77

SciMLWorkshop.jl

Workshop materials for training in scientific computing and scientific machine learning
Julia
36
star
78

AutoOffload.jl

Automatic GPU, TPU, FPGA, Xeon Phi, Multithreaded, Distributed, etc. offloading for scientific machine learning (SciML) and differential equations
Julia
35
star
79

ModelOrderReduction.jl

High-level model-order reduction to automate the acceleration of large-scale simulations
Julia
33
star
80

ModelingToolkitCourse

A course on composable system modeling, differential-algebraic equations, acausal modeling, compilers for simulation, and building digital twins of real-world devices
Julia
33
star
81

DifferenceEquations.jl

Solving difference equations with DifferenceEquations.jl and the SciML ecosystem.
Julia
32
star
82

DASSL.jl

Solves stiff differential algebraic equations (DAE) using variable stepsize backwards finite difference formula (BDF) in the SciML scientific machine learning organization
Julia
31
star
83

FiniteVolumeMethod.jl

Solver for two-dimensional conservation equations using the finite volume method in Julia.
Julia
31
star
84

SteadyStateDiffEq.jl

Solvers for steady states in scientific machine learning (SciML)
Julia
30
star
85

TruncatedStacktraces.jl

Simpler stacktraces for the Julia Programming Language
Julia
28
star
86

PDESystemLibrary.jl

A library of systems of partial differential equations, as defined with ModelingToolkit.jl in Julia
Julia
28
star
87

DiffEqOnline

It's Angular2 business in the front, and a Julia party in the back! It's scientific machine learning (SciML) for the web
TypeScript
27
star
88

ReactionNetworkImporters.jl

Julia Catalyst.jl importers for various reaction network file formats like BioNetGen and stoichiometry matrices
Julia
26
star
89

StochasticDelayDiffEq.jl

Stochastic delay differential equations (SDDE) solvers for the SciML scientific machine learning ecosystem
Julia
25
star
90

DiffEqOnlineServer

Backend for DiffEqOnline, a webapp for scientific machine learning (SciML)
Julia
25
star
91

MathML.jl

Julia MathML parser
Julia
23
star
92

IRKGaussLegendre.jl

Implicit Runge-Kutta Gauss-Legendre 16th order (Julia)
Jupyter Notebook
23
star
93

SimpleDiffEq.jl

Simple differential equation solvers in native Julia for scientific machine learning (SciML)
Julia
22
star
94

DiffEqFinancial.jl

Differential equation problem specifications and scientific machine learning for common financial models
Julia
22
star
95

ModelingToolkitNeuralNets.jl

Symbolic-Numeric Universal Differential Equations for Automating Scientific Machine Learning (SciML)
Julia
22
star
96

SciPyDiffEq.jl

Wrappers for the SciPy differential equation solvers for the SciML Scientific Machine Learning organization
Julia
21
star
97

SciMLTutorialsOutput

Tutorials for doing scientific machine learning (SciML) and high-performance differential equation solving with open source software.
HTML
20
star
98

OptimalControl.jl

A component of the SciML scientific machine learning ecosystem for optimal control
Julia
20
star
99

MATLABDiffEq.jl

Common interface bindings for the MATLAB ODE solvers via MATLAB.jl for the SciML Scientific Machine Learning ecosystem
Julia
20
star
100

IfElse.jl

Under some conditions you may need this function
Julia
19
star