• Stars
    star
    118
  • Rank 298,182 (Top 6 %)
  • Language
    Julia
  • License
    MIT License
  • Created over 4 years ago
  • Updated 10 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Complex neural network examples for Flux.jl

FluxArchitectures

Dev Build Status Coverage

Complex neural network examples for Flux.jl.

This package contains a loose collection of (slightly) more advanced neural network architectures, mostly centered around time series forecasting.

Installation

To install FluxArchitectures, type ] to activate the package manager, and type

add FluxArchitectures

for installation. After using FluxArchitectures, the following functions are exported:

  • prepare_data
  • get_data
  • DARNN
  • DSANet
  • LSTnet
  • TPALSTM

See their docstrings, the documentation, and the examples folder for details.

Models

  • LSTnet: This "Long- and Short-term Time-series network" follows the paper by Lai et. al..

  • DARNN: The "Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction" is based on the paper by Qin et. al..

  • TPA-LSTM: The Temporal Pattern Attention LSTM network is based on the paper "Temporal Pattern Attention for Multivariate Time Series Forecasting" by Shih et. al..

  • DSANet: The "Dual Self-Attention Network for Multivariate Time Series Forecasting" is based on the paper by Siteng Huang et. al.

Quickstart

Activate the package and load some sample-data:

using FluxArchitectures
poollength = 10; horizon = 15; datalength = 1000;
input, target = get_data(:exchange_rate, poollength, datalength, horizon) 

Define a model and a loss function:

model = LSTnet(size(input, 1), 2, 3, poollength, 120)
loss(x, y) = Flux.mse(model(x), y')

Train the model:

Flux.train!(loss, Flux.params(model),Iterators.repeated((input, target), 20), Adam(0.01))