• Stars
    star
    1,543
  • Rank 30,320 (Top 0.6 %)
  • Language
    Go
  • License
    BSD 2-Clause "Sim...
  • Created almost 5 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Self-contained Machine Learning and Natural Language Processing library in Go



Build Coverage Go Report Card Maintainability Documentation License PRs Welcome Awesome Go


If you like the project, please β˜… star this repository to show your support! 🀩

If you're interested in NLP-related functionalities, be sure to explore the Cybertron package!

Spago is a Machine Learning library written in pure Go designed to support relevant neural architectures in Natural Language Processing.

Spago is self-contained, in that it uses its own lightweight computational graph both for training and inference, easy to understand from start to finish.

It provides:

  • Automatic differentiation via dynamic define-by-run execution
  • Feed-forward layers (Linear, Highway, Convolution...)
  • Recurrent layers (LSTM, GRU, BiLSTM...)
  • Attention layers (Self-Attention, Multi-Head Attention...)
  • Gradient descent optimizers (Adam, RAdam, RMS-Prop, AdaGrad, SGD)
  • Gob compatible neural models for serialization

Usage

Requirements:

Clone this repo or get the library:

go get -u github.com/nlpodyssey/spago

Getting Started

A good place to start is by looking at the implementation of built-in neural models, such as the LSTM.

Example 1

Here is an example of how to calculate the sum of two variables:

package main

import (
  "fmt"

  "github.com/nlpodyssey/spago/ag"
  "github.com/nlpodyssey/spago/mat"
)

type T = float32

func main() {
  // create a new node of type variable with a scalar
  a := mat.Scalar(T(2.0), mat.WithGrad(true)) // create another node of type variable with a scalar
  b := mat.Scalar(T(5.0), mat.WithGrad(true)) // create an addition operator (the calculation is actually performed here)
  c := ag.Add(a, b)

  // print the result
  fmt.Printf("c = %v (float%d)\n", c.Value(), c.Value().Scalar().BitSize())

  c.AccGrad(mat.Scalar(T(0.5)))
  ag.Backward(c)
  fmt.Printf("ga = %v\n", a.Grad())
  fmt.Printf("gb = %v\n", b.Grad())
}

Output:

c = [7] (float32)
ga = [0.5]
gb = [0.5]

Example 2

Here is a simple implementation of the perceptron formula:

package main

import (
  . "github.com/nlpodyssey/spago/ag"
  "github.com/nlpodyssey/spago/mat"
)

func main() {
  x := mat.Scalar(-0.8)
  w := mat.Scalar(0.4)
  b := mat.Scalar(-0.2)

  y := Sigmoid(Add(Mul(w, x), b))
  _ = y
}

Contributing

If you think something is missing or could be improved, please open issues and pull requests.

To start contributing, check the Contributing Guidelines.

Contact

We highly encourage you to create an issue as it will contribute to the growth of the community. However, if you prefer to communicate with us privately, please feel free to email Matteo Grella with any questions or comments you may have.

Acknowledgments

Spago is part of the open-source NLP Odyssey initiative initiated by members of the EXOP team (now part of Crisis24).

Sponsors

See our Open Collective page if you too are interested in becoming a sponsor.