• Stars
    star
    136
  • Rank 258,581 (Top 6 %)
  • Language
    Clojure
  • Created over 11 years ago
  • Updated about 8 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Clojure wrapper for Encog (v3) (Machine-Learning framework that specialises in neural-nets)

enclog

Clojure wrapper for the encog (v3) machine-learning framework .

-from the official encog website:

"Encog is an open source Machine Learning framework for both Java and DotNet. Encog is primarily focused on neural networks and bot programming. It allows you to create many common neural network forms, such as feedforward perceptrons, self organizing maps, Adaline, bidirectional associative memory, Elman, Jordan and Hopfield networks and offers a variety of training schemes."

-from me:

Encog has been around for almost 5 years, and so can be considered fairly mature and optimised. Apart from neural-nets, version 3 introduced SVM and Bayesian classification. With this library, which is a thin wrapper around encog, you can construct and train many types of neural nets in less than 10 lines of pure Clojure code. The whole idea, from the start, was to expose the user as little as possible to the Java side of things, thus eliminating any potential sharp edges of a rather big librabry like encog. Hopefully I've done a good job...feel free to try it out, and more importantly, feel free to drop any comments/opinions/advice/critique etc etc...

P.S.: This is still work in progress. Nonetheless the neural nets, training methods,randomization and normalisation are pretty much complete - what's left at this point is the bayesian stuff if I'm not mistaken...aaaa also I'm pretty sure we need tests :) ...

Usage

The jar(s)?

Clojars Project

Quick demo:

-quick & dirty: (need lein2)

(use '[cemerick.pomegranate :only (add-dependencies)])
(add-dependencies :coordinates '[[enclog "0.6.3"]] 
                  :repositories (merge cemerick.pomegranate.aether/maven-central {"clojars" "http://clojars.org/repo"}))
(use '[enclog nnets training])

Ok, most the networks are already functional so let's go ahead and make one. Let's assume that for some reason we need a feed-forward net with 2 input neurons, 1 output neuron (classification), and 1 hidden layer with 2 neurons for the XOR problem.

(def net  
    (network  (neural-pattern :feed-forward) 
               :activation :sigmoid 
               :input   2
               :output  1
               :hidden [2])) ;;a single hidden layer 
                                    

...and voila! we get back the complete network initialised with random weights.

Most of the constructor-functions (make-something) accept keyword based arguments. For the full list of options refer to documentation or source code. Don't worry if you accidentaly pass in wrong parameters to a network e.g wrong activation function for a specific net-type. Each concrete implementation of the 'network' multi-method ignores arguments that are not settable by a particular neural pattern!

Of course, now that we have the network we need to train it...well, that's easy too! first we are going to need some dummy data...

(let [xor-input [[0.0 0.0] [1.0 0.0] [0.0 0.1] [1.0 1.0]]
      xor-ideal [[0.0] [1.0] [1.0] [0.0]] 
      dataset   (data :basic-dataset xor-input xor-ideal)
      trainer   (trainer :back-prop :network net :training-set dataset)]
 (train trainer 0.01 500 []))
  
;;train expects a training-method , error tolerance, iteration limit & strategies (possibly none)
;;in this case we're using simple back-propagation as our training scheme of preference.
;;feed-forward networks can be used with a variety of activations/trainers.

and that's it really! after training finishes you can start using the network as normal. For more in depth instructions consider looking at the 2 examples found in the examples.clj ns. These include the classic xor example (trained with resilient-propagation) and the lunar lander example (trained with genetic algorithm) from the from encog wiki/books.

In general you should always remember:

  • Most (if not all) of the constructor-functions (e.g. network, data, trainer etc.) accept keywords for arguments. The documentation tells you exactly what your options are. Some constructor-functions return other functions (closures) which then need to be called again with potentially extra arguments, in order to get the full object.

  • 'network' is a big multi-method that is responsible for looking at what type of neural pattern has been passed in and dispatching the appropriate method. This is the 'spine' of creating networks with enclog.

  • NeuroEvolution of Augmenting Topologies (NEAT) don't need to be initialised as seperate networks like all other networks do. Instead, we usually initialise a NEATPopulation which we then pass to NEATTraining via

(trainer :neat :fitness-fn #(...) :population-object (NEATPopulation. 2 1 1000)) ;;settable population object
(trainer :neat :fitness-fn #(...) :input 2 :output 1 :population-size 1000)  ;;a brand new population with default parameters
  • Simple convenience functions do exist for evaluating quickly a trained network and also for implementing the CalculateScore class which is needed for doing GA or simulated-annealing training.

  • Ideally, check the source when any 'strange' error occurs. You don't even have to go online - it's in the jar!

Notes

This project is no longer under active development.

License

Copyright Β© 2012 Dimitrios Piliouras

Distributed under the Eclipse Public License, the same as Clojure.

More Repositories

1

duratom

A durable atom type for Clojure
Clojure
192
star
2

fudje

Unit testing library for Clojure
Clojure
75
star
3

jedi-time

Datafiable/Navigable protocol extensions for the core java.time objects
Clojure
33
star
4

clojure-encog

Clojure wrapper for Encog (v3) (Machine-Learning framework that specialises in neural-nets) - deprecated
Clojure
25
star
5

clj-bom

BOM reading/writing for Clojure
Clojure
18
star
6

cryptohash-clj

Cryptographic hashing facilities (pbkdf2/bcrypt/scrypt/argon2) for Clojure
Clojure
16
star
7

clojuima

A demo/tutorial about working with UIMA from Clojure
Clojure
12
star
8

Clondie24

Blondie24 goes Functional
Clojure
11
star
9

annotator-clj

A parallel, dictionary-based annotator for Text-mining & NLP-related tasks.
Clojure
9
star
10

clambda

Utilities for idiomatic consumption of Java Streams from Clojure, or Clojure seqs from Java
Clojure
9
star
11

hotel-nlp

An NLP toolkit for Clojure. It is a hotel because it aims to provide a common roof for several foreign and potentially incompatible libraries.
Clojure
9
star
12

clamda

Utilities for idiomatic consumption of Java8 streams from Clojure, or Clojure seqs from Java
Clojure
6
star
13

rapio

Random Access Parallel IO
Clojure
6
star
14

circuit-breaker-fn

Circuit-breaker primitives for Clojure
Clojure
6
star
15

asynctopia

High-level core.async helpers
Clojure
3
star
16

bites

A bite-sized Clojure library for converting things to/from bytes
Clojure
3
star
17

DER

drug NER with Clojure and openNLP
3
star
18

flog

Painless (despite the name) logging
Clojure
3
star
19

bankio

The canonical 'bank-accounts' example done right in Clojure
Clojure
1
star
20

treajure

Bike-shedding of various precious Clojure utilities/experiments
Clojure
1
star
21

MultiSnake

A rudimentary snake game adopted and extended from the book "Programming Clojure"
Clojure
1
star
22

ajenda

Clojure utilities for retrying/timeout-ing/scheduling side-effects
Clojure
1
star
23

circlecast

An immutable DB distributed via Hazelcast
Clojure
1
star
24

dblocks

Clojure macros for leveraging PostgreSQL advisory locks
Clojure
1
star