• Stars
    star
    1
  • Language
    Jupyter Notebook
  • Created over 5 years ago
  • Updated over 5 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Small scholar project for the CS Games 2019 qualification. An afternoon of coding. Data loading and validation code could be better.

More Repositories

1

LSTM-Human-Activity-Recognition

Human Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM RNN. Classifying the type of movement amongst six activity categories - Guillaume Chevalier
Jupyter Notebook
3,278
star
2

Awesome-Deep-Learning-Resources

Rough list of my favorite deep learning resources, useful for revisiting topics or for reference. I have got through all of the content listed there, carefully. - Guillaume Chevalier
1,631
star
3

seq2seq-signal-prediction

Signal forecasting with a Sequence-to-Sequence (seq2seq) Recurrent Neural Network (RNN) model in TensorFlow - Guillaume Chevalier
Jupyter Notebook
1,078
star
4

How-to-Grow-Neat-Software-Architecture-out-of-Jupyter-Notebooks

Growing the code out of your notebooks - the right way.
521
star
5

HAR-stacked-residual-bidir-LSTMs

Using deep stacked residual bidirectional LSTM cells (RNN) with TensorFlow, we do Human Activity Recognition (HAR). Classifying the type of movement amongst 6 categories or 18 categories on 2 different datasets.
Python
308
star
6

Spiking-Neural-Network-SNN-with-PyTorch-where-Backpropagation-engenders-STDP

What about coding a Spiking Neural Network using an automatic differentiation framework? In SNNs, there is a time axis and the neural network sees data throughout time, and activation functions are instead spikes that are raised past a certain pre-activation threshold. Pre-activation values constantly fades if neurons aren't excited enough.
Jupyter Notebook
249
star
7

Linear-Attention-Recurrent-Neural-Network

A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Jupyter Notebook
142
star
8

Hyperopt-Keras-CNN-CIFAR-100

Auto-optimizing a neural net (and its architecture) on the CIFAR-100 dataset. Could be easily transferred to another dataset or another classification task.
Python
106
star
9

GloVe-as-a-TensorFlow-Embedding-Layer

Taking a pretrained GloVe model, and using it as a TensorFlow embedding weight layer **inside the GPU**. Therefore, you only need to send the index of the words through the GPU data transfer bus, reducing data transfer overhead.
Jupyter Notebook
90
star
10

filtering-stft-and-laplace-transform

Simple demo of filtering signal with an LPF and plotting its Short-Time Fourier Transform (STFT) and Laplace transform, in Python.
Jupyter Notebook
65
star
11

ReuBERT

A question-answering chatbot, simply.
Python
43
star
12

PyTorch-Dynamic-RNN-Attention-Decoder-Tree

This is code I wrote within less than an hour so as to very roughly draft how I would code a Dynamic RNN Attention Decoder Tree with PyTorch.
Python
31
star
13

LinkedIn-Connections-Growth-Analysis

Assessing personal growth on LinkedIn with charts. Plot LinkedIn connections over time. Discover what your connections most do and where they most work.
Jupyter Notebook
23
star
14

SGNN-Self-Governing-Neural-Networks-Projection-Layer

Attempt at reproducing a SGNN's projection layer, but with word n-grams instead of skip-grams. Paper and more: http://aclweb.org/anthology/D18-1105
Jupyter Notebook
23
star
15

Predict-if-salary-is-over-50k-with-Keras

Predict whether income exceeds $50K/yr based on census data of the "Adult Dataset". Also known as "Census Income" dataset.
Jupyter Notebook
21
star
16

AI-Planning-Solver-Shakeys-World-PDDL

Solving a planning problem (Shakey's World) with the FF and IPP planners, the PDDL language and some Python meta-programming to glue things together.
Python
15
star
17

EDA-for-Cybersecurity-Intrusion-Detection-KDD-Cup-99

Jupyter Notebook
14
star
18

caffe-cifar-10-and-cifar-100-datasets-preprocessed-to-HDF5

Both deep learning datasets can be imported in python directly with h5py (HDF5 format). The datasets can be directly imported or converted with a python script.
Python
13
star
19

python-caffe-custom-cifar-100-conv-net

Custom convolutional neural network on cifar-100 dataset for image classification. Images and their labels are processed to HDF5 data format for use in Caffe.
Jupyter Notebook
10
star
20

python-conv-lib

A lightweight library to do for-loop-styled convolution passes on your iterable objects (e.g.: on a list). Note: this is not a convolution, it is about exposing what would the kernel pass on in the first place in your loops.
Python
7
star
21

CS-Games-2018-Google-Challenge-TensorFlow

Jupyter Notebook
5
star
22

SGNN-Transformer-Sentence-Model-SimilarityBXEnt

SGNN-Transformer Sentence Model trained by the paragraph-skip-gram-like SimilarityBXENT. Also see: https://github.com/guillaume-chevalier/SGNN-Self-Governing-Neural-Networks-Projection-Layer
Jupyter Notebook
3
star
23

guillaume-chevalier

2
star
24

dotfiles

Shell
2
star
25

scikit-learn-digit-recognition

Digit Recognition with scikit-learn's Bernoulli RBM and Logistic Classifier
Python
2
star
26

CSGames-2019-AI

Our team's solution for the Artificial Competition of the Computer Science Games 2019 (a programming contest). Deep Reinforcement Learning.
Python
2
star
27

Wikipedia-XML-Markup-Code-to-Plain-Text-Parser-of-Hell

Parsing a Wikipedia XML file of all articles to lots of raw txt files, and remove most of wiki markup (not perfect: see issues first). For more info on wiki markup, see: https://en.wikipedia.org/wiki/Wikipedia:Tutorial/Formatting#Wiki_markup
Jupyter Notebook
1
star
28

Sentiment-Classification-and-Language-Detection

Jupyter Notebook
1
star