• Stars
    star
    317
  • Rank 132,216 (Top 3 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created about 8 years ago
  • Updated over 5 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Collection of Keras models used for classification

Keras-Classification-Models

A set of models which allow easy creation of Keras models to be used for classification purposes. Also contains modules which offer implementations of recent papers.

NOTE

Since this readme is getting very large, I will post most of these projects on titu1994.github.io

Image Classification Models

Keras Octave Convolutions

Keras implementation of the Octave Convolution blocks from the paper Drop an Octave: Reducing Spatial Redundancy in Convolutional Neural Networks with Octave Convolution.


Sparse Neural Networks (SparseNets) in Keras

An implementation of "SparseNets" from the paper Sparsely Connected Convolutional Networks in Keras 2.0+.

SparseNets are a modification of DenseNet and its dense connectivity pattern to reduce memory requirements drastically while still having similar or better performance.


Non-Local Neural Networks in Keras

Keras implementation of Non-local blocks from the paper "Non-local Neural Networks".

  • Support for "Gaussian", "Embedded Gaussian" and "Dot" instantiations of the Non-Local block.
  • Support for shielded computation mode (reduces computation by 4x)
  • Support for "Concatenation" instantiation will be supported when authors release their code.

Available at : Non-Local Neural Networks in Keras


Neural Architecture Search Net (NASNet) in Keras

An implementation of "NASNet" models from the paper Learning Transferable Architectures for Scalable Image Recognitio in Keras 2.0+.

Supports building NASNet Large (6 @ 4032), NASNet Mobile (4 @ 1056) and custom NASNets.

Available at : Neural Architecture Search Net (NASNet) in Keras


Squeeze and Excite Networks in Keras

Implementation of Squeeze and Excite networks in Keras. Supports ResNet and Inception v3 models currently. Support for Inception v4 and Inception-ResNet-v2 will also come once the paper comes out.

Available at : Squeeze and Excite Networks in Keras


Dual Path Networks in Keras

Implementation of Dual Path Networks, which combine the grouped convolutions of ResNeXt with the dense connections of DenseNet into two path

Available at : Dual Path Networks in Keras


MobileNets in Keras

Implementation of MobileNet models from the paper MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications in Keras 2.0+.

Contains code for building the MobileNet model (optimized for datasets similar to ImageNet) and weights for the model trained on ImageNet.

Also contains MobileNet V2 model implementations + weights.

Available at : MobileNets in Keras


ResNeXt in Keras

Implementation of ResNeXt models from the paper Aggregated Residual Transformations for Deep Neural Networks in Keras 2.0+.

Contains code for building the general ResNeXt model (optimized for datasets similar to CIFAR) and ResNeXtImageNet (optimized for the ImageNet dataset).

Available at : ResNeXt in Keras


Inception v4 in Keras

Implementations of the Inception-v4, Inception - Resnet-v1 and v2 Architectures in Keras using the Functional API. The paper on these architectures is available at "Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning".

The models are plotted and shown in the architecture sub folder. Due to lack of suitable training data (ILSVR 2015 dataset) and limited GPU processing power, the weights are not provided.

Contains : Inception v4, Inception-ResNet-v1 and Inception-ResNet-v2

Available at : Inception v4 in Keras


Wide Residual Networks in Keras

Implementation of Wide Residual Networks from the paper Wide Residual Networks

Usage

It can be used by importing the wide_residial_network script and using the create_wide_residual_network() method. There are several parameters which can be changed to increase the depth or width of the network.

Note that the number of layers can be calculated by the formula : nb_layers = 4 + 6 * N

import wide_residial_network as wrn
ip = Input(shape=(3, 32, 32)) # For CIFAR 10

wrn_28_10 = wrn.create_wide_residual_network(ip, nb_classes=10, N=4, k=10, dropout=0.0, verbose=1)

model = Model(ip, wrn_28_10)

Contains weights for WRN-16-8 and WRN-28-8 models trained on the CIFAR-10 Dataset.

Available at : Wide Residual Network in Keras


DenseNet in Keras

Implementation of DenseNet from the paper Densely Connected Convolutional Networks.

Usage

  1. Run the cifar10.py script to train the DenseNet 40 model
  2. Comment out the model.fit_generator(...) line and uncomment the model.load_weights("weights/DenseNet-40-12-CIFAR10.h5") line to test the classification accuracy.

Contains weights for DenseNet-40-12 and DenseNet-Fast-40-12, trained on CIFAR 10.

Available at : DenseNet in Keras


Residual Networks of Residual Networks in Keras

Implementation of the paper "Residual Networks of Residual Networks: Multilevel Residual Networks"

Usage

To create RoR ResNet models, use the ror.py script :

import ror

input_dim = (3, 32, 32) if K.image_dim_ordering() == 'th' else (32, 32, 3)
model = ror.create_residual_of_residual(input_dim, nb_classes=100, N=2, dropout=0.0) # creates RoR-3-110 (ResNet)

To create RoR Wide Residual Network models, use the ror_wrn.py script :

import ror_wrn as ror

input_dim = (3, 32, 32) if K.image_dim_ordering() == 'th' else (32, 32, 3)
model = ror.create_pre_residual_of_residual(input_dim, nb_classes=100, N=6, k=2, dropout=0.0) # creates RoR-3-WRN-40-2 (WRN)

Contains weights for RoR-3-WRN-40-2 trained on CIFAR 10

Available at : Residual Networks of Residual Networks in Keras


Neural Architecture Search

Sequentual Halving and Classification

PySHAC is a python library to use the Sequential Halving and Classification algorithm from the paper Parallel Architecture and Hyperparameter Search via Successive Halving and Classification with ease.

Available at : Sequentual Halving and Classification Documentation available at : PySHAC Documentation


Progressive Neural Architecture Search in Keras

Basic implementation of Encoder RNN from the paper ["Progressive Neural Architecture Search"]https://arxiv.org/abs/1712.00559), which is an improvement over the original Neural Architecture Search paper since it requires far less time and resources.

  • Uses Keras to define and train children / generated networks, which are defined in Tensorflow by the Encoder RNN.
  • Define a state space by using StateSpace, a manager which adds states and handles communication between the Encoder RNN and the user. Submit custom operations and parse locally as required.
  • Encoder RNN trained using a modified Sequential Model Based Optimization algorithm from the paper. Some stability modifications made by me to prevent extreme variance when training to cause failed training.
  • NetworkManager handles the training and reward computation of a Keras model

Available at : Progressive Neural Architecture Search in Keras


Neural Architecture Search in Keras

Basic implementation of Controller RNN from the paper "Neural Architecture Search with Reinforcement Learning " and "Learning Transferable Architectures for Scalable Image Recognition".

  • Uses Keras to define and train children / generated networks, which are defined in Tensorflow by the Controller RNN.
  • Define a state space by using StateSpace, a manager which adds states and handles communication between the Controller RNN and the user.
  • Reinforce manages the training and evaluation of the Controller RNN
  • NetworkManager handles the training and reward computation of a Keras model

Available at : Neural Architecture Search in Keras


Keras Segmentation Models

A set of models which allow easy creation of Keras models to be used for segmentation tasks.

Fully Connected DenseNets for Semantic Segmentation

Implementation of the paper The One Hundred Layers Tiramisu : Fully Convolutional DenseNets for Semantic Segmentation

Usage

Simply import the densenet_fc.py script and call the create method:

import densenet_fc as dc

model = dc.create_fc_dense_net(img_dim=(3, 224, 224), nb_dense_block=5, growth_rate=12,
                               nb_filter=16, nb_layers=4)

Keras Recurrent Neural Networks

A set of scripts which can be used to add custom Recurrent Neural Networks to Keras.


Neural Algorithmic Logic Units

A Keras implementation of Neural Arithmatic and Logical Unit from the paper Neural Algorithmic Logic Units by Andrew Trask, Felix Hill, Scott Reed, Jack Rae, Chris Dyer, Phil Blunsom.

  • Contains the layers for Neural Arithmatic Logic Unit (NALU) and Neural Accumulator (NAC).
  • Also contains the results of the static function learning toy tests.

Chrono Initializer, Chrono LSTM and JANET

Keras implementation of the paper The unreasonable effectiveness of the forget gate and the Chrono initializer and Chrono LSTM from the paper Can Recurrent Neural Networks Warp Time?.

This model utilizes just 2 gates - forget (f) and context (c) gates out of the 4 gates in a regular LSTM RNN, and uses Chrono Initialization to acheive better performance than regular LSTMs while using fewer parameters and less complicated gating structure.

Usage

Simply import the janet.py file into your repo and use the JANET layer.

It is not adviseable to use the JANETCell directly wrapped around a RNN layer, as this will not allow the max timesteps calculation that is needed for proper training using the Chrono Initializer for the forget gate.

The chrono_lstm.py script contains the ChronoLSTM model, as it requires minimal modifications to the original LSTM layer to use the ChronoInitializer for the forget and input gates.

Same restrictions to usage as the JANET layer, use the ChronoLSTM layer directly instead of the ChronoLSTMCell wrapped around a RNN layer.

from janet import JANET
from chrono_lstm import ChronoLSTM

...

To use just the ChronoInitializer, import the chrono_initializer.py script.


Independently Recurrent Neural Networks (SRU)

Implementation of the paper Independently Recurrent Neural Network (IndRNN): Building A Longer and Deeper RNN for Keras 2.0+. IndRNN is a recurrent unit that can run over extremely long time sequences, able to learn the additional problem over 5000 timesteps where most other models fail..

Usage

Usage of IndRNNCells

from ind_rnn import IndRNNCell, RNN

cells = [IndRNNCell(128), IndRNNCell(128)]
ip = Input(...)
x = RNN(cells)(ip)
...

Usage of IndRNN layer

from ind_rnn import IndRNN

ip = Input(...)
x = IndRNN(128)(x)
...

Simple Recurrent Unit (SRU)

Implementation of the paper Training RNNs as Fast as CNNs for Keras 2.0+. SRU is a recurrent unit that can run over 10 times faster than cuDNN LSTM, without loss of accuracy tested on many tasks, when implemented with a custom CUDA kernel.

This is a naive implementation with some speed gains over the generic LSTM cells, however its speed is not yet 10x that of cuDNN LSTMs.


Multiplicative LSTM

Implementation of the paper Multiplicative LSTM for sequence modelling for Keras 2.0+. Multiplicative LSTMs have been shown to achieve state-of-the-art or close to SotA results for sequence modelling datasets. They also perform better than stacked LSTM models for the Hutter-prize dataset and the raw wikipedia dataset.

Usage

Add the multiplicative_lstm.py script into your repository, and import the MultiplicativeLSTM layer.

Eg. You can replace Keras LSTM layers with MultiplicativeLSTM layers.

from multiplicative_lstm import MultiplicativeLSTM

Minimal RNN

Implementation of the paper MinimalRNN: Toward More Interpretable and Trainable Recurrent Neural Networks for Keras 2.0+. Minimal RNNs are a new recurrent neural network architecture that achieves comparable performance as the popular gated RNNs with a simplified structure. It employs minimal updates within RNN, which not only leads to efficient learning and testing but more importantly better interpretability and trainability

Usage

Import minimal_rnn.py and use either the MinimalRNNCell or MinimalRNN layer

from minimal_rnn import MinimalRNN 

# this imports the layer rather than the cell
ip = Input(...)  # Rank 3 input shape
x = MinimalRNN(units=128)(ip)
...

Nested LSTM

Implementation of the paper Nested LSTMs for Keras 2.0+. Nested LSTMs add depth to LSTMs via nesting as opposed to stacking. The value of a memory cell in an NLSTM is computed by an LSTM cell, which has its own inner memory cell. Nested LSTMs outperform both stacked and single-layer LSTMs with similar numbers of parameters in our experiments on various character-level language modeling tasks, and the inner memories of an LSTM learn longer term dependencies compared with the higher-level units of a stacked LSTM

Usage

from nested_lstm import NestedLSTM

ip = Input(shape=(nb_timesteps, input_dim))
x = NestedLSTM(units=64, depth=2)(ip)
...

Keras Modules

A set of scripts which can be used to add advanced functionality to Keras.


Switchable Normalization for Keras

Switchable Normalization is a normalization technique that is able to learn different normalization operations for different normalization layers in a deep neural network in an end-to-end manner.

Keras port of the implementation of the paper Differentiable Learning-to-Normalize via Switchable Normalization.

Code ported from the switchnorm official repository.

Note

This only implements the moving average version of batch normalization component from the paper. The batch average technique cannot be easily implemented in Keras as a layer, and therefore it is not supported.

Usage

Simply import switchnorm.py and replace BatchNormalization layer with this layer.

from switchnorm import SwitchNormalization

ip = Input(...)
...
x = SwitchNormalization(axis=-1)(x)
...

Group Normalization for Keras

A Keras implementation of Group Normalization by Yuxin Wu and Kaiming He.

Useful for fine-tuning of large models on smaller batch sizes than in research setting (where batch size is very large due to multiple GPUs). Similar to Batch Renormalization, but performs significantly better on ImageNet.

As can be seen, GN is independent of batchsize, which is crucial for fine-tuning large models which cannot be retrained with small batch sizes due to Batch Normalization's dependence on large batchsizes to compute the statistics of each batch and update its moving average perameters properly.

Usage

Dropin replacement for BatchNormalization layers from Keras. The important parameter that is different from BatchNormalization is called groups. This must be appropriately set, and requires certain constraints such as :

  1. Needs to an integer by which the number of channels is divisible.
  2. 1 <= G <= #channels, where #channels is the number of channels in the incomming layer.
from group_norm import GroupNormalization

ip = Input(shape=(...))
x = GroupNormalization(groups=32, axis=-1)
...

Normalized Optimizers for Keras

Keras wrapper class for Normalized Gradient Descent from kmkolasinski/max-normed-optimizer, which can be applied to almost all Keras optimizers.

Partially implements Block-Normalized Gradient Method: An Empirical Study for Training Deep Neural Network for all base Keras optimizers, and allows flexibility to choose any normalizing function. It does not implement adaptive learning rates however.

Usage

from keras.optimizers import Adam, SGD
from optimizer import NormalizedOptimizer

sgd = SGD(0.01, momentum=0.9, nesterov=True)
sgd = NormalizedOptimizer(sgd, normalization='l2')

adam = Adam(0.001)
adam = NormalizedOptimizer(adam, normalization='l2')

Tensorflow Eager with Keras APIs

A set of example notebooks and scripts which detail the usage and pitfalls of Eager Execution Mode in Tensorflow using Keras high level APIs.


One Cycle Learning Rate Policy for Keras

Implementation of One-Cycle Learning rate policy from the papers by Leslie N. Smith.


Batch Renormalization

Batch Renormalization algorithm implementation in Keras 1.2.1. Original paper by Sergey Ioffe, Batch Renormalization: Towards Reducing Minibatch Dependence in Batch-Normalized Models.\

Usage

Add the batch_renorm.py script into your repository, and import the BatchRenormalization layer.

Eg. You can replace Keras BatchNormalization layers with BatchRenormalization layers.

from batch_renorm import BatchRenormalization

Snapshot Ensembles in Keras

Implementation of the paper Snapshot Ensembles

Usage

The technique is simple to implement in Keras, using a custom callback. These callbacks can be built using the SnapshotCallbackBuilder class in snapshot.py. Other models can simply use this callback builder to other models to train them in a similar manner.

  1. Download the 6 WRN-16-4 weights that are provided in the Release tab of the project and place them in the weights directory
  2. Run the train_cifar_10.py script to train the WRN-16-4 model on CIFAR-10 dataset (not required since weights are provided)
  3. Run the predict_cifar_10.py script to make an ensemble prediction.

Contains weights for WRN-CIFAR100-16-4 and WRN-CIFAR10-16-4 (snapshot ensemble weights - ranging from 1-5 and including single best model)

Available at : Snapshot Ensembles in Keras


More Repositories

1

Neural-Style-Transfer

Keras Implementation of Neural Style Transfer from the paper "A Neural Algorithm of Artistic Style" (http://arxiv.org/abs/1508.06576) in Keras 2.0+
Jupyter Notebook
2,271
star
2

Image-Super-Resolution

Implementation of Super Resolution CNN in Keras.
Python
832
star
3

neural-image-assessment

Implementation of NIMA: Neural Image Assessment in Keras
Python
780
star
4

LSTM-FCN

Codebase for the paper LSTM Fully Convolutional Networks for Time Series Classification
Python
755
star
5

DenseNet

DenseNet implementation in Keras
Python
706
star
6

MLSTM-FCN

Multivariate LSTM Fully Convolutional Networks for Time Series Classification
Python
490
star
7

neural-architecture-search

Basic implementation of [Neural Architecture Search with Reinforcement Learning](https://arxiv.org/abs/1611.01578).
Python
431
star
8

keras-squeeze-excite-network

Implementation of Squeeze and Excitation Networks in Keras
Python
400
star
9

Inception-v4

Inception-v4, Inception - Resnet-v1 and v2 Architectures in Keras
Python
385
star
10

Snapshot-Ensembles

Snapshot Ensemble in Keras
Python
305
star
11

keras-non-local-nets

Keras implementation of Non-local Neural Networks
Python
290
star
12

keras-one-cycle

Implementation of One-Cycle Learning rate policy (adapted from Fast.ai lib)
Python
285
star
13

Super-Resolution-using-Generative-Adversarial-Networks

An implementation of SRGAN model in Keras
Python
283
star
14

tf-TabNet

A Tensorflow 2.0 implementation of TabNet.
Python
238
star
15

Keras-ResNeXt

Implementation of ResNeXt models from the paper Aggregated Residual Transformations for Deep Neural Networks in Keras 2.0+.
Python
224
star
16

tfdiffeq

Tensorflow implementation of Ordinary Differential Equation Solvers with full GPU support
Python
218
star
17

Keras-NASNet

"NASNet" models in Keras 2.0+ with weights
Python
200
star
18

keras-efficientnets

Keras Implementation of EfficientNets
Python
187
star
19

tf_SIREN

Tensorflow 2.0 implementation of Sinusodial Representation networks (SIREN)
Python
149
star
20

keras-coordconv

Keras implementation of CoordConv for all Convolution layers
Python
148
star
21

MobileNetworks

Keras implementation of Mobile Networks
Python
132
star
22

keras-adabound

Keras implementation of AdaBound
Python
130
star
23

progressive-neural-architecture-search

Implementation of Progressive Neural Architecture Search in Keras and Tensorflow
Python
120
star
24

keras-attention-augmented-convs

Keras implementation of Attention Augmented Convolutional Neural Networks
Python
120
star
25

Keras-DualPathNetworks

Dual Path Networks for Keras 2.0+
Python
114
star
26

Wide-Residual-Networks

Wide Residual Networks in Keras
Python
112
star
27

Fast-Neural-Style

Implementation of "Perceptual Losses for Real-Time Style Transfer and Super-Resolution" in Keras
Python
109
star
28

Keras-Group-Normalization

A Keras implementation of https://arxiv.org/abs/1803.08494
Python
103
star
29

BatchRenormalization

Batch Renormalization algorithm implementation in Keras
Python
98
star
30

Nested-LSTM

Keras implementation of Nested LSTMs
Python
90
star
31

keras-SRU

Implementation of Simple Recurrent Unit in Keras
Python
89
star
32

Fully-Connected-DenseNets-Semantic-Segmentation

Fully Connected DenseNet for Image Segmentation (https://arxiv.org/pdf/1611.09326v1.pdf)
Python
84
star
33

keras-LAMB-Optimizer

Implementation of the LAMB optimizer for Keras from the paper "Reducing BERT Pre-Training Time from 3 Days to 76 Minutes"
Python
76
star
34

tf-eager-examples

A set of simple examples ported from PyTorch for Tensorflow Eager Execution
Jupyter Notebook
73
star
35

keras_rectified_adam

Implementation of Rectified Adam in Keras
Python
69
star
36

Keras-IndRNN

Implementation of IndRNN in Keras
Python
67
star
37

LSTM-FCN-Ablation

Repository for the ablation study of "Long Short-Term Memory Fully Convolutional Networks for Time Series Classification"
Python
55
star
38

keras-octconv

Keras implementation of Octave Convolutions
Python
53
star
39

keras-global-context-networks

Keras implementation of Global Context Attention blocks
Python
46
star
40

Neural-Style-Transfer-Windows

Windows Form application written in C# to ease usage of neural style transfer script
Python
43
star
41

tf_fourier_features

Tensorflow 2.0 implementation of Fourier Feature Mapping Networks.
Python
42
star
42

Keras-Multiplicative-LSTM

Miltiplicative LSTM for Keras 2.0+
Python
42
star
43

keras_mixnets

Keras Implementation of MixNets: Mixed Depthwise Convolutions
Python
39
star
44

Keras-just-another-network-JANET

Keras implementation of [The unreasonable effectiveness of the forget gate](https://arxiv.org/abs/1804.04849)
Jupyter Notebook
35
star
45

keras-switchnorm

Switch Normalization implementation for Keras 2+
Python
30
star
46

keras-neural-alu

A Keras implementation of Neural Arithmatic and Logical Unit
Python
27
star
47

keras-mobile-colorizer

U-Net Model conditioned with MobileNet features for Grayscale -> Color mapping
Python
25
star
48

Deep-Columnar-Convolutional-Neural-Network

Deep Columnar Convolutional Neural Network architecture, which is based on Multi Columnar DNN (Ciresan 2012).
Python
24
star
49

keras-SparseNet

Keras Implementation of SparseNets
Python
23
star
50

Residual-of-Residual-Networks

Residual Network of Residual Networks in Keras
Python
22
star
51

pyshac

A Python library for the Sequential Halving and Classification algorithm
Python
21
star
52

keras_novograd

Keras implementation of NovoGrad
Python
20
star
53

Adversarial-Attacks-Time-Series

Codebase for the paper "Adversarial Attacks on Time Series"
Python
20
star
54

simple_diffusion

Simple notebooks to learn diffusion models on toy datasets
Jupyter Notebook
17
star
55

keras-normalized-optimizers

Wrapper for Normalized Gradient Descent in Keras
Jupyter Notebook
17
star
56

keras-padam

Keras implementation of Padam from "Closing the Generalization Gap of Adaptive Gradient Methods in Training Deep Neural Networks"
Python
17
star
57

pytorch_odegan

Partial implementation of ODE-GAN technique from the paper Training Generative Adversarial Networks by Solving Ordinary Differential Equations
Python
16
star
58

tf-sha-rnn

Tensorflow port implementation of Single Headed Attention RNN
Python
16
star
59

warprnnt_numba

WarpRNNT loss ported in Numba CPU/CUDA for Pytorch
Jupyter Notebook
16
star
60

Advanced_Machine_Learning

Python
16
star
61

dtw-numba

Implementation of Dynamic Time Warping algorithm with speed improvements based on Numba.
Python
16
star
62

keras-minimal-rnn

Keras implementation of MinimalRNN: Toward More Interpretable and Trainable Recurrent Neural Networks
Python
16
star
63

TweetSentimentAnalysis

CS583 course project
Python
14
star
64

lambda_networks_pt

Lambda Networks implemented in PyTorch
Python
13
star
65

tf_GON

Tensorflow 2.x implementation of Gradient Origin Networks
Python
13
star
66

tf_neural_deconvolution

Neural Deconvolutions in Tensorflow
Python
12
star
67

Python-Work

Python scripts to facilitate easy working
Jupyter Notebook
11
star
68

PyCTakesParser

Utilities to parse the output of cTAKES
Python
10
star
69

tf_star_rnn

Tensorflow 2.0 implementation of STAR RNN
Python
10
star
70

Deep-Dream

Deep Dream implementation in Keras
Python
9
star
71

Kaggle

Kaggle competition library. Uses Python 3.4.1 with almost all known python libraries for Machine Learning
Python
7
star
72

Music-Recognition

C# project to perform Frequency Analysis of music
C#
5
star
73

Rabin-Karp-String-Matching

C
4
star
74

Data-Science

Library of Data Science classes
Python
3
star
75

diffusion_model_nemo

Python
3
star
76

Ragial-Searcher

The Core Java library used to parse and store Ragial.com data
HTML
3
star
77

MSApriori

Multiple support apriori algorithm in Java
Java
3
star
78

RagialNotifier

Android App to parse ragial.com using the Ragial Searcher library to track items and notify the user if the item is on sale. Developed for the game Ragnarok Online, developed and owned by Gravity Inc.
Java
3
star
79

IDS-Course-Project

Intro to Data Science Project
Python
2
star
80

ML-Tools

Python
2
star
81

braindrain-uncommonhacks

JavaScript
2
star
82

Tiger-Game

Tiger Game in Python 2.7 / 3.4+
Python
2
star
83

8086-Microprocessor

An attempt to emulate an 8086 microprocessor, with its ASM instruction set.
Java
2
star
84

titu1994.github.io

HTML
2
star
85

Adaptive-Sorting-Algorithm

Analysis and implementation of Machine Learning Decision Tree to classify best algorithm for given data set
C#
2
star
86

Optimal-Binary-Search-Tree

C
2
star
87

Naive-String-Matching

C
2
star
88

Recurstion-C

Recursion in C
C
2
star
89

Java-Adaptive-Sorting-Algorithm

Adaptive Sorting Algorithm using Decision Trees to decide which algorithm will be optimal to sort a given dataset.
Java
2
star
90

Quick-Sort

Quick Sort in Java
1
star
91

Rate-Monotonic-Scheduling-Algorithm

Java
1
star
92

WT-Mini-Project

CSS
1
star
93

Kruskals-Algorithm

C
1
star
94

Stack

Stack
C
1
star
95

Doublu-Linked-List

Doubly Linked List
C
1
star
96

CircularLinkedList

Circular Linked List in C
C
1
star
97

Knuth-Morris-Pratt

C
1
star
98

MyLib

1
star
99

Polynomial-Linked-List

Polynomial Linked List
C
1
star
100

SOOAD-Mini-Project

Java
1
star