There are no reviews yet. Be the first to send feedback to the community and the maintainers!
TsetlinMachine
Code and datasets for the Tsetlin MachineFire-Detection-Image-Dataset
This dataset contains normal images and images with fire. It is highly unbalanced to reciprocate real world situations. It consists of a variety of scenarios and different fire situations (intensity, luminosity, size, environment etc).deep-rts
A Real-Time-Strategy game for Deep Learning researchpyTsetlinMachine
Implements the Tsetlin Machine, Convolutional Tsetlin Machine, Regression Tsetlin Machine, Weighted Tsetlin Machine, and Embedding Tsetlin Machine, with support for continuous features, multigranularity, clause indexing, and literal budgettmu
Implements the Tsetlin Machine, Coalesced Tsetlin Machine, Convolutional Tsetlin Machine, Regression Tsetlin Machine, and Weighted Tsetlin Machine, with support for continuous features, drop clause, Type III Feedback, focused negative sampling, multi-task classifier, autoencoder, literal budget, and one-vs-one multi-class classifier. TMU is written in Python with wrappers for C and CUDA-based clause evaluation and updating.pyVNC
VNC Client Library for Pythonfast-tsetlin-machine-with-mnist-demo
A fast Tsetlin Machine implementation employing bit-wise operators, with MNIST demo.convolutional-tsetlin-machine-tutorial
Tutorial on the Convolutional Tsetlin MachineTextUnderstandingTsetlinMachine
Using the Tsetlin Machine to learn human-interpretable rules for high-accuracy text categorization with medical applicationsPyTsetlinMachineCUDA
Massively Parallel and Asynchronous Architecture for Logic-based AIpyTsetlinMachineParallel
Multi-threaded implementation of the Tsetlin Machine, Convolutional Tsetlin Machine, Regression Tsetlin Machine, and Weighted Tsetlin Machine, with support for continuous features and multigranularity.TsetlinMachineBook
Python code accompanying the book "An Introduction to Tsetlin Machines".FlashRL
fast-tsetlin-machine-in-cuda-with-imdb-demo
A CUDA implementation of the Tsetlin Machine based on bitwise operatorsdeep_maze
open-tsetlin-machine
Open Source Tsetlin Machine frameworkTsetlinMachineC
A C implementation of the Tsetlin Machinerl
awesome-tsetlin-machine
A curated list of Tsetlin Machine researchregression-tsetlin-machine
Implementation of the Regression Tsetlin Machinedeep-warehouse
A Simulator for complex logistic environmentsTM-XOR-proof
#tsetlin-machine #machine-learning #game-theory #propositional-logic #pattern-recognition #bandit-learning #frequent-pattern-mining #learning-automataAxis_and_Allies
A simple Axis & Allies engine.python-fast-tsetlin-machine
Python wrapper for https://github.com/cair/fast-tsetlin-machine-with-mnist-demotmu-datasets
A dataset repository for datasets in tmuICML-Massively-Parallel-and-Asynchronous-Tsetlin-Machine-Architecture
Code repository for ICML 21 for Paper titled Massively Parallel and Asynchronous Tsetlin Machine Architectureikt111
notebooks
A collection of jupyter notebooksFire-Scene-Parsing
deep-line-wars
Docker-Tutorial
A docker tutorial for cair-gpu'sray-bugfix
A workaround to issues with Rllib, given it does not work for your current gym environment. CarRacing-v0 is one of these.fire
deep-line-wars-2
Deterministic-Tsetlin-Machine
Due to the high energy consumption and scalability challenges of deep learning, there is a critical need to shift research focus towards dealing with energy consumption constraints. Tsetlin Machines (TMs) are a recent approach to machine learning that has demonstrated significantly reduced energy usage compared to neural networks alike, while performing competitively accuracy-wise on several benchmarks. However, TMs rely heavily on energy-costly random number generation to stochastically guide a team of Tsetlin Automata to a Nash Equilibrium of the TM game. In this paper, we propose a novel finite-state learning automaton that can replace the Tsetlin Automata in TM learning, for increased determinism. The new automaton uses multi-step deterministic state jumps to reinforce sub-patterns. Simultaneously, flipping a coin to skip every d'th state update ensures diversification by randomization. The d-parameter thus allows the degree of randomization to be finely controlled. E.g., d=1 makes every update random and d=infinity makes the automaton completely deterministic. Our empirical results show that, overall, only substantial degrees of determinism reduces accuracy. Energy-wise, random number generation constitutes switching energy consumption of the TM, saving up to 11 mW power for larger datasets with high d values. We can thus use the new d-parameter to trade off accuracy against energy consumption, to facilitate low-energy machine learning.DeepAxie
Implementation of a simplified Axie Infinity Environment in C++ that is used to train an agent with the reinforcement learning algorithm DQN to play the game.Tsetlin-Machine-Deep-Neural-Network-Recommendation-System-Comparison
LogicalTransformerWithTsetlinMachine
Love Open Source and this site? Check out how you can help us