• Stars
    star
    285
  • Rank 142,142 (Top 3 %)
  • Language
    Python
  • License
    MIT License
  • Created over 4 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Brain-Computer interface stuff

OpenBCI Headset

Brain-Computer Interface workspace

My objective here is to share some of the code, models, and data from the OpenBCI 16-channel headset. I suspect many people are not going to be able to get their hands on the headset, but that doesn't mean you can't still play with some of the data!

Headset used is the OpenBCI Ultracortex Mark IV. You can check out OpenBCI's products here: https://shop.openbci.com/

Objectives

To start, my objective is to train a neural network to detect thoughts of left/right movements. From here, I would like to apply this BCI control to GTA V

Files

training.py - This is merely an example of training a model with this data. I have yet to find any truly great model, though at the end of this readme, I will try to keep an updated confusion matrix of my best-yet models. This way, you can easily tell if you've been able to find something better than what I've got.

If people are able to beat my model and are willing to share their models. I will post some sort of highscores somewhere on this repo.

Since some people wont be able to resist making a model on validation data...I will use my own separate validation data to actually create scores. If you're not cheating, this shouldn't impact you ;)

analysis.py - You can use this to run through validation data to see confusion matricies for your models on out of sample data.

Example of a % accuracy confusion matrix (the default graphed): confusion matrix Model used for the above: https://github.com/Sentdex/BCI/tree/master/models#614-acc-loss-239-topmodel

In the above confusion matrix, we can see that if the thought is left, the model accurately predicts this 53% of the time, predicts that left thought is actually none 15% of the time, and predicts right 32% of the time.

For the "right" thought, we can see the model predicted that correctly 64% of the time, predicted none 16% of the time, and predicted left 21% of the time.

An "ideal" confusion matrix would be a perfectly green diagonal line of boxes from the top left to the bottom right. This isn't too bad so far.

testing_and_making_data.py - This is just here if you happen to have your own OpenBCI headset and want to actually play with the model and/or build on the dataset. Or if you just want to help audit/improve my code. This file will load in whatever model you wish to use, you will specify the action you intend to think ahead of time for the ACTION var, then you run the script. The environment will pop up and collect all of your FFT data, storing them to a numpy file in the dir named whatever you said the ACTION thought was.

Requirements

Numpy TensorFlow 2.0. (you need 2.0 if you intend to load the models) pylsl (if you intend to run on an actual headset) OpenBCI GUI (using the networking tab https://docs.openbci.com/docs/06Software/01-OpenBCISoftware/GUIDocs)

The data

Currently, the data available is 16-channel FFT 0-60Hz, sampled at a rate of about 25/second. Data is contained in directories labeled as left, right, or none. These directories contain numpy arrays of this FFT data collected where I was thinking of moving a square on the screen in this directions.

I am not sure where I want to put the data, but, for now, it's available here: Download: https://hkinsley.com/static/downloads/bci/model_data_v2.7z

I plan to upload more and more as I create more data.

File structure (for both the data and validation_data directories):

  • data
    • left
    • none
    • right
  • validation_data
    • left
    • none
    • right

Contained within the left, none, and right directories are .npy files with unix timestamps as their name. Each of the files is a numpy array of shape:

import numpy as np

d = np.load("data/left/1572814991.npy")
print(d.shape)

>>>(250, 16, 60)

Each file is targeted to be 10 seconds long, which, at 25 iter/sec gives us, the 250 (though you should not depend/assume all files will be exactly 250 long). Then you have the number of channels (16), and then 60 values, for up to 60Hz. For example, if you do:

import numpy as np
import matplotlib.pyplot as plt

d = np.load("data/left/1572814991.npy")

plt.plot(d[0][0])
plt.show()

You will see a graph of: The data for Channel 0 for the very first sample. FFT graph single channel

If you want to see all 16 channels:

import numpy as np
import matplotlib.pyplot as plt

d = np.load("data/left/1572814991.npy")

for channel in d[175]:
    plt.plot(channel)
plt.show()

FFT graph 16 channels

Best model so far (on the validation/out of sample data):

confusion matrix

More info: https://github.com/Sentdex/BCI/tree/master/models#all-cnn-model-6323-acc-loss-252model

More Repositories

1

pygta5

Explorations of Using Python to play Grand Theft Auto 5.
Python
3,864
star
2

NNfSiX

Neural Networks from Scratch in various programming languages
C++
1,358
star
3

GANTheftAuto

Python
843
star
4

socialsentiment

Sentiment Analysis application created with Python and Dash, hosted at socialsentiment.net
Python
467
star
5

TermGPT

Giving LLMs like GPT-4 the ability to plan and execute terminal commands
Jupyter Notebook
395
star
6

Carla-RL

Reinforcement Learning codebase for self-driving car in Carla
Python
339
star
7

ChatGPT-at-Home

ChatGPT @ Home: Large Language Model (LLM) chatbot application, written by ChatGPT
Python
325
star
8

ChatGPT-API-Basics

Jupyter Notebook
292
star
9

nnfs_book

Sample code from the Neural Networks from Scratch book.
Python
261
star
10

BLOOM_Examples

Some quick BLOOM LLM examples
Jupyter Notebook
258
star
11

nnfs

Neural Networks from Scratch
Python
177
star
12

Falcon-LLM

Helper scripts and examples for exploring the Falcon LLM models
Jupyter Notebook
168
star
13

SC2RL

Reinforcement Learning + Starcraft 2
Python
139
star
14

Simple-kNN-Gzip

A simplistic linear and multiprocessed approach to sentiment analysis using Gzip Normalized Compression Distances with k nearest neighbors
Jupyter Notebook
138
star
15

QuantumComputing

Collection of Tutorials and other Quantum Computer programming related things.
Jupyter Notebook
134
star
16

cyberpython2077

Using Python to Play Cyberpunk 2077
Python
122
star
17

GPT-Journey

Building a text and image-based journey game powered by, and with, GPT 3.5
Python
79
star
18

OpenAssistant_API_Pythia_12B

Creating and Using an Open Assistant API locally (Pythia 12B GPT model)
Jupyter Notebook
75
star
19

neural-net-internals-visualized

Visualizing some of the internals of a neural network during training and inference.
Jupyter Notebook
59
star
20

reddit_spam_detector_bot

Bot that detects spam/affiliate marketing authors, and posts some stats on their threads.
Python
58
star
21

Together-API-Basics

Some information for working with the Together inference API for Open Source AI models
Jupyter Notebook
55
star
22

sentdebot

Code for Sentdebot in the Sentdex discord channel (discord.gg/sentdex)
Python
53
star
23

NEAT-samples

samples of neat code
Python
50
star
24

Lambda-Cloud

Helpers and such for working with Lambda Cloud
Python
49
star
25

LLM-Finetuning

Some helpers and examples for creating an LLM fine-tuning dataset
Jupyter Notebook
46
star
26

uarm

uArm Things
Python
29
star
27

satisfunctions

Fighting arthritis from Satisfactory one function at a time.
Python
23
star
28

PyGTA5_Reboot

Python Plays GTA V Reboot
18
star
29

TTSentdex9000

I am a human just like you!
16
star
30

chatbotrnd

working with chatbot response scoring.
Python
14
star
31

HF-Cache-Cleanup

cleanup cached models.
Python
10
star
32

cellvolution

Evolutionary cell-based simulation
Python
1
star