DeepLabCut-Utils
This repository contains various scripts as well as links to other packages related to DeepLabCut. Feel free to contribute your own analysis methods, and perhaps some short notebook of how to use it. Thanks!
Example scripts for scaling up your DLC analysis & training:
These two scripts illustrate how to train, test, and analyze videos for multiple projects automatically (scale_raining_and_evaluation.py) and how to analyze videos that are organized in subfolders automatically (scale_analysis_oversubfolders.py). Feel free to adjust them for your needs!
Contributed by Alexander Mathis
Using your DLC outputs, loading, simple ROI analysis, visualization examples:
Time spent of a body part in a particular region of interest (ROI)
You can compute time spent in particular ROIs in frames. This demo Jupyer Notebook shows you how to load the outputs of DLC and perform the analysis (plus other plotting functions):
code: https://github.com/DeepLabCut/DLCutils/blob/master/Demo_loadandanalyzeDLCdata.ipynb
code: https://github.com/DeepLabCut/DLCutils/blob/master/time_in_each_roi.py
Contributed by Federico Claudi and Jupyter Notebok from Alexander Mathis
DeepLabCut-Display GUI
Open and view data to understand pose estimation errors and trends. Filter data by likelihood threshold.
code: https://github.com/jakeshirey/DeepLabCut-Display
Contributed by Jacob Shirey
A GUI based ROI tool for time spent of a body part in a defined region of interest
code: https://github.com/PolarBean/DLC_ROI_tool
Contributed by Harry Carey
Clustering tools (using the output of DLC):
Identifying Behavioral Structure from Deep Variational Embeddings of Animal Motion
paper: https://www.biorxiv.org/content/10.1101/2020.05.14.095430
code: https://github.com/LINCellularNeuroscience/VAME
Behavior clustering with MotionMapper
- (adpated from https://github.com/gordonberman/MotionMapper)
code: https://github.com/DeepLabCut/DLCutils/tree/master/DLC_2_MotionMapper
Contributed by Mackenzie Mathis
Behavior clustering with B-SOiD
B-SOiD: An Open Source Unsupervised Algorithm for Discovery of Spontaneous Behaviors <-- you can use the outputs of DLC to feed directly into B-SOiD (in MATLAB).
paper: https://www.biorxiv.org/content/10.1101/770271v1.abstract
code: https://github.com/YttriLab/B-SOiD
Machine-learning helper packages (using the output of DLC):
Behavior analysis with machine-learning in R (ETH-DLCAnalyzer)
Deep learning based behavioral analysis enables high precision rodent tracking and is capable of outperforming commercial solutions. Oliver Sturman, Lukas von Ziegler, Christa SchlΓ€ppi, Furkan Akyol, Benjamin Grewe, Johannes Bohacek
paper: https://www.biorxiv.org/content/10.1101/2020.01.21.913624v1
code: https://github.com/ETHZ-INS/DLCAnalyzer
Behavior Analysis with machine learning classifiers (SIMBA)
A pipeline for using pose estimation (i.e. DeepLabCut) then behavioral annotatation and generatation of supervised machine-learning-based classifiers. <-- you can use the outputs of DLC to feed directly into SIMBA (in Python).
Code written by: Simon Nilsson (please direct use questions to Simon).
paper: https://www.biorxiv.org/content/10.1101/2020.04.19.049452v2
code: https://github.com/sgoldenlab/simba
3D DeepLabCut helper packages:
A wrapper package for DeepLabCut2.0 for 3D videos (anipose)
code: https://github.com/lambdaloop/anipose
maintainer: Pierre Karashchuk
3D reconstruction with EasyWand/Argus DLT system with DeepLabCut data:
Written by Brandon Jackson, post our DLC workshop in Jan 2020:
A small set of utilities that allow conversion between the data storage formats of DeepLabCut (DLC) and one of the DLT-based 3D tracking systems: either Ty Hedrick's DigitizingTools in MATLAB, or the Python-based Argus. These functions should allow you to use data previously digitized in a DLT system to create the files needed to train a DLC model, and to import DLC-tracked points back into a DLT 3D calibration to reconstruct 3D points.
code: https://github.com/haliaetus13/DLCconverterDLT
Pupil Tracking
- From Tom Vaissie - [email protected]
- Please see the README.txt file https://github.com/DeepLabCut/DLCutils/tree/master/pupilTracking for details; this code makes the video in case study 7 http://www.mousemotorlab.org/deeplabcut/.
Using DeepLabCut for USB-CGPIO feedback
paper: https://www.biorxiv.org/content/early/2018/11/28/482349
code: https://github.com/bf777/DeepCutRealTime
maintainer: Brandon Forys
LEGACY utility functions (no longer required in DLC 2+):
DLC1 to DLC 2 conversion code
This code allows you to import the labeled data from DLC 1 to DLC 2 projects. Note, it is not streamlined and should be used with care.
https://github.com/DeepLabCut/DLCutils/tree/master/conversion_scripts_LEGACY
Contributed by Alexander Mathis
Running project created on Windows on Colaboratory
#UPDATE: as of Deeplabcut 2.0.4 onwards you no longer need to use this code! You can simply create the training set on the cloud and it will automatically convert your project for you.
- This solves a path problem when creating a project and annotating data on Windows (see DeepLabCut/DeepLabCut#172). This functionality will be included in a later version of DLC 2 (DONE!) https://github.com/DeepLabCut/DLCutils/tree/master/conversion_scripts_LEGACY
Usage: change in lines 70 and 71 of https://github.com/DeepLabCut/DLCutils/tree/master/conversion_scripts_LEGACY/convertWin2Unix.py
basepath='/content/drive/My Drive/DeepLabCut/examples/'
projectname='Reaching-Mackenzie-2018-08-30'
then run this script on colaboratory after uploading your labeled data to the drive. Thereby it will be converted to unix format, then create a training set (with deeplabcut) and proceed as usual...
Contributed by Alexander Mathis
Please direct inquires to the contributors/code-maintainers of that code. Note that the software(s) are provided "as is", without warranty of any kind, express or implied.