DeepPose (stg-1) on TensorFlow
NOTE: This is not an official implementation. Original paper is DeepPose: Human Pose Estimation via Deep Neural Networks.
This is implementation of DeepPose (stg-1).
Code includes training and testing on 2 popular Pose Benchmarks: LSP Extended Dataset and MPII Human Pose Dataset.
Performance of Alexnet pretrained on Imagenet and finetuned on LSP is close to the performance reported in the original paper.
Requirements
- Python 2.7
- TensorFlow r1.0
- Chainer 1.17.0+ (for background data processing only)
- numpy 1.12+
- OpenCV 2.4.8+
- tqdm 4.8.4+
For tensorflow version 0.11.0rc0 and 0.12.0rc0 checkout branch r0.12
RAM requirements
Requires around 10 Gb of free RAM.
Installation of dependencies
- Install TensorFlow
- Install other dependencies via
pip
.
pip install chainer numpy opencv tqdm
- In
scripts/config.py
setROOT_DIR
to point to the root dir of the project. - Download weights of alexnet pretrained on Imagenet bvlc_alexnet.tf and put them into
weights/
dir.
Dataset preparation
cd datasets
bash download.sh
cd ..
python datasets/lsp_dataset.py
python datasets/mpii_dataset.py
- LSP dataset (1000 tran / 1000 test images)
- LSP Extended dataset (10000 train images)
- MPII dataset (use original train set and split it into 17928 train / 1991 test images)
Training
examples/ provide several scripts for training on LSP + LSP_EXT and MPII:
- examples/train_lsp_alexnet_scratch.sh to run training Alexnet on LSP + LSP_EXT from scratch
- examples/train_lsp_alexnet_imagenet.sh to run training Alexnet on LSP + LSP_EXT using weights pretrained on Imagenet.
- examples/train_mpii_alexnet_scratch.py to run training Alexnet on MPII from scratch.
- examples/train_mpii_alexnet_imagenet.py to run training Alexnet on MPII using weights pretrained on Imagenet.
Example: bash examples/train_lsp_alexnet_scratch.sh
All these scripts call train.py
.
To check which options it accepts and which default values are set, please look into cmd_options.py
.
- The network is trained with Adagrad optimizer and learning rate
0.0005
as specified in the paper. - For training we use cropped pearsons (not the full image).
- To use your own network architecure set it accordingly in
scripts/regressionnet.py
increate_regression_net
method.
The network wiil be tested during training and you will see the following output every T iterations:
8it [00:06, 1.31it/s]
Step 0 test/pose_loss = 0.116
Step 0 test/mPCP 0.005
Step 0 test/parts_PCP:
Head Torso U Arm L Arm U Leg L Leg mean
0.000 0.015 0.001 0.003 0.013 0.001 0.006
Step 0 test/mPCKh 0.029
Step 0 test/mSymmetricPCKh 0.026
Step 0 test/parts_mSymmetricPCKh:
Head Neck Shoulder Elbow Wrist Hip Knee Ankle
0.003 0.016 0.019 0.043 0.044 0.028 0.053 0.003
Here you can see that PCP and PCKh scores at step (iteration) 0.
test/METRIC_NAME
means that the metric was calculated on test set.
val/METRIC_NAME
means that the metric was calculated on validation set. Just for sanity check on LSP I took the first 1000 images from train as validation.
pose_loss
is the regression loss of the joint prediction,
mPCP
is mean [email protected] score over all sticks,
parts_PCP
is [email protected] score for every stick.
mRelaxedPCP
is a relaxed [email protected] score, where the stick has a correct position when the average error for both joints is less than the threshold (0.5).
mPCKh
is mean PCKh score for all joints,
mSymmetricPCKh
is mean PCKh score for all joints, where the score for symmetric left/right joints was averaged,
Testing
To test the model use tests/test_snapshot.py
.
- The script will produce [email protected] and [email protected] scores applied on cropped pearsons.
- Scores wiil be computed for different crops.
- BBOX EXTENSION=1 means that the pearson was tightly cropped,
BBOX EXTENSION=1.5 means that the bounding box of the person was enlarged in 1.5 times and then image was cropped.
Usage: python tests/test_snapshot.py DATASET_NAME SNAPSHOT_PATH
,
- where
DATASET_NAME
is'lsp'
or'mpii'
, SNAPSHOT_PATH
is the path to the snapshot.
Example: python tests/test_snapshot.py lsp out/lsp_alexnet_scratch/checkpoint-10000
Results
Results for Random initialization and Alexnet initialization from our CVPR 2017 paper Deep Unsupervised Similarity Learning using Partially Ordered Sets. Check the paper for more results using our initialization and Shuffle&Learn initialization.
[email protected]
LSPRandom Init. | Alexnet | |
---|---|---|
Torso | 87.3 | 92.8 |
Upper legs | 52.3 | 68.1 |
Lower legs | 35.4 | 53.0 |
Upper arms | 25.4 | 39.8 |
Lower arms | 7.6 | 17.5 |
Head | 44.0 | 62.8 |
Total | 42.0 | 55.7 |
[email protected]
MPIIRandom Init. | Alexnet | |
---|---|---|
Head | 79.5 | 87.2 |
Neck | 87.1 | 93.2 |
LR Shoulder | 71.6 | 85.2 |
LR Elbow | 52.1 | 69.6 |
LR Wrist | 34.6 | 52.0 |
LR Hip | 64.1 | 81.3 |
LR Knee | 58.3 | 69.7 |
LR Ankle | 51.2 | 62.0 |
Thorax | 85.5 | 93.4 |
Pelvis | 70.1 | 86.6 |
Total | 65.4 | 78.0 |
Notes
If you use this code please cite the repo.
License
GNU General Public License