PARE: Part Attention Regressor for 3D Human Body Estimation [ICCV 2021]
PARE: Part Attention Regressor for 3D Human Body Estimation,
Muhammed Kocabas, Chun-Hao Paul Huang, Otmar Hilliges Michael J. Black,
International Conference on Computer Vision (ICCV), 2021
Features
PARE is an occlusion-robust human pose and shape estimation method. This implementation includes the demo and evaluation code for PARE implemented in PyTorch.
Updates
- 13/10/2021: Demo and evaluation code is released.
Getting Started
PARE has been implemented and tested on Ubuntu 18.04 with python >= 3.7. If you don't have a suitable device, try running our Colab demo.
Clone the repo:
git clone https://github.com/mkocabas/PARE.git
Install the requirements using virtualenv or conda:
# pip
source scripts/install_pip.sh
# conda
source scripts/install_conda.sh
Demo
First, you need to download the required data (i.e our trained model and SMPL model parameters). It is approximately 1.3GB. To do this you can just run:
source scripts/prepare_data.sh
Video Demo
Run the command below. See scripts/demo.py
for more options.
python scripts/demo.py --vid_file data/sample_video.mp4 --output_folder logs/demo
Sample demo output:
Image Folder Demo
python scripts/demo.py --image_folder <path to image folder> --output_folder logs/demo
Output format
If demo finishes succesfully, it needs to create a file named pare_output.pkl
in the --output_folder
.
We can inspect what this file contains by:
>>> import joblib # you may also use native pickle here as well
>>> output = joblib.load('pare_output.pkl')
>>> print(output.keys())
dict_keys([1, 2, 3, 4]) # these are the track ids for each subject appearing in the video
>>> for k,v in output[1].items(): print(k,v.shape)
pred_cam (n_frames, 3) # weak perspective camera parameters in cropped image space (s,tx,ty)
orig_cam (n_frames, 4) # weak perspective camera parameters in original image space (sx,sy,tx,ty)
verts (n_frames, 6890, 3) # SMPL mesh vertices
pose (n_frames, 72) # SMPL pose parameters
betas (n_frames, 10) # SMPL body shape parameters
joints3d (n_frames, 49, 3) # SMPL 3D joints
joints2d (n_frames, 21, 3) # 2D keypoint detections by STAF if pose tracking enabled otherwise None
bboxes (n_frames, 4) # bbox detections (cx,cy,w,h)
frame_ids (n_frames,) # frame ids in which subject with tracking id #1 appears
smpl_joints2d (n_frames, 49, 2) # SMPL 2D joints
Google Colab
Training
Training instructions will follow soon.
Evaluation
You need to download 3DPW
and 3DOH
datasets before running the evaluation script.
After the download, the data
folder should look like:
data/
βββ body_models
βΒ Β βββ smpl
βββ dataset_extras
βββ dataset_folders
βΒ Β βββ 3doh
βΒ Β βββ 3dpw
βββ pare
βββ checkpoints
Then, you can evaluate PARE by running:
python scripts/eval.py \
--cfg data/pare/checkpoints/pare_config.yaml \
--opts DATASET.VAL_DS 3doh_3dpw-all
python scripts/eval.py \
--cfg data/pare/checkpoints/pare_w_3dpw_config.yaml \
--opts DATASET.VAL_DS 3doh_3dpw-all
You should obtain results in this table on 3DPW test set:
MPJPE | PAMPJPE | V2V | |
---|---|---|---|
PARE | 82 | 50.9 | 97.9 |
PARE (w. 3DPW) | 74.5 | 46.5 | 88.6 |
Occlusion Sensitivity Analysis
We prepare a script to run occlusion sensitivity analysis proposed in our paper. Occlusion sensitivity analysis slides an occluding patch on the image and visualizes how human pose and shape estimation result affected.
python scripts/occlusion_analysis.py \
--cfg data/pare/checkpoints/pare_config.yaml \
--ckpt data/pare/checkpoints/pare_checkpoint.ckpt
Sample occlusion test output:
Citation
@inproceedings{Kocabas_PARE_2021,
title = {{PARE}: Part Attention Regressor for {3D} Human Body Estimation},
author = {Kocabas, Muhammed and Huang, Chun-Hao P. and Hilliges, Otmar and Black, Michael J.},
booktitle = {Proc. International Conference on Computer Vision (ICCV)},
pages = {11127--11137},
month = oct,
year = {2021},
doi = {},
month_numeric = {10}
}
License
This code is available for non-commercial scientific research purposes as defined in the LICENSE file. By downloading and using this code you agree to the terms in the LICENSE. Third-party datasets and software are subject to their respective licenses.
References
We indicate if a function or script is borrowed externally inside each file. Consider citing these works if you use them in your project.
Contact
For questions, please contact [email protected]
For commercial licensing (and all related questions for business applications), please contact [email protected].