• Stars
    star
    590
  • Rank 75,302 (Top 2 %)
  • Language
    Python
  • License
    GNU General Publi...
  • Created over 5 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

MANO layer for PyTorch, generating hand meshes as a differentiable layer

Manopth

MANO layer for PyTorch (tested with v0.4 and v1.x)

ManoLayer is a differentiable PyTorch layer that deterministically maps from pose and shape parameters to hand joints and vertices. It can be integrated into any architecture as a differentiable layer to predict hand meshes.

image

ManoLayer takes batched hand pose and shape vectors and outputs corresponding hand joints and vertices.

The code is mostly a PyTorch port of the original MANO model from chumpy to PyTorch. It therefore builds directly upon the work of Javier Romero, Dimitrios Tzionas and Michael J. Black.

This layer was developped and used for the paper Learning joint reconstruction of hands and manipulated objects for CVPR19. See project page and demo+training code.

It reuses part of the great code from the Pytorch layer for the SMPL body model by Zhang Xiong (MandyMo) to compute the rotation utilities !

It also includes in mano/webuser partial content of files from the original MANO code (posemapper.py, serialization.py, lbs.py, verts.py, smpl_handpca_wrapper_HAND_only.py).

If you find this code useful for your research, consider citing:

  • the original MANO publication:
@article{MANO:SIGGRAPHASIA:2017,
  title = {Embodied Hands: Modeling and Capturing Hands and Bodies Together},
  author = {Romero, Javier and Tzionas, Dimitrios and Black, Michael J.},
  journal = {ACM Transactions on Graphics, (Proc. SIGGRAPH Asia)},
  publisher = {ACM},
  month = nov,
  year = {2017},
  url = {http://doi.acm.org/10.1145/3130800.3130883},
  month_numeric = {11}
}
  • the publication this PyTorch port was developped for:
@INPROCEEDINGS{hasson19_obman,
  title     = {Learning joint reconstruction of hands and manipulated objects},
  author    = {Hasson, Yana and Varol, G{\"u}l and Tzionas, Dimitris and Kalevatykh, Igor and Black, Michael J. and Laptev, Ivan and Schmid, Cordelia},
  booktitle = {CVPR},
  year      = {2019}
}

The training code associated with this paper, compatible with manopth can be found here. The release includes a model trained on a variety of hand datasets.

Installation

Get code and dependencies

  • git clone https://github.com/hassony2/manopth
  • cd manopth
  • Install the dependencies listed in environment.yml
    • In an existing conda environment, conda env update -f environment.yml
    • In a new environment, conda env create -f environment.yml, will create a conda environment named manopth

Download MANO pickle data-structures

  • Go to MANO website
  • Create an account by clicking Sign Up and provide your information
  • Download Models and Code (the downloaded file should have the format mano_v*_*.zip). Note that all code and data from this download falls under the MANO license.
  • unzip and copy the models folder into the manopth/mano folder
  • Your folder structure should look like this:
manopth/
  mano/
    models/
      MANO_LEFT.pkl
      MANO_RIGHT.pkl
      ...
  manopth/
    __init__.py
    ...

To check that everything is going well, run python examples/manopth_mindemo.py, which should generate from a random hand using the MANO layer !

Install manopth package

To be able to import and use ManoLayer in another project, go to your manopth folder and run pip install .

cd /path/to/other/project

You can now use from manopth import ManoLayer in this other project!

Usage

Minimal usage script

See examples/manopth_mindemo.py

Simple forward pass with random pose and shape parameters through MANO layer

import torch
from manopth.manolayer import ManoLayer
from manopth import demo

batch_size = 10
# Select number of principal components for pose space
ncomps = 6

# Initialize MANO layer
mano_layer = ManoLayer(mano_root='mano/models', use_pca=True, ncomps=ncomps)

# Generate random shape parameters
random_shape = torch.rand(batch_size, 10)
# Generate random pose parameters, including 3 values for global axis-angle rotation
random_pose = torch.rand(batch_size, ncomps + 3)

# Forward pass through MANO layer
hand_verts, hand_joints = mano_layer(random_pose, random_shape)
demo.display_hand({'verts': hand_verts, 'joints': hand_joints}, mano_faces=mano_layer.th_faces)

Result :

random hand

Demo

With more options, forward and backward pass, and a loop for quick profiling, look at examples/manopth_demo.py.

You can run it locally with:

python examples/manopth_demo.py

More Repositories

1

kinetics_i3d_pytorch

Inflated i3d network with inception backbone, weights transfered from tensorflow
Python
511
star
2

useful-computer-vision-phd-resources

Lists of resources useful for my PhD in computer vision
495
star
3

torch_videovision

Transforms for video datasets in pytorch
Python
263
star
4

obman_train

[cvpr19] Demo, training and evaluation code for generating dense hand+object reconstructions from single rgb images
Python
186
star
5

obman

[cvpr19] Hands+Objects synthetic dataset, instructions to download and code to load the dataset
Python
143
star
6

inflated_convnets_pytorch

Inflate DenseNet and ResNet as per I3D with ImageNet weight transfer
Python
129
star
7

handobjectconsist

[cvpr 20] Demo, training and evaluation code for joint hand-object pose estimation in sparsely annotated videos
Python
118
star
8

homan

[3dv 2021] Joint fitting of hands and object from short RGB video clips
Python
87
star
9

obman_render

[cvpr19] Code to generate images from the ObMan dataset, synthetic renderings of hands holding objects (or hands in isolation)
Python
77
star
10

interview-prep

Notes on preparing for coding interviews during my PhD
Python
61
star
11

inria-research-wiki

Wiki for my research notes
52
star
12

shape_sdf

Python
44
star
13

libyana

Utility functions that I reuse across different projects
Python
14
star
14

synthetic-hands

Python
12
star
15

pyrender_sdf

Minimal rendering in python for shapes defined implicitely through signed distance functions
Python
11
star
16

conda_colmap

CMake
8
star
17

multiperson

pytorch 1.6-compatible NMR
Python
6
star
18

hourglass-hands

Jupyter Notebook
3
star
19

flow-toolbox

C++
3
star
20

mva

Homeworks and project for the MVA (Mathematics, Vision, Learning) master
Jupyter Notebook
2
star