• Stars
    star
    139
  • Rank 262,954 (Top 6 %)
  • Language
    C++
  • License
    BSD 3-Clause "New...
  • Created about 2 years ago
  • Updated 8 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

ControlVAE: Model-Based Learning of Generative Controllers for Physics-Based Characters

Heyuan Yao, Zhenhua Song, Baoquan Chen, Libin Liu


A reimplemention of Siggraph Asia 2022 paper ControlVAE: Model-Based Learning of Generative Controllers for Physics-Based Characters.

Please note that we are not able to guarantee that the codes will work if it is run with settings that we did not intend when we developed for the project. We would appreciate it if users could report the problems when they find it.

Install

build conda from requirements.yml

conda env create -f requirements.yml
conda activate control-vae
conda install pytorch=*=*cuda* torchvision torchaudio cudatoolkit=11.3 -c pytorch
pip install panda3d

You should change into the folder of this project, and run

pip install -e .
cd ModifyODESrc
pip install -e .

Then be sure that your pytorch version >= 11.0, because we need torch.linalg.cross to accelerate code...

Training

Our code needs mpi4py. The main process will be used to train the network and the rest process will be used to collect simulation data. You can simply run:

mpiexec -n 5 python train_controlvae.py --YOUR_ARGS

You do not need YOUR_ARGS by default.

Playing

We offer two tasks now: random sampling, and joystick control. Both of them are in in the folder PlayGround and can be played by directly run the code.

We offer a panda viewer. The camera can be controled with mouse, and you can throw box to character by pressing 'b' or 'SPACE'.

For joystick control, the character is controler with wasd for direction and 123 for speed. The direction is relative to camera's forward direction.

press asd continously.... A known issue is that in some computer, the left/right direction is opposite... this maybe caused by the different loading methods of the gltf model

For example, you can run PlayGround\random_playground.py, which will ask for a config file(yml) and a trained parameters(.data). The pretrained model can be obtained from

OneDrive:
https://1drv.ms/u/s!AhVw0PSSGV0TmSfCdXQO7iwTyFwN?e=wKelcs

PKU Disk:
https://disk.pku.edu.cn:443/link/664B7E3BC3E7FF3F240E3C99312A5C6C
valid until๏ผš2027-08-31 23:59

please download the config and data files into Data\Pretrained, and run.

python PlayGround\random_playground.py
# select yml and controlvae.data
python PlayGround\joystick_playground.py
# select yml and joystick.data

Citation

@article{
    ControlVAE,
    author = {Yao, Heyuan and Song, Zhenhua and Chen, Baoquan and Liu, Libin},
    title = {ControlVAE: Model-Based Learning of Generative Controllers for Physics-Based Characters},
    year = {2022},
    issue_date = {December 2022},
    volume = {41},
    number = {6},
    url = {https://doi.org/10.1145/3550454.3555434},
    journal = {ACM Trans. Graph.},
    articleno = {183},
}

Star History Chart

Some demos:

Prediction of world model:


Random sampling in the latent space


Speed/Style and direction control:


Resistance to external perturbations: