Skip to content

xcyan/eccv18_mtvae

Repository files navigation

MT-VAE for Multimodal Human Motion Synthesis

This is the code for ECCV 2018 paper MT-VAE: Learning Motion Transformations to Generate Multimodal Human Dynamics by Xinchen Yan, Akash Rastogi, Ruben Villegas, Kalyan Sunkavalli, Eli Shechtman, Sunil Hadap, Ersin Yumer, Honglak Lee.

Please follow the instructions to run the code.

Requirements

MT-VAE requires or works with

  • Mac OS X or Linux
  • NVIDIA GPU

Installing Dependency

Data Preprocessing

bash prep_human36m_joints.sh
  • Disclaimer: Please check the license of Human3.6M dataset if you download this preprocessed version.

Training (MT-VAE)

  • If you want to train the MT-VAE human motion generator, please run the following script (usually it takes 1 day with a single Titan GPU).
bash demo_human36m_trainMTVAE.sh
  • Alternatively, you can download the pre-trained MT-VAE model, please run the following script.
bash prep_human36m_model.sh

Motion Synthesis Using Pre-trained MT-VAE Model

  • Please run the following command to generate multiple diverse human motion given initial motion.
bash demo_human36m_inferMTVAE.sh

Motion Analogy-making Using Pre-trained MT-VAE Model

  • Please run the following command to execute motion analogy-making.
bash demo_human36m_analogyMTVAE.sh

Hierchical Video Synthesis Using Pre-trained Image Generation Model

CUDA_VISIBLE_DEVICE=0 python h36m_hierach_gensample.py
  • Disclaimer: Please double check the license in that repository and cite HierchVid paper when use.

Citation

If you find this useful, please cite our work as follows:

@inproceedings{yan2018mt,
  title={MT-VAE: Learning Motion Transformations to Generate Multimodal Human Dynamics},
  author={Yan, Xinchen and Rastogi, Akash and Villegas, Ruben and Sunkavalli, Kalyan and Shechtman, Eli and Hadap, Sunil and Yumer, Ersin and Lee, Honglak},
  booktitle={European Conference on Computer Vision},
  pages={276--293},
  year={2018},
  organization={Springer}
}

Acknowledgements

We would like to thank the amazing developers and the open-sourcing community. Our implementation has especially been benefited from the following excellent repositories: