Home

Awesome

The Power of Points for Modeling Humans in Clothing (ICCV 2021)

Paper

This repository contains the official PyTorch implementation of the ICCV 2021 paper:

The Power of Points for Modeling Humans in Clothing <br> Qianli Ma, Jinlong Yang, Siyu Tang, and Michael J. Black <br>Paper | Supp | Video | Project website | Dataset

Installation

Run POP

Download pre-trained models and data

Run inference

With the data and pre-trained model ready, run the following command:

python main.py --config configs/config_demo.yaml --mode test

The command will:

Remark: The results from the unseen scenario is that of the single scan animation. When testing on the unseen outfits, the code will first take a single scan from the unseen outfit data, optimize the geometric feature tensor w.r.t. it, and then animate according to the given query poses. See command line outputs for more detailed information.

Visualize the results

To render images of the point sets generated as above, run the following command:

python render/o3d_render_pcl.py --name POP_pretrained_ReSynthdata_12outfits --case seen --query_resolution 256

The rendered images of each outfit will be saved under their respective subfolders under results/rendered_imgs/POP_pretrained_ReSynthdata_12outfits/.

Notes

Train POP

Training demo with our data examples

Training with your own data

News

License

Software Copyright License for non-commercial scientific research purposes. Please read carefully the terms and conditions and any accompanying documentation before you download and/or use the POP code, including the scripts, animation demos and pre-trained models. By downloading and/or using the Model & Software (including downloading, cloning, installing, and any other use of this GitHub repository), you acknowledge that you have read these terms and conditions, understand them, and agree to be bound by them. If you do not agree with these terms and conditions, you must not download and/or use the Model & Software. Any infringement of the terms of this agreement will automatically terminate your rights under this License.

The SMPL/SMPL-X body related files (including all .obj files, uv masks and barycentric coordinate values under the assets/ folder) are subject to the license of the SMPL model / SMPL-X model. The provided demo data (including the body pose and the meshes of clothed human bodies) are subject to the license of the CAPE Dataset. The Chamfer Distance implementation is subject to its original license.

Citations

If you find the codes of this work or the associated ReSynth dataset helpful to your research, please consider citing:

@inproceedings{POP:ICCV:2021,
title = {The Power of Points for Modeling Humans in Clothing},
author = {Ma, Qianli and Yang, Jinlong and Tang, Siyu and Black, Michael J.},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
month = oct,
year = {2021},
month_numeric = {10}}

Related Research

SCALE: Modeling Clothed Humans with a Surface Codec of Articulated Local Elements (CVPR 2021)<br> Qianli Ma, Shunsuke Saito, Jinlong Yang, Siyu Tang, Michael J. Black

Our previous point-based model for humans: modeling pose-dependent shapes of clothed humans explicitly with hundreds of articulated surface elements: the clothing deforms naturally even in the presence of topological change.

SCANimate: Weakly Supervised Learning of Skinned Clothed Avatar Networks (CVPR 2021)<br> Shunsuke Saito, Jinlong Yang, Qianli Ma, Michael J. Black

Our implicit solution to pose-dependent shape modeling: cycle-consistent implicit skinning fields + locally pose-aware implicit function = a fully animatable avatar with implicit surface from raw scans without surface registration!

Learning to Dress 3D People in Generative Clothing (CVPR 2020)<br> Qianli Ma, Jinlong Yang, Anurag Ranjan, Sergi Pujades, Gerard Pons-Moll, Siyu Tang, Michael J. Black

CAPE --- a generative model and a large-scale dataset for 3D clothed human meshes in varied poses and garment types. We trained POP using the CAPE dataset, check it out!