Awesome
DeepMetaHandles (CVPR2021 Oral)
<img src="fig/teaser.jpg" align="center"> <div float="center"> <img src="fig/chair0/5c70ab.gif" width="10.2%"> <img src="fig/chair0/11e521.gif" width="10.2%"> <img src="fig/chair0/587ee5.gif" width="10.2%"> <img src="fig/chair7/4a0e7f.gif" width="10.2%"> <img src="fig/chair7/37a095.gif" width="10.2%"> <img src="fig/chair7/a2bffa.gif" width="10.2%"> <img src="fig/chair6/4a0e7f.gif" width="10.2%"> <img src="fig/chair6/9aa05f.gif" width="10.2%"> <img src="fig/chair6/39fee0.gif" width="10.2%"> </div> <div float="center"> <img src="fig/chair5/7e4335.gif" width="10.2%"> <img src="fig/chair5/104256.gif" width="10.2%"> <img src="fig/chair5/f76d50.gif" width="10.2%"> <img src="fig/chair9/11e521.gif" width="10.2%"> <img src="fig/chair9/f1563f.gif" width="10.2%"> <img src="fig/chair9/fde8c8.gif" width="10.2%"> <img src="fig/chair13/3e72bf.gif" width="10.2%"> <img src="fig/chair13/5c6c95.gif" width="10.2%"> <img src="fig/chair13/5c70ab.gif" width="10.2%"> </div>[project] [paper] [demo] [animations]
DeepMetaHandles is a shape deformation technique. It learns a set of meta-handles for each given shape. The disentangled meta-handles factorize all the plausible deformations of the shape, while each of them corresponds to an intuitive deformation direction. A new deformation can then be generated by the "linear combination" of the meta-handles. Although the approach is learned in an unsupervised manner, the learned meta-handles possess strong interpretability and consistency.
Environment setup
- Create a conda environment by
conda env create -f environment.yml
. - Build and install torch-batch-svd.
Demo
- Download
data/demo
andcheckpoints/chair_15.pth
from here and place them in the corresponding folder. Pre-processed demo data contains the manifold mesh, sampled control point, sampled surface point cloud, and corresponding biharmonic coordinates. - Run
src/demo_target_driven_deform.py
to deform a source shape to match a target shape. - Run
src/demo_meta_handle.py
to generate deformations along the direction of each learned meta-handle.
Train
- Download
data/chair
from here and place them in the corresponding folder. - Run the visdom server. (We use visdom to visualize the training process.)
- Run
src/train.py
to start training.
Note: For different categories, you may need to adjust the number of meta-handles. Also, you need to tune the weights for the loss functions. Different sets of weights may produce significantly different results.
Pre-process your own data
- Compile codes in
data_preprocessing/.
- Build and run manifold to convert your meshes into watertight manifolds.
- Run
data_preprocessing/normalize_bin
to normalize the manifold into a unit bounding sphere. - Build and run fTetWild to convert your manifolds into tetrahedral meshes. Please use
--output xxx.mesh
option to generate the.mesh
format tet mesh. Also, you will get axxx.mesh__sf.obj
for the surface mesh. We will usexxx.mesh
andxxx.mesh__sf.obj
to calculate the biharmonic weights. We will only deformxxx.mesh__sf.obj
later. - Run
data_preprocessing/sample_key_points_bin
to sample control points fromxxx.mesh__sf.obj
. We use the FPS algorithm over edge distances to sample the control points. - Run
data_preprocessing/calc_weight_bin
to calculate the bihrnomic weights. It takesxxx.mesh
,xxx.mesh__sf.obj
, and the control point file as input, and will output a text file containing the weight matrix for the vertices inxxx.mesh__sf.obj
. - Run
data_preprocessing/sample_surface_points_bin
to sample points on thexxx.mesh__sf.obj
and calculate the corresponding biharmonic weights for the sampled point cloud. - In our training, we remove those shapes (about 10%) whose biharmonic weight matrix contains elements that are smaller than -1.5 or greater than 1.5. We find that this can help us to converge faster.
- To reduce IO time during training, you may compress the data into a compact form and load them to the memory. For example, you can use python scripts in
data_preprocessing/merge_data
to convert cpp output into numpy files.
Citation
If you find our work useful, please consider citing our paper:
@article{liu2021deepmetahandles,
title={DeepMetaHandles: Learning Deformation Meta-Handles of 3D Meshes with Biharmonic Coordinates},
author={Liu, Minghua and Sung, Minhyuk and Mech, Radomir and Su, Hao},
journal={arXiv preprint arXiv:2102.09105},
year={2021}
}