Home

Awesome

Neural Kernel Surface Reconstruction

NKSR

Neural Kernel Surface Reconstruction<br> Jiahui Huang, Zan Gojcic, Matan Atzmon, Or Litany, Sanja Fidler, Francis Williams <br> Paper, Project Page

Abstract: We present a novel method for reconstructing a 3D implicit surface from a large-scale, sparse, and noisy point cloud. Our approach builds upon the recently introduced Neural Kernel Fields (NKF) representation. It enjoys similar generalization capabilities to NKF, while simultaneously addressing its main limitations: (a) We can scale to large scenes through compactly supported kernel functions, which enable the use of memory-efficient sparse linear solvers. (b) We are robust to noise, through a gradient fitting solve. (c) We minimize training requirements, enabling us to learn from any dataset of dense oriented points, and even mix training data consisting of objects and scenes at different scales. Our method is capable of reconstructing millions of points in a few seconds, and handling very large scenes in an out-of-core fashion. We achieve state-of-the-art results on reconstruction benchmarks consisting of single objects, indoor scenes, and outdoor scenes.

For business inquiries, please visit our website and submit the form: NVIDIA Research Licensing

News

Environment setup

We recommend using the latest Python and PyTorch to run our algorithm. To install all dependencies using conda:

# Clone the repository
git clone git@github.com:nv-tlabs/nksr.git
cd nksr

# Create conda environment
conda env create

# Activate it
conda activate nksr

# Install NKSR
pip install nksr -f https://nksr.huangjh.tech/whl/torch-2.0.0+cu118.html

For docker users, we suggest using a base image from nvidia/cuda with tag 11.8.0-cudnn8-devel-ubuntu22.04, and applying the above conda setup over it.

Testing NKSR on your own data

We have tested our algorithm on multiple different spatial scales. It can reconstruct scenes spanning kilometers with millions of points+ on an RTX 3090 GPU. To use our kitchen-sink model (released under CC-BY-SA 4.0 license), the following code snippet suffices:

import torch
import nksr

bunny_geom = load_bunny_example()

input_xyz = torch.from_numpy(np.asarray(bunny_geom.points)).float().to(device)
input_normal = torch.from_numpy(np.asarray(bunny_geom.normals)).float().to(device)

reconstructor = nksr.Reconstructor(device)
field = reconstructor.reconstruct(input_xyz, input_normal, detail_level=1.0)
mesh = field.extract_dual_mesh(mise_iter=1)

We have prepared detailed instructions about data preparation and different example usages at NKSR Documentation Page.

Reproducing results from the paper

Our training and inference system is based on the Zeus Deep Learning infrastructure, supporting both tensorboard and wandb (recommended) as loggers. To config Zeus, copy the default yaml file and modify the related paths:

cp configs/default/zeus.yaml zeus_config.yaml

Modify the contents of zeus_config.yaml as needed to include your wandb account name and checkpoint/test results save directory.

Training

Data download links:

The main training script is train.py. We provide different config files for the different datasets we've benchmarked in our paper:

# ShapeNet small noise 1K input
python train.py configs/shapenet/train_1k_perfect.yaml
# ShapeNet medium noise 3K input
python train.py configs/shapenet/train_3k_noise.yaml
# ShapeNet big noise 3K input
python train.py configs/shapenet/train_3k_noiser.yaml
# Points2Surf dataset noisy input
python train.py configs/points2surf/train.yaml
# CARLA dataset
python train.py configs/carla/train.yaml

In addition, you can manually specify different training settings to obtain models that suit your needs. Common flags include:

Inference

You can either infer using your own trained models or our pre-trained checkpoints.

# From pre-trained checkpoints
python test.py configs/shapenet/train_3k_noise.yaml --url https://nksr.huangjh.tech/snet-n3k-wnormal.pth --exec udf.enabled=False
python test.py configs/points2surf/train.yaml --url https://nksr.huangjh.tech/p2s.pth --include configs/points2surf/data_abc_test.yaml
python test.py configs/carla/train.yaml --url https://nksr.huangjh.tech/carla.pth  --include configs/carla/data_no_patch.yaml

# From your own trained models
python test.py none --ckpt wdb:<WANDB_USER_NAME>/<WANDB_PROJECT>/<WANDB_RUN_ID>

Useful flags for test.py include:

License

Copyright © 2023, NVIDIA Corporation & affiliates. All rights reserved. This work is made available under the Nvidia Source Code License.

Related Works

NKSR is highly based on the following existing works:

Citation

@inproceedings{huang2023nksr,
  title={Neural Kernel Surface Reconstruction},
  author={Huang, Jiahui and Gojcic, Zan and Atzmon, Matan and Litany, Or and Fidler, Sanja and Williams, Francis},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={4369--4379},
  year={2023}
}