Home

Awesome

LayoutNet

New: Please check our official PyTorch implementation for LayoutNet v2

Torch implementation of our CVPR 18 paper: "LayoutNet: Reconstructing the 3D Room Layout from a Single RGB Image"

See sample video of 3D reconstruced layouts by our method.

<img src='figs/teasor.jpg' width=400>

Prerequisites

matio: https://github.com/tbeu/matio

Data

This includes the panoramas from both the panoContext dataset and our labeled stanford 2d-3d dataset.

This includes the groundtruth 2D position of room corners in .mat format from the two dataset. We've corrected some wrong corner labels in PanoContext to match the layout boundaries.

Pretrained model

  1. The pretrained full approach on the panoContext dataset, the joint boudary and corner prediction branch, the single boundary prediction branch and the 3D layout box regressor;

  2. The pretrained full approach on the LSUN dataset (we've corrected 10% wrong labels), the joint boudary and corner prediction branch and the single boundary prediction branch.

Image preprocess

We provide sample script to extract Manhattan lines and align the panorama in ./matlab/getManhattanAndAlign.m.

To get gt edge map, corner map and box parameters, see sample script ./matlab/preprocessPano.m

To convert gt data to .t7 file, see sample code preProcess_pano.lua

Train network

th driver_pano_full.lua

Note that this loads the pretrained joint prediction branch and the 3D layout box regressor.

th driver_pano_joint.lua

Note that this loads the pretrained boundary prediction branch.

th driver_pano_edg.lua
th driver_pano_box.lua

Test network

th testNet_pano_full.lua

This saves predicted boundary, corner and 3D layout parameter in "result/" folder.

Optimization

cd matlab
panoOptimization.m

This loads saved predictions from the network output and performs sampling.

Evaluation

We provide the Matlab evaluation code for 3D IoU (compute3dOcc_eval.m) and the generation of 2D layout label (getSegMask_eval.m) for evaluating layout pixel accuracy.

Extension to perspective images

th driver_persp_joint_lsun_type.lua

Note that this loads the pretrained joint corner and boundary prediction branch.

th driver_persp_joint_lsun.lua

Note that this loads the pretrained boundary prediction branch.

th driver_persp_lsun.lua
th testNet_persp_full_lsun.lua

Note that this saves predicted boundary, corner and room type in "result/" folder. To get the exact 2D corner position on the image, run the following using Matlab:

cd matlab
getLSUNRes.m

You need to download the LSUN data and the toolbox to run through the experiment.

Miscellaneous

Citation

Please cite our paper for any purpose of usage.

@inproceedings{zou2018layoutnet,
  title={LayoutNet: Reconstructing the 3D Room Layout from a Single RGB Image},
  author={Zou, Chuhang and Colburn, Alex and Shan, Qi and Hoiem, Derek},
  booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
  pages={2051--2059},
  year={2018}
}