Home

Awesome

<!-- * @Author: fuchy@stu.pku.edu.cn * @Date: 2021-09-18 18:33:55 * @LastEditTime: 2021-12-14 20:09:55 * @LastEditors: FCY * @Description: README * @FilePath: /compression/README.md -->

OctAttention: Octree-Based Large-Scale Contexts Model for Point Cloud Compression. AAAI 2022 Paper.

Branches

There are two branches named obj and lidar that implement Object and LiDAR point cloud coding respectively. They share the same network. Note: the checkpoint file is saved in the corresponding branch separately. The model for LiDAR compression is here.

Requirements

Download and Prepare Training and Testing Data

Please set oriDir in dataPrepare.py before.

python dataPrepare.py

To prepare train and test data. It will generate *.mat data in the directory Data.

Train

python octAttention.py 

You should set the Network parameters expName,DataRootetc. in networkTool.py. This will output checkpoint in expName folder, e.g. Exp/Kitti. (Note: You should run DataFolder.calcdataLenPerFile() in dataset.py for a new dataset, and you can comment it after you get the parameter dataLenPerFile)

Encode and Decode

You may need to run the following command to provide pc_error and tmc13v14_r(release version) execute permission.

chmod +x file/pc_error file/tmc13v14_r 
python encoder.py  

This will output binary codes saved in .bin format in Exp(expName)/data, and will generate *.mat data in the directory Data/testPly.

python decoder.py 

This will load *.mat data for check and calculate PSNR by pc_error.

Test TMC

We provide the test code for TMC13 v14 (G-PCC) for Object and LiDAR point cloud compression.

python testTMC.py

Citation

If this work is useful for your research, please consider citing :

@article{OctAttention, 
title={OctAttention: Octree-Based Large-Scale Contexts Model for Point Cloud Compression}, volume={36}, 
url={https://ojs.aaai.org/index.php/AAAI/article/view/19942}, DOI={10.1609/aaai.v36i1.19942},
number={1}, journal={Proceedings of the AAAI Conference on Artificial Intelligence}, 
author={Fu, Chunyang and Li, Ge and Song, Rui and Gao, Wei and Liu, Shan}, year={2022}, month={Jun.}, pages={625-633} 
}