Awesome
OASim: an Open and Adaptive Simulator based on Neural Rendering for Autonomous Driving
https://github.com/PJLab-ADG/OASim/assets/24562717/ddb580c4-d329-419e-87f2-f3c8f487d8f7
🔍 Framework Overview
<img src="assets/framework.jpg" width=100%>OASim focuses on generating new and highly customizable autonomous driving data through neural implicit reconstruction and rendering techniques. This technology has a wealth of applications, such as large-scale data and scene generation, corner case generation, autonomous driving closed-loop training, autonomous driving stack testing, etc.
🌟 Highlights
2024-02-08
Codes are now release!2024-02-07
Explore our project page, now live here🔗!2023-02-07
Our paper is available on Arxiv📄!
🚀 Getting Started
First, clone with submodules:
git clone git@github.com:PJLab-ADG/OASim.git --recurse-submodules -j8
1. Installation 📦
Our code is developed on Ubuntu 22.04 using Python 3.9 and PyTorch 2.0 (CUDA 11.8). Please note that the code has only been tested with these specified versions. We recommend using conda for the installation of dependencies. The installation process might take more than 30 minutes.
conda create -n oasim python=3.9
conda activate oasim
conda install pytorch==2.0.0 torchvision==0.15.0 torchaudio==2.0.0 pytorch-cuda=11.8 -c pytorch -c nvidia
- pytorch_scatter
conda install pytorch-scatter -c pyg
- other pip packages
pip install opencv-python-headless kornia imagesize omegaconf addict imageio imageio-ffmpeg scikit-image scikit-learn pyyaml pynvml psutil seaborn==0.12.0 trimesh plyfile ninja icecream tqdm plyfile tensorboard PySide6 vtk dash-vtk numpy==1.24.2 dearpygui==1.8.0 matplotlib==3.7.1 pandas==2.0.0 pynput==1.7.6 rich==13.4.2 sumolib==1.16.0 traci==1.16.0
- the most recent sumo
sudo add-apt-repository ppa:sumo/stable
sudo apt-get update
sudo apt-get install sumo sumo-tools sumo-doc
2. Configuration ⚙️
cd
to the neuralsim/nr3d_lib
directory.
Then: (Notice the trailing dot .
)
pip install -v .
Download the model file according to the link below.
Link (链接): https://pan.baidu.com/s/1HtRb_8mCSFeNtg9iwBtICg?pwd=vj56
Extraction code (提取码): vj56
3. Running OASim
Running OASim is straightforward:
bash scripts/run_oasim.sh
Note: Modify --resume_dir
parameter in run_oasim.sh
to the directory of the model file you downloaded.
3D Preview:
First, move the mouse to the 3D preview interface in the upper right part and click the right mouse button. Then preview the 3D implicit reconstruction results through the keyboard. The keyboard control method is as follows.
Extrinsic Params | Keyboard_input | Extrinsic Params | Keyboard_input |
---|---|---|---|
+x degree | q | -x degree | a |
+y degree | w | -y degree | s |
+z degree | e | -z degree | d |
+x trans | r | -x trans | f |
+y trans | t | -y trans | g |
+z trans | y | -z trans | h |
Traffic flow editor:
First, click the Choose Start Lane
button in the middle interface, and click on any lane on the HD map to select the starting lane. Then click the Choose Arrival Lane
button and click on any lane on the HD map to select the destination lane. Finally, click Confirm
button to generate data.
🔖 Citation
If you find our paper and codes useful, please kindly cite us via:
@misc{yan2024oasim,
title={OASim: an Open and Adaptive Simulator based on Neural Rendering for Autonomous Driving},
author={Guohang Yan and Jiahao Pi and Jianfei Guo and Zhaotong Luo and Min Dou and Nianchen Deng and Qiusheng Huang and Daocheng Fu and Licheng Wen and Pinlong Cai and Xing Gao and Xinyu Cai and Bo Zhang and Xuemeng Yang and Yeqi Bai and Hongbin Zhou and Botian Shi},
year={2024},
eprint={2402.03830},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
📝 License
OASim is released under the Apache 2.0 license.
Contact
If you have questions about this repo, please contact Yan Guohang (yanguohang@pjlab.org.cn
).