Awesome
<div align="center">🤖 AGRNav: Efficient and Energy-Saving Autonomous Navigation for Air-Ground Robots in Occlusion-Prone Environments
</div>🤗 AGR-Family Works
- OMEGA (RA-L 2024.12): The First AGR-Tailored Dynamic Navigation System.
- HE-Nav (RA-L 2024.09): The First AGR-Tailored ESDF-Free Navigation System.
- AGRNav (ICRA 2024.01): The First AGR-Tailored Occlusion-Aware Navigation System.
🎉 Chinese Media Reports/Interpretations
- AMOV Lab Research Scholarship -- 2024.11: 5000 RMB
- AMOV Lab Research Scholarship -- 2024.10: 5000 RMB
📢 News
- [2024/01]: AGRNav is accepted to ICRA 2024.
- [2023/11]: The code for training SCONet is in another repository.
- [2023/09]: The 3D model in the simulation environment can be downloaded in OneDrive.
- [2023/08]: 🔥 We released the code of AGRNav in the simulation environment.
If you find this work helpful, kindly show your support by giving us a free ⭐️. Your recognition is truly valued.
<p align = "center"> <img src="figs/sim1.gif" width = "400" height = "260" border="1" style="display:inline;"/> </p>If you find this work useful in your research, please consider citing:
@INPROCEEDINGS{wang2024agrnav,
author={Wang, Junming and Sun, Zekai and Guan, Xiuxian and Shen, Tianxiang and Zhang, Zongyuan and Duan, Tianyang and Huang, Dong and Zhao, Shixiong and Cui, Heming},
booktitle={2024 IEEE International Conference on Robotics and Automation (ICRA)},
title={AGRNav: Efficient and Energy-Saving Autonomous Navigation for Air-Ground Robots in Occlusion-Prone Environments},
year={2024},
pages={11133-11139}
}
🔧 Hardware List
<div align="center">Hardware | Link |
---|---|
AMOV Lab P600 UAV | link |
AMOV Lab Allapark1-Jetson Xavier NX | link |
Wheeltec R550 ROS Car | link |
Intel RealSense D435i | link |
Intel RealSense T265 | link |
TFmini Plus | link |
❗ Considering that visual positioning is prone to drift in the Z-axis direction, we added TFmini Plus for height measurement. Additionally, GNSS-RTK positioning is recommended for better localization accuracy.
🤑 Our customized Aerial-Ground Robot cost about RMB 70,000.
🛠️ Installation
The code was tested with python=3.6.9
, as well as pytorch=1.10.0+cu111
and torchvision=0.11.2+cu111
.
Please follow the instructions here to install both PyTorch and TorchVision dependencies. Installing both PyTorch and TorchVision with CUDA support is strongly recommended.
- Clone the repository locally:
git clone https://github.com/jmwang0117/AGRNav.git
- We recommend using Docker to run the project, which can reduce the burden of configuring the environment, you can find the Dockerfile in our project, and then execute the following command:
docker build . -t skywalker_robot -f Dockerfile
- After the compilation is complete, use our one-click startup script in the same directory:
bash create_container.sh
- Next enter the container and use git clone our project
docker exec -it robot bash
- Re-clone the repository locally
git clone https://github.com/jmwang0117/AGRNav.git
- Since need to temporarily save the point cloud, please check the path in the following file:
/root/AGRNav/src/perception/launch/inference.launch
/root/AGRNav/src/perception/SCONet/network/data/SemanticKITTI.py
/root/AGRNav/src/perception/script/pointcloud_listener.py
- SCONet pre-trained model is in the folder below:
/root/AGRNav/src/perception/SCONet/network/weights
- If you want to use our 3D AGR model, please download the AGR model to the folder below:
/root/AGRNav/src/uav_simulator/Utils/odom_visualization/meshes
And modify the code on line 503 in the following file to AGR.dae
/root/AGRNav/src/uav_simulator/Utils/odom_visualization/src/odom_visualization.cpp
- Run the following commands
catkin_make
source devel/setup.bash
sh src/run.sh
You've begun this project successfully; enjoy yourself!
💽 Dataset
- SemanticKITTI
🏆 Acknowledgement
Many thanks to these excellent open source projects: