Awesome
<div align="center"> <h1>๐ค HE-Nav</h1> <h2>A High-Performance and Efficient Navigation System for Aerial-Ground Robots in Cluttered Environments</h2> <br> <a href='https://jmwang0117.github.io/HE_Nav.pdf'><img src='https://img.shields.io/badge/arXiv-HE_Nav-green' alt='arxiv'></a> <a href='https://jmwang0117.github.io/HE-Nav/'><img src='https://img.shields.io/badge/Project_Page-HE_Nav-green' alt='Project Page'></a> </div>๐ค AGR-Family Works
- OMEGA (RA-L 2024.12): The First AGR-Tailored Dynamic Navigation System.
- HE-Nav (RA-L 2024.09): The First AGR-Tailored ESDF-Free Navigation System.
- AGRNav (ICRA 2024.01): The First AGR-Tailored Occlusion-Aware Navigation System.
๐ Chinese Media Reports/Interpretations
- AMOV Lab Research Scholarship -- 2024.11: 5000 RMB
- AMOV Lab Research Scholarship -- 2024.10: 5000 RMB
๐ข News
- [2024/07]: Experiment log of HE-Nav and its key components (i.e., LBSCNet and AG-Planner).
Task | Experiment Log |
---|---|
LBSCNet training log | link |
HE-Nav navigation in square room | link |
HE-Nav navigation in corridor | link |
AGRNav navigation in square room | link |
AGRNav navigation in corridor | link |
TABV navigation in square room | link |
TABV navigation in corridor | link |
- [2024/04]: The 3D model in the simulation environment can be downloaded in OneDrive.
- [2024/04]: ๐ฅ We released the code of HE-Nav in the simulation environment. The pre-trained model can be downloaded at OneDrive
๐ Introduction
HE-Nav introduces a novel, efficient navigation system specialized for Autonomous Ground Robots (AGRs) in highly obstructed settings, optimizing both perception and path planning. It leverages a lightweight semantic scene completion network (LBSCNet) and an energy-efficient path planner (AG-Planner) to deliver high-performance, real-time navigation with impressive energy savings and planning success rates.
<p align="center"> <img src="misc/overview1.png" width = 60% height = 60%/> </p> <br>@article{wang2024henav,
title={HE-Nav: A High-Performance and Efficient Navigation System for Aerial-Ground Robots in Cluttered Environments},
author={Wang, Junming and Sun, Zekai and Guan, Xiuxian and Shen, Tianxiang and Huang, Dong and Zhang, Zongyuan and Duan, Tianyang and Liu, Fangming and Cui, Heming},
journal={IEEE Robotics and Automation Letters},
year={2024},
volume={9},
number={11},
pages={10383-10390},
publisher={IEEE}
}
<br>
Please kindly star โญ๏ธ this project if it helps you. We take great efforts to develop and maintain it ๐.
๐ง Hardware List
<div align="center">Hardware | Link |
---|---|
AMOV Lab P600 UAV | link |
AMOV Lab Allapark1-Jetson Xavier NX | link |
Wheeltec R550 ROS Car | link |
Intel RealSense D435i | link |
Intel RealSense T265 | link |
TFmini Plus | link |
โ Considering that visual positioning is prone to drift in the Z-axis direction, we added TFmini Plus for height measurement. Additionally, GNSS-RTK positioning is recommended for better localization accuracy.
๐ค Our customized Aerial-Ground Robot cost about RMB 70,000.
๐ ๏ธ Installation
The code was tested with python=3.6.9
, as well as pytorch=1.10.0+cu111
and torchvision=0.11.2+cu111
.
Please follow the instructions here to install both PyTorch and TorchVision dependencies. Installing both PyTorch and TorchVision with CUDA support is strongly recommended.
- Clone the repository locally:
git clone https://github.com/jmwang0117/HE-Nav.git
- We recommend using Docker to run the project, which can reduce the burden of configuring the environment, you can find the Dockerfile in our project, and then execute the following command:
docker build . -t skywalker_robot -f Dockerfile
- After the compilation is complete, use our one-click startup script in the same directory:
bash create_container.sh
Pay attention to switch docker image
- Next enter the container and use git clone our project
docker exec -it robot bash
- Then catkin_make compiles this project
apt update && sudo apt-get install libarmadillo-dev ros-melodic-nlopt
- Run the following commands
pip install pyyaml
pip install rospkg
pip install imageio
catkin_make
source devel/setup.bash
sh src/run_sim.sh
You've begun this project successfully; If you only want to use the path planner, you can delete the perception network in the ROS package!!! enjoy yourself!
๐ฝ Dataset
- SemanticKITTI
๐Acknowledgement
Many thanks to these excellent open source projects: