Awesome
<div align="center"> <h1>🤖 HE-Nav</h1> <h2>A High-Performance and Efficient Navigation System for Aerial-Ground Robots in Cluttered Environments</h2> <br> <a href='https://jmwang0117.github.io/HE_Nav.pdf'><img src='https://img.shields.io/badge/arXiv-HE_Nav-green' alt='arxiv'></a> <a href='https://jmwang0117.github.io/HE-Nav/'><img src='https://img.shields.io/badge/Project_Page-HE_Nav-green' alt='Project Page'></a> </div>🤗 AGR-Family Works
- HE-Nav (RA-L'24): The First AGR-Tailored ESDF-Free Navigation System.
- AGRNav (ICRA'24): The First AGR-Tailored Occlusion-Aware Navigation System.
📢 News
- [2024/07]: Experiment log of HE-Nav and its key components (i.e., LBSCNet and AG-Planner).
Task | Experiment Log |
---|---|
LBSCNet training log | link |
HE-Nav navigation in square room | link |
HE-Nav navigation in corridor | link |
AGRNav navigation in square room | link |
AGRNav navigation in corridor | link |
TABV navigation in square room | link |
TABV navigation in corridor | link |
- [2024/04]: The 3D model in the simulation environment can be downloaded in OneDrive.
- [2024/04]: 🔥 We released the code of HE-Nav in the simulation environment. The pre-trained model can be downloaded at OneDrive
📜 Introduction
HE-Nav introduces a novel, efficient navigation system specialized for Autonomous Ground Robots (AGRs) in highly obstructed settings, optimizing both perception and path planning. It leverages a lightweight semantic scene completion network (LBSCNet) and an energy-efficient path planner (AG-Planner) to deliver high-performance, real-time navigation with impressive energy savings and planning success rates.
<p align="center"> <img src="misc/overview1.png" width = 60% height = 60%/> </p> <br>@article{wang2024he,
title={HE-Nav: A High-Performance and Efficient Navigation System for Aerial-Ground Robots in Cluttered Environments},
author={Wang, Junming and Sun, Zekai and Guan, Xiuxian and Shen, Tianxiang and Huang, Dong and Zhang, Zongyuan and Duan, Tianyang and Liu, Fangming and Cui, Heming},
journal={IEEE Robotics and Automation Letters},
year={2024},
publisher={IEEE}
}
<br>
Please kindly star ⭐️ this project if it helps you. We take great efforts to develop and maintain it 😁.
🛠️ Installation
The code was tested with python=3.6.9
, as well as pytorch=1.10.0+cu111
and torchvision=0.11.2+cu111
.
Please follow the instructions here to install both PyTorch and TorchVision dependencies. Installing both PyTorch and TorchVision with CUDA support is strongly recommended.
- Clone the repository locally:
git clone https://github.com/jmwang0117/HE-Nav.git
- We recommend using Docker to run the project, which can reduce the burden of configuring the environment, you can find the Dockerfile in our project, and then execute the following command:
docker build . -t skywalker_robot -f Dockerfile
- After the compilation is complete, use our one-click startup script in the same directory:
bash create_container.sh
Pay attention to switch docker image
- Next enter the container and use git clone our project
docker exec -it robot bash
- Then catkin_make compiles this project
apt update && sudo apt-get install libarmadillo-dev ros-melodic-nlopt
- Run the following commands
pip install pyyaml
pip install rospkg
pip install imageio
catkin_make
source devel/setup.bash
sh src/run_sim.sh
You've begun this project successfully; enjoy yourself!
💽 Dataset
- SemanticKITTI
🏆Acknowledgement
Many thanks to these excellent open source projects: