Home

Awesome

<div align="center"> <h1>๐Ÿค– HE-Nav</h1> <h2>A High-Performance and Efficient Navigation System for Aerial-Ground Robots in Cluttered Environments</h2> <br> <a href='https://jmwang0117.github.io/HE_Nav.pdf'><img src='https://img.shields.io/badge/arXiv-HE_Nav-green' alt='arxiv'></a> <a href='https://jmwang0117.github.io/HE-Nav/'><img src='https://img.shields.io/badge/Project_Page-HE_Nav-green' alt='Project Page'></a> </div>

๐Ÿค— AGR-Family Works

๐ŸŽ‰ Chinese Media Reports/Interpretations

๐Ÿ“ข News

<div align="center">
TaskExperiment Log
LBSCNet training loglink
HE-Nav navigation in square roomlink
HE-Nav navigation in corridorlink
AGRNav navigation in square roomlink
AGRNav navigation in corridorlink
TABV navigation in square roomlink
TABV navigation in corridorlink
</div> </br>

๐Ÿ“œ Introduction

HE-Nav introduces a novel, efficient navigation system specialized for Autonomous Ground Robots (AGRs) in highly obstructed settings, optimizing both perception and path planning. It leverages a lightweight semantic scene completion network (LBSCNet) and an energy-efficient path planner (AG-Planner) to deliver high-performance, real-time navigation with impressive energy savings and planning success rates.

<p align="center"> <img src="misc/overview1.png" width = 60% height = 60%/> </p> <br>
@article{wang2024henav,
  title={HE-Nav: A High-Performance and Efficient Navigation System for Aerial-Ground Robots in Cluttered Environments},
  author={Wang, Junming and Sun, Zekai and Guan, Xiuxian and Shen, Tianxiang and Huang, Dong and Zhang, Zongyuan and Duan, Tianyang and Liu, Fangming and Cui, Heming},
  journal={IEEE Robotics and Automation Letters},
  year={2024},
  volume={9},
  number={11},
  pages={10383-10390},
  publisher={IEEE}
}
<br>

Please kindly star โญ๏ธ this project if it helps you. We take great efforts to develop and maintain it ๐Ÿ˜.

๐Ÿ”ง Hardware List

<div align="center">
HardwareLink
AMOV Lab P600 UAVlink
AMOV Lab Allapark1-Jetson Xavier NXlink
Wheeltec R550 ROS Carlink
Intel RealSense D435ilink
Intel RealSense T265link
TFmini Pluslink
</div>

โ— Considering that visual positioning is prone to drift in the Z-axis direction, we added TFmini Plus for height measurement. Additionally, GNSS-RTK positioning is recommended for better localization accuracy.

๐Ÿค‘ Our customized Aerial-Ground Robot cost about RMB 70,000.

๐Ÿ› ๏ธ Installation

The code was tested with python=3.6.9, as well as pytorch=1.10.0+cu111 and torchvision=0.11.2+cu111.

Please follow the instructions here to install both PyTorch and TorchVision dependencies. Installing both PyTorch and TorchVision with CUDA support is strongly recommended.

  1. Clone the repository locally:
 git clone https://github.com/jmwang0117/HE-Nav.git
  1. We recommend using Docker to run the project, which can reduce the burden of configuring the environment, you can find the Dockerfile in our project, and then execute the following command:
 docker build . -t skywalker_robot -f Dockerfile
  1. After the compilation is complete, use our one-click startup script in the same directory:
 bash create_container.sh

Pay attention to switch docker image

  1. Next enter the container and use git clone our project
 docker exec -it robot bash
  1. Then catkin_make compiles this project
 apt update && sudo apt-get install libarmadillo-dev ros-melodic-nlopt

  1. Run the following commands
pip install pyyaml
pip install rospkg
pip install imageio
catkin_make
source devel/setup.bash
sh src/run_sim.sh

You've begun this project successfully; If you only want to use the path planner, you can delete the perception network in the ROS package!!! enjoy yourself!

๐Ÿ’ฝ Dataset

๐Ÿ†Acknowledgement

Many thanks to these excellent open source projects: