-
July 5, 2020: We will release the implementation of paper: RAPTOR: Robust and Perception-aware Trajectory Replanning for Quadrotor Fast Flight (submitted to TRO, under review) in the future.
-
April 12, 2020: The implementation of the ICRA2020 paper: Robust Real-time UAV Replanning Using Guided Gradient-based Optimization and Topological Paths is available.
-
Jan 30, 2020: The volumetric mapping is integrated with our planner. It takes in depth image and camera pose pairs as input, do raycasting to fuse the measurements, and build a Euclidean signed distance field (ESDF) for the planning module.
Fast-Planner is a robust and computationally efficient planning system that enables quadrotor fast flight in complex unknown environments. It contains a collection of carefully designed techniques:
- Kinodynamic path searching
- B-spline-based trajectory optimization
- Topological path searching and path-guided optimization
- Perception-aware planning strategy (to appear)
Complete videos: video1, video2, video3.
Demonstrations about the planner have been reported on the IEEE Spectrum: page1, page2 (search for HKUST in the pages).
Authors: Boyu Zhou and Shaojie Shen from the HUKST Aerial Robotics Group, Fei Gao from ZJU FAST Lab.
Key modules are contained in fast_planner and a lightweight uav_simulator is used for testing. Key components of fast_planner are:
- plan_env: The online mapping algorithms. It takes in depth image (or point cloud) and camera pose (odometry) pairs as input, do raycasting to update a probabilistic volumetric map, and build an Euclidean signed distance filed (ESDF) for the planning system.
- path_searching: Front-end path searching algorithms. Currently it includes a kinodynamic path searching that respects the dynamics of quadrotors. It also contains a sampling-based topological path searching algorithm to generate multiple topologically distinctive paths that capture the structure of the 3D environments.
- bspline: A implementation of the B-spline-based trajectory representation.
- bspline_opt: The gradient-based trajectory optimization using B-spline trajectory.
- active_perception: Perception-aware planning strategy, to appear.
- plan_manage: High-level modules that schedule and call the mapping and planning algorithms. Interfaces for launching the whole system, as well as the configuration files are contained here.
-
Our software is developed and tested in Ubuntu 16.04, ROS Kinetic. Other version may require minor modification.
-
We use NLopt to solve the non-linear optimization problem.
-
The uav_simulator depends on the C++ linear algebra library Armadillo, which can be installed by
sudo apt-get install libarmadillo-dev
. -
Optional: If you want to run the more realistic depth camera in uav_simulator, installation of CUDA Toolkit is needed. Otherwise, a less realistic depth sensor model will be used (See section Use GPU Depth Rendering below).
After the prerequisites are satisfied, you can clone this repository to your catkin workspace and catkin_make. A new workspace is recommended:
cd ${YOUR_WORKSPACE_PATH}/src
git clone https://github.com/HKUST-Aerial-Robotics/Fast-Planner.git
cd ../
catkin_make
The local_sensing package in uav_simulator has the option of using GPU or CPU to render the depth sensor measurement. By default, it is set to CPU version in CMakeLists:
set(ENABLE_CUDA false)
# set(ENABLE_CUDA true)
However, we STRONGLY recommend the GPU version, because it generates depth images more like a real depth camera. To enable the GPU depth rendering, set ENABLE_CUDA to true, and also remember to change the 'arch' and 'code' flags according to your graphics card devices. You can check the right code here.
set(CUDA_NVCC_FLAGS
-gencode arch=compute_61,code=sm_61;
)
For installation of CUDA, please go to CUDA ToolKit
Run Rviz with our configuration firstly:
<!-- go to your workspace and run: -->
source devel/setup.bash
roslaunch plan_manage rviz.launch
Then run the quadrotor simulator and Fast-Planner. Several examples are provided below:
In this method, a kinodynamic path searching finds a safe, dynamically feasible, and minimum-time initial trajectory in the discretized control space. Then the smoothness and clearance of the trajectory are improved by a B-spline optimization. To test this method, run:
<!-- open a new terminal, go to your workspace and run: -->
source devel/setup.bash
roslaunch plan_manage kino_replan.launch
Normally, you will find the randomly generated map and the drone model in Rviz
. At this time, you can trigger the planner using the 2D Nav Goal
tool. When a point is clicked in Rviz
, a new trajectory will be generated immediately and executed by the drone. A sample is displayed below:
If you use this algorithm for your application or research, please cite our related paper:
- Robust and Efficient Quadrotor Trajectory Generation for Fast Autonomous Flight, Boyu Zhou, Fei Gao, Luqi Wang, Chuhao Liu and Shaojie Shen, IEEE Robotics and Automation Letters (RA-L), 2019.
@article{zhou2019robust,
title={Robust and efficient quadrotor trajectory generation for fast autonomous flight},
author={Zhou, Boyu and Gao, Fei and Wang, Luqi and Liu, Chuhao and Shen, Shaojie},
journal={IEEE Robotics and Automation Letters},
volume={4},
number={4},
pages={3529--3536},
year={2019},
publisher={IEEE}
}
This method features searching for multiple trajectories in distinctive topological classes. Thanks to the strategy, the solution space is explored more thoroughly, avoiding local minima and yielding better solutions. Similarly, run:
<!-- open a new terminal, go to your workspace and run: -->
source devel/setup.bash
roslaunch plan_manage topo_replan.launch
then you will find the random map generated and can use the 2D Nav Goal
to trigger the planner:
If you use this algorithm for your application or research, please cite our related paper:
- Robust Real-time UAV Replanning Using Guided Gradient-based Optimization and Topological Paths, Boyu Zhou, Fei Gao, Jie Pan and Shaojie Shen, IEEE International Conference on Robotics and Automation (ICRA), 2020.
@article{zhou2019robust,
title={Robust Real-time UAV Replanning Using Guided Gradient-based Optimization and Topological Paths},
author={Zhou, Boyu and Gao, Fei and Pan, Jie and Shen, Shaojie},
journal={arXiv preprint arXiv:1912.12644},
year={2019}
}
To appear.
If you have successfully run the simulation and want to use Fast-Planner in your project, please explore the files kino_replan.launch or topo_replan.launch. Important parameters that may be changed in your usage are contained and documented.
Note that in our configuration, the size of depth image is 640x480. For higher map fusion efficiency we do downsampling (in kino_algorithm.xml, skip_pixel = 2). If you use depth images with lower resolution (like 256x144), you might disable the downsampling by setting skip_pixel = 1. Also, the depth_scaling_factor is set to 1000, which may need to be changed according to your device.
Finally, please kindly give a STAR to this repo if it helps your research or work, thanks! :)
We use NLopt for non-linear optimization.
The source code is released under GPLv3 license.
This is research code, it is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of merchantability or fitness for a particular purpose.