Skip to content

HFNet-SLAM: An accurate and real-time monocular SLAM system with deep features

Notifications You must be signed in to change notification settings

SanghyunPark01/HFNet_SLAM

 
 

Repository files navigation

HFNet-SLAM

An accurate and real-time monocular SLAM system with deep features

HFNet-SLAM is the combination and extension of the well-known ORB-SLAM3 SLAM framework and a unified CNN model called HF-Net. It uses the image features from HF-Net to fully replace the hand-crafted ORB features and the BoW method in the ORB-SLAM3 system. This novelty results in better performance in tracking and loop closure, boosting the accuracy of the entire HFNet-SLAM.

Better Tracking:

Better Loop Closure:

Better Runtime Performance

HFNet-SLAM can run at 50 FPS with GPU support.

More details about the differences can be found in the HFNet-SLAM vs. ORB-SLAM3 document.

Prerequisites

We use OpenCV, CUDA, and cuDNN, TensorRT in HFNet-SLAM. The corresponding version of these libraries should be chosen wisely according to the devices. The following configurations has been tested:

Name Version
Ubuntu 20.04
GPU RTX 2070 Max-Q
NVIDIA Driver 510.47
OpenCV 4.2.0
CUDA tool 11.6.2
cuDNN 8.4.1.50
TensorRT 8.5.1
TensorFlow(optional) 1.15, 2.9
ROS(optional) noetic

OpenCV

We use OpenCV to manipulate images and features.

sudo apt-get install libopencv-dev

Note: While building, please carefully check the output log of the compiler and make sure the OpenCV version is correct. Only 4.2.0 is tested, and a different version might cause terrible compilation problems.

build type: Release -- Found OpenCV: /usr (found suitable version "4.2.0", minimum required is "4.2.0")

TensorRT

We use TensorRT, CUDA, and cuDNN for model inference.

The download and install instructions of CUDA can be found at: https://developer.nvidia.com/cuda-toolkit.

The instructions of cuDNN can be found at: https://docs.nvidia.com/deeplearning/cudnn/install-guide/index.html.

The instructions of TensorRT can be found at: https://docs.nvidia.com/deeplearning/tensorrt/install-guide/index.html.

The converted TensorRT model can be downloaded form here. If you wish to convert the model for yourself, more details about the process can be found in the HF-Net Model Converting document.

Building HFNet-SLAM library and examples

chmod +x build.sh
bash build.sh

TensorFlow C++ (optional)

The Official HF-Net is built on TensorFLow. HFNet-SLAM also support test with the original HF-Net in TensorFlow C++.

  1. Install TensorFlow C++: An easy method for building TensorFlow C++ API can be found at: https://github.com/FloopCZ/tensorflow_cc.

  2. Edit CMakeLists.txt and rebuild the project

# In line 19, set USE_TENSORFLOW from OFF to ON to enable TensorFlow functions. 
set(USE_TENSORFLOW ON)
# In line 132, indicate the installation path for TensorFlow.
set(Tensorflow_Root "PATH/tensorflow_cc/install")
  1. Download the converted TensorFLow Model files from here.

ROS (optional)

Some examples using ROS are provided. Building these examples is optional. These have been tested with ROS Noetic under Ubuntu 20.04.

Evaluation on EuRoC dataset

EuRoC.mp4

Evaluate a single sequence with the pure monocular configuration:

pathDataset='PATH/Datasets/EuRoC/'
pathEvaluation='./evaluation/Euroc/'
sequenceName='MH01'
./Examples/Monocular/mono_euroc ./Examples/Monocular/EuRoC.yaml "$pathEvaluation"/"$sequenceName"_MONO/ "$pathDataset"/"$sequenceName" ./Examples/Monocular/EuRoC_TimeStamps/"$sequenceName".txt
python3 ./evaluation/evaluate_ate_scale.py ./evaluation/Ground_truth/EuRoC_left_cam/"$sequenceName"_GT.txt "$pathEvaluation"/"$sequenceName"_MONO/trajectory.txt --verbose --save_path "$pathEvaluation"/"$sequenceName"_MONO/

Evaluate a single sequence with the monocular-inertial configuration:

pathDataset='PATH/Datasets/EuRoC/'
pathEvaluation='./evaluation/Euroc/'
sequenceName='MH01'
./Examples/Monocular-Inertial/mono_inertial_euroc ./Examples/Monocular-Inertial/EuRoC.yaml "$pathEvaluation"/"$sequenceName"_MONO_IN/ "$pathDataset"/"$sequenceName" ./Examples/Monocular-Inertial/EuRoC_TimeStamps/"$sequenceName".txt
python3 ./evaluation/evaluate_ate_scale.py "$pathDataset"/"$sequenceName"/mav0/state_groundtruth_estimate0/data.csv "$pathEvaluation"/"$sequenceName"_MONO_IN/trajectory.txt --verbose --save_path "$pathEvaluation"/"$sequenceName"_MONO_IN/

Evaluate the whole dataset:

bash Examples/eval_euroc.sh

Evaluation results:

Evaluation on TUM-VI dataset

TUM-VI.mp4

Evaluate a single sequence with the monocular-inertial configuration:

In 'outdoors' sequences, Use './Examples/Monocular-Inertial/TUM-VI_far.yaml' configuration file instead.

pathDataset='PATH/Datasets/TUM-VI/'
pathEvaluation='./evaluation/TUM-VI/'
sequenceName='dataset-corridor1_512'
./Examples/Monocular-Inertial/mono_inertial_tum_vi ./Examples/Monocular-Inertial/TUM-VI.yaml "$pathEvaluation"/"$sequenceName"/ "$pathDataset"/"$sequenceName"_16/mav0/cam0/data ./Examples/Monocular-Inertial/TUM_TimeStamps/"$sequenceName".txt ./Examples/Monocular-Inertial/TUM_IMU/"$sequenceName".txt
python3 ./evaluation/evaluate_ate_scale.py "$pathDataset"/"$sequenceName"_16/mav0/mocap0/data.csv "$pathEvaluation"/"$sequenceName"/trajectory.txt --verbose --save_path "$pathEvaluation"/"$sequenceName"/

Evaluate the whole dataset:

bash Examples/eval_tum_vi.sh

Evaluation results:

Evaluation on TUM-RGBD dataset

Evaluate a single sequence with the RGB-D configuration:

pathDataset='PATH/Datasets/TUM-RGBD/'
pathEvaluation='./evaluation/TUM-RGBD/'
sequenceName='fr1_desk'
echo "Launching $sequenceName with RGB-D sensor"
./Examples/RGB-D/rgbd_tum ./Examples/RGB-D/TUM1.yaml "$pathEvaluation"/"$sequenceName"/ "$pathDataset"/"$sequenceName"/ ./Examples/RGB-D/associations/"$sequenceName".txt
python3 ./evaluation/evaluate_ate_scale.py "$pathDataset"/"$sequenceName"/groundtruth.txt "$pathEvaluation"/"$sequenceName"/trajectory.txt --verbose --save_path "$pathEvaluation"/"$sequenceName"/

Evaluate the whole dataset:

bash Examples/eval_tum_rgbd.sh

Evaluation with ROS

Tested with ROS Noetic and ubuntu 20.04.

  1. Add the path including Examples/ROS/HFNet_SLAM to the ROS_PACKAGE_PATH environment variable.
export ROS_PACKAGE_PATH=${ROS_PACKAGE_PATH}:PATH/HFNet_SLAM/Examples/ROS
  1. Execute build_ros.sh script:
chmod +x build_ros.sh
./build_ros.sh
  1. We provide some simple nodes with public benchmarks
# Monocular configuration in EuRoC dataset
roslaunch HFNet_SLAM mono_euroc.launch

# Monocular Inertial configuration in EuRoC dataset
roslaunch HFNet_SLAM mono_inertial_euroc.launch

# RGB-D configuration in TUM-RGBD dataset
roslaunch HFNet_SLAM rgbd_tum.launch

About

HFNet-SLAM: An accurate and real-time monocular SLAM system with deep features

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 92.4%
  • Python 7.0%
  • Other 0.6%