Visual SLAM methods often encounter challenges in dynamic scenes that severely affect their core modules. To avoid that, Dynamic V-SLAM methods often apply semantic cues, geometric constraints, or optical flows to exclude dynamic elements. However, heavy reliance on precise segmentation and the a-priori inclusion of selected classes, along with the inability to recognize unknown moving objects, degrade their performance.
To address this, we introduce DynaPix, a novel visual SLAM system based on per-pixel motion probability estimation. Our method includes the semantic-free estimation module and an improved pose optimization process. The motion probability is calculated through static background differencing on both images and optical flows from splatted frames. DynaPix fully integrates those motion probabilities into the tracking and optimization modules based on ORB-SLAM2 framework.
All reported experiments are detailed here. (with additional results from other methods for reference)
We provide two sequences here for direct testing, involving the FH
sequence from GRADE dataset and halfsphere
sequence from TUM RGB-D dataset, which have following structure:
Sequence
├── rgb/ # images in dynamic scenes
├── background/ # images with (estimated) static background
├── prob/ # estimated motion probability for SLAM
├── depth/ # depth map
├── (depth_bg)/ # depth map for static background
├── groundtruth.txt # camera pose groundtruth
└── association.txt # correspondence between depth and rgb images
- Download complete datasets :
GRADE [Sequence] [Motion Prob] / TUM RGB-D [Sequence] [Inpainted Background/Motion Prob] - Download complete experiment results (estimated trajectory files) here
- Generate motion probability on GRADE dataset
cd motion_estimation python3 motion_grade.py --output
- Generate motion probability on TUM-RGBD dataset
cd motion_estimation python3 motion_tum.py --output
Note: Please specify the input directory and output directory in
configs/GRADE.yaml
andconfigs/TUM.yaml
before generation.
DynaPix is derived from ORB-SLAM2, while DynaPix-D is derived from DynaSLAM. In this version, we only implement the RGB-D tracking for testing. Plase follow the instructions in each folder for more details.
- Evaluate absoluate trajectory error (ATE) and tracking rate (TR):
mkdir RESULT && cd RESULT/ mkdir DynaPix cp ${ESTIMATED_FILE} RESULT/DynaPix/ python3 eval/eval.py ${SEQ}/groundtruth.txt ${SEQ}/association.txt RESULT/
Note: The evaluation scripts are derived from TUM SLAM evaluation tool for multiple trajectories testing and evaluation.
@misc{xu2023dynapix,
title={DynaPix SLAM: A Pixel-Based Dynamic SLAM Approach},
author={Chenghao Xu and Elia Bonetto and Aamir Ahmad},
year={2023},
url={https://arxiv.org/abs/2309.09879},
note={to appear},
booktitle={DAGM German Conference on Pattern Recognition 2024}
}