Skip to content

Combines input of YOLOv3 with gmapping to output a filtered mapping during SLAM. This was my Master's thesis as part of my specialisation in Robotics and AI as Industrial Engineer.

Notifications You must be signed in to change notification settings

christenbc/yolo-object-discrimination-slam

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 

Repository files navigation

yolo_object_discrimination_slam

This is an Open Source package.

This ROS package contains:

  • laser_darknet_filter: This is the engine of the algorithm

  • struct_gmapping: This is a modification of http://wiki.ros.org/gmapping in order to make laser_darknet_filter work properly

Dependencies: https://github.com/leggedrobotics/darknet_ros

Technical note: From YoloObjectDetector.cpp in line 588, instead of "if (num > 0 && num <= 100) {" replace with "if (num >= 0 && num <= 100) {".

https://www.youtube.com/watch?v=92vfkuiwe_Y

This repository was made as part of my MSc's thesis.

Implemented in ROS for a higher-fidelity approach in autonomous mobile robot exploration using LIDAR sensors and a depth camera integrated with the MiR100 robot. In addition, by using state-of-the-art deep learning object detector like YOLOv3, the system filtered out undesired objects from the navigation map.

Please, do not forget to referenciate me if you use any piece of my work. The documentation is still in process to be completed.

About

Combines input of YOLOv3 with gmapping to output a filtered mapping during SLAM. This was my Master's thesis as part of my specialisation in Robotics and AI as Industrial Engineer.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published