-
Project Name
-
Development of multimodal sensor-based intelligent systems for outdoor surveillance robots
-
Project Total Period : 2017.04.01. ~ 2021.12.31.
-
Institutions
- LG Electronics
- ETRI
- KIRO
- SNU
-
-
Perception and Intelligence Laboratory (PIL)
- Professor
- Ph.D. Candidate
-
Machine Intelligence and Pattern Recognition Laboratory (MIPAL)
- Professor
- Ph.D. Candidate
-
Development System Information
- Developed on Ubuntu 16.04
- GPU: GeForce GTX 1070 ( also tested on GTX 1080Ti )
- Note that " master-final " branch is the main branch!
-
Dependencies ( use Anaconda Environment )
- python 2.7
- PyTorch 1.1.0
- torchvision 0.3.0
- CUDA 10.0
- cuDNN 7.5.0
- ROS-kinetic ( Install on Ubuntu Main System )
- need "rospkg" module, install via pip
(rospkg module is needed in the Anaconda Environment, don't install it via pip on the system) - for "pycharm" IDE, refer to THIS
- import-(1): /opt/ros/<distro>/lib/python2.7/dist-packages
also refer to THIS - import-(2): /devel/lib/python2.7/dist-packages
[Note] : "catkin_make" is necessary
(check-out for step-02 in "How to use ROS-embedded current algorithm?" to build Custom ROS Messages)
- import-(1): /opt/ros/<distro>/lib/python2.7/dist-packages
- need "rospkg" module, install via pip
- opencv-python ( install via pip )
- empy ( pip )
- yaml
- numpy, numba, scipy, FilterPy, sklearn, yacs
- sip 4.18 ( for PyKDL support, version number is important! )
- motmetrics ( pip, for MOT Performance Evaluation )
-
How to use ROS-embedded current algorithm?
-
Install ROS-kinetic and set basic things
- Recommended to Install the "ros-kinetic-desktop-full" repository
>> sudo apt install ros-kinetic-desktop-full
- For convenience, add source "/opt/ros/kinetic/setup.bash" to the "bashrc" file
>> echo "source /opt/ros/kinetic/setup.bash" >> ~/.bashrc
>> source ~/.bashrc
- Install other dependencies
>> sudo apt install python-rosdep python-rosinstall python-rosinstall-generator python-wstool build-essential
- Initialize rosdep
>> sudo apt install python-rosdep
>> sudo rosdep init
>> rosdep update
- Recommended to Install the "ros-kinetic-desktop-full" repository
-
Custom ROS Messages and How to Build the Messages
-
ROS message types for publishing inferenced data through roscore.
-
Defines annotation of single distinguishable object.
-
Defines multiple annotation objects.
-
Defines bounding box object of format: XYWH
-
Defines message object for MOT evaluation. (single frame)
This message stores "Annotations" and "Tracks", for evaluation.
Note that this message is declared in a single frame index time.
For using this across frame indices, use "Evaluators" message. -
Defines message object for MOT evaluation. (multiple frame)
-
Defines message object of single trajectory, with specific ID.
-
Defines message object of multiple trajectories.
-
-
How to build osr_msgs
- At the master directory, ( i.e. /path/to/SNU_USR_dev ) run the following:
>> catkin_make
- If successful, then additionally import the following path to the python interpreter:
/devel/lib/python2.7/dist-packages - [Important] Before running the code, run the following at the terminal:
>> source /path/to/SNU_USR_dev/devel/setup.bash
- At the master directory, ( i.e. /path/to/SNU_USR_dev ) run the following:
-
-
Run SNU USR Integrated Algorithm
( for example, say<example>.py
is the execution code. )-
On terminal, run
>> roscore
-
Publish rostopics to the roscore\
- The messages should at least include the following topics,
to successfully execute SNU Integrated Module.
/osr/image_color
/osr/image_color_camerainfo
/osr/image_thermal
/osr/image_thermal_camerainfo
/osr/lidar_pointcloud
- For thorough information, explore configuration files in [Link]
- There are several ways to publish ROS messages to the roscore.
Here, we introduce the following ways.
(1) Using ROS Bag File
- [1] Unpack *.bag file
- Use "rosbag" command (CUI-based) [options]
>> rosbag play <file_name>.bag
- Use "rqt_bag" command (GUI-based)
>> rqt_bag <file_name>.bag
- Use "rqt_image_view" command (for image-like rostopics only)
>> rqt_image_view <file_name>.bag
- Use "rosbag" command (CUI-based) [options]
- [2] Publish all the "rostopics-of-interest"
(i.e.) "/osr/image_color", "/osr/lidar_pointcloud", ... - [3] Play the bag file
- [4] Execute integrated module
(2) By Subscribing Messages via ROS
- [1] From another implemented ROS source, publish appropriate messages.
- [2] Execute integrated module
- The messages should at least include the following topics,
-
Run the execution file
- For command-line,
>> rosrun snu_module path/to/scripts4/<example>.py
- For PyCharm IDE
- run as same as ordinary pycharm projects
(debugging is also possible)
- run as same as ordinary pycharm projects
- [ Note ]
- For " rosbag " mode, execute [ ros__run_snu_module.py ]
- For non - "rosbag " mode, execute [ run_osr_snu_module.py ]
- For command-line,
-
-
Other Important Source Codes or Parts
- module libraries
- Currently, v4.5 is the latest version and public version only.
- We do have previous versions, but currently have no intentions to make them public. ( however, they might be found manually from the git histories )
- module_bridge.py
- Next important source code script to the execution file.
- Defines class that incorporates each individual module.
- models
- All models are included ( do not need to download from external drives )
- LiDAR-to-Camera Registration
- Parameters for Unmanned Robots are given [Here]
- Class to Simulate ROS TF2 message are given in the same registration folder
- Visualization Code
- Defines Visualization Codes
- ROS Base Codes
- base.py
- Provides Multimodal Sensor Subscription Functional
- wrapper.py
- Provides Publishers
- sync_subscriber.py
- Provides Synchronization functional for multimodal sensor subscription
- base.py
- See through codes for detailed implementations.
- module libraries
-
- rospkg.common.Resourcenotfound (tf2_ros) [on PyCharm IDE]
- Run PyCharm at the terminal with Anaconda environment activated
-
Detection Results
- Daytime
- RGB
- Thermal
- Nighttime
- RGB
- Thermal
- Daytime
-
MOT Results
- Daytime, RGB
- Nighttime, Thermal
-
Action Classification Results
- Daytime, RGB
- Nighttime, Thermal