Skip to content

ketaro-m/foosball_robot

Repository files navigation

Foosball Robot

This is a vision controlled foosball table. The camera above the table processes the image and tracks the ball's position. The Arduino receives that information via ROS communication and control six stepper motors simultaneously; linear and rotation axes for three bars.

cameracallibrator.png

Demo

Dome video is here. → https://youtu.be/knL-YN4Qc_c

Environment

Hardware

Circuit diagram

Software

  • Ubuntu 18.04
  • ROS Melodic Morenia

ROS packages

System diagram

cameracallibrator.png

1. Camera Calibration

ROS Wiki

First, download the checker board (check_7x6_27mm.pdf) and run the calibration program. Move the checker board around in the camera view.

$ ls /dev/ | grep video # check the device name

# run calibration programs (in different terminals)
$ roscore
$ rosrun uvc_camera uvc_camera_node _device:=/dev/video1
$ rosrun camera_calibration cameracalibrator.py --size 7x6 --square 0.027 image:=/image_raw

cameracallibrator.png

Then extract the gz file and place the yaml file in the appropriate directory.

# make camera.yaml
$ mkdir camera_calibration; cd camera_calibration
$ mv /tmp/calibrationdata.tar.gz ./
$ tar -xvzf calibrationdata.tar.gz
$ mv ost.yaml camera.yaml
$ vi camera.yaml 
# rewrite "camera_name: narrow_stereo" -> "camera_name: camera", 
#         "camera_model: plumb_bob" -> "distortion_model: plumb_bob"

Finally, check the result

$ roscore
$ rosrun uvc_camera uvc_camera_node _device:=/dev/video1 _camera_info_url:=file://$PWD/camera_calibration/camera.yaml
$ rosrun image_proc image_proc
$ rqt_image_view

or run this launch file (yet, need to modify the absolute path to camera.yaml in line 4)

$ roslaunch launch/calibration_demo.launch # use "calibration_demo_usb.launch" instead for usb_cam package 

image_raw.png image_rect_color.png

2. Ball Tracking

1. Check the pixels of the fields.

After establishing the camera nodes, open rqt_image_view and select /image_rect_color. Then click the top-left and bottom-right points and check their pixel x, y values with echoing /image_rect_color_mouse_left.

$ roslaunch launch/track_ball.launch # use "track_ball_usb.launch" instead for usb_cam package
$ rqt_image_view # select /image_rect_color
$ rostopic echo /image_rect_color_mouse_left

Fill these values in scripts/hsv.py and scripts/track_ball_demo.py.

field_area = [[33, 50], [605, 405]] # [[top-left x,y], [bottom-right x, y]]

2. Optimize HSV value

In order to track only the ball, you have to optimize the HSV value of the mask filter.
To do so, save a camera view image using the save button of the rqt_image_viewer, and run scripts/hsv.py with the image.

$ python scripts/hsv.py [image file name]

Click the ball in the "hsv" image window, and check the HSV value which is printed on the terminal.

hsv_optimization.png


Lastly, to check if the ball is correctly tracked, fill the hsv_lower and hsv_upper values obtained above and run scripts/track_ball_demo.py. Looking at the result, you might have to modify hsv values and filter sizes of image processings.

(track_ball_demo.py)

# obtain from hsv.py
hsv_lower = np.array([20, -10, 100])
hsv_upper = np.array([50, 64, 300])
median_size = 7      # filter size for median filter
morpho_size = 13     # filter size for morphology processing

In terminal, run the following commands.

$ roslaunch launch/track_ball.launch
$ python scripts/track_ball_demo.py

hsv_optimization.png

3. Publishing ROS topic

Finally, publish the ball position as a ROS topic, which will be subscribed by the Arduino. scripts/ball_position_publisher.py conducts the entire image processing of the previous sections and publishing ROS topic /ball_position (opencv_apps/Circle). This script will be used as the final form of this section.
Fill the parameters in line 11~18 and confirm that the ball position is correctly published.

$ roslaunch launch/track_ball.launch # use "track_ball_usb.launch" instead for usb_cam package
$ python scripts/ball_position_publisher.py
$ rostopic echo /ball_position

3. Let's play!

Now you've prepared all pieces. Kickoff is almost there!

After writing "sketchbook/L6470_SPI_stepMoter_sketch/L6470_SPI_stepMoter_sketch.ino" into the Arduino Mega, launch the uvc_camera node and rosserial node by the following command.

$ roslaunch launch/stepper_camera_driver.launch # use "stepper_camera_driver_usb.launch" instead for usb_cam package

Once the nodes have been successfully launched, the motor positions will be initialized like this.

hsv_optimization.png

The last thing you have to do is launching the image processing node. If you start running this program, the whole ROS pipeline is established and the robot starts moving.

$ python scripts/ball_position_publisher.py

Enjoy fighting!

hsv_optimization.png

Appendix

Appendix A: Installing rosserial

First, you need to install the ros packages.

$ sudo apt install ros-melodic-rosserial
$ sudo apt install ros-melodic-rosserial-arduino

Second, go to the libraries directory below the installed Arduino folder, and run the following setup script which is given by the rosserial package.

$ cd ~/arduino-1.8.13/libraries
$ rosrun rosserial_arduino make_libraries.py .
$ ls | grep ros_lib  # check if it's successfully installed
ros_lib

About

Vision controlled foosball table robot

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published