Skip to content

Demo_en

段双达 edited this page Apr 15, 2019 · 5 revisions

How_to_run

  1. Setup the Baxter Robot

Please follow the instructions below to ensure you properly setup the Baxter robot.

  1. Online Update Robot URDF

Due to the installation of an FT sensor and a pair of Tactile sensors in the right hand, we need to update Baxter's standard URDF description file to the current physical configuration so as to ensure that the robot dynamics won't be affected. This process has to be done online and every time turn the robot is turned on. So you need to run the below command every time you run these experiments.

rosrun birl_baxter_online_urdf_update update_urdf.py
  1. Launch Camera. Two methods are provided:

Scheme 1: Launch Xtion camera mounted on baxter's base

roslaunch openni2_launch openni2.launch 

Scheme 2: Launch baxter's built-in left hand camera

rosrun birl_kitting_experiment setup_baxter_left_hand_camera.py

Note that to better detect objects by the robot's left camera, we command the left arm to move to an optimized position for our experiment. These are hard-coded values.

rosrun birl_kitting_experiment set_left_arm2static_pose.py

In the current experiment setup, we use alvar markers attached on each plane of an object's surface to get an object's ID and pose. Due to vision's error and the z-axis published by marker is pointing up, we need to make a hard-coded transformation to minimize the pose error of the object produced by the image processing. We also set the z-axis to point down.

roslaunch birl_kitting_experiment alvar_marker_demo.launch

TODO: Computer Optimization: The process complexity of running all these processes in one single computer, depending on your processing power, might require you to run on multiple workstations.

TODO: Describe which processes to run in which computers. TODO: Describe how to connect different computers to the same ROS master.

  1. Sensor setup
  • Launch FT Sensors
rosrun robotiq_force_torque_sensor rq_sensor

Note:if the sensor port cannot be found, you might need to unplug and replug any of the sensor cables going into the computer. Sometimes, you might need to repeat this process several times. E.g. cannot find /tty/~~.

Also, if the sensor port is shown to be busy, try the following command: chmod 776 /tty/*

  • Launch Tactile Sensor
rosrun tactilesensors4 PollData4

5.Launch MoveIt

  • Due the installation of sensors on the End-effector, MoveIt also needs to be updated. Otherwise, the robot planer and the IK solver cannot find a valid solution.
  • In order to communicate with the real robot, launch the JointActionServer shown below to synchronize data between MoveIt and Baxter.
rosrun baxter_interface joint_trajectory_action_server.py
  • Due to the MoveIt's internal mechanics, it needs the topic robot/joint_state to map the real robot's behavior. However, the corresponding topic from Baxter has an additional s topic: robot/joint_states. In order to match them, we write a script that maps robot/joint_states to robot/joint_state. This script has been added to the MoveIt launch file below:
roslaunch birl_moveit_r_ft_config birl_baxter_gripper.launch

6.Running the Experiment To run the kitting experiment, roslaunch as shown below:

rosrun birl_kitting_experiment smach_based_kitting_experiment_runner.py

Note: You need to distinguish the goal of running the experiment (training/testing). During training, you will run the nominal version of the experiment. There will be no need to start the anomaly detection nor the anomaly classification services. During testing, those two services (listed below) should be started first. Anomaly detection publishes flags that are used by both the kitting experiment and anomaly classification nodes. Some warning messages will be published if a node is missing.

7.Launch Anomaly detection service Has two modalities: training and testing. For further details click here.

  • Training: run at least 5 set of demos are enough to train the model for anomaly detection
rosrun smach_based_introspection_framework anomaly_detection.py

8.Launch Anomaly Classification Has two modalities: training and testing. For further details click here.

  • To train the anomaly classifier, Some anomalous demo is needed.
rosrun smach_based_introspection_framework redis_based_anomaly_classification.py