Piper Hand-Eye Calibration

Piper Hand-Eye Calibration

  • Eye-in-Hand: The camera is mounted on the end effector of the robotic arm, with the calibration board fixed. The resulting transformation matrix is T_ee_cam (end effector to camera). For grasping applications: T_base_cam = T_base_ee × T_ee_cam

  • Eye-to-Hand: The calibration board is fixed on the end effector of the robotic arm, with the camera stationary. The resulting transformation matrix is T_base_cam (base coordinate system to camera). For positioning applications: T_ee_cam = T_ee_base × T_base_cam

Prerequisites

Calibration Steps

  1. Create a workspace and clone packages

    mkdir -p ~/handeye/src
    cd ~/handeye/src
    git clone -b humble-devel https://github.com/pal-robotics/aruco_ros.git
    git clone -b humble https://github.com/agilexrobotics/piper_ros.git
    git clone -b humble https://github.com/agilexrobotics/handeye_calibration_ros.git
    
  2. Build the workspace

    cd ~/handeye
    colcon build
    
  3. Run the ArUco marker detection program

    source ~/handeye/src/install/setup.sh
    
    # marker_id and marker_size must match the actual calibration board specifications
    ros2 launch aruco_ros single.launch.py eye:=left marker_id:=582 marker_size:=0.0677
    
  4. Connect the camera, run the camera program, and publish camera image and camera info topics. This example uses an IntelRealSense camera.

    # "left" corresponds to the eye parameter in Step 3
    # Remap the original camera topics to the topics subscribed by the ArUco detection program
    ros2 run realsense2_camera realsense2_camera_node --ros-args \
    -p rgb_camera.color_profile:=640x480x60 \
    --remap /camera/camera/color/image_raw:=/stereo/left/image_rect_color \
    --remap /camera/camera/color/camera_info:=/stereo/left/camera_info
    
  5. Connect the Piper robotic arm, activate the CAN module, and run the robotic arm program

    bash ~/handeye/src/piper_ros/can_activate.sh
    source ~/handeye/src/install/setup.sh
    ros2 launch piper start_single_piper.launch.py can_port:=can0
    
  6. Run the Piper hand-eye calibration program

    # mode parameter: eye_in_hand or eye_to_hand (corresponding to eye-in-hand and eye-to-hand calibration modes)
    ros2 run handeye_calibration_ros handeye_calibration --ros-args \
    -p piper_topic:=/end_pose \
    -p marker_topic:=/aruco_single/pose \
    -p mode:=eye_in_hand
    
  7. Control the robotic arm and follow the terminal prompts from the calibration program. Two control methods are available:

    • Teach Mode: Click the teach button and directly drag the robotic arm. This is MIT mode, with lower precision than position-velocity mode.
    • Gamepad Control:Enables position-velocity mode for higher precision, requiring a gamepad teleoperation program.

Additional Notes

  1. View the image

    ros2 run image_view image_view --ros-args --remap /image:=/aruco_single/result
    
  2. View the ArUco marker pose

    ros2 topic echo /aruco_single/pose
    
  3. View the robotic arm end effector pose

    ros2 topic echo /end_pose
    
  4. The camera info topic contains intrinsic and distortion parameters, which are required by the ArUco marker detection program.

  5. For other cameras (e.g., industrial cameras, regular USB cameras), additional intrinsic and distortion calibration may be required. These parameters need to be encapsulated into the camera info topic.