Autonomous airhockey robot
Brief overview
The project aims to have a Franka Emika 7DOF robotic arm autonomously play airhockey againts an opponent. The project was inspired by the fast dynamics of the game and the variability in game play which does not allow any hard coding and decisions need to be made in real time. Our project used a Realsense camera with OpenCV to detect the puck and calculate the transformation of the puck relative to the robot. The puck coordinates are then used to calculate the predicted trajectory of the puck and estimate where the robot should hit the puck along the predicted trajectory line. Robot Operating System (ROS2) and the MoveIt2 motion planning framework are used to interface with the robot and control its movements. A ROS2 python API wrapper was developed as it was not currently avaible when this project was developed. The custom API is in the moveit_helper package.
Video demo
Collaborations
- Marno (Marthinus) Nel
- Team Leader, Systems Integrator (Git), Trajectory calculations, Robot manipulation, and assisted on Computer vision.
- Ritika Ghosh
- Hanyin Yuan
- Ava Zahedi
Concepts and Overall System Architecture
The process loop of the robot is as follows:
Start-Up Sequence
- Upon startup, the robot follows a start-up sequence to reach its home position. The robot follows a series of waypoints to reach the home x- and y-coordinates with an offset in the z. It then reaches down to grasp the paddle (with an adapter) and moves back up slightly. This slight increase in height allows the robot flexibility while moving, so that if it pushes down during movement, it will not apply a force into the table while still keeping the paddle level with the table.
Computer Vision
- Intel RealSense D435i is used at 480x270x90 allowing the puck to be tracked at 90 fps. As soon as the streaming has been enabled, this node detects the center of the table in pixel coordinates. Then with the help of the depth camera, the deproject function is used to convert pixel coordinates into real world coordinates with respect to the camera. Since the distance between the air hockey table and the Franka robot is known, points from the camera’s frame of reference can be transformed to the robot’s frame of reference. Next, with the help of OpenCV’s HughCircles function the center of the puck is tracked in real time. For the calculation of the trajectory, the puck is only tracked when going towards the robot and up to the center of the table. In order to get rid of noise, before publishing the puck’s position it is checked whether the point is close (with a prefixed tolerance) to the best fit line of the previous positions obtained. Note: The output video shows the tracked puck encircled with a black border, regardless of whether all these points are published (in other words, the video shows the contour for every direction of movement of the puck).
Trajectory Calculations
Calculates the predicted trajectory of the puck and the play waypoints for the robot by using two puck coordinates from computer vision. The node handles collisions by reflecting the impact angle about the normal line. The waypoints for the robot to hit the puck are constrained by the robot’s workspace on the air hockey table. The most optimal waypoints are selected by considering all four sides of the robot’s workspace. The robot will then move to the first waypoint that is on the predicted trajectory line of the puck and then move along the line to the second waypoint and hit the puck. A plot is dynamically generated and updated each time a new trajectory is calculated. The robot blocks if the trajectory is out of the workspace and unreachable.
Hit the Puck
- After receiving waypoint and goal positions, the robot receives service calls to move to those points, thereby meeting the puck along its trajectory and hitting it. If there is an edge case where the robot cannot successfully meet the puck given its trajectory, the robot will block instead.