Collecting Game Pieces
With a calibrated robot arm and a calibrated vision system we are now in the position to let the robot arm autonomously react to its environment. The final goal is to make it play Tic-Tac-Toe against a human opponent but as an intermediate (and simpler) task we make the robot collect game pieces from the base board. This requires that the system correctly detects game pieces and then picks up each game piece and drops it into a bin at a fixed location. The final 'product' looks like this:
The Collector application is part of the Collector.sln solution (http://code.google.com/p/robotic-tic-tac-toe-lynxmotion/source/browse/trunk/#trunk/Collector).
The two buttons at the bottom right trigger robot moves:
- The Home button causes the robot to move to its home position.
- The Collect button starts the collect task. The robot collects the detected game pieces and drops them into the collection bin. Once this is done it moves to its home position.
Before pressing either of the two buttons the calibration data files that were created in previous sections need to loaded by clicking on the corresponding File menu entry:
Once the locations of the calibration files are defined the application will try to automatically load the configuration data during subsequent start ups of the app.
The image handling code is very similar to the code of the camera calibration application. Again OpenCV's FindContours function is used but this time we are are no longer just looking at rectangles. The app displays the detected blobs (game pieces) in the picture box at the bottom (see screenshot above). In order to be able to see the order of the detected blobs they are drawn with increasing line width.
Controlling the Robot Movements
The code that controls the arm movements resides in the MainFormModel class (MainFormModel.cs). As an example let's look at the StartToMoveToHomePosition function which is executed when the Home button is pressed:
internal void StartMoveToHomePosition()
this.IsBusy = true;
Action action = new Action(MoveToHomePosition);
action.BeginInvoke(new AsyncCallback(OnMoveToHomePositionComplete), null);
private void MoveToHomePosition()
Debug.WriteLine("Moving to home position.");
MoveLogicalCommand moveCommand = new MoveLogicalCommand(m_Servos);
JointAngles jointAngles = GetJointAngles(m_HomeLocation.X, m_HomeLocation.Y, m_HomeLocation.Z);
moveCommand[ChannelId.Base] = jointAngles.BaseAngle;
moveCommand[ChannelId.Shoulder] = jointAngles.ShoulderAngle;
moveCommand[ChannelId.Elbow] = jointAngles.ElbowAngle;
moveCommand[ChannelId.WristUpDown] = jointAngles.WristTiltAngle;
moveCommand[ChannelId.Gripper] = c_OpenGripperPosition;
private void OnMoveToHomePositionComplete(IAsyncResult result)
this.IsBusy = false;
Since the move takes some time the code that controls the move is executed asynchronously. While the move is happening the IsBusy flag is set to true which prevents additional moves from being triggered while a move is already in progress. A new MoveLogicalCommand instance is created and initialized with the desired angles and gripper position. The desired joint angles are calculated in the function GetJointAngles from the coordinates of the target location:
private JointAngles GetJointAngles(double x, double y, double z)
z = z + RobotArm.EndEffectorLength;
Vector3 wristLocation = new Vector3(x, y, z);
JointAngles jointAngles = RobotArm.DoInverseKinematics(wristLocation);
jointAngles.WristTiltAngle = GetWristTiltAngleForVertical(jointAngles);
private double GetWristTiltAngleForVertical(JointAngles jointAngles)
return -90 + jointAngles.ShoulderAngle - jointAngles.ElbowAngle;
The angle for the wrist is calculated so that the gripper points vertically downwards.
The remaining move related code follows the same pattern and hopefully is fairly self explanatory.