In this project, computer vision and motion planning is used to control a robotic arm to grab a pen and leave it iside a box.
The following video is a demonstration of the grab the pen:
Fig. 2 Grab The Pen
This project contains 2 parts:
In order to locate the pen in the 3D space we used OpenCV2.
First step is to transform our RGB image to HSV where it is easier to filter the image and isolate only the purple color (see figure 2 left image).
Since we have isolated the pen from the image we are detecting contours and keeping only the ones that exceed a threshold based on the contour area (see figure 2 right image).
Now that we are able to detect the pen at the RGB image we know the location of the pen at the Depth image, and then retrieving the 3D coordinates.
Fig. 2 Pen recognition
In this part, The interbotix motion and control packages are used to move the arm to a specified coordinates and grab the pen.
The arm motion can be split to 4 parts: