Computer Vision-Controlled Robot Arm
Python, OpenCV, Transforms
Overview
Goal: Use the PincherX 100 robot arm to autonomously grab a purple pen.
Github: https://github.com/henryburon/pen-thief
Process:
- Detect Location of the Purple Pen
- First, I used the RGB image from an Intel RealSense camera to create an HSV mask that filtered out every color except purple.
- Identify Contour and Calculate Centroid
- I added contours around the selected pixels and found the 2D coordinate of the centroid of the largest contour, which I assumed to be the pen.
- Align the Images
- I then aligned the camera’s Depth Map with the RGB Image and found the pen’s 3D coordinates in the camera’s reference frame.
- Transform to Robot Frame
- I converted these coordinates to be in the robot arm’s frame, given its 90° rotation and fixed offset.
- Command the End-Effector
- Finally, I used the InterbotixManipulatorXS Python package to move the end-effector and gripper to the desired coordinate and position.