Tasneef Robotic Arm

Tasneef is a Miniature Robotic Arm (5 DOF) that sorts integrated circuit boards using machine learning and computer vision

Abstract:

This work presents the design and implementation of a pick and place 5 degrees of freedom (DOF), 3D printed, robotic arm guided by computer vision. Machine learning was used to detect various objects. A pre-trained ResNet-50 Common Objects in Context (COCO) model was utilized and further trained on a GPU using faster R-CNN architecture to identify several microcontroller boards placed on a workspace table. A dataset of 300 manually labelled images, increased to 1200 images using data augmentation process, is fed to the model. The trained model is then able to identify the objects with high accuracy and return the exact pixel location of each item. The pixel locations are then mapped to real life coordinates by the help of AprilTags and a mapping function. Using the forward and inverse kinematics, formulas were derived to calculate the servos angles necessary to move the arm to the desired coordinates. The angles are then sent to a designated MQTT broker that facilitates the communication to the microcontroller controlling the robotic arm. A system is implemented to incorporate a software threaded design using Python programming language and utilizing PyQt5 and TensorFlow (for object detection) libraries as the backbone of the system. A graphical user interface is designed as a front end, serving useful system information to the end user. It is also responsible for starting and stopping the system. A simulation of the robotic arm is created using data science libraries in Python (Matplotlib and Numpy) and is used to test different real-life scenarios and configurations of the robotic arm before real life implementation.

Description

    Tasneef is a Miniature Robotic Arm (5 DOF) that sorts integrated circuit boards using machine learning and computer vision