Abstract
The goal of this project is to control a robotic arm in real-time through computer vision. Live video feed is used as input for this project. The hand gesture is discovered in the video feed using Google's MediaPipe framework and sent to the Arduino Uno through serial communication. The servos of the robotic arms are programmed by an Arduino Uno in turn. This challenge required the use of MediaPipe and OpenCV. The code was implemented in Python. The present investigation investigates the possibilities for intuitive control of machines or robots through human-machine interaction. This project can address SDG 9 (Industry, Innovation and Infrastructure).