The paper investigates the challenge of recognizing distinct arm movements of robots in noisy environments using machine learning techniques. A vision system tracks the robot’s movements, and a deep learning model extracts the arm’s key points. The model’s effectiveness and robustness are then evaluated. The study uses a Tic-Tac-Toe game in a 3-by-3 grid environment as a case study to accurately identify the actions of the arms. The results show the approach can achieve precise key point detection and action classification despite noise and uncertainties in the dataset.
Publication date: 19 Jan 2024
Project Page: Not provided
Paper: https://arxiv.org/pdf/2401.09606