We are a group of undergraduate students who developed a prosthetic arm that mimics hand movements!
Our Project!
We created a prosthetic arm that is capable of using sEMG (surface electromyography) data to mimic user input live using machine learning.
Demo Day Final Product!
Unfortunately, despite our very best efforts, we couldn't get our final product to function as we intended for our stretch goals. This can mostly be attributed to issues with DC motors not being precise enough with the way we were controlling them. For future iterations of our project, it would have been much easier to use regular servo motors as we would be able to connect them directly to the Arduino and not have to worry about the motor shields. Servo motors are also much better at precise movement which is why we implemented them in our first 2 sprints. We could also continue using DC motors, but this time implementing a closed loop control system with a high resolution feedback sensor (encoder) and a well tuned controller.
Project Overview
This project aims to develop a prosthetic arm that uses surface electromyography (sEMG) data to replicate natural hand movements. The prosthetic will mimic various hand gestures, including forming simple gestures like rock-paper-scissors, moving the wrist, and flexing the fingers. By interpreting sEMG signals from sensors placed on the user’s arm, the system will control the prosthetic’s fingers and wrist to create intuitive, fluid movements.
MECHANICAL DESIGN
Design and 3D print a fully functional prosthetic arm with joints and moving fingers
Develop a forearm block that serves as a base, with motors controlling fingers movements and wrist rotation
Stretch Goals: Add wrist movement (rotation and flexion) and ensure the arm's motors are optimized for size, strength, and cost
SOFTWARE SYSTEM
Train a machine learning model to identify hand gestures and map the signals to the prosthetic's motors
Develop a user interface for real-time feedback and testing, ensuring the model works seamlessly with the hardware
Develop an effective data collection system for anyone that may want to use this code to collect their own data and train a model to identify their gestures.
Stretch Goals: Achieve real-time gesture recognition for more complex actions, such as wrist movement & continue tuning and testing the model for improved accuracy and user adaptability
ELECTRICAL DESIGN
Integrate individual motors to control each finger and wrist movement independently
Use Myoware sEMG sensors to collect data from muscles to train a model for accurate movements
Streamline the electrical components to fit within the arm or in an external, inconspicuous enclosure
FIRMWARE SYSTEM
Effectively send signals from sensors to Arduino then to Raspberry Pi for the model we create to make its prediction
Use Arduino to move servos based on prediction from our model
01
Sprint 1!
We've created a cardboard hand sketch that recognizes data from a single sEMG sensor to move a servo motor which opens and closes a hand. We have also been working on CAD simultaneously!
We've printed the CAD model. We have also trained several models on sEMG data to learn to how to best approach categorizing our data. We are looking into wrist movement and stand for the arm. We are also adapting our sensor placements to perfect our data collection methods.
We have redone the entire arm CAD with a hand that can hold its own motors! We have also perfected sensor placements with new muscles. We also now have a working software implementation which can predict gestures in real time to a high degree of accuracy.