Throughout this sprint, we created a cardboard hand sketch model of our final vision, while simultaneously working on CAD and researching signal processing for Sprint 2.
We made the cardboard hand sketch model to test the functionality of electrical and software components. This allowed our mechanical leads to have more time to work on the CAD for Sprint 2.
electrical
On the electrical side, we have connected a single sEMG sensor to Arduino to control a single servo motor. It can identify whether your hand is clenched or relaxed and controls the servo based off of that.
Software
We've learned to process sEMG signals as we wanted to use raw signals to process sEMG signals and made an initial MATLAB script to use in signal processing. We have also begun learning more about machine learning and how we can apply it to our goal of making the hand respond to specific gestures from the user.
Cardboard Sketch Model
We have a cardboard hand that controls all of the fingers and thumb with one emg sensor. We are using one servo motor and an Arduino Uno R4.
What we hope to accomplish in the next sprint!
To create a fully fabricated 3D printed mechanical arm that has necessary detailed features, such as casing that hold all electrical components and channels for the strings in the pulley system, with individual servo motors for each finger. We will also determine where to place our sEMG sensors to best collect data to train our model. We hope to train a machine learning model that can identify at least 3 gestures consistently (rock, paper, and scissors). If time allows, we will connect the servos to control the hand and test the hardware.