Throughout this sprint, we created a entirely new 3D hand that holds its own motors, as well as finalized our sensor placements for the muscles. We also finalized training our models and finalized script that can identify gestures in real time. We will now be starting full integration, creating a stand for the arm, a space for circuitry and debugging for Demo Day!
We have completed printed the entire hand! Inside of the front plate there are spaces for every finger motors to sit in and there are specific pullets that pull the wires through.
electrical
We have connected multiple sEMG sensors to the users arm in the ideal location. We have also updated our Arduino code to take in the prediction from our model and move DC motors based on what the model predicts from the inputted sEMG data.
Software
We have now trained another model with 5 sensors and 5 gestures. It had a testing accuracy of 100% and can confidently identify gestures in real time using a script
Almost Demo Day Ready!
This will be the basis for our final project! We have changed sensor placements, had five sensors, and shelled out places in the CAD for our motors to sit in. We are using nine motors (eight dc and one ac), three motor shields, Arduino Uno R4, and a Raspberry Pi.
What we hope to accomplish for the final!
We will create a stand and wrist CAD to hold the mechanical arm. We will also train the model on more gestures, hopefully getting to our goal of a model that can confidently identify up to 20 gestures. We will also integrate and fully put everything together!