Developed embedded software and hardware for an external neural prosthetic using a custom EMG armband.
This project was a deep dive into building a functional, affordable prosthetic arm controlled by muscle signals – a real challenge that came from a simple, yet complex, desire: to bring advanced neurotechnology to life and make it accessible. It wasn't just about building a device; it was about truly understanding how we can bridge the gap between human intent and robotic action.
This project started as a general project for Triton NeuroTech. The main concept was to teach beginners how to build a BCI, or brain-computer interface, by building an external prosthetic hand and controlling it using neural signals.
We explored using electroencephalography (EEG) and electromyography (EMG). We decided on EMG because it seemed like a more viable way for beginners and offered more control with the hardware we had available.
Prosthetic arms have come a long way, but often they're either incredibly complex and expensive, or too basic to be truly functional. Our goal was to create something in the middle – a 3D-printed arm that was both controllable and accurate, leveraging the power of electromyography (EMG).
Speaking of which, the biggest challenge was that we had to do this with the OpenBCI Cyton (a low-cost 8-channel device) but didn't have the corresponding electrodes for EMG or EEG, so we had to create our own electrodes. We borrowed from our drone project the year before and continued iterating on the penny-based armband we created. This was our first iteration.
As the Software Lead, I stepped into a role that was about more than just writing code; it was about orchestrating a symphony of software, hardware, and machine learning. I led a dedicated team, guiding them through the intricate dance of programming, signal processing, and real-time control. Our success hinged on tight collaboration with the hardware and machine learning teams – we were constantly feeding off each other's progress, troubleshooting, and brainstorming.
We started with the very basics: Arduino and Arduino C++ to meticulously program each individual finger movement of the prosthetic hand. It was a journey of trial and error, from the raw code to perfecting the interface between the Arduino IDE and VSCode for optimal results.
Then came the ambition to create a user-friendly experience. I'm actively developing a web application to provide a clear, real-time window into the EMG signals from the armband and to give users intuitive control over the hand. This wasn't just about functionality; it was about making complex tech approachable.
A crucial piece of the puzzle was the communication backbone. We used the Firmata protocol, with a Python script running PyFirmata, to bridge the gap between our machine learning models (running on a Jetson) and the Arduino microcontroller on the arm. This meant seamless translation of classified hand positions into actual servo movements. I personally crafted the findValue
and moveFinger
functions, the digital "fingers" that interpret the ML model's output and bring the prosthetic to life.
Beyond the code, I also got my hands dirty (literally!) with the hardware. I was deeply involved in the manufacturing of our custom penny electrodes and the 3D printing and design of the armband components. We adapted an existing EMG collection device, adding more channels to our armband to boost classification accuracy. We also redesigned the armband's 3D-printed casings to ensure consistent electrode placement, which is paramount for clean signal acquisition. This hands-on experience gave me invaluable insight into the full scope of the project.
The culmination of our efforts is a prosthetic hand built from meticulously 3D-printed parts, controlled by servo motors and an Arduino, with fishing lines providing fluid, natural finger movements.
The magic happens when the custom 4-channel penny electrode armband captures EMG signals from the forearm, transmitting them wirelessly via a CYTON board and Bluetooth dongle. We used OpenBCI and LabRecorder to calibrate and record these signals, syncing them with participant prompts to create rich, labeled datasets for our machine learning models.
Unlike EEG, EMG offers a higher acuity for arm movements, which was key to our project's success. The ultimate aim is for this arm to respond instinctively to classified hand positions, offering a truly intuitive experience.
This wasn't just a project; it was a testament to interdisciplinary teamwork and persistent problem-solving. My role as Software Lead, guiding my team and collaborating with others, was incredibly rewarding. We’ve managed to create software that accurately determines hand open/close movements, seen significant leaps in machine learning classification rates, and developed innovative hardware designs.
The journey continues(just with a different team at Triton NeuroTech), with plans to refine the mechanical design, further enhance the machine learning, integrate everything seamlessly, and build an even more user-friendly graphical interface. It's a powerful reminder that sometimes, the most impactful solutions come from a collective effort, driven by curiosity and a bit of creative ingenuity.
Here’s a small compilation of some of the resources our team put together for presentations, demos, etc.
2022_Prosthetics_Banquet_Slides.pdf