EMG-Controlled Mouse (aka aeromus)

An award-winning neurotech project using tripolar concentric electrodes for high-resolution control interfaces.

neurotechnology
wearable devices
electromyograpahy (emg)
manufacturing
signal processing

About this project

This project started as a wild idea and turned into a full-fledged EMG-based mouse controller. The goal was to make neurotechnology more tangible and accessible—something you could wear, interact with, and build yourself. I led the hardware side of things, designing and fabricating the armband that would translate muscle contractions into cursor movements and clicks.

We submitted it to the international NeuroTechX Student Club Competition—and to our surprise, it took 2nd place globally and won the Neuroethics Award 🧠✨.

The problem we wanted to solve

Brain-computer interfaces (BCIs) are powerful but often hard to access—expensive, invasive, or just intimidating. We wanted to build a wearable, muscle-based system that was intuitive and hands-on, especially for people new to neurotech. Basically: what if your arm could become a mouse?

My role (Hardware Lead)

I focused on building the physical armband and electrodes:

  • Designed custom EMG armbands that were secure but flexible enough for muscle movement
  • Manufactured and integrated electrodes, testing different configurations
  • Ultimately decided on tripolar concentric ring electrodes (CREs), inspired by literature showing improved spatial resolution
  • Worked closely with our signal and software teams to make sure our physical setup matched the needs of the pipeline
  • Added copper shielding and ensured consistent skin contact for clean signal acquisition
  • Iterated through multiple versions of the band to optimize comfort and signal quality

How it works

  1. Muscle contractions in the forearm are captured using our dry CRE electrodes
    1. we used stainless steel since we could laser cut it at our makerspace (but not zinc…check out Zinc-Based Wearable Electrodes for how me & a few others expanded on this project!)
  2. The OpenBCI Cyton board amplifies and digitizes those signals
  3. Data is streamed to a computer where a custom signal processing pipeline classifies the input (e.g. a “clench” = click)
  4. At the same time, the accelerometer on the Cyton captures 3D movement—which the on-screen cursor mirrors
  5. Using packages like PyAutoGUI, the system translates movement + gestures into real-time mouse control

Our team

This was a full interdisciplinary effort:

  • Hardware: Me (Cassia Rizq), Lea Winner, William Zhang
  • Software: Joelle Faybishenko, Akhil Subbarao, Edward McGee
  • Data Science: Aidan Truel, Sawyer Figueroa, Nakshatra Bansal
  • Mouse Integration: Nakshatra Bansal, Gavin Roberts
  • Video editing: William Zhang
  • Shoutout to our Triton NeuroTech mentors + community!

Awards & Recognition

🏆 2nd Place Winner – NeuroTechX Student Club Competition 2023

🧠 Neuroethics Award – for designing an accessible system and thoughtfully considering data privacy, inclusivity, and long-term user impact

Reflections

This project was hands-down one of my favorite builds. It brought together biomedical engineering, signal processing, ethics, and creativity—plus it was just really cool to demo. I learned a lot about hardware constraints in wearable neurotech, and how critical the electrode interface is for downstream performance.

It was also amazing to work on something that resonated with the broader neurotech community and felt like it could actually help people.

Watch the demo

🎥 Watch our full video demo on YouTube

Let me know if you want the CAD files, STL files, or references—I’m always happy to share or help others build their own version.