A wireless controller for real-time music interaction.

The goal of this project was to create a playful musical instrument controller which may be enjoyed by all musicians, regardless of their skill level.

Project Info:

May 2020
Personal Project
4 Weeks
Master Digital Design


Interaction Design
Sound Design
Machine Learning
Physical Prototyping

Project Background

Most gestural electronic instruments require precise movements to produce pleasant sounds and learning them can be tedious. Ml.cubes detect exact position thresholds allowing musicians to freely explore an extensive and diverse soundscape without having to master the instrument.

ml.cubes prototype on a wooden desk next to music speakers, laptop, and audio interface.


I was inspired to start this project after taking an online course which explored using machine learning technologies for the purpose of building real-time controllers for music, games and interactive art. I learned from this course that machine learning technologies can allow for the creation of quick, but accurate working prototypes and so began the ml.cubes project.

As for the physical design of ml.cubes, they are intentionally colourful and are meant to stand out. The geometric patterns on the sides of the cubes were hand-painted using acrylics and masking tape in a hard-edge style. The wireless design enables the cubes to be easily positioned on top of a table, piano, guitar amplifier, or wherever else feels appropriate during a live performance. Two cubes means that the sound combinations that one can experiment with are virtually endless.

colourful painted sides of the ml.cubes music controllers laid out on a wooden desk.


  • MicroBit wireless accelerometers
  • Cycling 74's Max
  • Ml.lib SVM machine learning library
  • IRCAM's Modalys
  • Mi-Creative's Mi-Gen

The ml.cubes send wireless accelerometer data via a Bluetooth connection to the ml.cubes program. This data is fed to an SVM machine learning model to detect cube face positions. Each unique cube position combination triggers audio effect changes. I used physically modelled instruments to achieve realistic sounds.

wireless accelerometer inside of one of the unfinished ml.cubes music controllers.


Minimizing the delay time between gestural actions and their associated sounds was my biggest challenge. Musical instrument controllers must have a near-zero millisecond response time in order to feel natural. My solution for minimizing lag was to use a more advanced type of accelerometer smoothing called Kalman filtering.

Another challenge was providing users feedback for how their cube movements were influencing sounds. It became clear from user comments during testing that a visual clue was needed to help learn interactions and to reduce errors. I added a simple graphical user interface to show:

  • How the cubes are positioned in real-time (via accelerometer data)
  • Which sound (classifier number) is currently being triggered.
ml.cubes graphical user interface and performance set-up.

future direction

This is a first version prototype. My future goals for the ml.cubes design include:

  • Interfacing directly (via MIDI) with other electronic music controllers such as synthesizers and drum machines
  • Adding soft lighting controls for poorly lit performance situations.