Improvising a Life Between Music, Math and Machines
How my lifelong love of music, science and engineering became the Music Intelligence Lab
I started playing the piano when I was seven. To be honest, I hated it at first. Sitting at the bench, grinding through classical exercises, felt more like a punishment than a gift. A couple of years in, I wanted to give up.
Things changed when I switched instructors and discovered improvisation. Suddenly, music became alive. I could create simple melodies, craft tunes, experiment with harmonies, things that felt mine. That taste of creative freedom changed everything. Eventually, I grew back into classical music; I wanted to compose like Beethoven, Bach, and Wagner, and I wanted to learn the vocabulary.
That’s when, I threw myself into conservatory studies. But what drove me wasn’t just performance, it was the joy of improvisation and composition. At the same time, I was falling in love with science: math, physics, equations, and the beauty of abstraction. I wanted to understand—and contribute to the creation of understanding—how the world works. These two worlds, music and science, have been running in parallel in my life ever since.
At every fork in the road, I had the choice: go deeper into music, or pursue science and engineering. Every time I chose the latter. But I never left music behind. It came along on the journey with me.
The Glove
During my first year as a graduate student at UC San Diego, I taught a music technology course for high school students. That’s where I first discovered the magic of programming music: writing Python code that could generate melodies. It opened up a whole new world; not just of composing, but of designing instruments themselves.
One idea in particular stuck with me. I thought: most instruments are played with hands; what if I can create a glove that plays all of them? A wearable interface where your hand movements become sound. It wasn’t new (people had been trying it since the 90s), but the challenge fascinated me. Gloves have too many degrees of freedom, no haptic feedback, and are hard to engineer. But what if we could make it work; and not just as a novelty, but as a genuine standalone instrument?
I started experimenting. A glove that learns from your playing. A glove that conducts a virtual choir like Jacob Collier does with his audience. A glove that lets you tap on a table, and it remembers what you just played on a keyboard. Not all of these experiments worked perfectly, but each opened a door.
And it made realize: this isn’t just about gloves. It’s about motion. About mapping the intelligence of movement into the intelligence of sound.
Three Kinds of Intelligence
The glove was a side project for a very long time, before I joined the American University of Beirut as an assistant professor. It was time to bring my music and engineering backgrounds together. That’s how the idea for the Music Intelligence Lab was born. But why “intelligence”?
To me, music is woven from at least three kinds of intelligence:
Physical intelligence — The body knows music before the brain does. Fingers remember patterns, muscles learn expression, and the act of playing is deeply embodied. Physical intelligence is as complex and hard to understand as intellectual intelligence. Motion and sound are both complex physics at play, leading to beautiful mathematical symmetries.
Mathematical intelligence — Ratios, harmonics, scales, maqamat. From Pythagoras to Arab theorists, music theory is built on simple and complex relationships of sound. Understanding those symmetries are at the foundation of modern music, and have so much potential to revolutionize what music is created, and how we feel about it.
Computational intelligence — The modern layer: analyzing signals, retrieving information, and now using machine learning to find hidden patterns in music and performance. Most of modern music is created on computers. As art creation is increasingly automated, how do we preserve what is deeply human about music?
The lab is about exploring these intelligences, connecting them, and creating new ways for people to experience music; whether that’s through building instruments, reviving cultural traditions, or designing AI systems that don’t replace musicians, but empower them.
Why this matters
As an academic, science and engineering ask me to be objective, detached. Music does the opposite: it pulls me back into myself. It’s personal. Subjective. Human.
And that’s why this work matters. At a time when so much of our world, even art, is commoditized, automated, or mass-produced, music reminds us who we are. Not every project we try in the lab will work. Not every experiment will resonate. But the North Star is clear:
to empower music creation in ways that are personal, human, and deeply meaningful
For me, music has always been that place where I reconnect with myself. With the lab, I want to make space for others to do the same; whether through a new instrument, a revived tuning system, or simply an experiment that sparks joy.
