The Atomic Flute projects aim is to bring to life the sounds of atomic particles. Our team comprises of nuclear physicists, coders, software engineers and musicians, working to create something musical out of data coming out of the physics department at the university of York. We were lucky enough to get some starter funding from the SoVoT Network which has allowed us to produce the initial particle – Hydrogen.

The project started when nuclear physicists Dan Watts and Nick Zachariou mapped out the energy states of a hydrogen atom. These resonances start low and jump up through the states. These were translated into frequencies by Tom Collins, a professor in music technology, who programmed these frequencies into a scale that sounds like this
I then converted this scale onto flute, so that I could then write a piece in the scale of a Hydrogen atom. There are a few practical challenges with mapping these scales onto the flute. The first is that these particles don’t resonate in the 12 note western scale that the flute normally plays, they aren’t vibrating conveniently on a Bb or a D, they’re normally vibrating somewhere between two notes that we play. These are called microtones and they can be a challenge as a listener if you haven’t heard much microtonal music. Your ear wants to put these notes into the boxes its used to- so if a note is close to B, you might hear it as a B, but everything sounds almost out of tune. Here’s the scale on flute.
As a composer I chose to lean into that because the purpose of the piece is to focus on these ‘unusual’ frequencies so I didn’t want to hide that, I wanted to highlight that.
Once the piece was written I teamed up with research software engineer Phil Harrison. We wanted to create a performance that was also visually impactful so that the audience could see as well as hear these changing energy states. Phil wrote a program that recognised each frequency as a different input; the program adds a sphere for each note the flute plays, creating a real-time image that Phil can manipulate live. The colours Phil chose were extracted from a photo of a nebula made up of mostly hydrogen.

This project has massive potential. In the lab we’re working on creating flutes in hydrogen scales that could be produced cheaply and played by anyone. We’re looking at creating an immersive performance using 360 projections and VR technology. We’re able to use Phils software to show students how breath affects tuning. There are more particles that we want to sonify and bring to life. It’s been so exciting to work with people from radically different disciplines and learn how they problem solve and project plan. It allows us to share the work of these scientists beyond the university, and understand their findings in a whole new way.
