February 2022 - April 2022
An algorithmically generated bass flute composition and a complementary 3D environment that listens and reacts to a live performer, delivering a narrative inspired by that of a nautilus mollusc traversing the deep sea floor.
Skills used: Unity Engine, PureData, audio processing
Nautilus is a case study from the wider project 'The Digital Score' (Digiscore) by Professor Craig Vear at De Montfort University, Leicester. This is a five year project funded by the European Research Council aimed to determine how the traditional idea of a music score can be enhanced by employing digital technologies such as robotics, immersive technology, and gaming. The latter of which is what Nautilus aims to explore. This project was developed in close cooperation with Carla Rees, Professor of Low Flutes & Contemporary Flute at the Royal Academy of Music in London, our collaborating performer and tester.
The concept behind the project is to immersive the performer in a digital world beyond just the music. A backing track is algorithmically generated using a neural network trained on classic jazz music, with source material from a recording of an improvisation bass flute session from the performer. The software then creates a score for the performer to play alongside the backing track.
My role was to create a visual experience for the performer to immerse themselves within while playing with some form of interaction. I created a Unity scene that listened to both the generated backing track and the live performance, processing and interpreting the audio data live to control certain elements of the 3D scene, such as camera movement and visual effects, as well as triggering transition points between different phases when a certain note and pitch from the performer is detected.
The visual design of the project is based on the deep-sea, in complete darkness with a bare, desolate landscape. There are particles floating in all directions, as well as particles that descend from above, glowing blue with a music note above it. These notes are spawned randomly in patterns, allowing the performer to interpret them as they wish. Far away notes can be seen as quieter and fast notes as louder, etc.
Walking the fine line between immersive visual experience and music-controlled game was a major concern throughout development. We wanted to avoid Carla feeling as though she was in a game with defined objectives, playing notes with the intention of progressing in the game rather than playing out of artistic expression. I did this by implementing the control systems in a subtle way, not making it obvious when and how the performance is influencing the game world. There is an inherent randomness in the systems, such as when listening for a certain note to trigger an event, the system may randomly decide to ignore the first, second, and third input and then trigger on the fourth.
The final performance for this phase of the project took place on the 7th of April. Carla Rees performed in front of a small live audience, the session was recorded and the audience members were asked to give feedback. The recording can be viewed on YouTube in the link section below. There was also a second performance with Franziska Baumann that I unfortunately couldn't attend.
Digiscore website - https://digiscore.dmu.ac.uk/
Digiscore blog post - https://digiscore.dmu.ac.uk/2022/01/27/nautilus/
Craig Vear - https://twitter.com/craigvear
Carla Rees - https://twitter.com/rarescale
Carla's performance - https://youtu.be/XK-9eXCJxCg
Franziska's performance - https://youtu.be/SV6TqzJkiX4
I would like to add more to the random generation of the backing track. The original plans for the backing track generation were far more complex but were scaled back during development due to unforeseen issues.