Theory
I strive to build tactile instruments with dynamic visual outputs. I see myself as a translator between the conversion of musical instruments into video instruments. My video instruments function the same way as musical instruments, but instead of producing specific audio outputs for specific physical inputs, specific video outputs get produced.
The translation becomes tricky then when you ask yourself questions like, 'how do I convert a concept such as musical chord to a visual chord?'
Each visual world I make is an attempt to answer these kinds of questions.
How I Perform
I perform concerts using a TouchDesigner file with preloaded visual worlds. For each show I perform I will review with my boss the run of the show(setlist), and then assign one visual world to each song.
During the show, I load that visual world by selecting it on my Ableton Push by pressing one of the orange buttons.
This does two things, it runs the visual I selected, but also loads the visual’s unique MIDI layout. The unique MIDI layout helps the user understand the visual world and how to perform it. Additionally, it gives each visual world its own personality through different flavors of logic in how to interact with the visual world.
Then I or the user, can essentially improvise the rest of the song like a musician soloing on one of their instruments. This allows me to match the rhythm and react to improvisation from musicians.