TAM singer Claire Parsons had worked with Vienna-based Astrid Rothaug on a previous music clip for her In Geometry album. For TAM, we wanted to give Astrid as much freedom as possible in creating a three-minute short film with hand-drawn animated frames and a story loosely spun around an Aquatic Museum theme. For inspiration during our first meetings, we provided Astrid with demo and pre-production material and settled on a simple pop music tune (with some twists), co-written1 by Claire and me, The Souvenir Shop—a constant in every museum visit.
The process2 of creating a clip for music or music for a short video was never clearly laid out. The final song choice came later during the album writing and creation process. Because we had the unique opportunity to work with an artist who spent weeks creating frame-by-frame hand-drawn animations in high resolution, we discarded several other music app ideas based more on traditional video game mechanics. Hand-drawn games are rare in video games (cf. literature review chapter) because the translation from hand-drawn artwork into computer-animated sprites can lead to aesthetic clashes. Hand-drawn paper-style flat graphics don’t lend themselves easily to an interactive graphic style. Also, to respect Astrid’s vision and to keep true to the original, we maintained the clip’s linearity.
Graphical user interface design is a challenging domain of human and computer interaction. How do you guide the user in a game or software? For TAM, I chose to use color to indicate possible interactions. Astrid’s footage is kept in black and white3, and color is used sparingly to indicate possible interaction.
In the TAM app, the different possible interactions that will influence the music are indicated by color spots and surfaces. The doorbell from clip 1 and the transforming whale from clip 2 are examples among other larger and smaller color indications:

The tree example shows multiple user interactions where different trees trigger random voice tracks. Later in the clip, a chessboard is reached when the Souvenir Shop tune reaches the solo section. A press by the user will trigger another instrument playing a solo, and a press on the same spot while a certain solo instrument is playing will slowly fade into another solo version. As an Easter egg, some instrumentalists agreed to play a funny or even poorly executed version of their solo that will only rarely be triggered (1 out of 1000 times). Interaction times are limited to a few seconds, the solo section is short, and the user has no built-in recording option.
Basic touchscreen single-finger interactions are supported, which also translate to mouse navigation: touch (single, double), drag, slide. Some user actions will have visual feedback cues, but audio is always front and center. The app is semi-generative4 and not generative in the sense of the original definition because it only uses the loop and soundscape material that we recorded and that fits the song.
The majority of pieces for the TAM album were written by Claire Parsons with song and lyric contributions by me on several tunes. ↩︎
Jacobs, L. (2015). Film rhythm after sound: technology, music, and performance. University of California Press; First edition (December 18, 2014). ↩︎
For another example of digital bi-color cf. e.g. 1-bit pixel games, a category dating back to the 80ties: Pedro Medeiros’s tutorial on 1-bit Pixel Art ↩︎
An inspiring project by Joonas Turner. ↩︎