Making albums with AI from our backyard: Claire Evans and Jona Bechtolt of YACHT

Original article was published by Cristobal Valenzuela on Artificial Intelligence on Medium


Last year, dance-pop band YACHT released Chain Tripping, a Grammy-nominated immersive album designed by band members Claire Evans and Jona Bechtolt using Runway. Based in LA, the duo used Runway to design the album’s songs and visual assets including music videos, band photos, an album cover, and live installations. They even named the album using GPT-2.

YACHT’s interest in using ML as part of their songwriting process began five years ago. With her background as a science editor and writer, Claire was curious about the implications AI might have on the arts and creativity. She mentions reading the writings of David Koepp and seeing the work of artists like Mario Klingemann, Memo Akten, and Robbie Barrat.

Starting a Conversation with a New Technology that’s Not For Us

“We don’t really know what’s the incorrect way of doing things … You start with brute force, try stuff, and end up making things maybe a trained programmer wouldn’t think to do.”

YACHT’s Grammy-Nominated 4K Visual album Chaintripping, designed using Runway.

When it came to ChainTripping, the band saw an album as an opportunity to better understand machine learning. “We’re always interested in starting a conversation with a new technology that’s not for us,” Claire says. ‘Runway was the first user-facing tool we saw.” Through music, visual art, photography, video, typography, design and layout, the band created what they call a “document” on the current status of art and ML.

With Runway, the band found an accessible tool with a marketplace of models they could experiment with. Since they’re not computer programmers, Claire and Jona previously outsourced to more experienced collaborators. Relying on other people was frustrating, Claire says. “Runway is the only tool that we were able to actually mess around with ourselves,” she adds. “That’s how you learn how systems work.”

Looking back at their process, Jona sees the creative value in misusing the models. “We don’t really know what’s the incorrect way of doing things,” Claire reflects. “You start with brute force, try stuff, and end up making things maybe a trained programmer wouldn’t think to do.”

YACHT’s Scatterhead music video, designed using Runway.

In the Scatterhead music video, the duo pulled dance archive footage and inserted it into post mapping models to create glitchy dance forms, then added on forms from dance pose, a model that allowed them to map their own 3D forms from TV video images. “What’s interesting and important is that these tools are not perfect,” Jona says, describing how their limbs would disappear and re-appear on a different part of their body. “But where they fail is often where the most interesting aesthetic stuff happens.”

Finding Narrative Meaning in ML

“We’re all making the meaning in time, in my performance, and listening to the song. What someone thinks the song is about is as valid as what I think the song is about.”

Listening back to the album now, the band describes the material as living in the space between traditional narrative songwriting, and AI’s anti-narrative generative processes. “The big question was how can we use ML to create something that isn’t just technically music,” Claire says, “but actually fits in a body of work we can be proud of.”

For most artists, a work’s meaning comes from the creator. ML disrupts this relationship. For YACHT, the creative process meant sitting in front of AI-generated text and language, aka “nonsensical gobbledygook.” The challenge is then to find meaning in this raw language, which arrived through a process created by humans from the input which was selected from humans.“That process of interpretation and arrangement is a huge part of where meaning actually works,” Claire tells us. In other words, it’s the difference between how you say something, compared to what you say.

Before Covid-19 canceled live shows, the band performed ChainTripping to their fans with Runway-generated live projections. “Because [this album] came from the space between us and the machine, there’s this lovely collective experience that we get to have with our audience,” Claire says. “We’re all making the meaning in time, in my performance, and listening to the song. What someone thinks the song is about is as valid as what I think the song is about.”

Jona mentions the aftershow conversations, where the band would talk to their fans about the ML processes used in the performance. “We told them to just go home and download Runway and play with that,” Jona describes. “These tools create completely unpredictable results. You have to make the best of those, but it’s really fun.” Looking back, he admits, ML tools were a pretty weird conversation to have at 1 AM in a rock club.

Making Massive Video Installations Using ML In Your Backyard