Rage with the machine: artificial intelligence takes on Eurovision – The Australian Financial Review

Original article was published on artificial intelligence

The team – Can AI Kick It – used AI techniques to generate a hit predictor based on the melodies and rhythms of more than 200 classics from the Eurovision Song Contest, an annual celebration of pop music and kitsch. These included Abba’s Waterloo (Sweden’s 1974 winner) and Loreen’s Euphoria (2012, also Sweden).

‘We do not condone these lyrics’

But to generate the lyrics for the song Abuss, the team also used a separate AI system – one based on the social media platform Reddit. It was this that resulted in a rallying cry for a revolution.

Like the notorious Tay chatbot developed by Microsoft in 2016 that started spewing racist and sexist sentiments after being trained on Twitter, the fault lay with the human sources of data, not the algorithms.

“We do not condone these lyrics!” stresses Janne Spijkervet, a student who worked with Can AI Kick It and ran the lyric generator. She says the Dutch team nevertheless decided to keep the anarchist sentiment to show the perils of applying AI even to the relatively risk-free environment of Europop.

Alongside Abuss, which its creators describe as atonal and creepy, sits the Australian entry with the same sheen as a chart-topping dance hit but with a distorted subliminal AI-generated chorus of koalas, kookaburras and Tasmanian devils.

Meanwhile, the song I’ll Marry You, Punk Come, composed by German team Dadabots x Portrait XO, used seven neural networks in its creation. The resulting piece of music blends lyrics from babble generated from 1950s
a cappella music with AI-generated death-metal vocal styles and a chromatic bass line spat out of a neural network trained on Bach’s canon.

The contest was judged along the same lines as the established competition with a public vote tallied against the opinions of a panel of expert judges. Ed Newton-Rex, who founded the British AI compositional start-up Jukedeck, is one of them. He explains that the panel looked at the process of how machine learning was applied, as well as creative uses of algorithms – such as the “koala synth” – and the quality of the song. The judges also factored in “Eurovisioness” into their thinking, although he admits, “I have no idea what that means.”

About 20,000 people tuned into the event, a far cry from the 182 million who watched last year’s human contest, but the hope is that the computer version will pave the way for AI to influence Eurovision proper through song composition or, over time, robotic performance.

“That is my dream,” says Karen Van Dijk, the VPRO producer who came up with the concept.

Coding for a hit

A performance given by the Sex Pistols at Manchester’s Lesser Free Trade Hall in 1976 – where, legend has it, almost everyone in the tiny audience went on to form their own band – became known as the “gig that changed the world”, and was deemed a genesis point for a musical revolution. The equivalent for AI music took place in the winter of 2019 in Delft, the picturesque Dutch town known for its fine pottery and as the birthplace of the painter Johannes Vermeer. The city’s university was hosting the 20th conference of the International Society for Music Information Retrieval when a proposition was put to the academics in attendance.

Van Dijk announced that she was organising the first “Eurovision for computers” and needed entries. When Holland’s Duncan Laurence won the Eurovision Song Contest in 2019, amid her euphoria Van Dijk pondered whether AI could be harnessed to lock in more hit songs for the country. “I was naive. I thought we could create the next Eurovision hit with the press of a button,” she says.

Van Dijk arrived in Delft bearing data gifts. An Israeli composer had created a spoof Eurovision song the year before, called Blue Jeans and Bloody Tears, using a cache of data extracted from the Eurovision catalogue. That data was bought by VPRO and provided to the entrants as a stimulant for their own experimentations. For some, it also allowed them to rekindle pop-star ambitions.

Tom Collins, a music lecturer at the University of York, and his wife Nancy Carlisle, an academic at Lehigh University in Pennsylvania, had a garage band called The Love Rats when they were doctoral students. When Collins heard about the AI Song Contest, he was inspired to “dust off his code” and get the band back together by using AI to write a song. He initially worked with Imogen Heap, the English singer-songwriter and audio engineer, but coronavirus-related travel restrictions halted those efforts. Instead, he and Nancy worked over a weekend on Hope Rose High, which he describes as an “eerie power ballad” inspired by the lockdown.

The husband-and-wife team turned to an AI lyric engine called “theselyricsdonotexist” to generate robotic poetry with an optimistic feel. Carlisle says the AI’s suggested lyric “and then the mist will dance” seemed ridiculous until she listened again to some of her favourite songs and started hearing what sounded like nonsense. “Radiohead don’t make a lot of sense but I still love them,” she admits. Collins adds that the mist lyric also fits with the Eurovision theme: “You can imagine the massive smoke machines kicking in.”

While the duo did not enter the contest with the aim of winning, others saw an opportunity to test whether AI could be used not just to write a song but to pen a hit. Ashley Burgoyne, a lecturer in computational musicology at the University of Amsterdam and a member of the team behind Abuss, used the Blue Jeans dataset to create a “Eurovision hit predictor”.

That data suggested that melodies with hooks of three to seven notes and songs with simple rhythmic patterns scored the highest. It also showed that a certain level of atonality – where it is hard for the ear to identify the key – was crucial to Eurovision success. Yet Burgoyne believes that despite a handful of “stinkers” being included in the data, the results reflected a paucity of the negative information that is needed to successfully train the system – in this case, songs that didn’t reach the finals.

He compared the issue to Netflix recommendations that suggest “a load of crap” after you have watched a high-quality TV series. “If you believe quality exists, then AI isn’t good at finding it. How do you define [what is] a good song even in the world of Eurovision?” he says.

Subliminal devil voices

The use of subliminal voices supposedly encouraging devil worship in heavy metal music was a cause célèbre in the 1980s. Few would have expected that subliminal Tasmanian devil voices would be influencing Europop 30 years later.

Caroline Pegram, head of innovation at Uncanny Valley, the music technology company behind the Australian entry, wanted to pay homage to the wildlife that had been killed during the 2019-20 bushfires in Australia. A zookeeper friend gave her videos of Tasmanian devils going absolutely wild and blended the screeches with the sounds of koalas and laughing kookaburras to create an audio-generating neural network using technology developed by Google’s creative AI research project Magenta. They called it the “koala synth”.

It proved that AI can create unexpected results. “It was a happy accident. Everyone thought I was insane – literally insane – but the koalas have sent out a positive message and it is a strong and catchy sound,” says Pegram.

We also need to guard against the risk that AI might in certain respects be deployed to supplant human creativity.

— Geoff Taylor, chief executive of the BPI

The koala synth adds a new Antipodean angle to the Eurovision story – Australia has only been permitted to compete in the contest since 2015 when the European Broadcasting Union allowed its entry.

Justin Shave, who produced the song, explains that the DDSP – differential digital signal processing – technology it used has since been used to generate the sounds of violins, trumpets and even a choir of drunken men. “That one didn’t work so well,” he admits.

Unlike the more academic entrants, Uncanny Valley comes from a musical background, having produced songs for Aphex Twin and Sia. The group had already planned to enter an AI-composed song in the main song contest.

They now hope that the AI Song Contest will help to dispel concerns in some parts of the traditional music community that the technology could lead to musicians losing their jobs if computers take over.

Geoff Taylor, chief executive of the BPI, Britain’s music trade body, and head of the Brit Awards, says the new horizons of AI are exciting but urges caution.

“We also need to guard against the risk that AI might in certain respects be deployed to supplant human creativity or undermine the cultural economy driven by artists. Such an outcome would leave our societies and our cultures worse off,” he says.

Rage with the machine

His fears have been stoked as some of the world’s largest technology companies, including Google and TikTok owner ByteDance, have moved into the compositional space. But Anna Huang, a resident at Google’s Magenta and a judge on the AI Song Contest, says Big Tech is attracted to AI musical composition by scientific curiosity, not a desire to take over the music world.

“Music is a very complex domain. In contrast to language, which is a single sequence, music comprises arrangement, timbre, multiple instruments, harmony and is perceptually driven. It is also very referential,” she says.

AI could also have a democratising impact on the creation of new music, says Huang. She cites her own experience at high school in Hong Kong, when some of her classmates were already composing for full orchestras. Huang was a musician too and believed that computer science could develop new methods of musical composition, something AI can potentially deliver.

That was demonstrated via an interactive Google Doodle launched in March last year that encouraged users to input a simple melody. The AI, developed by Magenta, then generated harmonies in the style of Bach. Within two days, the lighthearted doodle had created 55 million snippets of music.

Newton-Rex, who sold his company to China’s Bytedance last year, says musicians need to see AI as a tool to stimulate creativity – a spur that helps new ideas or disrupts habits – rather than a threat. “Every time I sit down at the piano, I play the same thing,” he says, adding that AI is already creeping into sophisticated drum machines, arpeggiators and mastering software, and that it will always need human curation. “What does AI music sound like? It sounds like nothing without a human element.”

As Pegram says: “Some musicians fear we will end up building machines pumping out terrible music – but we need to rage with the machine, not against it.”

Eurovision 2020: Big Night In is on SBS from 7.30pm on Saturday.

— Financial Times