Lost tapes of the 27 club
Every time one of our idols dies, and even more if they do when they are still young, it remains the uncertainty of knowing how the songs that they did not get to write would have sounded. Great example is Kurt Cobain who since he committed suicide in April 1994 leaving Nirvana fans desolate, people has not stopped imagining about the music he would have made if he had lived. But other than “You Know You're Right,” the song that Nirvana recorded a few months before his suicide, and a few rumours about a collaboration with REM's Michael Stipe or even Kurt going completely solo, he mainly left behind question marks . And as it happens with Kurt Cobain, the same occurs with any of the other members of the so-called 27 Club in which artists such as Jim Morrison, Jimi Hendrix or Amy Winehouse died unexpectedly when reaching the fateful number as a result of one way or another of the mental problems produced by a savage music industry.
With the hook of this macabre club that has even acquired a certain air of romanticism over the years, the project Lost Tapes of the 27 Club has used AI to imagine what these artists might have created, if they were still with us . The initiative has been developed by Over The Bridge, an organization based in Toronto that aims to change the conversation about mental health in the music community while providing a compassionate environment for members to thrive.
Everything on the songs that the project has released is the result of the work of computers with the sole exception of the vocals. For example, in the song “Drowned in the Sun” in the style of Nirvana, it’s Eric Hogan, frontman of Nirvana’s tribute band Atlanta’s Nevermind who’s responsible for the vocals. However, for amazing this may sound, let’s not forget that this project was meant to create awareness of mental health and how living musicians and ask for help to avoid depression states leading into suicides like Kurt’s Cobain.
Back to the songs, all of them are the result of using Google’s AI program Magenta which analyzed many songs by each artist as MIDI files which works similarly to a player-piano scroll by translating pitch and rhythm into a digital code that can be fed through a synthesizer to recreate a song. After examining each artist’s note choices, rhythmic quirks, and preferences for harmony in the MIDI file, the computer creates new music that the staff could pore over to pick the best moments. For the lyrics it was used an AI program called an artificial neural network, basically in a very similar process to the music creation. With the lyrics the team behind the project were able to input the artist’s lyrics and start off with a few words and the program would guess the cadence and tone of the poetry to complete it.
Amazing technology to create some sort of monster in the form of surprising songs that certainly raises a lot of interesting questions about ethics connected to the possibilities of AI.