They didn’t have much of a problem finding musicians to compose tunes; Peter had deep connections to the New York music scene, and folks like Jarret were in need of something to occupy their time.
With Jarrett and a small group of furloughed Broadway composers, arrangers, and orchestrators to write the music, Dynascore’s technical team stumbled into a best-case scenario when it came to in-house talent. Who better to write pieces that evoke visual drama than those who are already paid to do it nightly?
The Human Problem
Human composers’ golden ears are really the key to Dynascore’s ultimate success. The historic problem with AI-based, or “algorithmically composed” music, according to Saatchi, was that it mostly attempted to teach software instruments to write music from scratch, rather than reinterpreting pieces that had already been written.
“Making music that actually resonates with people is a human problem, so you have to start with the people,” Saatchi says. “AI is the supercharger for the humans.”
In the early days of Dynascore, Saatchi and the team worked to develop a way of breaking down original music and out-of-copyright classics (think Grieg’s “Hall of the Mountain King”) into segments they call “morphones.” They’d teach the AI a song, then ask it to recompose something similar using the original song’s morphones as a guide. Afterwards, they’d ask the musicians to critique the AI’s composition.
Getting music to fit perfectly in a video isn’t as simple as chopping up existing songs in predictable ways. Organic transitions require a more thorough understanding of key, rhythm, and intensity, among other musical markers. As such, morphones don’t just indicate the speed and key of a song. They also indicate various other tonal and musical characteristics, all of which allow the AI to know which types of musical Lego blocks snap together in which ways.
After they developed the morphone system, the team would feed the AI songs and have it recompose them. It took a while before it was musically literate enough to make good choices.
“The AI would compose a piece, and the musicians would go, ‘That was bad,’” Saatchi says, chuckling at the simplicity of the test. “It gets that feedback and it learns from it, and it gets to the point where, suddenly, it creates coherent compositions.”
The AI soon became smart enough to fit transitions, fades, breaks, and other user-dictated timing shifts into each tune it wrote. The version of Dynascore that I witnessed reshaping the Moonlight Sonata was born.
Take a Load Off
Dynascore’s achievement represents a dramatic improvement to what’s historically been a cumbersome workflow.
“When you’re working as an editor or filmmaker, you spend so much time with the music, because you have to make it fit frame by frame,” says DiGiovanna, who has worked on everything from feature films to TV ads, “With Dynascore you can do it on the fly.”
A tool that enables dynamic musical composition is particularly useful when working on commercial projects where certain items might need to be cut down the line. DiGiovanna gives an example of a director who needs a purse removed from an ad he created.
“You have to remove five seconds of video, and now the ending of the song doesn’t work, the transition to the next song doesn’t work,” he says, “That’s when Dynascore is going to save me a lot of time.”