Researchers at Deezer have designed an AI system that can relate songs with its moods and intensities. The AI technology has been detailed in a research paper published recently named “Music Mood Detection Based on Audio Lyrics With Deep Neural Nets”. To understand any song’s mood, the team used both the audio and the lyrics. The first step was to enter the audio signal into a neural network. They came up with a model to understand the linguistic contexts of words. They used the Million Song Dataset (MSD) which is a metadata for more than a million songs to teach it how to determine moods. In particular, they used last.fm’s database, a website designed to listen to songs online. It assigns identifiers to soundtracks and the site has more than 500,000 unique tags for the purpose. Many such categories for songs are associated with moods. The team of developers then combined all this structured data to Deezer’s catalog using identifiers such as song titles, album titles etc.
The AI system designed by them is able to effectively determine how calm, soothing or upbeat the song is. This has a potential to further look into how music, lyrics, and mood correlate and understands peoples’ mindsets when they are listening to music. It also digs deep to find possibilities to sort through and find high volumes of data using deep learning models.