By Alice Klein
We use the left and right sides of our brain to process the lyrics and melodies of songs
Speech and music are highly complex forms of communication that are mostly unique to humans. How our brains distinguish both at once when they are blended together in songs has been a mystery until now.
Previous research has studied speech and sounds, but not music. Philippe Albouy at McGill University in Canada and his colleagues created 100 unique a cappella songs by crossing 10 sentences in French or English with 10 original melodies.
They played the songs to 27 French speakers and 22 English speakers, manipulating different elements of the songs to understand how the participants perceived the words and melodies. They tested two languages to see if the results would be the same for both.
The researchers found that the ability to recognise lyrics is heavily reliant on a songs timing patterns. Speech contains multiple syllables per second, making its time structure more important than that of melodies, which tend to be more fluid. When the team distorted time elements in the songs, the participants could still identify the melodies but could no longer understand the lyrics.
In contrast, our capacity to recognise melodies seems to depend more on their frequency patterns. When the researchers distorted the frequencies of the songs, the participants could still understand the words but could no longer identify the melodies.
Next, the experiments were repeated while the participants brains were scanned using functional MRI. This revealed that the left half of their brains detected the timing information that allowed word recognition, while their right halves detected the frequency information required to identify melodies.
The findings are consistent with previous observations that suggest damage to the brains left hemisphere is more likely to affect speech abilities, while damage to the right is more likely to impair musical abilities.
It makes sense for our brains to perceive words and melodies separately, says Albouy. One region dealing with both timing and frequency information would maybe be too difficult, he says.
The team now wants to test whether people who speak languages like Mandarin and Vietnamese, which are more melodic than French and English, also process words and melodies on opposite sides of their brain.
Journal reference: Science, DOI: 10.1126/science.aaz3468
More on these topics:
By Alice Klein