The neural mechanisms involved in the processing of vocalizations and music were compared, in order to observe possible similarities in the encoding of their emotional content. Positive and negative emotional vocalizations (e.g. laughing, crying) and violin musical stimuli digitally extracted from them were used as stimuli. They shared the melodic profile and main pitch/frequency characteristics. Participants listened to vocalizations or music while detecting rare auditory targets (bird tweeting, or piano's arpeggios). EEG was recorded from 128 sites. P2, N400 and Late positivity responses of ERPs were analysed. P2 peak was earlier in response to vocalizations, while P2 amplitude was larger to positive than negative stimuli. N400 was greater to negative than positive stimuli. LP was greater to vocalizations than music and to positive than negative stimuli. Source modelling using swLORETA suggested that, among N400 generators, the left middle temporal gyrus and the right uncus responded to both music and vocalizations, and more to negative than positive stimuli. The right parahippocampal region of the limbic lobe and the right cingulate cortex were active during music listening, while the left superior temporal cortex only responded to human vocalizations. Negative stimuli always activated the right middle temporal gyrus, whereas positively valenced stimuli always activated the inferior frontal cortex. The processing of emotional vocalizations and music seemed to involve common neural mechanisms. Notation obtained from acoustic signals showed how emotionally negative stimuli tended to be in Minor key, and positive stimuli in Major key, thus shedding some lights on the brain ability to understand music.

Proverbio, A., de Benedetto, F., Guazzone, M. (2020). Shared neural mechanisms for processing emotions in music and vocalizations. EUROPEAN JOURNAL OF NEUROSCIENCE, 51(9), 1987-2007 [10.1111/ejn.14650].

Shared neural mechanisms for processing emotions in music and vocalizations

Proverbio, AM
;
de Benedetto, F
Membro del Collaboration Group
;
2020

Abstract

The neural mechanisms involved in the processing of vocalizations and music were compared, in order to observe possible similarities in the encoding of their emotional content. Positive and negative emotional vocalizations (e.g. laughing, crying) and violin musical stimuli digitally extracted from them were used as stimuli. They shared the melodic profile and main pitch/frequency characteristics. Participants listened to vocalizations or music while detecting rare auditory targets (bird tweeting, or piano's arpeggios). EEG was recorded from 128 sites. P2, N400 and Late positivity responses of ERPs were analysed. P2 peak was earlier in response to vocalizations, while P2 amplitude was larger to positive than negative stimuli. N400 was greater to negative than positive stimuli. LP was greater to vocalizations than music and to positive than negative stimuli. Source modelling using swLORETA suggested that, among N400 generators, the left middle temporal gyrus and the right uncus responded to both music and vocalizations, and more to negative than positive stimuli. The right parahippocampal region of the limbic lobe and the right cingulate cortex were active during music listening, while the left superior temporal cortex only responded to human vocalizations. Negative stimuli always activated the right middle temporal gyrus, whereas positively valenced stimuli always activated the inferior frontal cortex. The processing of emotional vocalizations and music seemed to involve common neural mechanisms. Notation obtained from acoustic signals showed how emotionally negative stimuli tended to be in Minor key, and positive stimuli in Major key, thus shedding some lights on the brain ability to understand music.
Articolo in rivista - Articolo scientifico
ERP; Emotions; Language; Music; N400; minor major key; sadness; valence
English
14-dic-2019
2020
51
9
1987
2007
open
Proverbio, A., de Benedetto, F., Guazzone, M. (2020). Shared neural mechanisms for processing emotions in music and vocalizations. EUROPEAN JOURNAL OF NEUROSCIENCE, 51(9), 1987-2007 [10.1111/ejn.14650].
File in questo prodotto:
File Dimensione Formato  
ejn2020.pdf

accesso aperto

Tipologia di allegato: Submitted Version (Pre-print)
Dimensione 2.4 MB
Formato Adobe PDF
2.4 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/254609
Citazioni
  • Scopus 17
  • ???jsp.display-item.citation.isi??? 16
Social impact