0000000000999226
AUTHOR
Petri Toiviainen
Musical training predicts cerebello-hippocampal coupling during music listening.
Cerebello-hippocampal interactions occur during accurate spatiotemporal prediction of movements. In the context of music listening, differences in cerebello-hippocampal functional connectivity may result from differences in predictive listening accuracy. Using functional MRI, we studied differences in this network between 18 musicians and 18 nonmusicians while they listened to music. Musicians possess a predictive listening advantage over nonmusicians, facilitated by strengthened coupling between produced and heard sounds through lifelong musical experience. Thus, we hypothesized that musicians would exhibit greater functional connectivity than nonmusicians as a marker of accurate online pr…
Semi-blind Independent Component Analysis of functional MRI elicited by continuous listening to music
This study presents a method to analyze blood-oxygen-level-dependent (BOLD) functional magnetic resonance imaging (tMRI) signals associated with listening to continuous music. Semi-blind independent component analysis (ICA) was applied to decompose the tMRI data to source level activation maps and their respective temporal courses. The unmixing matrix in the source separation process of ICA was constrained by a variety of acoustic features derived from the piece of music used as the stimulus in the experiment. This allowed more stable estimation and extraction of more activation maps of interest compared to conventional ICA methods.
Interpersonal Coordination in Dyadic Performance
Dyadic musical performance provides an excellent framework to study interpersonal coordination because it involves multiple agents performing matched, rhythmic and/or interactive behaviors. In this chapter, we explore interpersonal coordination using Canonical Correlation Analysis as a coupling measure. To provide some context when interpreting the output of CCA, musicians performed using different expressive manners (deadpan, normal, exaggerated). Overall the results showed the normal performances were slightly more interpersonally coordinated than deadpan and exaggerated. peerReviewed
Perception of Segment Boundaries in Musicians and Non-Musicians
In the act of music listening, many people break down musical pieces into chunks such as verses and choruses. Recent work on music segmentation has shown that highly agreed segment boundaries are also considered strong and are described by using multiple cues. However, these studies could not pinpoint the effects of data collection methods and of musicianship on boundary perception. Our study investigated the differences between segmentation tasks performed by musicians in real-time and non real-time listening contexts. Further, we assessed the effect of musical training on the perception of boundaries in real-time listening. We collected perceived boundaries by 18 musicians and 18 non-musi…
The Role of Music in Everyday Life During the First Wave of the Coronavirus Pandemic : A Mixed-Methods Exploratory Study
Although music is known to be a part of everyday life and a resource for mood and emotion management, everyday life has changed significantly for many due to the global coronavirus pandemic, making the role of music in everyday life less certain. An online survey in which participants responded to Likert scale questions as well as providing free text responses was used to explore how participants were engaging with music during the first wave of the pandemic, whether and how they were using music for mood regulation, and how their engagement with music related to their experiences of worry and anxiety resulting from the pandemic. Results indicated that, for the majority of participants, whi…
Embodiment in Electronic Dance Music: Effects of musical content and structure on body movement
Electronic dance music (EDM) is music produced with the foremost aim to make people move. While research has revealed relationships between movement features and, for example, musical, emotional, or personality characteristics, systematic investigations of genre differences and specifically of EDM are rather rare. This article aims at offering insights into the embodiment of EDM from three different angles: first from a genre-comparison perspective, then by comparing different EDM stimuli with each other, and finally by investigating embodiments in one specific EDM stimulus. Sixty participants moved freely to 16 stimuli of four different genres (EDM, Latin, Funk, Jazz – four stimuli/genre)…
Personality modulates brain responses to emotion in music: Comparing whole-brain and regions-of-variance approaches
AbstractWhether and how personality traits explain the individual variance in neural responses to emotion in music remains unclear. The sparse studies on this topic report inconsistent findings. The present study extends previous work using regions of variance (ROVs) as regions of interest, compared with whole-brain analysis. Fifty-five subjects listened to happy, sad, and fearful music during functional Magnetic Resonance Imaging. Personality was measured with the Big Five Questionnaire. Results confirmed previous observations of Neuroticism being positively related to activation during sad music, in the left inferior parietal lobe. In an exploratory analysis, Openness was positively relat…
Musical interaction in music therapy for depression treatment
Music therapy is efficacious for the treatment of depression. Compared to other psychotherapeutic forms, it allows for the emergence of various modes of mutual interaction, thus enabling multiple channels for emotional expression and fostering therapeutic alliance. Although musical interaction patterns between client and therapist have been regarded as predictors of therapeutic outcome in depression, this has not yet been systematically investigated. We aim to address this gap by analyzing the possible linkage between musical interaction features and changes in depression score. In a clinical trial, digital piano improvisations from 58 Finnish clients and their therapists were recorded ove…
Dance on cortex: enhanced theta synchrony in experts when watching a dance piece
When watching performing arts, a wide and complex network of brain processes emerge. These processes can be shaped by professional expertise. When compared to laymen, dancers have enhanced processes in observation of short dance movement and listening to music. But how do the cortical processes differ in musicians and dancers when watching an audio-visual dance performance? In our study, we presented the participants long excerpts from the contemporary dance choreography of Carmen. During multimodal movement of a dancer, theta phase synchrony over the fronto-central electrodes was stronger in dancers when compared to musicians and laymen. In addition, alpha synchrony was decreased in all gr…
Early auditory processing in musicians and dancers during a contemporary dance piece
AbstractThe neural responses to simple tones and short sound sequences have been studied extensively. However, in reality the sounds surrounding us are spectrally and temporally complex, dynamic and overlapping. Thus, research using natural sounds is crucial in understanding the operation of the brain in its natural environment. Music is an excellent example of natural stimulation which, in addition to sensory responses, elicits vast cognitive and emotional processes in the brain. Here we show that the preattentive P50 response evoked by rapid increases in timbral brightness during continuous music is enhanced in dancers when compared to musicians and laymen. In dance, fast changes in brigh…
Turning Heads on the Dance Floor: Synchrony and Social Interaction Using a Silent Disco Paradigm
Music and dance appear to have a social bonding effect, which some have theorized is part of their ultimate evolutionary function. Prior research has also found a social bonding effect of synchronized movement, and it is possible that interpersonal synchrony could be considered the “active ingredient” in the social bonding consequences of music or dance activity. The present study aimed to separate the effects of synchrony from other factors associated with joint experience of dancing by using a “silent disco” manipulation, in which the timing of a musical stimulus was varied within a dyad in a freestyle dance setting. Three conditions were included: synchrony, tempo-shifted (in which the …
Tapping Doesn't Help : Synchronized Self-Motion and Judgments of Musical Tempo
For both musicians and music psychologists, beat rate (BPM) has often been regarded as a transparent measure of musical speed or tempo, yet recent research has shown that tempo is more than just BPM. In a previous study, London, Burger, Thompson, and Toiviainen (Acta Psychologica, 164, 70–80, 2016) presented participants with original as well as “time-stretched” versions of classic R&B songs; time stretching slows down or speeds up a recording without changing its pitch or timbre. In that study we discovered a tempo anchoring effect (TAE): Although relative tempo judgments (original vs. time-stretched versions of the same song) were correct, they were at odds with BPM rates of each stimulus…
sj-docx-1-pom-10.1177_03057356221084368 – Supplemental material for Musical interaction in music therapy for depression treatment
Supplemental material, sj-docx-1-pom-10.1177_03057356221084368 for Musical interaction in music therapy for depression treatment by Martin Hartmann, Anastasios Mavrolampados, Petri Toiviainen, Suvi Saarikallio, Katrien Foubert, Olivier Brabant, Nerdinga Snape, Esa Ala-Ruona, Christian Gold and Jaakko Erkkilä in Psychology of Music
Exploring Frequency-Dependent Brain Networks from Ongoing EEG Using Spatial ICA During Music Listening
Recently, exploring brain activity based on functional networks during naturalistic stimuli especially music and video represents an attractive challenge because of the low signal-to-noise ratio in collected brain data. Although most efforts focusing on exploring the listening brain have been made through functional magnetic resonance imaging (fMRI), sensor-level electro- or magnetoencephalography (EEG/MEG) technique, little is known about how neural rhythms are involved in the brain network activity under naturalistic stimuli. This study exploited cortical oscillations through analysis of ongoing EEG and musical feature during freely listening to music. We used a data-driven method that co…
Music we move to : Spotify audio features and reasons for listening
Previous literature has shown that music preferences (and thus preferred musical features) differ depending on the listening context and reasons for listening (RL). Yet, to our knowledge no research has investigated how features of music that people dance or move to relate to particular RL. Consequently, in two online surveys, participants (N = 173) were asked to name songs they move to (“dance music”). Additionally, participants (N = 105) from Survey 1 provided RL for their selected songs. To investigate relationships between the two, we first extracted audio features from dance music using the Spotify API and compared those features with a baseline dataset that is considered to represent …
Multi-scale Modelling of Segmentation
While listening to music, people often unwittingly break down musical pieces into constituent chunks such as verses and choruses. Music segmentation studies have suggested that some consensus regarding boundary perception exists, despite individual differences. However, neither the effects of experimental task (i.e., real-time vs. annotated segmentation), nor of musicianship on boundary perception are clear. Our study assesses musicianship effects and differences between segmentation tasks. We conducted a real-time experiment to collect segmentations by musicians and nonmusicians from nine musical pieces. In a second experiment on non-real-time segmentation, musicians indicated boundaries a…
An Interactive MIDI Accompanist
The ability to infer beat and meter from music is one of the basic activities of musical cognition. After hearing only a short fraction of music, we are able to develop a sense of beat and to tap our foot along with the music. Even if the music is rhythmically complex, containing a range of different time values and possibly syncopation as well, we are capable of inferring the different periodicities present in the music and synchronizing to them. Simulating this activity with a computer program might seem, at first glance, to be simple. If a note onset (that is, an attack) occurs before the system expects it to occur, the estimated tempo is increased, and vice versa. In practice, however, …
Beauty and the brain: Investigating the neural and musical attributes of beauty during a naturalistic music listening experience
ABSTRACTEvaluative beauty judgments are very common, but in spite of this commonality, are rarely studied in cognitive neuroscience. Here we investigated the neural and musical attributes of musical beauty using a naturalistic free-listening paradigm applied to behavioral and neuroimaging recordings and validated by experts’ judgments. In Study 1, 30 Western healthy adult participants rated continuously the perceived beauty of three musical pieces using a motion sensor. This allowed us to identify the passages in the three musical pieces that were inter-subjectively judged as beautiful or ugly. This informed the analysis for Study 2, where additional 36 participants were recorded with funct…
Cross-cultural music cognition: cognitive methodology applied to North Sami yoiks
This article is a study of melodic expectancy in North Sami yoiks, a style of music quite distinct from Western tonal music. Three different approaches were taken. The first approach was a statistical style analysis of tones in a representative corpus of 18 yoiks. The analysis determined the relative frequencies of tone onsets and two- and three-tone transitions. It also identified style characteristics, such as pentatonic orientation, the presence of two reference pitches, the frequency of large consonant intervals, and a relatively large set of possible melodic continuations. The second approach was a behavioral experiment in which listeners made judgments about melodic continuations. Thr…
Perception of emotional content in musical performances by 3–7-year- old children
The emotional content expressed through musical performance has become a widely-discussed topic in music psychology during the past two decades. However, empirical evidence regarding children’s abilities in interpreting the emotional content of a musical performance is sparse. We investigated 3–7-year-old children’s abilities to interpret the emotional content expressed through performance features in music. Short musical pieces previously rated as inexpressive of emotion were recorded by three musicians with five emotional expressions (happy, sad, fearful, angry and neutral) and played to 3–7-year-old children ( N = 94), adult non-musicians ( N = 83), and adult musicians ( N = 118) who ma…
Capturing the musical brain with Lasso: Dynamic decoding of musical features from fMRI data.
We investigated neural correlates of musical feature processing with a decoding approach. To this end, we used a method that combines computational extraction of musical features with regularized multiple regression (LASSO). Optimal model parameters were determined by maximizing the decoding accuracy using a leave-one-out cross-validation scheme. The method was applied to functional magnetic resonance imaging (fMRI) data that were collected using a naturalistic paradigm, in which participants' brain responses were recorded while they were continuously listening to pieces of real music. The dependent variables comprised musical feature time series that were computationally extracted from the…
Kinematics of perceived dyadic coordination in dance
We investigated the relationships between perceptions of similarity and interaction in spontaneously dancing dyads, and movement features extracted using novel computational methods. We hypothesized that dancers’ movements would be perceived as more similar when they exhibited spatially and temporally comparable movement patterns, and as more interactive when they spatially oriented more towards each other. Pairs of dancers were asked to move freely to two musical excerpts while their movements were recorded using optical motion capture. Subsequently, in two separate perceptual experiments we presented stick figure animations of the dyads to observers, who rated degree of interaction and si…
Musical expertise modulates functional connectivity of limbic regions during continuous music listening.
Music is known to be an important facet of all human cultures (Merriam, 1964). Listening to music in order to influence moods, evoke strong emotions, and derive pleasure is becoming increasingly common, especially in this day and age when access to music is easy and quick. In recent years, exploring the neural correlates of musical emotions has attracted the attention of neuroscientists (Brattico & Pearce, 2013; Koelsch, Fritz, v. Cramon, Muller, & Friederici, 2006). However, the majority of these studies have not accounted for the effect of musical expertise, despite increasing evidence of structural and functional differences between musicians and nonmusicians, particularly in the regions…
On application of kernel PCA for generating stimulus features for fMRI during continuous music listening
Abstract Background There has been growing interest towards naturalistic neuroimaging experiments, which deepen our understanding of how human brain processes and integrates incoming streams of multifaceted sensory information, as commonly occurs in real world. Music is a good example of such complex continuous phenomenon. In a few recent fMRI studies examining neural correlates of music in continuous listening settings, multiple perceptual attributes of music stimulus were represented by a set of high-level features, produced as the linear combination of the acoustic descriptors computationally extracted from the stimulus audio. New method fMRI data from naturalistic music listening experi…
Timbre Similarity: Convergence of Neural, Behavioral, and Computational Approaches
The present study compared the degree of similarity of timbre representations as observed with brain recordings, behavioral studies, and computer simulations. To this end, the electrical brain activity of subjects was recorded while they were repetitively presented with five sounds differing in timbre. Subjects read simultaneously so that their attention was not focused on the sounds. The brain activity was quantified in terms of a change-specific mismatch negativity component. Thereafter, the subjects were asked to judge the similarity of all pairs along a five-step scale. A computer simulation was made by first training a Kohonen self-organizing map with a large set of instrumental sounds…
Diurnal changes in the perception of emotions in music: Does the time of day matter?
According to the Hindustani music tradition, the ability of a song to induce certain emotions depends on the time of day: playing a song at the right time is said to maximise its emotional effect. The present exploratory study investigated this claim by combining findings in chronobiology, mood research and music perception. It has already been established that some aspects of our mood fluctuations follow a cyclical pattern. Besides, it is a known fact that our current mood influences our perception and assessment of emotions. However, these elements have never been linked together in a study examining the effect of mood cyclicity on perceived emotions in music. To test the hypothesis of a…
Statistical features and perceived similarity of folk melodies
Listeners are sensitive to pitch distributional information in music (N. Oram & L. L. Cuddy, 1995; C. L. Krumhansl, J. Louhivuori, P.Toiviainen, T. Järvinen, & T. Eerola, 1999). However, it is uncertain whether frequency-based musical features are sufficient to explain the similarity judgments that underlie listeners' classification processes. A similarity rating experiment was designed to determine the effectiveness of these features in predicting listeners' similarity ratings. The material consisted of 15 melodies representing five folk music styles. A multiple regression analysis showed that the similarity of frequency-based musical properties could account for a moderate amount …
The chronnectome of musical beat
Keeping time is fundamental for our everyday existence. Various isochronous activities, such as locomotion, require us to use internal timekeeping. This phenomenon comes into play also in other human pursuits such as dance and music. When listening to music, we spontaneously perceive and predict its beat. The process of beat perception comprises both beat inference and beat maintenance, their relative importance depending on the salience of beat in the music. To study functional connectivity associated with these processes in a naturalistic situation, we used functional magnetic resonance imaging to measure brain responses of participants while they were listening to a piece of music contai…
The Relationship Between Musical Structure and Emotion in Classical Piano Scores : A Case Study on the Theme of La Folia
We explored the relationship between musical structure and emotion on different variations of La Folia – a musical theme of Portuguese origin based on a standard harmonic progression. Our approach aims to extend previous research by investigating more factors and comparing different models for music emotion. In a pilot study, 12 participants rated the emotion associated to the stimuli on a graduate scale from 1 to 10, according to 3 different models for music emotion: the valence/arousal-based emotion model (Russell, 1980), a discrete emotion approach (Izard, 1972), and the Geneva Emotional Music Scale (GEMS) (Zentner et al., 2008). Stimuli were commercial recordings of the first 8 bars of …
Crossing Phrase Boundaries In Music
This paper presents a new model for segmenting symbolic music data into phrases. It is based on the idea that melodic phrases tend to consist of notes, which increase rather than decrease in length towards the phrase end. Previous research implies that the timing of note events might be a stronger predictor of both theoretical and perceived segmentation than pitch information. Our approach therefore relies only on temporal information about note onsets. Phrase boundaries are predicted at those points in a melody where the difference between subsequent note-to-note intervals reaches minimal values. On its own, the proposed model is parameter-free, does not require adjustments to fit a partic…
The reliability of continuous brain responses during naturalistic listening to music
Low-level (timbral) and high-level (tonal and rhythmical) musical features during continuous listening to music, studied by functional magnetic resonance imaging (fMRI), have been shown to elicit large-scale responses in cognitive, motor, and limbic brain networks. Using a similar methodological approach and a similar group of participants, we aimed to study the replicability of previous findings. Participants' fMRI responses during continuous listening of a tango Nuevo piece were correlated voxelwise against the time series of a set of perceptually validated musical features computationally extracted from the music. The replicability of previous results and the present study was assessed b…
Increasing Stability of EEG Components Extraction Using Sparsity Regularized Tensor Decomposition
Tensor decomposition has been widely employed for EEG signal processing in recent years. Constrained and regularized tensor decomposition often attains more meaningful and interpretable results. In this study, we applied sparse nonnegative CANDECOMP/PARAFAC tensor decomposition to ongoing EEG data under naturalistic music stimulus. Interesting temporal, spectral and spatial components highly related with music features were extracted. We explored the ongoing EEG decomposition results and properties in a wide range of sparsity levels, and proposed a paradigm to select reasonable sparsity regularization parameters. The stability of interesting components extraction from fourteen subjects’ dat…
On application of kernel PCA for generating stimulus features for fMRI during continuous music listening
Background There has been growing interest towards naturalistic neuroimaging experiments, which deepen our understanding of how human brain processes and integrates incoming streams of multifaceted sensory information, as commonly occurs in real world. Music is a good example of such complex continuous phenomenon. In a few recent fMRI studies examining neural correlates of music in continuous listening settings, multiple perceptual attributes of music stimulus were represented by a set of high-level features, produced as the linear combination of the acoustic descriptors computationally extracted from the stimulus audio. New method fMRI data from naturalistic music listening experiment were…
Musical Feature and Novelty Curve Characterizations as Predictors of Segmentation Accuracy
Novelty detection is a well-established method for analyzing the structure of music based on acoustic descriptors. Work on novelty-based segmentation prediction has mainly concentrated on enhancement of features and similarity matrices, novelty kernel computation and peak detection. Less attention, however, has been paid to characteristics of musical features and novelty curves, and their contribution to segmentation accuracy. This is particularly important as it can help unearth acoustic cues prompting perceptual segmentation and find new determinants of segmentation model performance. This study focused on spectral, rhythmic and harmonic prediction of perceptual segmentation density, whic…
Connectivity Patterns During Music Listening: Evidence for Action-Based Processing in Musicians
Musical expertise is visible both in the morphology and functionality of the brain. Recent research indicates that functional integration between multi-sensory, somato-motor, default-mode (DMN), and salience (SN) networks of the brain differentiates musicians from non-musicians during resting state. Here, we aimed at determining whether brain networks differentially exchange information in musicians as opposed to non-musicians during naturalistic music listening. Whole-brain graph-theory analyses were performed on participants' fMRI responses. Group-level differences revealed that musicians' primary hubs comprised cerebral and cerebellar sensorimotor regions whereas non-musicians' dominant …
Motown, Disco, and Drumming
In a study of tempo perception, London, Burger, Thompson, and Toiviainen (2016) presented participants with digitally ‘‘tempo-shifted’’ R&B songs (i.e., sped up or slowed down without otherwise altering their pitch or timbre). They found that while participants’ relative tempo judgments of original versus altered versions were correct, they no longer corresponded to the beat rate of each stimulus. Here we report on three experiments that further probe the relation(s) between beat rate, tempo-shifting, beat salience, melodic structure, and perceived tempo. Experiment 1 is a replication of London et al. (2016) using the original stimuli. Experiment 2 replaces the Motown stimuli with disco…
Hunting for the beat in the body: on period and phase locking in music-induced movement.
Music has the capacity to induce movement in humans. Such responses during music listening are usually spontaneous and range from tapping to full-body dancing. However, it is still unclear how humans embody musical structures to facilitate entrainment. This paper describes two experiments, one dealing with period locking to different metrical levels in full-body movement and its relationships to beat- and rhythm-related musical characteristics, and the other dealing with phase locking in the more constrained condition of sideways swaying motions. Expected in Experiment 1 was that music with clear and strong beat structures would facilitate more period-locked movement. Experiment 2 was assum…
Trait empathy modulates music-related functional connectivity
It has been well established through behavioural studies that empathy significantly influences music perception. Such individual differences typically manifest as variability in whole brain functional connectivity patterns. To date, nobody has examined the modulatory effect of empathy on functional connectivity patterns during continuous music listening. In the present study, we seek to investigate the global and local connectivity patterns of 36 participants whose fMRI scanning was done by employing the naturalistic paradigm wherein they listened to a continuous piece of music. We used graph-based measures of functional connectivity to identify how cognitive and affective components of emp…
Synchronization to metrical levels in music depends on low-frequency spectral components and tempo
Previous studies have found relationships between music-induced movement and musical characteristics on more general levels, such as tempo or pulse clarity. This study focused on synchronization abilities to music of finely-varying tempi and varying degrees of low frequency spectral change/flux. Excerpts from six classic Motown/R&B songs at three different tempos (105, 115, and 130 BPM) were used as stimuli in this experiment. Each was then time-stretched by a factor of 5% with regards to the original tempo, yielding a total of 12 stimuli that were presented to 30 participants. Participants were asked to move along with the stimuli while being recorded with an optical motion capture system.…
Relationships Between Spectral Flux, Perceived Rhythmic Strength, and the Propensity to Move
The tendency to move to music seems to be built into human nature. Previous studies have shown a relationship between movement and the degree of spectral flux in music, particularly in the lower sub-bands. In this study, listeners’ perceptions of a range of frequency-restricted musical stimuli were investigated in order to find relationships between perceived musical aspects (rhythm, melody, and fluctuation) and the spectral flux in three different frequency bands. Additionally, the relationship between the perception of features in specific frequency bands and participants’ desire to move was studied. Participants were presented with clips of frequency-restricted musical stimuli and answer…
Mocap Toolbox - A Matlab Toolbox For Computational Analysis Of Movement Data
The MoCap Toolbox is a set of functions written in Matlab for analyzing and visualizing motion capture data. It is aimed at investigating music-related movement, but can be beneficial for other research areas as well. Since the toolbox code is available as open source, users can freely adapt the functions according to their needs. Users can also make use of the additional functionality that Matlab offers, such as other toolboxes, to further analyze the features extracted with the MoCap Toolbox within the same environment. This paper describes the structure of the toolbox and its data representations, and gives an introduction to the use of the toolbox for research and analysis purposes. The…
Exploring Frequency-dependent Brain Networks from ongoing EEG using Spatial ICA during music listening
AbstractRecently, exploring brain activity based on functional networks during naturalistic stimuli especially music and video represents an attractive challenge because of the low signal-to-noise ratio in collected brain data. Although most efforts focusing on exploring the listening brain have been made through functional magnetic resonance imaging (fMRI), sensor-level electro- or magnetoencephalography (EEG/MEG) technique, little is known about how neural rhythms are involved in the brain network activity under naturalistic stimuli. This study exploited cortical oscillations through analysis of ongoing EEG and musical feature during free-listening to music. We used a data-driven method t…
Group analysis of ongoing EEG data based on fast double-coupled nonnegative tensor decomposition
Abstract Background Ongoing EEG data are recorded as mixtures of stimulus-elicited EEG, spontaneous EEG and noises, which require advanced signal processing techniques for separation and analysis. Existing methods cannot simultaneously consider common and individual characteristics among/within subjects when extracting stimulus-elicited brain activities from ongoing EEG elicited by 512-s long modern tango music. New method Aiming to discover the commonly music-elicited brain activities among subjects, we provide a comprehensive framework based on fast double-coupled nonnegative tensor decomposition (FDC-NTD) algorithm. The proposed algorithm with a generalized model is capable of simultaneo…
Probing neural mechanisms of music perception, cognition, and performance using multivariate decoding.
Recent neuroscience research has shown increasing use of multivariate decoding methods and machine learning. These methods, by uncovering the source and nature of informative variance in large data sets, invert the classical direction of inference that attempts to explain brain activity from mental state variables or stimulus features. However, these techniques are not yet commonly used among music researchers. In this position article, we introduce some key features of machine learning methods and review their use in the field of cognitive and behavioral neuroscience of music. We argue for the great potential of these methods in decoding multiple data types, specifically audio waveforms, e…
Born to dance but beat deaf: A new form of congenital amusia
Humans move to the beat of music. Despite the ubiquity and early emergence of this response, some individuals report being unable to feel the beat in music. We report a sample of people without special training, all of whom were proficient at perceiving and producing the musical beat with the exception of one case (“Mathieu”). Motion capture and psychophysical tests revealed that people synchronized full-body motion to music and detected when a model dancer was not in time with the music. In contrast, Mathieu failed to period- and phase-lock his movement to the beat of most music pieces, and failed to detect most asynchronies of the model dancer. Mathieu’s near-normal synchronization with a…
Modelling the relationships between emotional responses to, and musical content of, music therapy improvisations
This article reports a study in which listeners were asked to provide continuous ratings of perceived emotional content of clinical music therapy improvisations. Participants were presented with 20 short excerpts of music therapy improvisations, and had to rate perceived activity, pleasantness and strength using a computer-based slider interface. A total of nine musical features relating to various aspects of the music (timing, register, dynamics, tonality, pulse clarity and sensory dissonance) were extracted from the excerpts, and relationships between these features and participants' emotion ratings were investigated. The data were analysed in three stages. First, inter-dimension correla…
Predicting Individual Differences from Brain Responses to Music using Functional Network Centrality
Individual differences are known to modulate brain responses to music. Recent neuroscience research suggests that each individual has unique and fundamentally stable functional brain connections irrespective of the task they perform. 77 participants’ functional Magnetic Resonance Imaging (fMRI) responses were measured while continuously listening to music. Using a graph-theory-based approach, we modeled whole-brain functional connectivity. We then calculate voxel-wise eigenvector centrality and subsequently use it to classify gender and musical expertise using binary Support Vector Machine (SVM). We achieved a cross-validated classification accuracy of 97% and 96% for gender and musical exp…
Music therapeutic emotional processing (MEP): Expression, awareness, and pain predict therapeutic outcome
Successful emotional processing is pivotal for the therapeutic change, and music can support emotional processing. However, we know little on how music-based emotional processing actually predicts clinical outcomes. This study investigated music therapeutic emotional processing (MEP) as a predictor of therapeutic outcome in treatment for depression. Data consisted of self-reports of 64 clients (age range 19–57, 74% female) from a clinical trial (12 sessions) of integrative improvisational music therapy (IIMT). A 19-item MEP questionnaire was developed for assessing clients’ experiences after sessions. Emergent MEP factors were correlated with clients’ perceptions of the therapeutic value o…
Analyzing multidimensional movement interaction with generalized cross-wavelet transform
Humans are able to synchronize with musical events whilst coordinating their movements with others. Interpersonal entrainment phenomena, such as dance, involve multiple body parts and movement directions. Along with being multidimensional, dance movement interaction is plurifrequential, since it can occur at different frequencies simultaneously. Moreover, it is prone to nonstationarity, due to, for instance, displacements around the dance floor. Various methodological approaches have been adopted for the study of human entrainment, but only spectrogram-based techniques allow for an integral analysis thereof. This article proposes an alternative approach based upon the cross-wavelet transfor…
Decoding Individual differences and musical preference via music-induced movement.
AbstractMovement is a universal response to music, with dance often taking place in social settings. Although previous work has suggested that socially relevant information, such as personality and gender, are encoded in dance movement, the generalizability of previous work is limited. The current study aims to decode dancers’ gender, personality traits, and music preference from music-induced movements. We propose a method that predicts such individual difference from free dance movements, and demonstrate the robustness of the proposed method by using two data sets collected using different musical stimuli. In addition, we introduce a novel measure to explore the relative importance of dif…
Groove as a multidimensional participatory experience
Groove is a popular and widely used concept in the field of music. Yet, its precise definition remains elusive. Upon closer inspection, groove appears to be used as an umbrella term with various connotations depending on the musical era, the musical context, and the individual using the term. Our aim in this article was to explore different definitions and connotations of the term groove so as to reach a more detailed understanding of it. Consequently, in an online survey, 88 participants provided free-text descriptions of the term groove. A thematic analysis revealed that groove is a multifaceted phenomenon, and participants’ descriptions fit into two main categories: music- and experience…
Action in Perception: Prominent Visuo-Motor Functional Symmetry in Musicians during Music Listening.
Musical training leads to sensory and motor neuroplastic changes in the human brain. Motivated by findings on enlarged corpus callosum in musicians and asymmetric somatomotor representation in string players, we investigated the relationship between musical training, callosal anatomy, and interhemispheric functional symmetry during music listening. Functional symmetry was increased in musicians compared to nonmusicians, and in keyboardists compared to string players. This increased functional symmetry was prominent in visual and motor brain networks. Callosal size did not significantly differ between groups except for the posterior callosum in musicians compared to nonmusicians. We conclude…
Motivic matching strategies for automated pattern extraction
This article proposes an approach to the problem of automated extraction of motivic patterns in monodies. Different musical dimensions, restricted in current approaches to the most prominent melodic and rhythmic features at the surface level, are defined. The proposed strategy of detection of repeated patterns consists of an exact matching of the successive parameters forming the motives. We suggest a generalization of the multiple-viewpoint approach that allows a variability of the types of parameters (melodic, rhythmic, etc.) defining each successive extension of these motives. This enables us to take into account a more general class of motives, called heterogeneous motives, which inclu…
Visualization in comparative music research
Computational analysis of large musical corpora provides an approach that overcomes some of the limitations of manual analysis related to small sample sizes and subjectivity. The present paper aims to provide an overview of the computational approach to music research. It discusses the issues of music representation, musical feature extraction, digital music collections, and data mining techniques. Moreover, it provides examples of visualization of large musical collections.
Autocorrelation in meter induction: the role of accent structure.
The performance of autocorrelation-based meter induction was tested with two large collections of folk melodies, consisting of approximately 13 000 melodies for which the correct meters were available. The performance was measured by the proportion of melodies whose meter was correctly classified by a discriminant function. Furthermore, it was examined whether including different melodic accent types would improve the classification performance. By determining the components of the autocorrelation functions that were significant in the classification it was found that periodicity in note onset locations was the most important cue for the determination of meter. Of the melodic accents includ…
Emotion-driven encoding of music preference and personality in dance
Thirty rhythmic music excerpts were presented to 60 individuals. Dance movements to each excerpt were recorded using an optical motion-capture system, preference for each excerpt recorded on a 5-point Likert scale, and personality assessed using the 44-item version of the Big Five Inventory. From the movement data, a large number of postural, kinematic and kinetic features were extracted, a subset of which were chosen for further analysis using sequential backward elimination with variance inflation factor (VIF) selection. Multivariate analyses revealed significant effects on these 11 features of both preference and personality, as well as a number of interactions between the two. As regar…
Measuring and modeling real-time responses to music: the dynamics of tonality induction.
We examined a variety of real-time responses evoked by a single piece of music, the organ Duetto BWV 805 by J S Bach. The primary data came from a concurrent probe-tone method in which the probe-tone is sounded continuously with the music. Listeners judged how well the probe tone fit with the music at each point in time. The process was repeated for all probe tones of the chromatic scale. A self-organizing map (SOM) [Kohonen 1997 Self-organizing Maps (Berlin: Springer)] was used to represent the developing and changing sense of key reflected in these judgments. The SOM was trained on the probe-tone profiles for 24 major and minor keys (Krumhansl and Kessler 1982 Psychological Review89 334–…
Dynamics of brain activity underlying working memory for music in a naturalistic condition
We aimed at determining the functional neuroanatomy of working memory (WM) recognition of musical motifs that occurs while listening to music by adopting a non-standard procedure. Western tonal music provides naturally occurring repetition and variation of motifs. These serve as WM triggers, thus allowing us to study the phenomenon of motif tracking within real music. Adopting a modern tango as stimulus, a behavioural test helped to identify the stimulus motifs and build a time-course regressor of WM neural responses. This regressor was then correlated with the participants' (musicians') functional magnetic resonance imaging (fMRI) signal obtained during a continuous listening condition. In…
Key issues in decomposing fMRI during naturalistic and continuous music experience with independent component analysis
Background: Independent component analysis (ICA) has been often used to decompose fMRI data mostly for the resting-state, block and event-related designs due to its outstanding advantage. For fMRI data during free-listening experiences, only a few exploratory studies applied ICA.New method: For processing the fMRI data elicited by 512-s modern tango, a FFT based band-pass filter was used to further pre-process the fMRI data to remove sources of no interest and noise. Then, a fast model order selection method was applied to estimate the number of sources. Next, both individual ICA and group ICA were performed. Subsequently, ICA components whose temporal courses were significantly correlated …
Tiede musiikin takana
Multi-Scale Modelling of Segmentation : Effect of Music Training and Experimental Task
While listening to music, people, often unwittingly, break down musical pieces into constituent chunks such as verses and choruses. Music segmentation studies have suggested that some consensus regarding boundary perception exists, despite individual differences. However, neither the effects of experimental task (i.e. realtime vs annotated segmentation), nor of musicianship on boundary perception are clear. Our study assesses musicianship effects and differences between segmentation tasks. We conducted a real-time task experiment to collect segmentations by musicians and non-musicians from 9 musical pieces; in a second experiment on non-realtime segmentation, musicians indicated boundaries …
Predicting Music Therapy Clients’ Type of Mental Disorder Using Computational Feature Extraction and Statistical Modelling Techniques
Background. Previous work has shown that improvisations produced by clients during clinical music therapy sessions are amenable to computational analysis. For example, it has been shown that the perception of emotion in such improvisations is related to certain musical features, such as note density, tonal clarity, and note velocity. Other work has identified relationships between an individual’s level of mental retardation and features such as amount of silence, integration of tempo with the therapist, and amount of dissonance. The present study further develops this work by attempting to predict music therapy clients’ type of mental disorder, as clinically diagnosed, from their improvisat…
MIDI toolbox : MATLAB tools for music research
Conscientiousness and Extraversion relate to responsiveness to tempo in dance
Previous research has shown broad relationships between personality and dance, but the relationship between personality and specific structural features of music has not been explored. The current study explores the influence of personality and trait empathy on dancers' responsiveness to small tempo differences between otherwise musically identical stimuli, measured by difference in the amount in acceleration of key joints. Thirty participants were recorded using motion capture while dancing to excerpts from six popular songs that were time-stretched to be slightly faster or slower than their original tempi. Analysis revealed that higher conscientiousness and lower extraversion both correla…
Postural and gestural synchronization, sequential imitation, and mirroring predict perceived coupling of dancing dyads
Body movement is a primary nonverbal communication channel in humans. Coordinated social behaviors, such as dancing together, encourage multifarious rhythmic and interpersonally coupled movements from which observers can extract socially and contextually relevant information. The investigation of relations between visual social perception and kinematic motor coupling is important for social cognition. Perceived coupling of dyads spontaneously dancing to pop music has been shown to be highly driven by the degree of frontal orientation between dancers. The perceptual salience of other aspects, including postural congruence, movement frequencies, time-delayed relations, and horizontal mirrorin…
What makes music memorable? : Relationships between acoustic musical features and music-evoked emotions and memories in older adults
Publisher: Public Library of Science; International audience; Music has a unique capacity to evoke both strong emotions and vivid autobiographical memories. Previous music information retrieval (MIR) studies have shown that the emotional experience of music is influenced by a combination of musical features, including tonal, rhythmic, and loudness features. Here, our aim was to explore the relationship between music-evoked emotions and music-evoked memories and how musical features (derived with MIR) can predict them both. Methods Healthy older adults (N = 113, age ≥ 60 years) participated in a listening task in which they rated a total of 140 song excerpts comprising folk songs and popular…
Differential Effects of Trait Empathy on Functional Network Centrality
Previous research has shown that empathy, a fundamental component of human social functioning, is engaged when listening to music. Neuroimaging studies of empathy processing in music have, however, been limited. fMRI analysis methods based on graph theory have recently gained popularity as they are capable of illustrating global patterns of functional connectivity, which could be very useful in studying complex traits such as empathy. The current study examines the role of trait empathy, including cognitive and affective facets, on whole-brain functional network centrality in 36 participants listening to music in a naturalistic setting. Voxel-wise eigenvector centrality mapping was calculat…
Relationships between perceived emotions in music and music-induced movement
Listening to music makes us move in various ways. Several factors can affect the characteristics of these movements, including individual factors and musical features. Additionally, music-induced movement may also be shaped by the emotional content of the music, since emotions are an important element of musical expression. This study investigates possible relationships between emotional characteristics of music and music-induced, quasi-spontaneous movement. We recorded music-induced movement of 60 individuals, and computationally extracted features from the movement data. Additionally, the emotional content of the stimuli was assessed in a perceptual experiment. A subsequent correlational …
Embodied Meter Revisited : Entrainment, Musical Content, and Genre in Music-Induced Movement
Previous research has shown that humans tend to embody musical meter at multiple beat levels during spontaneous dance. This work that been based on identifying typical periodic movement patterns, or eigenmovements, and has relied on time-domain analyses. The current study: 1) presents a novel method of using time-frequency analysis in conjunction with group-level tensor decomposition; 2) compares its results to time-domain analysis, and 3) investigates how the amplitude of eigenmovements depends on musical content and genre. Data comprised three-dimensional motion capture of 72 participants’ spontaneous dance movements to 16 stimuli including eight different genres. Each trial was subjected…
Combining PCA and multiset CCA for dimension reduction when group ICA is applied to decompose naturalistic fMRI data
An extension of group independent component analysis (GICA) is introduced, where multi-set canonical correlation analysis (MCCA) is combined with principal component analysis (PCA) for three-stage dimension reduction. The method is applied on naturalistic functional MRI (fMRI) images acquired during task-free continuous music listening experiment, and the results are compared with the outcome of the conventional GICA. The extended GICA resulted slightly faster ICA convergence and, more interestingly, extracted more stimulus-related components than its conventional counterpart. Therefore, we think the extension is beneficial enhancement for GICA, especially when applied to challenging fMRI d…
Dynamic Functional Connectivity Captures Individuals’ Unique Brain Signatures
Recent neuroimaging evidence suggest that there exists a unique individual-specific functional connectivity (FC) pattern consistent across tasks. The objective of our study is to utilize FC patterns to identify an individual using a supervised machine learning approach. To this end, we use two previously published data sets that comprises resting-state and task-based fMRI responses. We use static FC measures as input to a linear classifier to evaluate its performance. We additionally extend this analysis to capture dynamic FC using two approaches: the common sliding window approach and the more recent phase synchrony-based measure. We found that the classification models using dynamic FC pa…
Dynamic Functional Connectivity in the Musical Brain
Musical training causes structural and functional changes in the brain due to its sensory-motor demands. This leads to differences in how musicians perceive and process music as compared to non-musicians, thereby providing insights into brain adaptations and plasticity. Correlational studies and network analysis investigations have indicated the presence of large-scale brain networks involved in the processing of music and have highlighted differences between musicians and non-musicians. However, studies on functional connectivity in the brain during music listening tasks have thus far focused solely on static network analysis. Dynamic Functional Connectivity (DFC) studies have lately been …
Identifying musical pieces from fMRI data using encoding and decoding models.
AbstractEncoding models can reveal and decode neural representations in the visual and semantic domains. However, a thorough understanding of how distributed information in auditory cortices and temporal evolution of music contribute to model performance is still lacking in the musical domain. We measured fMRI responses during naturalistic music listening and constructed a two-stage approach that first mapped musical features in auditory cortices and then decoded novel musical pieces. We then probed the influence of stimuli duration (number of time points) and spatial extent (number of voxels) on decoding accuracy. Our approach revealed a linear increase in accuracy with duration and a poin…
Effects of the Big Five and musical genre on music-induced movement
Nine-hundred-and-fifty-two individuals completed the Big Five Inventory, and 60 extreme scorers were presented with 30 music excerpts from six popular genres. Music-induced movement was recorded by an optical motion-capture system, the data from which 55 postural, kinematic, and kinetic movement features were computed. These features were subsequently reduced to five principal components of movement representing Local Movement, Global Movement, Hand Flux, Head Speed, and Hand Distance. Multivariate Analyses revealed significant effects on these components of both personality and genre, as well as several interactions between the two. Each personality dimension was associated with a differen…
Influences of rhythm- and timbre-related musical features on characteristics of music-induced movement
Music makes us move. Several factors can affect the characteristics of such movements, including individual factors or musical features. For this study, we investigated the effect of rhythm- and timbre-related musical features as well as tempo on movement characteristics. Sixty participants were presented with 30 musical stimuli representing different styles of popular music, and instructed to move along with the music. Optical motion capture was used to record participants’ movements. Subsequently, eight movement features and four rhythm- and timbre-related musical features were computationally extracted from the data, while the tempo was assessed in a perceptual experiment. A subsequent c…
Perceived complexity of western and African folk melodies by western and African listeners
Stylistic knowledge and enculturation play a significant role in music perception, although the importance of psychophysical cues in perception of emotions in music has been acknowledged. The psychophysical cues, such as melodic complexity, are assumed to be independent of musical experience. A cross-cultural comparison was used to investigate the ratings of melodic complexity of western and African participants for western (Experiment 1) and African folk songs (Experiment 2). A range of melodic complexity measures was developed to discover what factors contribute to complexity. On the whole, the groups gave similar patterns of responses in both experiments. In Experiment 1, western folk s…
Musicianship can be decoded from magnetic resonance images
AbstractLearning induces structural changes in the brain. Especially repeated, long-term behaviors, such as extensive training of playing a musical instrument, are likely to produce characteristic features to brain structure. However, it is not clear to what extent such structural features can be extracted from magnetic resonance images of the brain. Here we show that it is possible to predict whether a person is a musician or a non-musician based on the thickness of the cerebral cortex measured at 148 brain regions en-compassing the whole cortex. Using a supervised machine-learning technique, we achieved a significant (κ = 0.321, p < 0.001) agreement between the actual and predicted par…
Cochlear implant users move in time to the beat of drum music.
Cochlear implant users show a profile of residual, yet poorly understood, musical abilities. An ability that has received little to no attention in this population is entrainment to a musical beat. We show for the first time that a heterogeneous group of cochlear implant users is able to find the beat and move their bodies in time to Latin Merengue music, especially when the music is presented in unpitched drum tones. These findings not only reveal a hidden capacity for feeling musical rhythm through the body in the deaf and hearing impaired population, but illuminate promising avenues for designing early childhood musical training that can engage implanted children in social musical activi…
Dance Like Someone is Watching
Although dancing often takes place in social contexts such as a club or party, previous study of such music-induced movement has focused mainly on individuals. The current study explores music-induced movement in a naturalistic dyadic context, focusing on the influence of personality, using five-factor model (FFM) traits, and trait empathy on participants’ responses to their partners. Fifty-four participants were recorded using motion capture while dancing to music excerpts alone and in dyads with three different partners, using a round-robin approach. Analysis using the Social Relations Model (SRM) suggested that the unique combination of each pair caused more variation in participants’ a…
Altered EEG Oscillatory Brain Networks During Music-Listening in Major Depression
To examine the electrophysiological underpinnings of the functional networks involved in music listening, previous approaches based on spatial independent component analysis (ICA) have recently been used to ongoing electroencephalography (EEG) and magnetoencephalography (MEG). However, those studies focused on healthy subjects, and failed to examine the group-level comparisons during music listening. Here, we combined group-level spatial Fourier ICA with acoustic feature extraction, to enable group comparisons in frequency-specific brain networks of musical feature processing. It was then applied to healthy subjects and subjects with major depressive disorder (MDD). The music-induced oscil…
Effect of Enculturation on the Semantic and Acoustic Correlates of Polyphonic Timbre
polyphonic timbre perception was investigated in a cross-cultural context wherein Indian and Western nonmusicians rated short Indian and Western popular music excerpts (1.5 s, n = 200) on eight bipolar scales. Intrinsic dimensionality estimation revealed a higher number of perceptual dimensions in the timbre space for music from one's own culture. Factor analyses of Indian and Western participants' ratings resulted in highly similar factor solutions. The acoustic features that predicted the perceptual dimensions were similar across the two participant groups. Furthermore, both the perceptual dimensions and their acoustic correlates matched closely with the results of a previous study perfor…
Speed on the dance floor : auditory and visual cues for musical tempo
Musical tempo is most strongly associated with the rate of the beat or “tactus,” which may be defined as the most prominent rhythmic periodicity present in the music, typically in a range of 1.67–2 Hz. However, other factors such as rhythmic density, mean rhythmic inter-onset interval, metrical (accentual) structure, and rhythmic complexity can affect perceived tempo (Drake et al., 1999 and London, 2011Drake, Gros, & Penel, 1999; London, 2011). Visual information can also give rise to a perceived beat/tempo (Iversen, et al., 2015), and auditory and visual temporal cues can interact and mutually influence each other (Soto-Faraco and Kingstone, 2004 and Spence, 2015). A five-part experiment w…
Towards Multimodal MIR: Predicting individual differences from music-induced movement
As the field of Music Information Retrieval grows, it is important to take into consideration the multi-modality of music and how aspects of musical engagement such as movement and gesture might be taken into account. Bodily movement is universally associated with music and reflective of important individual features related to music preference such as personality, mood, and empathy. Future multimodal MIR systems may benefit from taking these aspects into account. The current study addresses this by identifying individual differences, specifically Big Five personality traits, and scores on the Empathy and Systemizing Quotients (EQ/SQ) from participants' free dance movements. Our model succe…
Neuroanatomical substrate of noise sensitivity.
Recent functional studies suggest that noise sensitivity, a trait describing attitudes towards noise and predicting noise annoyance, is associated with altered processing in the central auditory system. In the present work, we examined whether noise sensitivity could be related to the structural anatomy of auditory and limbic brain areas. Anatomical MR brain images of 80 subjects were parcellated with FreeSurfer to measure grey matter volume, cortical thickness, cortical area and folding index of anatomical structures in the temporal lobe and insular cortex. The grey matter volume of amygdala and hippocampus was measured as well. According to our findings, noise sensitivity is associated wi…
Large-scale brain networks emerge from dynamic processing of musical timbre, key and rhythm
We investigated the neural underpinnings of timbral, tonal, and rhythmic features of a naturalistic musical stimulus. Participants were scanned with functional Magnetic Resonance Imaging (fMRI) while listening to a stimulus with a rich musical structure, a modern tango. We correlated temporal evolutions of timbral, tonal, and rhythmic features of the stimulus, extracted using acoustic feature extraction procedures, with the fMRI time series. Results corroborate those obtained with controlled stimuli in previous studies and highlight additional areas recruited during musical feature processing. While timbral feature processing was associated with activations in cognitive areas of the cerebel…
Naturalistic music and dance: Cortical phase synchrony in musicians and dancers
Expertise in music has been investigated for decades and the results have been applied not only in composition, performance and music education, but also in understanding brain plasticity in a larger context. Several studies have revealed a strong connection between auditory and motor processes and listening to and performing music, and music imagination. Recently, as a logical next step in music and movement, the cognitive and affective neuro-sciences have been directed towards expertise in dance. To understand the versatile and overlapping processes during artistic stimuli, such as music and dance, it is necessary to study them with continuous naturalistic stimuli. Thus, we used long exce…
Amusic does not mean unmusical: Beat perception and synchronization ability despite pitch deafness
Pitch deafness, the most commonly known form of congenital amusia, refers to a severe deficit in musical pitch processing (i.e., melody discrimination and recognition) that can leave time processing--including rhythm, metre, and "feeling the beat"--preserved. In Experiment 1, we show that by presenting musical excerpts in nonpitched drum timbres, rather than pitched piano tones, amusics show normal metre recognition. Experiment 2 reveals that body movement influences amusics' interpretation of the beat of an ambiguous drum rhythm. Experiment 3 and a subsequent exploratory study show an ability to synchronize movement to the beat of popular dance music and potential for improvement when give…
Personality and musical preference using social-tagging in excerpt-selection.
Music preference has been related to individual differences like social identity, cognitive style, and personality, but quantifying music preference can be a challenge. Self-report measures may be too presumptive of shared genre definitions between listeners, while listener ratings of expert-selected music may fail to reflect typical listeners’ genre boundaries. The current study aims to address this by using a social-tagging approach to select music for studying preference. In this study, 2,407 tracks were collected and subsampled from the Last.fm social-tagging service and the EchoNest platform based on attributes such as genre, tempo, and danceability. The set was further subsampled acco…
Tuning the brain for music
The Sound of Emotion
What is the effect of performers’ experienced emotions on the auditory characteristics of their performances? By asking performers to play a music phrase in response to three different instructions we attempted to answer this question. Performers were instructed to do the following: 1) play while focusing on the technical aspects of their playing; 2) give an expressive performance; and 3) focus on their experienced emotions, prior to which they were subjected to a sadness-inducing mood induction task. Performers were interviewed after each playing condition. We analyzed the tempo, articulation, dynamics, timbre, and vibrato of the performances obtained as well as the interview data. A focus…
Interaction features for prediction of perceptual segmentation:Effects of musicianship and experimental task
As music unfolds in time, structure is recognised and understood by listeners, regardless of their level of musical expertise. A number of studies have found spectral and tonal changes to quite successfully model boundaries between structural sections. However, the effects of musical expertise and experimental task on computational modelling of structure are not yet well understood. These issues need to be addressed to better understand how listeners perceive the structure of music and to improve automatic segmentation algorithms. In this study, computational prediction of segmentation by listeners was investigated for six musical stimuli via a real-time task and an annotation (non real-tim…
Diffusion map for clustering fMRI spatial maps extracted by Indipendent Component Analysis
Functional magnetic resonance imaging (fMRI) produces data about activity inside the brain, from which spatial maps can be extracted by independent component analysis (ICA). In datasets, there are n spatial maps that contain p voxels. The number of voxels is very high compared to the number of analyzed spatial maps. Clustering of the spatial maps is usually based on correlation matrices. This usually works well, although such a similarity matrix inherently can explain only a certain amount of the total variance contained in the high-dimensional data where n is relatively small but p is large. For high-dimensional space, it is reasonable to perform dimensionality reduction before clustering.…
Visualization of tonal content with self-organizing maps and self-similarity matrices
This article presents a dynamic model of tonality perception based on a short-term memory model and a self-organizing map (SOM). The model can be used for dynamic visualization of perceived tonal content, making it possible to examine the clarity and locus of tonality at any given point of time. This article also presents a method for the visualization of tonal structure using self-similarity matrices. The methods are applied to compositions of J. S. Bach, S. Barber, and J. Pachelbel. Finally, a real-time application embracing the tonality perception model is presented.
Dance moves reflect current affective state illustrative of approach–avoidance motivation.
Emotions of music listening in Finland and in India : Comparison of an individualistic and a collectivistic culture
Music is appreciated for emotional reasons across cultures, but knowledge on the cross-cultural similarities and differences of music-evoked emotions is still sparse. The current study compared music-evoked emotions in Finland and in India, contextualizing them within the perceived psychological functionality of music in an individualistic versus collectivistic culture. Participants ( N = 230) answered an online survey on music-evoked emotions and related personal meanings. A mixed-method approach using factor analysis and qualitative content analysis was used to identify the concepts for cross-cultural comparison. Results show that both cultures value music for positive emotional experien…
Influence of Musical Expertise on the processing of Musical Features in a Naturalistic Setting
Musical training causes structural and functional changes in the brain due to its sensory-motor demands, but the modulatory effect of musical training on music feature processing in the brain in a continuous music listening paradigm, has not been investigated thus far. In this work, we investigate the differences between musicians and non-musicians in the encoding of musical features encompassing musical timbre, rhythm and tone. 18 musicians and 18 non-musicians were scanned using fMRI while listening to 3 varied stimuli. Acoustic features corresponding to timbre, rhythm and tone were computationally extracted from the stimuli and correlated with brain responses, followed by t-tests on grou…
Coupling of Action-Perception Brain Networks during Musical Pulse Processing: Evidence from Region-of-Interest-Based Independent Component Analysis
Our sense of rhythm relies on orchestrated activity of several cerebral and cerebellar structures. Although functional connectivity studies have advanced our understanding of rhythm perception, this phenomenon has not been sufficiently studied as a function of musical training and beyond the General Linear Model (GLM) approach. Here, we studied pulse clarity processing during naturalistic music listening using a data-driven approach (independent component analysis; ICA). Participants’ (18 musicians and 18 controls) functional magnetic resonance imaging (fMRI) responses were acquired while listening to music. A targeted region of interest (ROI) related to pulse clarity processing was defined…
Dance to your own drum: Identification of musical genre and individual dancer from motion capture using machine learning
Machine learning has been used to accurately classify musical genre using features derived from audio signals. Musical genre, as well as lower-level audio features of music, have also been shown to...
Modeling the Target-Note Technique of Bebop-Style Jazz Improvisation: An Artificial Neural Network Approach
In cognitive science and research on artificial intelligence, there are two central paradigms: symbolic and analogical. Within the analogical paradigm, artificial neural networks (ANNs) have recently been successfully used to model and simulate cognitive phenomena. One of the most prominent features of ANNs is their ability to learn by example and, to a certain extent, generalize what they have learned. Improvisation, the art of spontaneously creating music while playing or singing, fundamentally has an imitative nature. Regardless of how much one studies and analyzes, the art of improvisation is learned mostly by example. Instead of memorizing explicit rules, the student mimics the playing…
Fractionating auditory priors: A neural dissociation between active and passive experience of musical sounds
Learning, attention and action play a crucial role in determining how stimulus predictions are formed, stored, and updated. Years-long experience with the specific repertoires of sounds of one or more musical styles is what characterizes professional musicians. Here we contrasted active experience with sounds, namely long-lasting motor practice, theoretical study and engaged listening to the acoustic features characterizing a musical style of choice in professional musicians with mainly passive experience of sounds in laypersons. We hypothesized that long-term active experience of sounds would influence the neural predictions of the stylistic features in professional musicians in a distinct…
Response to Discussion on Y. Zhu, X. Wang, K. Mathiak, P. Toiviainen, T. Ristaniemi, J. Xu, Y. Chang and F. Cong, Altered EEG Oscillatory Brain Networks During Music-Listening in Major Depression, International Journal of Neural Systems, Vol. 31, No. 3 (2021) 2150001 (14 pages).
Optimizing self-organizing timbre maps: Two approaches
The effect of using different auditory images and distance metrics on the final configuration of a self-organized timbre map is examined by comparing distance matrices obtained from simulations with a similarity rating matrix, obtained using the same set of stimuli as in the simulations. Two approaches are described. In the static approach, each stimulus is represented as a single multi-component vector. Gradient images, which are intended to represent idealizations of physiological gradient maps in the auditory pathway, are constructed. The optimal auditory image and distance metric, with respect to the similarity rating data, are searched using the gradient method. In the dynamic approach…
Exploring relationships between audio features and emotion in music
In this paper, we present an analysis of the associations between emotion categories and audio features automatically extracted from raw audio data. This work is based on 110 excerpts from film soundtracks evaluated by 116 listeners. This data is annotated with 5 basic emotions (fear, anger, happiness, sadness, tenderness) on a 7 points scale. Exploiting state-of-the-art Music Information Retrieval (MIR) techniques, we extract audio features of different kind: timbral, rhythmic and tonal. Among others we also compute estimations of dissonance, mode, onset rate and loudness. We study statistical relations between audio descriptors and emotion categories confirming results from psychological …
From Vivaldi to Beatles and back: predicting lateralized brain responses to music.
We aimed at predicting the temporal evolution of brain activity in naturalistic music listening conditions using a combination of neuroimaging and acoustic feature extraction. Participants were scanned using functional Magnetic Resonance Imaging (fMRI) while listening to two musical medleys, including pieces from various genres with and without lyrics. Regression models were built to predict voxel-wise brain activations which were then tested in a cross-validation setting in order to evaluate the robustness of the hence created models across stimuli. To further assess the generalizability of the models we extended the cross-validation procedure by including another dataset, which comprised …
Tapping doesn't help: Synchronized self-motion and judgments of musical tempo.
For both musicians and music psychologists, beat rate (BPM) has often been regarded as a transparent measure of musical speed or tempo, yet recent research has shown that tempo is more than just BPM. In a previous study, London, Burger, Thompson, and Toiviainen (Acta Psychologica, 164, 70–80, 2016) presented participants with original as well as “time-stretched” versions of classic R&B songs; time stretching slows down or speeds up a recording without changing its pitch or timbre. In that study we discovered a tempo anchoring effect (TAE): Although relative tempo judgments (original vs. time-stretched versions of the same song) were correct, they were at odds with BPM rates of each stimulus…
Empathy, Entrainment, and Perceived Interaction in Complex Dyadic Dance Movement
The current study explores how individuals' tendency to empathize with others (trait empathy) modulates interaction and social entrainment in dyadic dance in a free movement context using perceptual and computationally derived measures. Stimuli consisting of 24 point-light animations were created using motion capture data selected from a sample of 99 dyads, based on self-reported trait empathy. Individuals whose Empathy Quotient (EQ) scores were in the top or bottom quartile of all scores were considered to have high or low empathy, respectively, and twelve dyads comprised of four high-high, four low-low, and four high-low empathy combinations were identified. Animations of these dyads were…
Exploiting ongoing EEG with multilinear partial least squares during free-listening to music
During real-world experiences, determining the stimulus-relevant brain activity is excitingly attractive and is very challenging, particularly in electroencephalography. Here, spectrograms of ongoing electroencephalogram (EEG) of one participant constructed a third-order tensor with three factors of time, frequency and space; and the stimulus data consisting of acoustical features derived from the naturalistic and continuous music formulated a matrix with two factors of time and the number of features. Thus, the multilinear partial least squares (PLS) conforming to the canonical polyadic (CP) model was performed on the tensor and the matrix for decomposing the ongoing EEG. Consequently, we …
Connectivity patterns during music listening: Evidence for action-based processing in musicians
Musical expertise is visible both in the morphology and functionality of the brain. Recent research indicates that functional integration between multi-sensory, somato-motor, default-mode (DMN), and salience (SN) networks of the brain differentiates musicians from non-musicians during resting state. Here, we aimed at determining whether brain networks differentially exchange information in musicians as opposed to non-musicians during naturalistic music listening. Whole-brain graph-theory analyses were performed on participants' fMRI responses. Group-level differences revealed that musicians' primary hubs comprised cerebral and cerebellar sensorimotor regions whereas non-musicians' dominant …
A matlab toolbox for music information retrieval
We present MIRToolbox, an integrated set of functions written in Matlab, dedicated to the extraction from audio files of musical features related, among others, to timbre, tonality, rhythm or form. The objective is to offer a state of the art of computational approaches in the area of Music Information Retrieval (MIR). The design is based on a modular framework: the different algorithms are decomposed into stages, formalized using a minimal set of elementary mechanisms, and integrating different variants proposed by alternative approaches — including new strategies we have developed —, that users can select and parametrize. These functions can adapt to a large area of objects as input.
Effects of musicianship and experimental task on perceptual segmentation
The perceptual structure of music is a fundamental issue in music psychology that can be systematically addressed via computational models. This study estimated the contribution of spectral, rhythmic and tonal descriptors for prediction of perceptual segmentation across stimuli. In a real-time task, 18 musicians and 18 non-musicians indicated perceived instants of significant change for six ongoing musical stimuli. In a second task, 18 musicians parsed the same stimuli using audio editing software to provide non-real-time segmentation annotations. We built computational models based on a non-linear fuzzy integration of basic and interaction descriptors of local musical novelty. We found tha…
Optimizing auditory images and distance metrics for self‐organizing timbre maps*
Abstract The effect of using different auditory images and distance metrics on the final configuration of a self‐organized timbre map is examined by comparing distance matrices, obtained from simulations, with a similarity rating matrix, obtained using the same set of stimuli as in the simulations. Gradient images, which are intended to represent idealizations of physiological gradient maps in the auditory pathway, are constructed. The optimal auditory image and distance metric, with respect to the similarity rating data, are searched using the gradient method.
See how it feels to move: Relationships between movement characteristics and perception of emotions in dance
Music makes humans move in ways found to relate to, for instance, musical characteristics, personality, or emotional content of the music. In this study, we investigated associations between embodiments of musical emotions and the perception thereof. After collecting motion capture data of dancers moving to emotionally distinct musical stimuli, silent stick-figure animations were rated by a set of observers regarding perceived discrete emotions, while 10 movement features were computationally extracted from the motion capture data. Results indicate kinematic profiles—emotion-specific sets of movement characteristics—that furthermore conform with dimensional models of valence and arousal, su…
Motown, Disco, and Drumming : An Exploration of the Relationship Between Beat Salience, Melodic Structure, and Perceived Tempo
In a study of tempo perception, London, Burger, Thompson, and Toiviainen (2016) presented participants with digitally ‘‘tempo-shifted’’ R&B songs (i.e., sped up or slowed down without otherwise altering their pitch or timbre). They found that while participants’ relative tempo judgments of original versus altered versions were correct, they no longer corresponded to the beat rate of each stimulus. Here we report on three experiments that further probe the relation(s) between beat rate, tempo-shifting, beat salience, melodic structure, and perceived tempo. Experiment 1 is a replication of London et al. (2016) using the original stimuli. Experiment 2 replaces the Motown stimuli with disco mus…
sj-docx-1-pom-10.1177_03057356221084368 – Supplemental material for Musical interaction in music therapy for depression treatment
Supplemental material, sj-docx-1-pom-10.1177_03057356221084368 for Musical interaction in music therapy for depression treatment by Martin Hartmann, Anastasios Mavrolampados, Petri Toiviainen, Suvi Saarikallio, Katrien Foubert, Olivier Brabant, Nerdinga Snape, Esa Ala-Ruona, Christian Gold and Jaakko Erkkilä in Psychology of Music
Decoding Musical Training from Dynamic Processing of Musical Features in the Brain
AbstractPattern recognition on neural activations from naturalistic music listening has been successful at predicting neural responses of listeners from musical features, and vice versa. Inter-subject differences in the decoding accuracies have arisen partly from musical training that has widely recognized structural and functional effects on the brain. We propose and evaluate a decoding approach aimed at predicting the musicianship class of an individual listener from dynamic neural processing of musical features. Whole brain functional magnetic resonance imaging (fMRI) data was acquired from musicians and nonmusicians during listening of three musical pieces from different genres. Six mus…
Embodied Metre in Spontaneous Movement to Music
Listening to music is often associated with spontaneous body movements, frequently synchronized with its periodic structure. The notion of embodied cognition assumes that intelligent behavior does not emerge from mere passive perception, but requires goal-directed interactions between the organism and its environment. According to this view, one could postulate that we use our bodily movements to help parse the metric structure of music. The aim of the study was to investigate how pulsations on different metrical levels are manifested in spontaneous movement to music. Participants were presented with a piece of instrumental music in 4/4 time, played in five different tempi ranging from 92 B…
Hot or Not? Personality and attraction on the dance floor
Previous research has shown that personality plays a significant role in interpersonal attraction. We took this issue to the dance floor, and investigated how personality characteristics of both observers and dancers affect the former’s attractiveness ratings of the latter. Sixty-two heterosexual adult participants watched 48 short audio-visual point-light animations of eight male and eight female adults dancing to Techno, Pop, and Latin music. Participants rated perceived skill of each dancer, and the likelihood with which they would go on a date with them. Both dancers’ and observers’ personality characteristics were assessed using the Big Five Inventory. Multivariate analyses of variance…
On Happy Dance : Emotion Recognition in Dance Movements
Movements are capable of conveying emotions, as shown for instance in studies on both non-verbal gestures and music-specific movements performed by instrumentalists or professional dancers. Since dancing/moving to music is a common human activity, this study aims at investigating whether quasi-spontaneous music-induced movements of non-professional dancers can convey emotional qualities as well. From a movement data pool of 60 individuals dancing to 30 musical stimuli, the performances of four dancers that moved most notably, and four stimuli representing happiness, anger, sadness, and tenderness were chosen to create a set of stimuli containing the four audio excerpts, 16 video excerpts (w…
MATLAB codes implementing the generalized cross-wavelet transform (GXWT) algorithm described in the paper "Analyzing multidimensional movement interaction with generalized cross-wavelet transform" (Toiviainen & Hartmann, 2021)
MATLAB codes implementing the generalized cross-wavelet transform (GXWT) algorithm described in the paper "Analyzing multidimensional movement interaction with generalized cross-wavelet transform" (Toiviainen & Hartmann, 2021). Basic workflow for multivariate (time by channel) signals d1 and d2: [w1,f] = cwtensor(d1,FS,MINF,MAXF); [w2,f] = cwtensor(d2,FS,MINF,MAXF); [xs p1 p2] = getxwt(w1,w2); NOTE: cwtensor.m requires the Wavelet Toolbox for MATLAB.