Search results for "Timbre"
showing 10 items of 46 documents
Semantic structures of timbre emerging from social and acoustic descriptions of music
2011
The perceptual attributes of timbre have inspired a considerable amount of multidisciplinary research, but because of the complexity of the phenomena, the approach has traditionally been confined to laboratory conditions, much to the detriment of its ecological validity. In this study, we present a purely bottom-up approach for mapping the concepts that emerge from sound qualities. A social media ( http://www.last.fm ) is used to obtain a wide sample of verbal descriptions of music (in the form of tags) that go beyond the commonly studied concept of genre, and from this the underlying semantic structure of this sample is extracted. The structure that is thereby obtained is then evaluated th…
Earlier timbre processing of instrumental tones compared to equally complex spectrally rotated sounds as revealed by the mismatch negativity.
2014
Harmonically rich sounds have been shown to be processed more efficiently by the human brain compared to single sinusoidal tones. To control for stimulus complexity as a potentially confounding factor, tones and equally complex spectrally rotated sounds, have been used in the present study to investigate the role of the overtone series in sensory auditory processing in non-musicians. Timbre differences in instrumental tones with equal pitch elicited a MMN which was earlier compared to that elicited by the spectrally rotated sounds, indicating that harmonically rich tones are processed faster compared to non-musical sounds without an overtone series, even when pitch is not the relevant infor…
The sound of music: differentiating musicians using a fast, musical multi-feature mismatch negativity paradigm.
2011
Abstract Musicians’ skills in auditory processing depend highly on instrument, performance practice, and on level of expertise. Yet, it is not known though whether the style/genre of music might shape auditory processing in the brains of musicians. Here, we aimed at tackling the role of musical style/genre on modulating neural and behavioral responses to changes in musical features. Using a novel, fast and musical sounding multi-feature paradigm, we measured the mismatch negativity (MMN), a pre-attentive brain response, to six types of musical feature change in musicians playing three distinct styles of music (classical, jazz, rock/pop) and in non-musicians. Jazz and classical musicians sco…
Tapping doesn't help: Synchronized self-motion and judgments of musical tempo.
2019
For both musicians and music psychologists, beat rate (BPM) has often been regarded as a transparent measure of musical speed or tempo, yet recent research has shown that tempo is more than just BPM. In a previous study, London, Burger, Thompson, and Toiviainen (Acta Psychologica, 164, 70–80, 2016) presented participants with original as well as “time-stretched” versions of classic R&B songs; time stretching slows down or speeds up a recording without changing its pitch or timbre. In that study we discovered a tempo anchoring effect (TAE): Although relative tempo judgments (original vs. time-stretched versions of the same song) were correct, they were at odds with BPM rates of each stimulus…
Decoding Musical Training from Dynamic Processing of Musical Features in the Brain
2018
AbstractPattern recognition on neural activations from naturalistic music listening has been successful at predicting neural responses of listeners from musical features, and vice versa. Inter-subject differences in the decoding accuracies have arisen partly from musical training that has widely recognized structural and functional effects on the brain. We propose and evaluate a decoding approach aimed at predicting the musicianship class of an individual listener from dynamic neural processing of musical features. Whole brain functional magnetic resonance imaging (fMRI) data was acquired from musicians and nonmusicians during listening of three musical pieces from different genres. Six mus…
The Sound of Emotion
2014
What is the effect of performers’ experienced emotions on the auditory characteristics of their performances? By asking performers to play a music phrase in response to three different instructions we attempted to answer this question. Performers were instructed to do the following: 1) play while focusing on the technical aspects of their playing; 2) give an expressive performance; and 3) focus on their experienced emotions, prior to which they were subjected to a sadness-inducing mood induction task. Performers were interviewed after each playing condition. We analyzed the tempo, articulation, dynamics, timbre, and vibrato of the performances obtained as well as the interview data. A focus…
A matlab toolbox for music information retrieval
2008
We present MIRToolbox, an integrated set of functions written in Matlab, dedicated to the extraction from audio files of musical features related, among others, to timbre, tonality, rhythm or form. The objective is to offer a state of the art of computational approaches in the area of Music Information Retrieval (MIR). The design is based on a modular framework: the different algorithms are decomposed into stages, formalized using a minimal set of elementary mechanisms, and integrating different variants proposed by alternative approaches — including new strategies we have developed —, that users can select and parametrize. These functions can adapt to a large area of objects as input.
Event-related brain responses while listening to entire pieces of music
2017
Brain responses to discrete short sounds have been studied intensively using the event-related potential (ERP) method, in which the electroencephalogram (EEG) signal is divided into epochs time-locked to stimuli of interest. Here we introduce and apply a novel technique which enables one to isolate ERPs in human elicited by continuous music. The ERPs were recorded during listening to a Tango Nuevo piece, a deep techno track and an acoustic lullaby. Acoustic features related to timbre, harmony, and dynamics of the audio signal were computationally extracted from the musical pieces. Negative deflation occurring around 100 milliseconds after the stimulus onset (N100) and positive deflation occ…
Comprehensive auditory discrimination profiles recorded with a fast parametric musical multi-feature mismatch negativity paradigm
2016
Abstract Objective Mismatch negativity (MMN), a component of the auditory event-related potential (ERP) in response to auditory-expectancy violation, is sensitive to central auditory processing deficits associated with several clinical conditions and to auditory skills deriving from musical expertise. This sensitivity is more evident for stimuli integrated in complex sound contexts. This study tested whether increasing magnitudes of deviation (levels) entail increasing MMN amplitude (or decreasing latency), aiming to create a balanced version of the musical multi-feature paradigm towards measurement of extensive auditory discrimination profiles in auditory expertise or deficits. Methods Usi…
Harmonic priming in an amusic patient: the power of implicit tasks.
2008
Our study investigated with an implicit method (i.e., priming paradigm) whether I.R. - a brain-damaged patient exhibiting severe amusia - processes implicitly musical structures. The task consisted in identifying one of two phonemes (Experiment 1) or timbres (Experiment 2) on the last chord of eight-chord sequences (i.e., target). The targets were harmonically related or less related to the prior chords. I.R. displayed harmonic priming effects: Phoneme and timbre identification was faster for related than for less related targets (Experiments 1 and 2). However, I.R.'s explicit judgements of completion for the same sequences did not differ between related and less related contexts (Experimen…