Search results for "auditor."
showing 10 items of 737 documents
Auditory event-related potentials over medial frontal electrodes express both negative and positive prediction errors
2015
International audience; While the neuronal activation in the medial frontal cortex is thought to reflect higher-order evaluation processes of reward prediction errors when a reward deviates from our expectation, there is increasing evidence that the medial frontal activity might express prediction errors in general. However, given that several studies examined the medial frontal event-related potentials (ERPs) by comparing signals triggered by different stimuli and different anticipations, it remains an open question whether the medial frontal signals are sensitive to the valence of prediction errors. Here we orthogonally manipulated expectation magnitude (i.e., large/small expectation) and…
Effects of Selective Attention on Syntax Processing in Music and Language
2010
Abstract The present study investigated the effects of auditory selective attention on the processing of syntactic information in music and speech using event-related potentials. Spoken sentences or musical chord sequences were either presented in isolation, or simultaneously. When presented simultaneously, participants had to focus their attention either on speech, or on music. Final words of sentences and final harmonies of chord sequences were syntactically either correct or incorrect. Irregular chords elicited an early right anterior negativity (ERAN), whose amplitude was decreased when music was simultaneously presented with speech, compared to when only music was presented. However, t…
The Effect of Adaptive Nonlinear Frequency Compression on Phoneme Perception
2017
Purpose This study implemented a fitting method, developed for use with frequency lowering hearing aids, across multiple testing sites, participants, and hearing aid conditions to evaluate speech perception with a novel type of frequency lowering. Method A total of 8 participants, including children and young adults, participated in real-world hearing aid trials. A blinded crossover design, including posttrial withdrawal testing, was used to assess aided phoneme perception. The hearing aid conditions included adaptive nonlinear frequency compression (NFC), static NFC, and conventional processing. Results Enabling either adaptive NFC or static NFC improved group-level detection and recognit…
Synchronization to metrical levels in music depends on low-frequency spectral components and tempo
2016
Previous studies have found relationships between music-induced movement and musical characteristics on more general levels, such as tempo or pulse clarity. This study focused on synchronization abilities to music of finely-varying tempi and varying degrees of low frequency spectral change/flux. Excerpts from six classic Motown/R&B songs at three different tempos (105, 115, and 130 BPM) were used as stimuli in this experiment. Each was then time-stretched by a factor of 5% with regards to the original tempo, yielding a total of 12 stimuli that were presented to 30 participants. Participants were asked to move along with the stimuli while being recorded with an optical motion capture system.…
The auditory N1 suppression rebounds as prediction persists over time
2016
International audience; The predictive coding model of perception proposes that neuronal responses reflect prediction errors. Repeated as well as predicted stimuli trigger suppressed neuronal responses because they are associated with reduced prediction errors. However, many predictable events in our environment are not isolated but sequential, yet there is little empirical evidence documenting how suppressed neuronal responses reflecting reduced prediction errors change in the course of a predictable sequence of events. Here we conceived an auditory electroencephalography (EEG) experiment where prediction persists over series of four tones to allow for the delineation of the dynamics of th…
Speed on the dance floor : auditory and visual cues for musical tempo
2016
Musical tempo is most strongly associated with the rate of the beat or “tactus,” which may be defined as the most prominent rhythmic periodicity present in the music, typically in a range of 1.67–2 Hz. However, other factors such as rhythmic density, mean rhythmic inter-onset interval, metrical (accentual) structure, and rhythmic complexity can affect perceived tempo (Drake et al., 1999 and London, 2011Drake, Gros, & Penel, 1999; London, 2011). Visual information can also give rise to a perceived beat/tempo (Iversen, et al., 2015), and auditory and visual temporal cues can interact and mutually influence each other (Soto-Faraco and Kingstone, 2004 and Spence, 2015). A five-part experiment w…
Processing of auditory stimuli during tonic and phasic periods of REM sleep as revealed by event-related brain potentials
1996
The brain has been reported to be more preoccupied with dreams during phasic than during tonic REM sleep. Whether these periods also differ in terms of the processing of external stimuli was examined. Event-related brain potentials (ERPs) to a frequent standard tone of 1000 Hz (P = 97%) and infrequent deviant tones of 1100 and 2000 Hz (P = 1.5% for each) were recorded (n = 13) during wakefulness and nocturnal sleep. An ERP wave (called REM-P3) resembling a waking P3 wave was larger for the 2000 Hz deviant during tonic than during phasic REM sleep. Also the P210 wave was larger during tonic than during phasic REM sleep. A reliable mismatch negativity component appeared only in wakefulness. I…
The feeling of familiarity for music in patients with a unilateral temporal lobe lesion: A gating study
2015
International audience; Previous research has indicated that the medial temporal lobe (MTL), and more specifically the perirhinal cortex, plays a role in the feeling of familiarity for non-musical stimuli. Here, we examined contribution of the MTL to the feeling of familiarity for music by testing patients with unilateral MTL lesions. We used a gating paradigm: segments of familiar and unfamiliar musical excerpts were played with increasing durations (250, 500, 1000, 2000, 4000 ms and complete excerpts), and participants provided familiarity judgments for each segment. Based on the hypothesis that patients might need longer segments than healthy controls (HC) to identify excerpts as familia…
Speech- and sound-segmentation in dyslexia: evidence for a multiple-level cortical impairment
2006
Developmental dyslexia involves deficits in the visual and auditory domains, but is primarily characterized by an inability to translate the written linguistic code to the sound structure. Recent research has shown that auditory dysfunctions in dyslexia might originate from impairments in early pre-attentive processes, which affect behavioral discrimination. Previous studies have shown that whereas dyslexic individuals are deficient in discriminating sound distinctions involving consonants or simple pitch changes, discrimination of other sound aspects, such as tone duration, is intact. We hypothesized that such contrasts that can be discriminated by dyslexic individuals when heard in isolat…
Learning-induced neural plasticity of speech processing before birth
2013
Learning, the foundation of adaptive and intelligent behavior, is based on plastic changes in neural assemblies, reflected by the modulation of electric brain responses. In infancy, auditory learning implicates the formation and strengthening of neural long-term memory traces, improving discrimination skills, in particular those forming the prerequisites for speech perception and understanding. Although previous behavioral observations show that newborns react differentially to unfamiliar sounds vs. familiar sound material that they were exposed to as fetuses, the neural basis of fetal learning has not thus far been investigated. Here we demonstrate direct neural correlates of human fetal l…