WorldWideScience

Sample records for auditory attention activates

  1. Auditory attention activates peripheral visual cortex.

    Directory of Open Access Journals (Sweden)

    Anthony D Cate

    Full Text Available BACKGROUND: Recent neuroimaging studies have revealed that putatively unimodal regions of visual cortex can be activated during auditory tasks in sighted as well as in blind subjects. However, the task determinants and functional significance of auditory occipital activations (AOAs remains unclear. METHODOLOGY/PRINCIPAL FINDINGS: We examined AOAs in an intermodal selective attention task to distinguish whether they were stimulus-bound or recruited by higher-level cognitive operations associated with auditory attention. Cortical surface mapping showed that auditory occipital activations were localized to retinotopic visual cortex subserving the far peripheral visual field. AOAs depended strictly on the sustained engagement of auditory attention and were enhanced in more difficult listening conditions. In contrast, unattended sounds produced no AOAs regardless of their intensity, spatial location, or frequency. CONCLUSIONS/SIGNIFICANCE: Auditory attention, but not passive exposure to sounds, routinely activated peripheral regions of visual cortex when subjects attended to sound sources outside the visual field. Functional connections between auditory cortex and visual cortex subserving the peripheral visual field appear to underlie the generation of AOAs, which may reflect the priming of visual regions to process soon-to-appear objects associated with unseen sound sources.

  2. Dissociable influences of auditory object vs. spatial attention on visual system oscillatory activity.

    Directory of Open Access Journals (Sweden)

    Jyrki Ahveninen

    Full Text Available Given that both auditory and visual systems have anatomically separate object identification ("what" and spatial ("where" pathways, it is of interest whether attention-driven cross-sensory modulations occur separately within these feature domains. Here, we investigated how auditory "what" vs. "where" attention tasks modulate activity in visual pathways using cortically constrained source estimates of magnetoencephalograpic (MEG oscillatory activity. In the absence of visual stimuli or tasks, subjects were presented with a sequence of auditory-stimulus pairs and instructed to selectively attend to phonetic ("what" vs. spatial ("where" aspects of these sounds, or to listen passively. To investigate sustained modulatory effects, oscillatory power was estimated from time periods between sound-pair presentations. In comparison to attention to sound locations, phonetic auditory attention was associated with stronger alpha (7-13 Hz power in several visual areas (primary visual cortex; lingual, fusiform, and inferior temporal gyri, lateral occipital cortex, as well as in higher-order visual/multisensory areas including lateral/medial parietal and retrosplenial cortices. Region-of-interest (ROI analyses of dynamic changes, from which the sustained effects had been removed, suggested further power increases during Attend Phoneme vs. Location centered at the alpha range 400-600 ms after the onset of second sound of each stimulus pair. These results suggest distinct modulations of visual system oscillatory activity during auditory attention to sound object identity ("what" vs. sound location ("where". The alpha modulations could be interpreted to reflect enhanced crossmodal inhibition of feature-specific visual pathways and adjacent audiovisual association areas during "what" vs. "where" auditory attention.

  3. Association of blood antioxidants status with visual and auditory sustained attention.

    Science.gov (United States)

    Shiraseb, Farideh; Siassi, Fereydoun; Sotoudeh, Gity; Qorbani, Mostafa; Rostami, Reza; Sadeghi-Firoozabadi, Vahid; Narmaki, Elham

    2015-01-01

    A low antioxidants status has been shown to result in oxidative stress and cognitive impairment. Because antioxidants can protect the nervous system, it is expected that a better blood antioxidant status might be related to sustained attention. However, the relationship between the blood antioxidant status and visual and auditory sustained attention has not been investigated. The aim of this study was to evaluate the association of fruits and vegetables intake and the blood antioxidant status with visual and auditory sustained attention in women. This cross-sectional study was performed on 400 healthy women (20-50 years) who attended the sports clubs of Tehran Municipality. Sustained attention was evaluated based on the Integrated Visual and Auditory Continuous Performance Test using the Integrated Visual and Auditory (IVA) software. The 24-hour food recall questionnaire was used for estimating fruits and vegetables intake. Serum total antioxidant capacity (TAC), and erythrocyte superoxide dismutase (SOD) and glutathione peroxidase (GPx) activities were measured in 90 participants. After adjusting for energy intake, age, body mass index (BMI), years of education and physical activity, higher reported fruits, and vegetables intake was associated with better visual and auditory sustained attention (P attention (P visual and auditory sustained attention after adjusting for age, years of education, physical activity, energy, BMI, and caffeine intake (P visual and auditory sustained attention is associated with a better blood antioxidant status. Therefore, improvement of the antioxidant status through an appropriate dietary intake can possibly enhance sustained attention.

  4. Attentional modulation of auditory steady-state responses.

    Science.gov (United States)

    Mahajan, Yatin; Davis, Chris; Kim, Jeesun

    2014-01-01

    Auditory selective attention enables task-relevant auditory events to be enhanced and irrelevant ones suppressed. In the present study we used a frequency tagging paradigm to investigate the effects of attention on auditory steady state responses (ASSR). The ASSR was elicited by simultaneously presenting two different streams of white noise, amplitude modulated at either 16 and 23.5 Hz or 32.5 and 40 Hz. The two different frequencies were presented to each ear and participants were instructed to selectively attend to one ear or the other (confirmed by behavioral evidence). The results revealed that modulation of ASSR by selective attention depended on the modulation frequencies used and whether the activation was contralateral or ipsilateral. Attention enhanced the ASSR for contralateral activation from either ear for 16 Hz and suppressed the ASSR for ipsilateral activation for 16 Hz and 23.5 Hz. For modulation frequencies of 32.5 or 40 Hz attention did not affect the ASSR. We propose that the pattern of enhancement and inhibition may be due to binaural suppressive effects on ipsilateral stimulation and the dominance of contralateral hemisphere during dichotic listening. In addition to the influence of cortical processing asymmetries, these results may also reflect a bias towards inhibitory ipsilateral and excitatory contralateral activation present at the level of inferior colliculus. That the effect of attention was clearest for the lower modulation frequencies suggests that such effects are likely mediated by cortical brain structures or by those in close proximity to cortex.

  5. Attentional Modulation of Auditory Steady-State Responses

    Science.gov (United States)

    Mahajan, Yatin; Davis, Chris; Kim, Jeesun

    2014-01-01

    Auditory selective attention enables task-relevant auditory events to be enhanced and irrelevant ones suppressed. In the present study we used a frequency tagging paradigm to investigate the effects of attention on auditory steady state responses (ASSR). The ASSR was elicited by simultaneously presenting two different streams of white noise, amplitude modulated at either 16 and 23.5 Hz or 32.5 and 40 Hz. The two different frequencies were presented to each ear and participants were instructed to selectively attend to one ear or the other (confirmed by behavioral evidence). The results revealed that modulation of ASSR by selective attention depended on the modulation frequencies used and whether the activation was contralateral or ipsilateral. Attention enhanced the ASSR for contralateral activation from either ear for 16 Hz and suppressed the ASSR for ipsilateral activation for 16 Hz and 23.5 Hz. For modulation frequencies of 32.5 or 40 Hz attention did not affect the ASSR. We propose that the pattern of enhancement and inhibition may be due to binaural suppressive effects on ipsilateral stimulation and the dominance of contralateral hemisphere during dichotic listening. In addition to the influence of cortical processing asymmetries, these results may also reflect a bias towards inhibitory ipsilateral and excitatory contralateral activation present at the level of inferior colliculus. That the effect of attention was clearest for the lower modulation frequencies suggests that such effects are likely mediated by cortical brain structures or by those in close proximity to cortex. PMID:25334021

  6. Attentional modulation of auditory steady-state responses.

    Directory of Open Access Journals (Sweden)

    Yatin Mahajan

    Full Text Available Auditory selective attention enables task-relevant auditory events to be enhanced and irrelevant ones suppressed. In the present study we used a frequency tagging paradigm to investigate the effects of attention on auditory steady state responses (ASSR. The ASSR was elicited by simultaneously presenting two different streams of white noise, amplitude modulated at either 16 and 23.5 Hz or 32.5 and 40 Hz. The two different frequencies were presented to each ear and participants were instructed to selectively attend to one ear or the other (confirmed by behavioral evidence. The results revealed that modulation of ASSR by selective attention depended on the modulation frequencies used and whether the activation was contralateral or ipsilateral. Attention enhanced the ASSR for contralateral activation from either ear for 16 Hz and suppressed the ASSR for ipsilateral activation for 16 Hz and 23.5 Hz. For modulation frequencies of 32.5 or 40 Hz attention did not affect the ASSR. We propose that the pattern of enhancement and inhibition may be due to binaural suppressive effects on ipsilateral stimulation and the dominance of contralateral hemisphere during dichotic listening. In addition to the influence of cortical processing asymmetries, these results may also reflect a bias towards inhibitory ipsilateral and excitatory contralateral activation present at the level of inferior colliculus. That the effect of attention was clearest for the lower modulation frequencies suggests that such effects are likely mediated by cortical brain structures or by those in close proximity to cortex.

  7. Measuring Auditory Selective Attention using Frequency Tagging

    Directory of Open Access Journals (Sweden)

    Hari M Bharadwaj

    2014-02-01

    Full Text Available Frequency tagging of sensory inputs (presenting stimuli that fluctuate periodically at rates to which the cortex can phase lock has been used to study attentional modulation of neural responses to inputs in different sensory modalities. For visual inputs, the visual steady-state response (VSSR at the frequency modulating an attended object is enhanced, while the VSSR to a distracting object is suppressed. In contrast, the effect of attention on the auditory steady-state response (ASSR is inconsistent across studies. However, most auditory studies analyzed results at the sensor level or used only a small number of equivalent current dipoles to fit cortical responses. In addition, most studies of auditory spatial attention used dichotic stimuli (independent signals at the ears rather than more natural, binaural stimuli. Here, we asked whether these methodological choices help explain discrepant results. Listeners attended to one of two competing speech streams, one simulated from the left and one from the right, that were modulated at different frequencies. Using distributed source modeling of magnetoencephalography results, we estimate how spatially directed attention modulates the ASSR in neural regions across the whole brain. Attention enhances the ASSR power at the frequency of the attended stream in the contralateral auditory cortex. The attended-stream modulation frequency also drives phase-locked responses in the left (but not right precentral sulcus (lPCS, a region implicated in control of eye gaze and visual spatial attention. Importantly, this region shows no phase locking to the distracting stream suggesting that the lPCS in engaged in an attention-specific manner. Modeling results that take account of the geometry and phases of the cortical sources phase locked to the two streams (including hemispheric asymmetry of lPCS activity help partly explain why past ASSR studies of auditory spatial attention yield seemingly contradictory

  8. Brain activity during divided and selective attention to auditory and visual sentence comprehension tasks

    OpenAIRE

    Moisala, Mona; Salmela, Viljami; Salo, Emma; Carlson, Synnove; Vuontela, Virve; Salonen, Oili; Alho, Kimmo

    2015-01-01

    Using functional magnetic resonance imaging (fMRI), we measured brain activity of human participants while they performed a sentence congruence judgment task in either the visual or auditory modality separately, or in both modalities simultaneously. Significant performance decrements were observed when attention was divided between the two modalities compared with when one modality was selectively attended. Compared with selective attention (i.e., single tasking), divided attention (i.e., dua...

  9. The effects of divided attention on auditory priming.

    Science.gov (United States)

    Mulligan, Neil W; Duke, Marquinn; Cooper, Angela W

    2007-09-01

    Traditional theorizing stresses the importance of attentional state during encoding for later memory, based primarily on research with explicit memory. Recent research has begun to investigate the role of attention in implicit memory but has focused almost exclusively on priming in the visual modality. The present experiments examined the effect of divided attention on auditory implicit memory, using auditory perceptual identification, word-stem completion and word-fragment completion. Participants heard study words under full attention conditions or while simultaneously carrying out a distractor task (the divided attention condition). In Experiment 1, a distractor task with low response frequency failed to disrupt later auditory priming (but diminished explicit memory as assessed with auditory recognition). In Experiment 2, a distractor task with greater response frequency disrupted priming on all three of the auditory priming tasks as well as the explicit test. These results imply that although auditory priming is less reliant on attention than explicit memory, it is still greatly affected by at least some divided-attention manipulations. These results are consistent with research using visual priming tasks and have relevance for hypotheses regarding attention and auditory priming.

  10. Changes in otoacoustic emissions during selective auditory and visual attention.

    Science.gov (United States)

    Walsh, Kyle P; Pasanen, Edward G; McFadden, Dennis

    2015-05-01

    Previous studies have demonstrated that the otoacoustic emissions (OAEs) measured during behavioral tasks can have different magnitudes when subjects are attending selectively or not attending. The implication is that the cognitive and perceptual demands of a task can affect the first neural stage of auditory processing-the sensory receptors themselves. However, the directions of the reported attentional effects have been inconsistent, the magnitudes of the observed differences typically have been small, and comparisons across studies have been made difficult by significant procedural differences. In this study, a nonlinear version of the stimulus-frequency OAE (SFOAE), called the nSFOAE, was used to measure cochlear responses from human subjects while they simultaneously performed behavioral tasks requiring selective auditory attention (dichotic or diotic listening), selective visual attention, or relative inattention. Within subjects, the differences in nSFOAE magnitude between inattention and attention conditions were about 2-3 dB for both auditory and visual modalities, and the effect sizes for the differences typically were large for both nSFOAE magnitude and phase. These results reveal that the cochlear efferent reflex is differentially active during selective attention and inattention, for both auditory and visual tasks, although they do not reveal how attention is improved when efferent activity is greater.

  11. Changes in otoacoustic emissions during selective auditory and visual attention

    Science.gov (United States)

    Walsh, Kyle P.; Pasanen, Edward G.; McFadden, Dennis

    2015-01-01

    Previous studies have demonstrated that the otoacoustic emissions (OAEs) measured during behavioral tasks can have different magnitudes when subjects are attending selectively or not attending. The implication is that the cognitive and perceptual demands of a task can affect the first neural stage of auditory processing—the sensory receptors themselves. However, the directions of the reported attentional effects have been inconsistent, the magnitudes of the observed differences typically have been small, and comparisons across studies have been made difficult by significant procedural differences. In this study, a nonlinear version of the stimulus-frequency OAE (SFOAE), called the nSFOAE, was used to measure cochlear responses from human subjects while they simultaneously performed behavioral tasks requiring selective auditory attention (dichotic or diotic listening), selective visual attention, or relative inattention. Within subjects, the differences in nSFOAE magnitude between inattention and attention conditions were about 2–3 dB for both auditory and visual modalities, and the effect sizes for the differences typically were large for both nSFOAE magnitude and phase. These results reveal that the cochlear efferent reflex is differentially active during selective attention and inattention, for both auditory and visual tasks, although they do not reveal how attention is improved when efferent activity is greater. PMID:25994703

  12. Higher dietary diversity is related to better visual and auditory sustained attention.

    Science.gov (United States)

    Shiraseb, Farideh; Siassi, Fereydoun; Qorbani, Mostafa; Sotoudeh, Gity; Rostami, Reza; Narmaki, Elham; Yavari, Parvaneh; Aghasi, Mohadeseh; Shaibu, Osman Mohammed

    2016-04-01

    Attention is a complex cognitive function that is necessary for learning, for following social norms of behaviour and for effective performance of responsibilities and duties. It is especially important in sensitive occupations requiring sustained attention. Improvement of dietary diversity (DD) is recognised as an important factor in health promotion, but its association with sustained attention is unknown. The aim of this study was to determine the association between auditory and visual sustained attention and DD. A cross-sectional study was carried out on 400 women aged 20-50 years who attended sports clubs at Tehran Municipality. Sustained attention was evaluated on the basis of the Integrated Visual and Auditory Continuous Performance Test using Integrated Visual and Auditory software. A single 24-h dietary recall questionnaire was used for DD assessment. Dietary diversity scores (DDS) were determined using the FAO guidelines. The mean visual and auditory sustained attention scores were 40·2 (sd 35·2) and 42·5 (sd 38), respectively. The mean DDS was 4·7 (sd 1·5). After adjusting for age, education years, physical activity, energy intake and BMI, mean visual and auditory sustained attention showed a significant increase as the quartiles of DDS increased (P=0·001). In addition, the mean subscales of attention, including auditory consistency and vigilance, visual persistence, visual and auditory focus, speed, comprehension and full attention, increased significantly with increasing DDS (Pvisual and auditory sustained attention.

  13. Extensive Tonotopic Mapping across Auditory Cortex Is Recapitulated by Spectrally Directed Attention and Systematically Related to Cortical Myeloarchitecture.

    Science.gov (United States)

    Dick, Frederic K; Lehet, Matt I; Callaghan, Martina F; Keller, Tim A; Sereno, Martin I; Holt, Lori L

    2017-12-13

    Auditory selective attention is vital in natural soundscapes. But it is unclear how attentional focus on the primary dimension of auditory representation-acoustic frequency-might modulate basic auditory functional topography during active listening. In contrast to visual selective attention, which is supported by motor-mediated optimization of input across saccades and pupil dilation, the primate auditory system has fewer means of differentially sampling the world. This makes spectrally-directed endogenous attention a particularly crucial aspect of auditory attention. Using a novel functional paradigm combined with quantitative MRI, we establish in male and female listeners that human frequency-band-selective attention drives activation in both myeloarchitectonically estimated auditory core, and across the majority of tonotopically mapped nonprimary auditory cortex. The attentionally driven best-frequency maps show strong concordance with sensory-driven maps in the same subjects across much of the temporal plane, with poor concordance in areas outside traditional auditory cortex. There is significantly greater activation across most of auditory cortex when best frequency is attended, versus ignored; the same regions do not show this enhancement when attending to the least-preferred frequency band. Finally, the results demonstrate that there is spatial correspondence between the degree of myelination and the strength of the tonotopic signal across a number of regions in auditory cortex. Strong frequency preferences across tonotopically mapped auditory cortex spatially correlate with R 1 -estimated myeloarchitecture, indicating shared functional and anatomical organization that may underlie intrinsic auditory regionalization. SIGNIFICANCE STATEMENT Perception is an active process, especially sensitive to attentional state. Listeners direct auditory attention to track a violin's melody within an ensemble performance, or to follow a voice in a crowded cafe. Although

  14. Adapting the Theory of Visual Attention (TVA) to model auditory attention

    DEFF Research Database (Denmark)

    Roberts, Katherine L.; Andersen, Tobias; Kyllingsbæk, Søren

    Mathematical and computational models have provided useful insights into normal and impaired visual attention, but less progress has been made in modelling auditory attention. We are developing a Theory of Auditory Attention (TAA), based on an influential visual model, the Theory of Visual...... Attention (TVA). We report that TVA provides a good fit to auditory data when the stimuli are closely matched to those used in visual studies. In the basic visual TVA task, participants view a brief display of letters and are asked to report either all of the letters (whole report) or a subset of letters (e...... the auditory data, producing good estimates of the rate at which information is encoded (C), the minimum exposure duration required for processing to begin (t0), and the relative attentional weight to targets versus distractors (α). Future work will address the issue of target-distractor confusion, and extend...

  15. The spectrotemporal filter mechanism of auditory selective attention

    Science.gov (United States)

    Lakatos, Peter; Musacchia, Gabriella; O’Connell, Monica N.; Falchier, Arnaud Y.; Javitt, Daniel C.; Schroeder, Charles E.

    2013-01-01

    SUMMARY While we have convincing evidence that attention to auditory stimuli modulates neuronal responses at or before the level of primary auditory cortex (A1), the underlying physiological mechanisms are unknown. We found that attending to rhythmic auditory streams resulted in the entrainment of ongoing oscillatory activity reflecting rhythmic excitability fluctuations in A1. Strikingly, while the rhythm of the entrained oscillations in A1 neuronal ensembles reflected the temporal structure of the attended stream, the phase depended on the attended frequency content. Counter-phase entrainment across differently tuned A1 regions resulted in both the amplification and sharpening of responses at attended time points, in essence acting as a spectrotemporal filter mechanism. Our data suggest that selective attention generates a dynamically evolving model of attended auditory stimulus streams in the form of modulatory subthreshold oscillations across tonotopically organized neuronal ensembles in A1 that enhances the representation of attended stimuli. PMID:23439126

  16. Intentional preparation of auditory attention-switches: Explicit cueing and sequential switch-predictability.

    Science.gov (United States)

    Seibold, Julia C; Nolden, Sophie; Oberem, Josefa; Fels, Janina; Koch, Iring

    2018-06-01

    In an auditory attention-switching paradigm, participants heard two simultaneously spoken number-words, each presented to one ear, and decided whether the target number was smaller or larger than 5 by pressing a left or right key. An instructional cue in each trial indicated which feature had to be used to identify the target number (e.g., female voice). Auditory attention-switch costs were found when this feature changed compared to when it repeated in two consecutive trials. Earlier studies employing this paradigm showed mixed results when they examined whether such cued auditory attention-switches can be prepared actively during the cue-stimulus interval. This study systematically assessed which preconditions are necessary for the advance preparation of auditory attention-switches. Three experiments were conducted that controlled for cue-repetition benefits, modality switches between cue and stimuli, as well as for predictability of the switch-sequence. Only in the third experiment, in which predictability for an attention-switch was maximal due to a pre-instructed switch-sequence and predictable stimulus onsets, active switch-specific preparation was found. These results suggest that the cognitive system can prepare auditory attention-switches, and this preparation seems to be triggered primarily by the memorised switching-sequence and valid expectations about the time of target onset.

  17. Sustained selective attention to competing amplitude-modulations in human auditory cortex.

    Science.gov (United States)

    Riecke, Lars; Scharke, Wolfgang; Valente, Giancarlo; Gutschalk, Alexander

    2014-01-01

    Auditory selective attention plays an essential role for identifying sounds of interest in a scene, but the neural underpinnings are still incompletely understood. Recent findings demonstrate that neural activity that is time-locked to a particular amplitude-modulation (AM) is enhanced in the auditory cortex when the modulated stream of sounds is selectively attended to under sensory competition with other streams. However, the target sounds used in the previous studies differed not only in their AM, but also in other sound features, such as carrier frequency or location. Thus, it remains uncertain whether the observed enhancements reflect AM-selective attention. The present study aims at dissociating the effect of AM frequency on response enhancement in auditory cortex by using an ongoing auditory stimulus that contains two competing targets differing exclusively in their AM frequency. Electroencephalography results showed a sustained response enhancement for auditory attention compared to visual attention, but not for AM-selective attention (attended AM frequency vs. ignored AM frequency). In contrast, the response to the ignored AM frequency was enhanced, although a brief trend toward response enhancement occurred during the initial 15 s. Together with the previous findings, these observations indicate that selective enhancement of attended AMs in auditory cortex is adaptive under sustained AM-selective attention. This finding has implications for our understanding of cortical mechanisms for feature-based attentional gain control.

  18. Sustained Selective Attention to Competing Amplitude-Modulations in Human Auditory Cortex

    Science.gov (United States)

    Riecke, Lars; Scharke, Wolfgang; Valente, Giancarlo; Gutschalk, Alexander

    2014-01-01

    Auditory selective attention plays an essential role for identifying sounds of interest in a scene, but the neural underpinnings are still incompletely understood. Recent findings demonstrate that neural activity that is time-locked to a particular amplitude-modulation (AM) is enhanced in the auditory cortex when the modulated stream of sounds is selectively attended to under sensory competition with other streams. However, the target sounds used in the previous studies differed not only in their AM, but also in other sound features, such as carrier frequency or location. Thus, it remains uncertain whether the observed enhancements reflect AM-selective attention. The present study aims at dissociating the effect of AM frequency on response enhancement in auditory cortex by using an ongoing auditory stimulus that contains two competing targets differing exclusively in their AM frequency. Electroencephalography results showed a sustained response enhancement for auditory attention compared to visual attention, but not for AM-selective attention (attended AM frequency vs. ignored AM frequency). In contrast, the response to the ignored AM frequency was enhanced, although a brief trend toward response enhancement occurred during the initial 15 s. Together with the previous findings, these observations indicate that selective enhancement of attended AMs in auditory cortex is adaptive under sustained AM-selective attention. This finding has implications for our understanding of cortical mechanisms for feature-based attentional gain control. PMID:25259525

  19. Selective Attention to Auditory Memory Neurally Enhances Perceptual Precision.

    Science.gov (United States)

    Lim, Sung-Joo; Wöstmann, Malte; Obleser, Jonas

    2015-12-09

    Selective attention to a task-relevant stimulus facilitates encoding of that stimulus into a working memory representation. It is less clear whether selective attention also improves the precision of a stimulus already represented in memory. Here, we investigate the behavioral and neural dynamics of selective attention to representations in auditory working memory (i.e., auditory objects) using psychophysical modeling and model-based analysis of electroencephalographic signals. Human listeners performed a syllable pitch discrimination task where two syllables served as to-be-encoded auditory objects. Valid (vs neutral) retroactive cues were presented during retention to allow listeners to selectively attend to the to-be-probed auditory object in memory. Behaviorally, listeners represented auditory objects in memory more precisely (expressed by steeper slopes of a psychometric curve) and made faster perceptual decisions when valid compared to neutral retrocues were presented. Neurally, valid compared to neutral retrocues elicited a larger frontocentral sustained negativity in the evoked potential as well as enhanced parietal alpha/low-beta oscillatory power (9-18 Hz) during memory retention. Critically, individual magnitudes of alpha oscillatory power (7-11 Hz) modulation predicted the degree to which valid retrocues benefitted individuals' behavior. Our results indicate that selective attention to a specific object in auditory memory does benefit human performance not by simply reducing memory load, but by actively engaging complementary neural resources to sharpen the precision of the task-relevant object in memory. Can selective attention improve the representational precision with which objects are held in memory? And if so, what are the neural mechanisms that support such improvement? These issues have been rarely examined within the auditory modality, in which acoustic signals change and vanish on a milliseconds time scale. Introducing a new auditory memory

  20. Neural effects of cognitive control load on auditory selective attention.

    Science.gov (United States)

    Sabri, Merav; Humphries, Colin; Verber, Matthew; Liebenthal, Einat; Binder, Jeffrey R; Mangalathu, Jain; Desai, Anjali

    2014-08-01

    Whether and how working memory disrupts or alters auditory selective attention is unclear. We compared simultaneous event-related potentials (ERP) and functional magnetic resonance imaging (fMRI) responses associated with task-irrelevant sounds across high and low working memory load in a dichotic-listening paradigm. Participants performed n-back tasks (1-back, 2-back) in one ear (Attend ear) while ignoring task-irrelevant speech sounds in the other ear (Ignore ear). The effects of working memory load on selective attention were observed at 130-210ms, with higher load resulting in greater irrelevant syllable-related activation in localizer-defined regions in auditory cortex. The interaction between memory load and presence of irrelevant information revealed stronger activations primarily in frontal and parietal areas due to presence of irrelevant information in the higher memory load. Joint independent component analysis of ERP and fMRI data revealed that the ERP component in the N1 time-range is associated with activity in superior temporal gyrus and medial prefrontal cortex. These results demonstrate a dynamic relationship between working memory load and auditory selective attention, in agreement with the load model of attention and the idea of common neural resources for memory and attention. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Modification of sudden onset auditory ERP by involuntary attention to visual stimuli.

    Science.gov (United States)

    Oray, Serkan; Lu, Zhong-Lin; Dawson, Michael E

    2002-03-01

    To investigate the cross-modal nature of the exogenous attention system, we studied how involuntary attention in the visual modality affects ERPs elicited by sudden onset of events in the auditory modality. Relatively loud auditory white noise bursts were presented to subjects with random and long inter-trial intervals. The noise bursts were either presented alone, or paired with a visual stimulus with a visual to auditory onset asynchrony of 120 ms. In a third condition, the visual stimuli were shown alone. All three conditions, auditory alone, visual alone, and paired visual/auditory, were randomly inter-mixed and presented with equal probabilities. Subjects were instructed to fixate on a point in front of them without task instructions concerning either the auditory or visual stimuli. ERPs were recorded from 28 scalp sites throughout every experimental session. Compared to ERPs in the auditory alone condition, pairing the auditory noise bursts with the visual stimulus reduced the amplitude of the auditory N100 component at Cz by 40% and the auditory P200/P300 component at Cz by 25%. No significant topographical change was observed in the scalp distributions of the N100 and P200/P300. Our results suggest that involuntary attention to visual stimuli suppresses early sensory (N100) as well as late cognitive (P200/P300) processing of sudden auditory events. The activation of the exogenous attention system by sudden auditory onset can be modified by involuntary visual attention in a cross-model, passive prepulse inhibition paradigm.

  2. The Relationship between Types of Attention and Auditory Processing Skills: Reconsidering Auditory Processing Disorder Diagnosis

    Science.gov (United States)

    Stavrinos, Georgios; Iliadou, Vassiliki-Maria; Edwards, Lindsey; Sirimanna, Tony; Bamiou, Doris-Eva

    2018-01-01

    Measures of attention have been found to correlate with specific auditory processing tests in samples of children suspected of Auditory Processing Disorder (APD), but these relationships have not been adequately investigated. Despite evidence linking auditory attention and deficits/symptoms of APD, measures of attention are not routinely used in APD diagnostic protocols. The aim of the study was to examine the relationship between auditory and visual attention tests and auditory processing tests in children with APD and to assess whether a proposed diagnostic protocol for APD, including measures of attention, could provide useful information for APD management. A pilot study including 27 children, aged 7–11 years, referred for APD assessment was conducted. The validated test of everyday attention for children, with visual and auditory attention tasks, the listening in spatialized noise sentences test, the children's communication checklist questionnaire and tests from a standard APD diagnostic test battery were administered. Pearson's partial correlation analysis examining the relationship between these tests and Cochrane's Q test analysis comparing proportions of diagnosis under each proposed battery were conducted. Divided auditory and divided auditory-visual attention strongly correlated with the dichotic digits test, r = 0.68, p attention battery identified as having Attention Deficits (ADs). The proposed APD battery excluding AD cases did not have a significantly different diagnosis proportion than the standard APD battery. Finally, the newly proposed diagnostic battery, identifying an inattentive subtype of APD, identified five children who would have otherwise been considered not having ADs. The findings show that a subgroup of children with APD demonstrates underlying sustained and divided attention deficits. Attention deficits in children with APD appear to be centred around the auditory modality but further examination of types of attention in both

  3. Auditory attention enhances processing of positive and negative words in inferior and superior prefrontal cortex.

    Science.gov (United States)

    Wegrzyn, Martin; Herbert, Cornelia; Ethofer, Thomas; Flaisch, Tobias; Kissler, Johanna

    2017-11-01

    Visually presented emotional words are processed preferentially and effects of emotional content are similar to those of explicit attention deployment in that both amplify visual processing. However, auditory processing of emotional words is less well characterized and interactions between emotional content and task-induced attention have not been fully understood. Here, we investigate auditory processing of emotional words, focussing on how auditory attention to positive and negative words impacts their cerebral processing. A Functional magnetic resonance imaging (fMRI) study manipulating word valence and attention allocation was performed. Participants heard negative, positive and neutral words to which they either listened passively or attended by counting negative or positive words, respectively. Regardless of valence, active processing compared to passive listening increased activity in primary auditory cortex, left intraparietal sulcus, and right superior frontal gyrus (SFG). The attended valence elicited stronger activity in left inferior frontal gyrus (IFG) and left SFG, in line with these regions' role in semantic retrieval and evaluative processing. No evidence for valence-specific attentional modulation in auditory regions or distinct valence-specific regional activations (i.e., negative > positive or positive > negative) was obtained. Thus, allocation of auditory attention to positive and negative words can substantially increase their processing in higher-order language and evaluative brain areas without modulating early stages of auditory processing. Inferior and superior frontal brain structures mediate interactions between emotional content, attention, and working memory when prosodically neutral speech is processed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Focal Suppression of Distractor Sounds by Selective Attention in Auditory Cortex.

    Science.gov (United States)

    Schwartz, Zachary P; David, Stephen V

    2018-01-01

    Auditory selective attention is required for parsing crowded acoustic environments, but cortical systems mediating the influence of behavioral state on auditory perception are not well characterized. Previous neurophysiological studies suggest that attention produces a general enhancement of neural responses to important target sounds versus irrelevant distractors. However, behavioral studies suggest that in the presence of masking noise, attention provides a focal suppression of distractors that compete with targets. Here, we compared effects of attention on cortical responses to masking versus non-masking distractors, controlling for effects of listening effort and general task engagement. We recorded single-unit activity from primary auditory cortex (A1) of ferrets during behavior and found that selective attention decreased responses to distractors masking targets in the same spectral band, compared with spectrally distinct distractors. This suppression enhanced neural target detection thresholds, suggesting that limited attention resources serve to focally suppress responses to distractors that interfere with target detection. Changing effort by manipulating target salience consistently modulated spontaneous but not evoked activity. Task engagement and changing effort tended to affect the same neurons, while attention affected an independent population, suggesting that distinct feedback circuits mediate effects of attention and effort in A1. © The Author 2017. Published by Oxford University Press.

  5. Gender-specific effects of prenatal and adolescent exposure to tobacco smoke on auditory and visual attention.

    Science.gov (United States)

    Jacobsen, Leslie K; Slotkin, Theodore A; Mencl, W Einar; Frost, Stephen J; Pugh, Kenneth R

    2007-12-01

    Prenatal exposure to active maternal tobacco smoking elevates risk of cognitive and auditory processing deficits, and of smoking in offspring. Recent preclinical work has demonstrated a sex-specific pattern of reduction in cortical cholinergic markers following prenatal, adolescent, or combined prenatal and adolescent exposure to nicotine, the primary psychoactive component of tobacco smoke. Given the importance of cortical cholinergic neurotransmission to attentional function, we examined auditory and visual selective and divided attention in 181 male and female adolescent smokers and nonsmokers with and without prenatal exposure to maternal smoking. Groups did not differ in age, educational attainment, symptoms of inattention, or years of parent education. A subset of 63 subjects also underwent functional magnetic resonance imaging while performing an auditory and visual selective and divided attention task. Among females, exposure to tobacco smoke during prenatal or adolescent development was associated with reductions in auditory and visual attention performance accuracy that were greatest in female smokers with prenatal exposure (combined exposure). Among males, combined exposure was associated with marked deficits in auditory attention, suggesting greater vulnerability of neurocircuitry supporting auditory attention to insult stemming from developmental exposure to tobacco smoke in males. Activation of brain regions that support auditory attention was greater in adolescents with prenatal or adolescent exposure to tobacco smoke relative to adolescents with neither prenatal nor adolescent exposure to tobacco smoke. These findings extend earlier preclinical work and suggest that, in humans, prenatal and adolescent exposure to nicotine exerts gender-specific deleterious effects on auditory and visual attention, with concomitant alterations in the efficiency of neurocircuitry supporting auditory attention.

  6. Distraction task rather than focal attention modulates gamma activity associated with auditory steady-state responses (ASSRs)

    DEFF Research Database (Denmark)

    Griskova-Bulanova, Inga; Ruksenas, Osvaldas; Dapsys, Kastytis

    2011-01-01

    To explore the modulation of auditory steady-state response (ASSR) by experimental tasks, differing in attentional focus and arousal level.......To explore the modulation of auditory steady-state response (ASSR) by experimental tasks, differing in attentional focus and arousal level....

  7. Brain activity during divided and selective attention to auditory and visual sentence comprehension tasks.

    Science.gov (United States)

    Moisala, Mona; Salmela, Viljami; Salo, Emma; Carlson, Synnöve; Vuontela, Virve; Salonen, Oili; Alho, Kimmo

    2015-01-01

    Using functional magnetic resonance imaging (fMRI), we measured brain activity of human participants while they performed a sentence congruence judgment task in either the visual or auditory modality separately, or in both modalities simultaneously. Significant performance decrements were observed when attention was divided between the two modalities compared with when one modality was selectively attended. Compared with selective attention (i.e., single tasking), divided attention (i.e., dual-tasking) did not recruit additional cortical regions, but resulted in increased activity in medial and lateral frontal regions which were also activated by the component tasks when performed separately. Areas involved in semantic language processing were revealed predominantly in the left lateral prefrontal cortex by contrasting incongruent with congruent sentences. These areas also showed significant activity increases during divided attention in relation to selective attention. In the sensory cortices, no crossmodal inhibition was observed during divided attention when compared with selective attention to one modality. Our results suggest that the observed performance decrements during dual-tasking are due to interference of the two tasks because they utilize the same part of the cortex. Moreover, semantic dual-tasking did not appear to recruit additional brain areas in comparison with single tasking, and no crossmodal inhibition was observed during intermodal divided attention.

  8. Brain activity during divided and selective attention to auditory and visual sentence comprehension tasks

    Science.gov (United States)

    Moisala, Mona; Salmela, Viljami; Salo, Emma; Carlson, Synnöve; Vuontela, Virve; Salonen, Oili; Alho, Kimmo

    2015-01-01

    Using functional magnetic resonance imaging (fMRI), we measured brain activity of human participants while they performed a sentence congruence judgment task in either the visual or auditory modality separately, or in both modalities simultaneously. Significant performance decrements were observed when attention was divided between the two modalities compared with when one modality was selectively attended. Compared with selective attention (i.e., single tasking), divided attention (i.e., dual-tasking) did not recruit additional cortical regions, but resulted in increased activity in medial and lateral frontal regions which were also activated by the component tasks when performed separately. Areas involved in semantic language processing were revealed predominantly in the left lateral prefrontal cortex by contrasting incongruent with congruent sentences. These areas also showed significant activity increases during divided attention in relation to selective attention. In the sensory cortices, no crossmodal inhibition was observed during divided attention when compared with selective attention to one modality. Our results suggest that the observed performance decrements during dual-tasking are due to interference of the two tasks because they utilize the same part of the cortex. Moreover, semantic dual-tasking did not appear to recruit additional brain areas in comparison with single tasking, and no crossmodal inhibition was observed during intermodal divided attention. PMID:25745395

  9. Brain activity associated with selective attention, divided attention and distraction.

    Science.gov (United States)

    Salo, Emma; Salmela, Viljami; Salmi, Juha; Numminen, Jussi; Alho, Kimmo

    2017-06-01

    Top-down controlled selective or divided attention to sounds and visual objects, as well as bottom-up triggered attention to auditory and visual distractors, has been widely investigated. However, no study has systematically compared brain activations related to all these types of attention. To this end, we used functional magnetic resonance imaging (fMRI) to measure brain activity in participants performing a tone pitch or a foveal grating orientation discrimination task, or both, distracted by novel sounds not sharing frequencies with the tones or by extrafoveal visual textures. To force focusing of attention to tones or gratings, or both, task difficulty was kept constantly high with an adaptive staircase method. A whole brain analysis of variance (ANOVA) revealed fronto-parietal attention networks for both selective auditory and visual attention. A subsequent conjunction analysis indicated partial overlaps of these networks. However, like some previous studies, the present results also suggest segregation of prefrontal areas involved in the control of auditory and visual attention. The ANOVA also suggested, and another conjunction analysis confirmed, an additional activity enhancement in the left middle frontal gyrus related to divided attention supporting the role of this area in top-down integration of dual task performance. Distractors expectedly disrupted task performance. However, contrary to our expectations, activations specifically related to the distractors were found only in the auditory and visual cortices. This suggests gating of the distractors from further processing perhaps due to strictly focused attention in the current demanding discrimination tasks. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Effects of selective attention on the electrophysiological representation of concurrent sounds in the human auditory cortex.

    Science.gov (United States)

    Bidet-Caulet, Aurélie; Fischer, Catherine; Besle, Julien; Aguera, Pierre-Emmanuel; Giard, Marie-Helene; Bertrand, Olivier

    2007-08-29

    In noisy environments, we use auditory selective attention to actively ignore distracting sounds and select relevant information, as during a cocktail party to follow one particular conversation. The present electrophysiological study aims at deciphering the spatiotemporal organization of the effect of selective attention on the representation of concurrent sounds in the human auditory cortex. Sound onset asynchrony was manipulated to induce the segregation of two concurrent auditory streams. Each stream consisted of amplitude modulated tones at different carrier and modulation frequencies. Electrophysiological recordings were performed in epileptic patients with pharmacologically resistant partial epilepsy, implanted with depth electrodes in the temporal cortex. Patients were presented with the stimuli while they either performed an auditory distracting task or actively selected one of the two concurrent streams. Selective attention was found to affect steady-state responses in the primary auditory cortex, and transient and sustained evoked responses in secondary auditory areas. The results provide new insights on the neural mechanisms of auditory selective attention: stream selection during sound rivalry would be facilitated not only by enhancing the neural representation of relevant sounds, but also by reducing the representation of irrelevant information in the auditory cortex. Finally, they suggest a specialization of the left hemisphere in the attentional selection of fine-grained acoustic information.

  11. Entrainment to an auditory signal: Is attention involved?

    NARCIS (Netherlands)

    Kunert, R.; Jongman, S.R.

    2017-01-01

    Many natural auditory signals, including music and language, change periodically. The effect of such auditory rhythms on the brain is unclear however. One widely held view, dynamic attending theory, proposes that the attentional system entrains to the rhythm and increases attention at moments of

  12. Selective attention reduces physiological noise in the external ear canals of humans. I: Auditory attention

    Science.gov (United States)

    Walsh, Kyle P.; Pasanen, Edward G.; McFadden, Dennis

    2014-01-01

    In this study, a nonlinear version of the stimulus-frequency OAE (SFOAE), called the nSFOAE, was used to measure cochlear responses from human subjects while they simultaneously performed behavioral tasks requiring, or not requiring, selective auditory attention. Appended to each stimulus presentation, and included in the calculation of each nSFOAE response, was a 30-ms silent period that was used to estimate the level of the inherent physiological noise in the ear canals of our subjects during each behavioral condition. Physiological-noise magnitudes were higher (noisier) for all subjects in the inattention task, and lower (quieter) in the selective auditory-attention tasks. These noise measures initially were made at the frequency of our nSFOAE probe tone (4.0 kHz), but the same attention effects also were observed across a wide range of frequencies. We attribute the observed differences in physiological-noise magnitudes between the inattention and attention conditions to different levels of efferent activation associated with the differing attentional demands of the behavioral tasks. One hypothesis is that when the attentional demand is relatively great, efferent activation is relatively high, and a decrease in the gain of the cochlear amplifier leads to lower-amplitude cochlear activity, and thus a smaller measure of noise from the ear. PMID:24732069

  13. Selective attention modulates human auditory brainstem responses: relative contributions of frequency and spatial cues.

    Directory of Open Access Journals (Sweden)

    Alexandre Lehmann

    Full Text Available Selective attention is the mechanism that allows focusing one's attention on a particular stimulus while filtering out a range of other stimuli, for instance, on a single conversation in a noisy room. Attending to one sound source rather than another changes activity in the human auditory cortex, but it is unclear whether attention to different acoustic features, such as voice pitch and speaker location, modulates subcortical activity. Studies using a dichotic listening paradigm indicated that auditory brainstem processing may be modulated by the direction of attention. We investigated whether endogenous selective attention to one of two speech signals affects amplitude and phase locking in auditory brainstem responses when the signals were either discriminable by frequency content alone, or by frequency content and spatial location. Frequency-following responses to the speech sounds were significantly modulated in both conditions. The modulation was specific to the task-relevant frequency band. The effect was stronger when both frequency and spatial information were available. Patterns of response were variable between participants, and were correlated with psychophysical discriminability of the stimuli, suggesting that the modulation was biologically relevant. Our results demonstrate that auditory brainstem responses are susceptible to efferent modulation related to behavioral goals. Furthermore they suggest that mechanisms of selective attention actively shape activity at early subcortical processing stages according to task relevance and based on frequency and spatial cues.

  14. Musical experience shapes top-down auditory mechanisms: evidence from masking and auditory attention performance.

    Science.gov (United States)

    Strait, Dana L; Kraus, Nina; Parbery-Clark, Alexandra; Ashley, Richard

    2010-03-01

    A growing body of research suggests that cognitive functions, such as attention and memory, drive perception by tuning sensory mechanisms to relevant acoustic features. Long-term musical experience also modulates lower-level auditory function, although the mechanisms by which this occurs remain uncertain. In order to tease apart the mechanisms that drive perceptual enhancements in musicians, we posed the question: do well-developed cognitive abilities fine-tune auditory perception in a top-down fashion? We administered a standardized battery of perceptual and cognitive tests to adult musicians and non-musicians, including tasks either more or less susceptible to cognitive control (e.g., backward versus simultaneous masking) and more or less dependent on auditory or visual processing (e.g., auditory versus visual attention). Outcomes indicate lower perceptual thresholds in musicians specifically for auditory tasks that relate with cognitive abilities, such as backward masking and auditory attention. These enhancements were observed in the absence of group differences for the simultaneous masking and visual attention tasks. Our results suggest that long-term musical practice strengthens cognitive functions and that these functions benefit auditory skills. Musical training bolsters higher-level mechanisms that, when impaired, relate to language and literacy deficits. Thus, musical training may serve to lessen the impact of these deficits by strengthening the corticofugal system for hearing. 2009 Elsevier B.V. All rights reserved.

  15. Attentional Capture by Deviant Sounds: A Noncontingent Form of Auditory Distraction?

    Science.gov (United States)

    Vachon, François; Labonté, Katherine; Marsh, John E.

    2017-01-01

    The occurrence of an unexpected, infrequent sound in an otherwise homogeneous auditory background tends to disrupt the ongoing cognitive task. This "deviation effect" is typically explained in terms of attentional capture whereby the deviant sound draws attention away from the focal activity, regardless of the nature of this activity.…

  16. Frequency-specific attentional modulation in human primary auditory cortex and midbrain

    NARCIS (Netherlands)

    Riecke, Lars; Peters, Judith C; Valente, Giancarlo; Poser, Benedikt A; Kemper, Valentin G; Formisano, Elia; Sorger, Bettina

    2018-01-01

    Paying selective attention to an audio frequency selectively enhances activity within primary auditory cortex (PAC) at the tonotopic site (frequency channel) representing that frequency. Animal PAC neurons achieve this 'frequency-specific attentional spotlight' by adapting their frequency tuning,

  17. Negative emotion provides cues for orienting auditory spatial attention

    Directory of Open Access Journals (Sweden)

    Erkin eAsutay

    2015-05-01

    Full Text Available The auditory stimuli provide information about the objects and events around us. They can also carry biologically significant emotional information (such as unseen dangers and conspecific vocalizations, which provides cues for allocation of attention and mental resources. Here, we investigated whether task-irrelevant auditory emotional information can provide cues for orientation of auditory spatial attention. We employed a covert spatial orienting task: the dot-probe task. In each trial, two task irrelevant auditory cues were simultaneously presented at two separate locations (left-right or front-back. Environmental sounds were selected to form emotional vs. neutral, emotional vs. emotional, and neutral vs. neutral cue pairs. The participants’ task was to detect the location of an acoustic target that was presented immediately after the task-irrelevant auditory cues. The target was presented at the same location as one of the auditory cues. The results indicated that participants were significantly faster to locate the target when it replaced the negative cue compared to when it replaced the neutral cue. The positive cues did not produce a clear attentional bias. Further, same valence pairs (emotional-emotional or neutral-neutral did not modulate reaction times due to a lack of spatial attention capture by one cue in the pair. Taken together, the results indicate that negative affect can provide cues for the orientation of spatial attention in the auditory domain.

  18. Modulatory Effects of Attention on Lateral Inhibition in the Human Auditory Cortex.

    Science.gov (United States)

    Engell, Alva; Junghöfer, Markus; Stein, Alwina; Lau, Pia; Wunderlich, Robert; Wollbrink, Andreas; Pantev, Christo

    2016-01-01

    Reduced neural processing of a tone is observed when it is presented after a sound whose spectral range closely frames the frequency of the tone. This observation might be explained by the mechanism of lateral inhibition (LI) due to inhibitory interneurons in the auditory system. So far, several characteristics of bottom up influences on LI have been identified, while the influence of top-down processes such as directed attention on LI has not been investigated. Hence, the study at hand aims at investigating the modulatory effects of focused attention on LI in the human auditory cortex. In the magnetoencephalograph, we present two types of masking sounds (white noise vs. withe noise passing through a notch filter centered at a specific frequency), followed by a test tone with a frequency corresponding to the center-frequency of the notch filter. Simultaneously, subjects were presented with visual input on a screen. To modulate the focus of attention, subjects were instructed to concentrate either on the auditory input or the visual stimuli. More specific, on one half of the trials, subjects were instructed to detect small deviations in loudness in the masking sounds while on the other half of the trials subjects were asked to detect target stimuli on the screen. The results revealed a reduction in neural activation due to LI, which was larger during auditory compared to visual focused attention. Attentional modulations of LI were observed in two post-N1m time intervals. These findings underline the robustness of reduced neural activation due to LI in the auditory cortex and point towards the important role of attention on the modulation of this mechanism in more evaluative processing stages.

  19. Auditory and Visual Attention Performance in Children With ADHD: The Attentional Deficiency of ADHD Is Modality Specific.

    Science.gov (United States)

    Lin, Hung-Yu; Hsieh, Hsieh-Chun; Lee, Posen; Hong, Fu-Yuan; Chang, Wen-Dien; Liu, Kuo-Cheng

    2017-08-01

    This study explored auditory and visual attention in children with ADHD. In a randomized, two-period crossover design, 50 children with ADHD and 50 age- and sex-matched typically developing peers were measured with the Test of Various Attention (TOVA). The deficiency of visual attention is more serious than that of auditory attention in children with ADHD. On the auditory modality, only the deficit of attentional inconsistency is sufficient to explain most cases of ADHD; however, most of the children with ADHD suffered from deficits of sustained attention, response inhibition, and attentional inconsistency on the visual modality. Our results also showed that the deficit of attentional inconsistency is the most important indicator in diagnosing and intervening in ADHD when both auditory and visual modalities are considered. The findings provide strong evidence that the deficits of auditory attention are different from those of visual attention in children with ADHD.

  20. Stroke caused auditory attention deficits in children

    Directory of Open Access Journals (Sweden)

    Karla Maria Ibraim da Freiria Elias

    2013-01-01

    Full Text Available OBJECTIVE: To verify the auditory selective attention in children with stroke. METHODS: Dichotic tests of binaural separation (non-verbal and consonant-vowel and binaural integration - digits and Staggered Spondaic Words Test (SSW - were applied in 13 children (7 boys, from 7 to 16 years, with unilateral stroke confirmed by neurological examination and neuroimaging. RESULTS: The attention performance showed significant differences in comparison to the control group in both kinds of tests. In the non-verbal test, identifications the ear opposite the lesion in the free recall stage was diminished and, in the following stages, a difficulty in directing attention was detected. In the consonant- vowel test, a modification in perceptual asymmetry and difficulty in focusing in the attended stages was found. In the digits and SSW tests, ipsilateral, contralateral and bilateral deficits were detected, depending on the characteristics of the lesions and demand of the task. CONCLUSION: Stroke caused auditory attention deficits when dealing with simultaneous sources of auditory information.

  1. Music training relates to the development of neural mechanisms of selective auditory attention.

    Science.gov (United States)

    Strait, Dana L; Slater, Jessica; O'Connell, Samantha; Kraus, Nina

    2015-04-01

    Selective attention decreases trial-to-trial variability in cortical auditory-evoked activity. This effect increases over the course of maturation, potentially reflecting the gradual development of selective attention and inhibitory control. Work in adults indicates that music training may alter the development of this neural response characteristic, especially over brain regions associated with executive control: in adult musicians, attention decreases variability in auditory-evoked responses recorded over prefrontal cortex to a greater extent than in nonmusicians. We aimed to determine whether this musician-associated effect emerges during childhood, when selective attention and inhibitory control are under development. We compared cortical auditory-evoked variability to attended and ignored speech streams in musicians and nonmusicians across three age groups: preschoolers, school-aged children and young adults. Results reveal that childhood music training is associated with reduced auditory-evoked response variability recorded over prefrontal cortex during selective auditory attention in school-aged child and adult musicians. Preschoolers, on the other hand, demonstrate no impact of selective attention on cortical response variability and no musician distinctions. This finding is consistent with the gradual emergence of attention during this period and may suggest no pre-existing differences in this attention-related cortical metric between children who undergo music training and those who do not. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Modulatory Effects of Attention on Lateral Inhibition in the Human Auditory Cortex.

    Directory of Open Access Journals (Sweden)

    Alva Engell

    Full Text Available Reduced neural processing of a tone is observed when it is presented after a sound whose spectral range closely frames the frequency of the tone. This observation might be explained by the mechanism of lateral inhibition (LI due to inhibitory interneurons in the auditory system. So far, several characteristics of bottom up influences on LI have been identified, while the influence of top-down processes such as directed attention on LI has not been investigated. Hence, the study at hand aims at investigating the modulatory effects of focused attention on LI in the human auditory cortex. In the magnetoencephalograph, we present two types of masking sounds (white noise vs. withe noise passing through a notch filter centered at a specific frequency, followed by a test tone with a frequency corresponding to the center-frequency of the notch filter. Simultaneously, subjects were presented with visual input on a screen. To modulate the focus of attention, subjects were instructed to concentrate either on the auditory input or the visual stimuli. More specific, on one half of the trials, subjects were instructed to detect small deviations in loudness in the masking sounds while on the other half of the trials subjects were asked to detect target stimuli on the screen. The results revealed a reduction in neural activation due to LI, which was larger during auditory compared to visual focused attention. Attentional modulations of LI were observed in two post-N1m time intervals. These findings underline the robustness of reduced neural activation due to LI in the auditory cortex and point towards the important role of attention on the modulation of this mechanism in more evaluative processing stages.

  3. Attention-driven auditory cortex short-term plasticity helps segregate relevant sounds from noise.

    Science.gov (United States)

    Ahveninen, Jyrki; Hämäläinen, Matti; Jääskeläinen, Iiro P; Ahlfors, Seppo P; Huang, Samantha; Lin, Fa-Hsuan; Raij, Tommi; Sams, Mikko; Vasios, Christos E; Belliveau, John W

    2011-03-08

    How can we concentrate on relevant sounds in noisy environments? A "gain model" suggests that auditory attention simply amplifies relevant and suppresses irrelevant afferent inputs. However, it is unclear whether this suffices when attended and ignored features overlap to stimulate the same neuronal receptive fields. A "tuning model" suggests that, in addition to gain, attention modulates feature selectivity of auditory neurons. We recorded magnetoencephalography, EEG, and functional MRI (fMRI) while subjects attended to tones delivered to one ear and ignored opposite-ear inputs. The attended ear was switched every 30 s to quantify how quickly the effects evolve. To produce overlapping inputs, the tones were presented alone vs. during white-noise masking notch-filtered ±1/6 octaves around the tone center frequencies. Amplitude modulation (39 vs. 41 Hz in opposite ears) was applied for "frequency tagging" of attention effects on maskers. Noise masking reduced early (50-150 ms; N1) auditory responses to unattended tones. In support of the tuning model, selective attention canceled out this attenuating effect but did not modulate the gain of 50-150 ms activity to nonmasked tones or steady-state responses to the maskers themselves. These tuning effects originated at nonprimary auditory cortices, purportedly occupied by neurons that, without attention, have wider frequency tuning than ±1/6 octaves. The attentional tuning evolved rapidly, during the first few seconds after attention switching, and correlated with behavioral discrimination performance. In conclusion, a simple gain model alone cannot explain auditory selective attention. In nonprimary auditory cortices, attention-driven short-term plasticity retunes neurons to segregate relevant sounds from noise.

  4. Cross-modal attention influences auditory contrast sensitivity: Decreasing visual load improves auditory thresholds for amplitude- and frequency-modulated sounds.

    Science.gov (United States)

    Ciaramitaro, Vivian M; Chow, Hiu Mei; Eglington, Luke G

    2017-03-01

    We used a cross-modal dual task to examine how changing visual-task demands influenced auditory processing, namely auditory thresholds for amplitude- and frequency-modulated sounds. Observers had to attend to two consecutive intervals of sounds and report which interval contained the auditory stimulus that was modulated in amplitude (Experiment 1) or frequency (Experiment 2). During auditory-stimulus presentation, observers simultaneously attended to a rapid sequential visual presentation-two consecutive intervals of streams of visual letters-and had to report which interval contained a particular color (low load, demanding less attentional resources) or, in separate blocks of trials, which interval contained more of a target letter (high load, demanding more attentional resources). We hypothesized that if attention is a shared resource across vision and audition, an easier visual task should free up more attentional resources for auditory processing on an unrelated task, hence improving auditory thresholds. Auditory detection thresholds were lower-that is, auditory sensitivity was improved-for both amplitude- and frequency-modulated sounds when observers engaged in a less demanding (compared to a more demanding) visual task. In accord with previous work, our findings suggest that visual-task demands can influence the processing of auditory information on an unrelated concurrent task, providing support for shared attentional resources. More importantly, our results suggest that attending to information in a different modality, cross-modal attention, can influence basic auditory contrast sensitivity functions, highlighting potential similarities between basic mechanisms for visual and auditory attention.

  5. Auditory-Cortex Short-Term Plasticity Induced by Selective Attention

    Science.gov (United States)

    Jääskeläinen, Iiro P.; Ahveninen, Jyrki

    2014-01-01

    The ability to concentrate on relevant sounds in the acoustic environment is crucial for everyday function and communication. Converging lines of evidence suggests that transient functional changes in auditory-cortex neurons, “short-term plasticity”, might explain this fundamental function. Under conditions of strongly focused attention, enhanced processing of attended sounds can take place at very early latencies (~50 ms from sound onset) in primary auditory cortex and possibly even at earlier latencies in subcortical structures. More robust selective-attention short-term plasticity is manifested as modulation of responses peaking at ~100 ms from sound onset in functionally specialized nonprimary auditory-cortical areas by way of stimulus-specific reshaping of neuronal receptive fields that supports filtering of selectively attended sound features from task-irrelevant ones. Such effects have been shown to take effect in ~seconds following shifting of attentional focus. There are findings suggesting that the reshaping of neuronal receptive fields is even stronger at longer auditory-cortex response latencies (~300 ms from sound onset). These longer-latency short-term plasticity effects seem to build up more gradually, within tens of seconds after shifting the focus of attention. Importantly, some of the auditory-cortical short-term plasticity effects observed during selective attention predict enhancements in behaviorally measured sound discrimination performance. PMID:24551458

  6. Neuronal effects of nicotine during auditory selective attention.

    Science.gov (United States)

    Smucny, Jason; Olincy, Ann; Eichman, Lindsay S; Tregellas, Jason R

    2015-06-01

    Although the attention-enhancing effects of nicotine have been behaviorally and neurophysiologically well-documented, its localized functional effects during selective attention are poorly understood. In this study, we examined the neuronal effects of nicotine during auditory selective attention in healthy human nonsmokers. We hypothesized to observe significant effects of nicotine in attention-associated brain areas, driven by nicotine-induced increases in activity as a function of increasing task demands. A single-blind, prospective, randomized crossover design was used to examine neuronal response associated with a go/no-go task after 7 mg nicotine or placebo patch administration in 20 individuals who underwent functional magnetic resonance imaging at 3T. The task design included two levels of difficulty (ordered vs. random stimuli) and two levels of auditory distraction (silence vs. noise). Significant treatment × difficulty × distraction interaction effects on neuronal response were observed in the hippocampus, ventral parietal cortex, and anterior cingulate. In contrast to our hypothesis, U and inverted U-shaped dependencies were observed between the effects of nicotine on response and task demands, depending on the brain area. These results suggest that nicotine may differentially affect neuronal response depending on task conditions. These results have important theoretical implications for understanding how cholinergic tone may influence the neurobiology of selective attention.

  7. Selective and divided attention modulates auditory-vocal integration in the processing of pitch feedback errors.

    Science.gov (United States)

    Liu, Ying; Hu, Huijing; Jones, Jeffery A; Guo, Zhiqiang; Li, Weifeng; Chen, Xi; Liu, Peng; Liu, Hanjun

    2015-08-01

    Speakers rapidly adjust their ongoing vocal productions to compensate for errors they hear in their auditory feedback. It is currently unclear what role attention plays in these vocal compensations. This event-related potential (ERP) study examined the influence of selective and divided attention on the vocal and cortical responses to pitch errors heard in auditory feedback regarding ongoing vocalisations. During the production of a sustained vowel, participants briefly heard their vocal pitch shifted up two semitones while they actively attended to auditory or visual events (selective attention), or both auditory and visual events (divided attention), or were not told to attend to either modality (control condition). The behavioral results showed that attending to the pitch perturbations elicited larger vocal compensations than attending to the visual stimuli. Moreover, ERPs were likewise sensitive to the attentional manipulations: P2 responses to pitch perturbations were larger when participants attended to the auditory stimuli compared to when they attended to the visual stimuli, and compared to when they were not explicitly told to attend to either the visual or auditory stimuli. By contrast, dividing attention between the auditory and visual modalities caused suppressed P2 responses relative to all the other conditions and caused enhanced N1 responses relative to the control condition. These findings provide strong evidence for the influence of attention on the mechanisms underlying the auditory-vocal integration in the processing of pitch feedback errors. In addition, selective attention and divided attention appear to modulate the neurobehavioral processing of pitch feedback errors in different ways. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  8. Auditory and visual sustained attention in Down syndrome.

    Science.gov (United States)

    Faught, Gayle G; Conners, Frances A; Himmelberger, Zachary M

    2016-01-01

    Sustained attention (SA) is important to task performance and development of higher functions. It emerges as a separable component of attention during preschool and shows incremental improvements during this stage of development. The current study investigated if auditory and visual SA match developmental level or are particular challenges for youth with DS. Further, we sought to determine if there were modality effects in SA that could predict those seen in short-term memory (STM). We compared youth with DS to typically developing youth matched for nonverbal mental age and receptive vocabulary. Groups completed auditory and visual sustained attention to response tests (SARTs) and STM tasks. Results indicated groups performed similarly on both SARTs, even over varying cognitive ability. Further, within groups participants performed similarly on auditory and visual SARTs, thus SA could not predict modality effects in STM. However, SA did generally predict a significant portion of unique variance in groups' STM. Ultimately, results suggested both auditory and visual SA match developmental level in DS. Further, SA generally predicts STM, though SA does not necessarily predict the pattern of poor auditory relative to visual STM characteristic of DS. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Auditory attention: time of day and type of school

    Directory of Open Access Journals (Sweden)

    Picolini, Mirela Machado

    2010-06-01

    Full Text Available Introduction: The sustained auditory attention is crucial for the development of some communication skills and learning. Objective: To evaluate the effect of time of day and type of school attended by children in their ability to sustained auditory attention. Method: We performed a prospective study of 50 volunteer children of both sexes, aged 7 years, with normal hearing, no learning or behavioral problems and no complaints of attention. These participants underwent Ability Test of Sustained Auditory Attention (SAAAT. The performance was evaluated by total score and the decrease of vigilance. Statistical analysis was used to analysis of variance (ANOVA with significance level of 5% (p<0.05. Results: The result set by the normative test for the age group evaluated showed a statistically significant difference for the errors of inattention (p=0.041, p=0.027 and total error score (p=0.033, p=0.024, in different periods assessment and school types, respectively. Conclusion: Children evaluated in the afternoon and the children studying in public schools had a poorer performance on auditory attention sustained.

  10. Superior pre-attentive auditory processing in musicians.

    Science.gov (United States)

    Koelsch, S; Schröger, E; Tervaniemi, M

    1999-04-26

    The present study focuses on influences of long-term experience on auditory processing, providing the first evidence for pre-attentively superior auditory processing in musicians. This was revealed by the brain's automatic change-detection response, which is reflected electrically as the mismatch negativity (MMN) and generated by the operation of sensoric (echoic) memory, the earliest cognitive memory system. Major chords and single tones were presented to both professional violinists and non-musicians under ignore and attend conditions. Slightly impure chords, presented among perfect major chords elicited a distinct MMN in professional musicians, but not in non-musicians. This demonstrates that compared to non-musicians, musicians are superior in pre-attentively extracting more information out of musically relevant stimuli. Since effects of long-term experience on pre-attentive auditory processing have so far been reported for language-specific phonemes only, results indicate that sensory memory mechanisms can be modulated by training on a more general level.

  11. Coupling between Theta Oscillations and Cognitive Control Network during Cross-Modal Visual and Auditory Attention: Supramodal vs Modality-Specific Mechanisms.

    Science.gov (United States)

    Wang, Wuyi; Viswanathan, Shivakumar; Lee, Taraz; Grafton, Scott T

    2016-01-01

    Cortical theta band oscillations (4-8 Hz) in EEG signals have been shown to be important for a variety of different cognitive control operations in visual attention paradigms. However the synchronization source of these signals as defined by fMRI BOLD activity and the extent to which theta oscillations play a role in multimodal attention remains unknown. Here we investigated the extent to which cross-modal visual and auditory attention impacts theta oscillations. Using a simultaneous EEG-fMRI paradigm, healthy human participants performed an attentional vigilance task with six cross-modal conditions using naturalistic stimuli. To assess supramodal mechanisms, modulation of theta oscillation amplitude for attention to either visual or auditory stimuli was correlated with BOLD activity by conjunction analysis. Negative correlation was localized to cortical regions associated with the default mode network and positively with ventral premotor areas. Modality-associated attention to visual stimuli was marked by a positive correlation of theta and BOLD activity in fronto-parietal area that was not observed in the auditory condition. A positive correlation of theta and BOLD activity was observed in auditory cortex, while a negative correlation of theta and BOLD activity was observed in visual cortex during auditory attention. The data support a supramodal interaction of theta activity with of DMN function, and modality-associated processes within fronto-parietal networks related to top-down theta related cognitive control in cross-modal visual attention. On the other hand, in sensory cortices there are opposing effects of theta activity during cross-modal auditory attention.

  12. Self-Regulation of the Primary Auditory Cortex Attention Via Directed Attention Mediated By Real Time fMRI Neurofeedback

    Science.gov (United States)

    2017-05-05

    NELSON FROM: 59 MDW /SGYU SUBJECT: Professional Presentation Approval 1. Your paper, entitled Self - regulation of the Primary Auditory Cortex Attention via...DATE Sherwood - p.1 Self - regulation of the primary auditory cortex attention via directed attention mediated by real-time fMRI neurofeedback M S...auditory cortex hyperactivity by self - regulation of the primary auditory cortex (A 1) based on real-time functional magnetic resonance imaging neurofeedback

  13. Psychometric properties of Persian version of the Sustained Auditory Attention Capacity Test in children with attention deficit-hyperactivity disorder.

    Science.gov (United States)

    Soltanparast, Sanaz; Jafari, Zahra; Sameni, Seyed Jalal; Salehi, Masoud

    2014-01-01

    The purpose of the present study was to evaluate the psychometric properties (validity and reliability) of the Persian version of the Sustained Auditory Attention Capacity Test in children with attention deficit hyperactivity disorder. The Persian version of the Sustained Auditory Attention Capacity Test was constructed to assess sustained auditory attention using the method provided by Feniman and colleagues (2007). In this test, comments were provided to assess the child's attentional deficit by determining inattention and impulsiveness error, the total scores of the sustained auditory attention capacity test and attention span reduction index. In the present study for determining the validity and reliability of in both Rey Auditory Verbal Learning test and the Persian version of the Sustained Auditory Attention Capacity Test (SAACT), 46 normal children and 41 children with Attention Deficit Hyperactivity (ADHD), all right-handed and aged between 7 and 11 of both genders, were evaluated. In determining convergent validity, a negative significant correlation was found between the three parts of the Rey Auditory Verbal Learning test (first, fifth, and immediate recall) and all indicators of the SAACT except attention span reduction. By comparing the test scores between the normal and ADHD groups, discriminant validity analysis showed significant differences in all indicators of the test except for attention span reduction (pAttention Capacity test has good validity and reliability, that matches other reliable tests, and it can be used for the identification of children with attention deficits and if they suspected to have Attention Deficit Hyperactivity Disorder.

  14. Interhemispheric interaction expands attentional capacity in an auditory selective attention task.

    Science.gov (United States)

    Scalf, Paige E; Banich, Marie T; Erickson, Andrew B

    2009-04-01

    Previous work from our laboratory indicates that interhemispheric interaction (IHI) functionally increases the attentional capacity available to support performance on visual tasks (Banich in The asymmetrical brain, pp 261-302, 2003). Because manipulations of both computational complexity and selection demand alter the benefits of IHI to task performance, we argue that IHI may be a general strategy for meeting increases in attentional demand. Other researchers, however, have suggested that the apparent benefits of IHI to attentional capacity are an epiphenomenon of the organization of the visual system (Fecteau and Enns in Neuropsychologia 43:1412-1428, 2005; Marsolek et al. in Neuropsychologia 40:1983-1999, 2002). In the current experiment, we investigate whether IHI increases attentional capacity outside the visual system by manipulating the selection demands of an auditory temporal pattern-matching task. We find that IHI expands attentional capacity in the auditory system. This suggests that the benefits of requiring IHI derive from a functional increase in attentional capacity rather than the organization of a specific sensory modality.

  15. The auditory attention status in Iranian bilingual and monolingual people

    Directory of Open Access Journals (Sweden)

    Nayiere Mansoori

    2013-05-01

    Full Text Available Background and Aim: Bilingualism, as one of the discussing issues of psychology and linguistics, can influence the speech processing. Of several tests for assessing auditory processing, dichotic digit test has been designed to study divided auditory attention. Our study was performed to compare the auditory attention between Iranian bilingual and monolingual young adults. Methods: This cross-sectional study was conducted on 60 students including 30 Turkish-Persian bilinguals and 30 Persian monolinguals aged between 18 to 30 years in both genders. Dichotic digit test was performed on young individuals with normal peripheral hearing and right hand preference. Results: No significant correlation was found between the results of dichotic digit test of monolinguals and bilinguals (p=0.195, and also between the results of right and left ears in monolingual (p=0.460 and bilingual (p=0.054 groups. The mean score of women was significantly more than men (p=0.031. Conclusion: There was no significant difference between bilinguals and monolinguals in divided auditory attention; and it seems that acquisition of second language in lower ages has no noticeable effect on this type of auditory attention.

  16. Neural dynamics underlying attentional orienting to auditory representations in short-term memory.

    Science.gov (United States)

    Backer, Kristina C; Binns, Malcolm A; Alain, Claude

    2015-01-21

    Sounds are ephemeral. Thus, coherent auditory perception depends on "hearing" back in time: retrospectively attending that which was lost externally but preserved in short-term memory (STM). Current theories of auditory attention assume that sound features are integrated into a perceptual object, that multiple objects can coexist in STM, and that attention can be deployed to an object in STM. Recording electroencephalography from humans, we tested these assumptions, elucidating feature-general and feature-specific neural correlates of auditory attention to STM. Alpha/beta oscillations and frontal and posterior event-related potentials indexed feature-general top-down attentional control to one of several coexisting auditory representations in STM. Particularly, task performance during attentional orienting was correlated with alpha/low-beta desynchronization (i.e., power suppression). However, attention to one feature could occur without simultaneous processing of the second feature of the representation. Therefore, auditory attention to memory relies on both feature-specific and feature-general neural dynamics. Copyright © 2015 the authors 0270-6474/15/351307-12$15.00/0.

  17. Auditory and visual capture during focused visual attention

    NARCIS (Netherlands)

    Koelewijn, T.; Bronkhorst, A.W.; Theeuwes, J.

    2009-01-01

    It is well known that auditory and visual onsets presented at a particular location can capture a person's visual attention. However, the question of whether such attentional capture disappears when attention is focused endogenously beforehand has not yet been answered. Moreover, previous studies

  18. Control of Auditory Attention in Children With Specific Language Impairment.

    Science.gov (United States)

    Victorino, Kristen R; Schwartz, Richard G

    2015-08-01

    Children with specific language impairment (SLI) appear to demonstrate deficits in attention and its control. Selective attention involves the cognitive control of attention directed toward a relevant stimulus and simultaneous inhibition of attention toward irrelevant stimuli. The current study examined attention control during a cross-modal word recognition task. Twenty participants with SLI (ages 9-12 years) and 20 age-matched peers with typical language development (TLD) listened to words through headphones and were instructed to attend to the words in 1 ear while ignoring the words in the other ear. They were simultaneously presented with pictures and asked to make a lexical decision about whether the pictures and auditory words were the same or different. Accuracy and reaction time were measured in 5 conditions, in which the stimulus in the unattended channel was manipulated. The groups performed with similar accuracy. Compared with their peers with TLD, children with SLI had slower reaction times overall and different within-group patterns of performance by condition. Children with TLD showed efficient inhibitory control in conditions that required active suppression of competing stimuli. Participants with SLI had difficulty exerting control over their auditory attention in all conditions, with particular difficulty inhibiting distractors of all types.

  19. Modelling auditory attention: Insights from the Theory of Visual Attention (TVA)

    DEFF Research Database (Denmark)

    Roberts, K. L.; Andersen, Tobias; Kyllingsbæk, Søren

    modelled using a log-logistic function than an exponential function. A more challenging difference is that in the partial report task, there is more target-distractor confusion for auditory than visual stimuli. This failure of object-formation (prior to attentional object-selection) is not yet effectively......We report initial progress towards creating an auditory analogue of a mathematical model of visual attention: the ‘Theory of Visual Attention’ (TVA; Bundesen, 1990). TVA is one of the best established models of visual attention. It assumes that visual stimuli are initially processed in parallel......, and that there is a ‘race’ for selection and representation in visual short term memory (VSTM). In the basic TVA task, participants view a brief display of letters and are asked to report either all of the letters (whole report) or a subset of the letters (e.g., the red letters; partial report). Fitting the model...

  20. Auditory and Visual Capture during Focused Visual Attention

    Science.gov (United States)

    Koelewijn, Thomas; Bronkhorst, Adelbert; Theeuwes, Jan

    2009-01-01

    It is well known that auditory and visual onsets presented at a particular location can capture a person's visual attention. However, the question of whether such attentional capture disappears when attention is focused endogenously beforehand has not yet been answered. Moreover, previous studies have not differentiated between capture by onsets…

  1. The role of working memory in auditory selective attention.

    Science.gov (United States)

    Dalton, Polly; Santangelo, Valerio; Spence, Charles

    2009-11-01

    A growing body of research now demonstrates that working memory plays an important role in controlling the extent to which irrelevant visual distractors are processed during visual selective attention tasks (e.g., Lavie, Hirst, De Fockert, & Viding, 2004). Recently, it has been shown that the successful selection of tactile information also depends on the availability of working memory (Dalton, Lavie, & Spence, 2009). Here, we investigate whether working memory plays a role in auditory selective attention. Participants focused their attention on short continuous bursts of white noise (targets) while attempting to ignore pulsed bursts of noise (distractors). Distractor interference in this auditory task, as measured in terms of the difference in performance between congruent and incongruent distractor trials, increased significantly under high (vs. low) load in a concurrent working-memory task. These results provide the first evidence demonstrating a causal role for working memory in reducing interference by irrelevant auditory distractors.

  2. Effect of handedness on auditory attentional performance in ADHD students

    Directory of Open Access Journals (Sweden)

    Schmidt SL

    2017-12-01

    Full Text Available Sergio L Schmidt,1,2 Ana Lucia Novais Carvaho,3 Eunice N Simoes2 1Department of Neurophysiology, State University of Rio de Janeiro, Rio de Janeiro, 2Neurology Department, Federal University of the State of Rio de Janeiro, Rio de Janeiro, 3Department of Psychology, Fluminense Federal University, Niteroi, Brazil Abstract: The relationship between handedness and attentional performance is poorly understood. Continuous performance tests (CPTs using visual stimuli are commonly used to assess subjects suffering from attention deficit hyperactivity disorder (ADHD. However, auditory CPTs are considered more useful than visual ones to evaluate classroom attentional problems. A previous study reported that there was a significant effect of handedness on students’ performance on a visual CPT. Here, we examined whether handedness would also affect CPT performance using only auditory stimuli. From an initial sample of 337 students, 11 matched pairs were selected. Repeated ANOVAs showed a significant effect of handedness on attentional performance that was exhibited even in the control group. Left-handers made more commission errors than right-handers. The results were interpreted considering that the association between ADHD and handedness reflects that consistent left-handers are less lateralized and have decreased interhemispheric connections. Auditory attentional data suggest that left-handers have problems in the impulsive/hyperactivity domain. In ADHD, clinical therapeutics and rehabilitation must take handedness into account because consistent sinistrals are more impulsive than dextrals. Keywords: attention, ADHD, consistent left-handers, auditory attention, continuous performance test

  3. Cognitive Training Enhances Auditory Attention Efficiency in Older Adults

    Directory of Open Access Journals (Sweden)

    Jennifer L. O’Brien

    2017-10-01

    Full Text Available Auditory cognitive training (ACT improves attention in older adults; however, the underlying neurophysiological mechanisms are still unknown. The present study examined the effects of ACT on the P3b event-related potential reflecting attention allocation (amplitude and speed of processing (latency during stimulus categorization and the P1-N1-P2 complex reflecting perceptual processing (amplitude and latency. Participants completed an auditory oddball task before and after 10 weeks of ACT (n = 9 or a no contact control period (n = 15. Parietal P3b amplitudes to oddball stimuli decreased at post-test in the trained group as compared to those in the control group, and frontal P3b amplitudes show a similar trend, potentially reflecting more efficient attentional allocation after ACT. No advantages for the ACT group were evident for auditory perceptual processing or speed of processing in this small sample. Our results provide preliminary evidence that ACT may enhance the efficiency of attention allocation, which may account for the positive impact of ACT on the everyday functioning of older adults.

  4. The human auditory brainstem response to running speech reveals a subcortical mechanism for selective attention.

    Science.gov (United States)

    Forte, Antonio Elia; Etard, Octave; Reichenbach, Tobias

    2017-10-10

    Humans excel at selectively listening to a target speaker in background noise such as competing voices. While the encoding of speech in the auditory cortex is modulated by selective attention, it remains debated whether such modulation occurs already in subcortical auditory structures. Investigating the contribution of the human brainstem to attention has, in particular, been hindered by the tiny amplitude of the brainstem response. Its measurement normally requires a large number of repetitions of the same short sound stimuli, which may lead to a loss of attention and to neural adaptation. Here we develop a mathematical method to measure the auditory brainstem response to running speech, an acoustic stimulus that does not repeat and that has a high ecological validity. We employ this method to assess the brainstem's activity when a subject listens to one of two competing speakers, and show that the brainstem response is consistently modulated by attention.

  5. Pre-attentive, context-specific representation of fear memory in the auditory cortex of rat.

    Directory of Open Access Journals (Sweden)

    Akihiro Funamizu

    Full Text Available Neural representation in the auditory cortex is rapidly modulated by both top-down attention and bottom-up stimulus properties, in order to improve perception in a given context. Learning-induced, pre-attentive, map plasticity has been also studied in the anesthetized cortex; however, little attention has been paid to rapid, context-dependent modulation. We hypothesize that context-specific learning leads to pre-attentively modulated, multiplex representation in the auditory cortex. Here, we investigate map plasticity in the auditory cortices of anesthetized rats conditioned in a context-dependent manner, such that a conditioned stimulus (CS of a 20-kHz tone and an unconditioned stimulus (US of a mild electrical shock were associated only under a noisy auditory context, but not in silence. After the conditioning, although no distinct plasticity was found in the tonotopic map, tone-evoked responses were more noise-resistive than pre-conditioning. Yet, the conditioned group showed a reduced spread of activation to each tone with noise, but not with silence, associated with a sharpening of frequency tuning. The encoding accuracy index of neurons showed that conditioning deteriorated the accuracy of tone-frequency representations in noisy condition at off-CS regions, but not at CS regions, suggesting that arbitrary tones around the frequency of the CS were more likely perceived as the CS in a specific context, where CS was associated with US. These results together demonstrate that learning-induced plasticity in the auditory cortex occurs in a context-dependent manner.

  6. Pre-attentive, context-specific representation of fear memory in the auditory cortex of rat.

    Science.gov (United States)

    Funamizu, Akihiro; Kanzaki, Ryohei; Takahashi, Hirokazu

    2013-01-01

    Neural representation in the auditory cortex is rapidly modulated by both top-down attention and bottom-up stimulus properties, in order to improve perception in a given context. Learning-induced, pre-attentive, map plasticity has been also studied in the anesthetized cortex; however, little attention has been paid to rapid, context-dependent modulation. We hypothesize that context-specific learning leads to pre-attentively modulated, multiplex representation in the auditory cortex. Here, we investigate map plasticity in the auditory cortices of anesthetized rats conditioned in a context-dependent manner, such that a conditioned stimulus (CS) of a 20-kHz tone and an unconditioned stimulus (US) of a mild electrical shock were associated only under a noisy auditory context, but not in silence. After the conditioning, although no distinct plasticity was found in the tonotopic map, tone-evoked responses were more noise-resistive than pre-conditioning. Yet, the conditioned group showed a reduced spread of activation to each tone with noise, but not with silence, associated with a sharpening of frequency tuning. The encoding accuracy index of neurons showed that conditioning deteriorated the accuracy of tone-frequency representations in noisy condition at off-CS regions, but not at CS regions, suggesting that arbitrary tones around the frequency of the CS were more likely perceived as the CS in a specific context, where CS was associated with US. These results together demonstrate that learning-induced plasticity in the auditory cortex occurs in a context-dependent manner.

  7. Auditory and visual sustained attention in children with speech sound disorder.

    Directory of Open Access Journals (Sweden)

    Cristina F B Murphy

    Full Text Available Although research has demonstrated that children with specific language impairment (SLI and reading disorder (RD exhibit sustained attention deficits, no study has investigated sustained attention in children with speech sound disorder (SSD. Given the overlap of symptoms, such as phonological memory deficits, between these different language disorders (i.e., SLI, SSD and RD and the relationships between working memory, attention and language processing, it is worthwhile to investigate whether deficits in sustained attention also occur in children with SSD. A total of 55 children (18 diagnosed with SSD (8.11 ± 1.231 and 37 typically developing children (8.76 ± 1.461 were invited to participate in this study. Auditory and visual sustained-attention tasks were applied. Children with SSD performed worse on these tasks; they committed a greater number of auditory false alarms and exhibited a significant decline in performance over the course of the auditory detection task. The extent to which performance is related to auditory perceptual difficulties and probable working memory deficits is discussed. Further studies are needed to better understand the specific nature of these deficits and their clinical implications.

  8. Visual unimodal grouping mediates auditory attentional bias in visuo-spatial working memory.

    Science.gov (United States)

    Botta, Fabiano; Lupiáñez, Juan; Sanabria, Daniel

    2013-09-01

    Audiovisual links in spatial attention have been reported in many previous studies. However, the effectiveness of auditory spatial cues in biasing the information encoding into visuo-spatial working memory (VSWM) is still relatively unknown. In this study, we addressed this issue by combining a cuing paradigm with a change detection task in VSWM. Moreover, we manipulated the perceptual organization of the to-be-remembered visual stimuli. We hypothesized that the auditory effect on VSWM would depend on the perceptual association between the auditory cue and the visual probe. Results showed, for the first time, a significant auditory attentional bias in VSWM. However, the effect was observed only when the to-be-remembered visual stimuli were organized in two distinctive visual objects. We propose that these results shed new light on audio-visual crossmodal links in spatial attention suggesting that, apart from the spatio-temporal contingency, the likelihood of perceptual association between the auditory cue and the visual target can have a large impact on crossmodal attentional biases. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Auditory attention in childhood and adolescence: An event-related potential study of spatial selective attention to one of two simultaneous stories

    Science.gov (United States)

    Karns, Christina M.; Isbell, Elif; Giuliano, Ryan J.; Neville, Helen J.

    2015-01-01

    Auditory selective attention is a critical skill for goal-directed behavior, especially where noisy distractions may impede focusing attention. To better understand the developmental trajectory of auditory spatial selective attention in an acoustically complex environment, in the current study we measured auditory event-related potentials (ERPs) in human children across five age groups: 3–5 years; 10 years; 13 years; 16 years; and young adults using a naturalistic dichotic listening paradigm, characterizing the ERP morphology for nonlinguistic and linguistic auditory probes embedded in attended and unattended stories. We documented robust maturational changes in auditory evoked potentials that were specific to the types of probes. Furthermore, we found a remarkable interplay between age and attention-modulation of auditory evoked potentials in terms of morphology and latency from the early years of childhood through young adulthood. The results are consistent with the view that attention can operate across age groups by modulating the amplitude of maturing auditory early-latency evoked potentials or by invoking later endogenous attention processes. Development of these processes is not uniform for probes with different acoustic properties within our acoustically dense speech-based dichotic listening task. In light of the developmental differences we demonstrate, researchers conducting future attention studies of children and adolescents should be wary of combining analyses across diverse ages. PMID:26002721

  10. Auditory attention in childhood and adolescence: An event-related potential study of spatial selective attention to one of two simultaneous stories.

    Science.gov (United States)

    Karns, Christina M; Isbell, Elif; Giuliano, Ryan J; Neville, Helen J

    2015-06-01

    Auditory selective attention is a critical skill for goal-directed behavior, especially where noisy distractions may impede focusing attention. To better understand the developmental trajectory of auditory spatial selective attention in an acoustically complex environment, in the current study we measured auditory event-related potentials (ERPs) across five age groups: 3-5 years; 10 years; 13 years; 16 years; and young adults. Using a naturalistic dichotic listening paradigm, we characterized the ERP morphology for nonlinguistic and linguistic auditory probes embedded in attended and unattended stories. We documented robust maturational changes in auditory evoked potentials that were specific to the types of probes. Furthermore, we found a remarkable interplay between age and attention-modulation of auditory evoked potentials in terms of morphology and latency from the early years of childhood through young adulthood. The results are consistent with the view that attention can operate across age groups by modulating the amplitude of maturing auditory early-latency evoked potentials or by invoking later endogenous attention processes. Development of these processes is not uniform for probes with different acoustic properties within our acoustically dense speech-based dichotic listening task. In light of the developmental differences we demonstrate, researchers conducting future attention studies of children and adolescents should be wary of combining analyses across diverse ages. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Electrophysiological evidence for altered visual, but not auditory, selective attention in adolescent cochlear implant users.

    Science.gov (United States)

    Harris, Jill; Kamke, Marc R

    2014-11-01

    Selective attention fundamentally alters sensory perception, but little is known about the functioning of attention in individuals who use a cochlear implant. This study aimed to investigate visual and auditory attention in adolescent cochlear implant users. Event related potentials were used to investigate the influence of attention on visual and auditory evoked potentials in six cochlear implant users and age-matched normally-hearing children. Participants were presented with streams of alternating visual and auditory stimuli in an oddball paradigm: each modality contained frequently presented 'standard' and infrequent 'deviant' stimuli. Across different blocks attention was directed to either the visual or auditory modality. For the visual stimuli attention boosted the early N1 potential, but this effect was larger for cochlear implant users. Attention was also associated with a later P3 component for the visual deviant stimulus, but there was no difference between groups in the later attention effects. For the auditory stimuli, attention was associated with a decrease in N1 latency as well as a robust P3 for the deviant tone. Importantly, there was no difference between groups in these auditory attention effects. The results suggest that basic mechanisms of auditory attention are largely normal in children who are proficient cochlear implant users, but that visual attention may be altered. Ultimately, a better understanding of how selective attention influences sensory perception in cochlear implant users will be important for optimising habilitation strategies. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. Selective Attention to Visual Stimuli Using Auditory Distractors Is Altered in Alpha-9 Nicotinic Receptor Subunit Knock-Out Mice.

    Science.gov (United States)

    Terreros, Gonzalo; Jorratt, Pascal; Aedo, Cristian; Elgoyhen, Ana Belén; Delano, Paul H

    2016-07-06

    During selective attention, subjects voluntarily focus their cognitive resources on a specific stimulus while ignoring others. Top-down filtering of peripheral sensory responses by higher structures of the brain has been proposed as one of the mechanisms responsible for selective attention. A prerequisite to accomplish top-down modulation of the activity of peripheral structures is the presence of corticofugal pathways. The mammalian auditory efferent system is a unique neural network that originates in the auditory cortex and projects to the cochlear receptor through the olivocochlear bundle, and it has been proposed to function as a top-down filter of peripheral auditory responses during attention to cross-modal stimuli. However, to date, there is no conclusive evidence of the involvement of olivocochlear neurons in selective attention paradigms. Here, we trained wild-type and α-9 nicotinic receptor subunit knock-out (KO) mice, which lack cholinergic transmission between medial olivocochlear neurons and outer hair cells, in a two-choice visual discrimination task and studied the behavioral consequences of adding different types of auditory distractors. In addition, we evaluated the effects of contralateral noise on auditory nerve responses as a measure of the individual strength of the olivocochlear reflex. We demonstrate that KO mice have a reduced olivocochlear reflex strength and perform poorly in a visual selective attention paradigm. These results confirm that an intact medial olivocochlear transmission aids in ignoring auditory distraction during selective attention to visual stimuli. The auditory efferent system is a neural network that originates in the auditory cortex and projects to the cochlear receptor through the olivocochlear system. It has been proposed to function as a top-down filter of peripheral auditory responses during attention to cross-modal stimuli. However, to date, there is no conclusive evidence of the involvement of olivocochlear

  13. Attention, awareness, and the perception of auditory scenes

    Directory of Open Access Journals (Sweden)

    Joel S Snyder

    2012-02-01

    Full Text Available Auditory perception and cognition entails both low-level and high-level processes, which are likely to interact with each other to create our rich conscious experience of soundscapes. Recent research that we review has revealed numerous influences of high-level factors, such as attention, intention, and prior experience, on conscious auditory perception. And recently, studies have shown that auditory scene analysis tasks can exhibit multistability in a manner very similar to ambiguous visual stimuli, presenting a unique opportunity to study neural correlates of auditory awareness and the extent to which mechanisms of perception are shared across sensory modalities. Research has also led to a growing number of techniques through which auditory perception can be manipulated and even completely suppressed. Such findings have important consequences for our understanding of the mechanisms of perception and also should allow scientists to precisely distinguish the influences of different higher-level influences.

  14. Identification of Auditory Object-Specific Attention from Single-Trial Electroencephalogram Signals via Entropy Measures and Machine Learning

    Directory of Open Access Journals (Sweden)

    Yun Lu

    2018-05-01

    Full Text Available Existing research has revealed that auditory attention can be tracked from ongoing electroencephalography (EEG signals. The aim of this novel study was to investigate the identification of peoples’ attention to a specific auditory object from single-trial EEG signals via entropy measures and machine learning. Approximate entropy (ApEn, sample entropy (SampEn, composite multiscale entropy (CmpMSE and fuzzy entropy (FuzzyEn were used to extract the informative features of EEG signals under three kinds of auditory object-specific attention (Rest, Auditory Object1 Attention (AOA1 and Auditory Object2 Attention (AOA2. The linear discriminant analysis and support vector machine (SVM, were used to construct two auditory attention classifiers. The statistical results of entropy measures indicated that there were significant differences in the values of ApEn, SampEn, CmpMSE and FuzzyEn between Rest, AOA1 and AOA2. For the SVM-based auditory attention classifier, the auditory object-specific attention of Rest, AOA1 and AOA2 could be identified from EEG signals using ApEn, SampEn, CmpMSE and FuzzyEn as features and the identification rates were significantly different from chance level. The optimal identification was achieved by the SVM-based auditory attention classifier using CmpMSE with the scale factor τ = 10. This study demonstrated a novel solution to identify the auditory object-specific attention from single-trial EEG signals without the need to access the auditory stimulus.

  15. Auditory event-related responses to diphthongs in different attention conditions

    DEFF Research Database (Denmark)

    Morris, David Jackson; Steinmetzger, Kurt; Tøndering, John

    2016-01-01

    The modulation of auditory event-related potentials (ERP) by attention generally results in larger amplitudes when stimuli are attended. We measured the P1-N1-P2 acoustic change complex elicited with synthetic overt (second formant, F2 = 1000 Hz) and subtle (F2 = 100 Hz) diphthongs, while subjects...... (i) attended to the auditory stimuli, (ii) ignored the auditory stimuli and watched a film, and (iii) diverted their attention to a visual discrimination task. Responses elicited by diphthongs where F2 values rose and fell were found to be different and this precluded their combined analysis....... Multivariate analysis of ERP components from the rising F2 changes showed main effects of attention on P2 amplitude and latency, and N1-P2 amplitude. P2 amplitude decreased by 40% between the attend and ignore conditions, and by 60% between the attend and divert conditions. The effect of diphthong magnitude...

  16. Attention Cueing and Activity Equally Reduce False Alarm Rate in Visual-Auditory Associative Learning through Improving Memory.

    Science.gov (United States)

    Nikouei Mahani, Mohammad-Ali; Haghgoo, Hojjat Allah; Azizi, Solmaz; Nili Ahmadabadi, Majid

    2016-01-01

    In our daily life, we continually exploit already learned multisensory associations and form new ones when facing novel situations. Improving our associative learning results in higher cognitive capabilities. We experimentally and computationally studied the learning performance of healthy subjects in a visual-auditory sensory associative learning task across active learning, attention cueing learning, and passive learning modes. According to our results, the learning mode had no significant effect on learning association of congruent pairs. In addition, subjects' performance in learning congruent samples was not correlated with their vigilance score. Nevertheless, vigilance score was significantly correlated with the learning performance of the non-congruent pairs. Moreover, in the last block of the passive learning mode, subjects significantly made more mistakes in taking non-congruent pairs as associated and consciously reported lower confidence. These results indicate that attention and activity equally enhanced visual-auditory associative learning for non-congruent pairs, while false alarm rate in the passive learning mode did not decrease after the second block. We investigated the cause of higher false alarm rate in the passive learning mode by using a computational model, composed of a reinforcement learning module and a memory-decay module. The results suggest that the higher rate of memory decay is the source of making more mistakes and reporting lower confidence in non-congruent pairs in the passive learning mode.

  17. Attention-related modulation of auditory brainstem responses during contralateral noise exposure.

    Science.gov (United States)

    Ikeda, Kazunari; Sekiguchi, Takahiro; Hayashi, Akiko

    2008-10-29

    As determinants facilitating attention-related modulation of the auditory brainstem response (ABR), two experimental factors were examined: (i) auditory discrimination; and (ii) contralateral masking intensity. Tone pips at 80 dB sound pressure level were presented to the left ear via either single-tone exposures or oddball exposures, whereas white noise was delivered continuously to the right ear at variable intensities (none--80 dB sound pressure level). Participants each conducted two tasks during stimulation, either reading a book (ignoring task) or detecting target tones (attentive task). Task-related modulation within the ABR range was found only during oddball exposures at contralateral masking intensities greater than or equal to 60 dB. Attention-related modulation of ABR can thus be detected reliably during auditory discrimination under contralateral masking of sufficient intensity.

  18. Dynamic crossmodal links revealed by steady-state responses in auditory-visual divided attention.

    Science.gov (United States)

    de Jong, Ritske; Toffanin, Paolo; Harbers, Marten

    2010-01-01

    Frequency tagging has been often used to study intramodal attention but not intermodal attention. We used EEG and simultaneous frequency tagging of auditory and visual sources to study intermodal focused and divided attention in detection and discrimination performance. Divided-attention costs were smaller, but still significant, in detection than in discrimination. The auditory steady-state response (SSR) showed no effects of attention at frontocentral locations, but did so at occipital locations where it was evident only when attention was divided between audition and vision. Similarly, the visual SSR at occipital locations was substantially enhanced when attention was divided across modalities. Both effects were equally present in detection and discrimination. We suggest that both effects reflect a common cause: An attention-dependent influence of auditory information processing on early cortical stages of visual information processing, mediated by enhanced effective connectivity between the two modalities under conditions of divided attention. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  19. Auditory and visual capture during focused visual attention

    OpenAIRE

    Koelewijn, T.; Bronkhorst, A.W.; Theeuwes, J.

    2009-01-01

    It is well known that auditory and visual onsets presented at a particular location can capture a person's visual attention. However, the question of whether such attentional capture disappears when attention is focused endogenously beforehand has not yet been answered. Moreover, previous studies have not differentiated between capture by onsets presented at a nontarget (invalid) location and possible performance benefits occurring when the target location is (validly) cued. In this study, th...

  20. Frequency-specific attentional modulation in human primary auditory cortex and midbrain.

    Science.gov (United States)

    Riecke, Lars; Peters, Judith C; Valente, Giancarlo; Poser, Benedikt A; Kemper, Valentin G; Formisano, Elia; Sorger, Bettina

    2018-07-01

    Paying selective attention to an audio frequency selectively enhances activity within primary auditory cortex (PAC) at the tonotopic site (frequency channel) representing that frequency. Animal PAC neurons achieve this 'frequency-specific attentional spotlight' by adapting their frequency tuning, yet comparable evidence in humans is scarce. Moreover, whether the spotlight operates in human midbrain is unknown. To address these issues, we studied the spectral tuning of frequency channels in human PAC and inferior colliculus (IC), using 7-T functional magnetic resonance imaging (FMRI) and frequency mapping, while participants focused on different frequency-specific sounds. We found that shifts in frequency-specific attention alter the response gain, but not tuning profile, of PAC frequency channels. The gain modulation was strongest in low-frequency channels and varied near-monotonically across the tonotopic axis, giving rise to the attentional spotlight. We observed less prominent, non-tonotopic spatial patterns of attentional modulation in IC. These results indicate that the frequency-specific attentional spotlight in human PAC as measured with FMRI arises primarily from tonotopic gain modulation, rather than adapted frequency tuning. Moreover, frequency-specific attentional modulation of afferent sound processing in human IC seems to be considerably weaker, suggesting that the spotlight diminishes toward this lower-order processing stage. Our study sheds light on how the human auditory pathway adapts to the different demands of selective hearing. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  1. EEG phase reset due to auditory attention: an inverse time-scale approach

    International Nuclear Information System (INIS)

    Low, Yin Fen; Strauss, Daniel J

    2009-01-01

    We propose a novel tool to evaluate the electroencephalograph (EEG) phase reset due to auditory attention by utilizing an inverse analysis of the instantaneous phase for the first time. EEGs were acquired through auditory attention experiments with a maximum entropy stimulation paradigm. We examined single sweeps of auditory late response (ALR) with the complex continuous wavelet transform. The phase in the frequency band that is associated with auditory attention (6–10 Hz, termed as theta–alpha border) was reset to the mean phase of the averaged EEGs. The inverse transform was applied to reconstruct the phase-modified signal. We found significant enhancement of the N100 wave in the reconstructed signal. Analysis of the phase noise shows the effects of phase jittering on the generation of the N100 wave implying that a preferred phase is necessary to generate the event-related potential (ERP). Power spectrum analysis shows a remarkable increase of evoked power but little change of total power after stabilizing the phase of EEGs. Furthermore, by resetting the phase only at the theta border of no attention data to the mean phase of attention data yields a result that resembles attention data. These results show strong connections between EEGs and ERP, in particular, we suggest that the presentation of an auditory stimulus triggers the phase reset process at the theta–alpha border which leads to the emergence of the N100 wave. It is concluded that our study reinforces other studies on the importance of the EEG in ERP genesis

  2. EEG phase reset due to auditory attention: an inverse time-scale approach.

    Science.gov (United States)

    Low, Yin Fen; Strauss, Daniel J

    2009-08-01

    We propose a novel tool to evaluate the electroencephalograph (EEG) phase reset due to auditory attention by utilizing an inverse analysis of the instantaneous phase for the first time. EEGs were acquired through auditory attention experiments with a maximum entropy stimulation paradigm. We examined single sweeps of auditory late response (ALR) with the complex continuous wavelet transform. The phase in the frequency band that is associated with auditory attention (6-10 Hz, termed as theta-alpha border) was reset to the mean phase of the averaged EEGs. The inverse transform was applied to reconstruct the phase-modified signal. We found significant enhancement of the N100 wave in the reconstructed signal. Analysis of the phase noise shows the effects of phase jittering on the generation of the N100 wave implying that a preferred phase is necessary to generate the event-related potential (ERP). Power spectrum analysis shows a remarkable increase of evoked power but little change of total power after stabilizing the phase of EEGs. Furthermore, by resetting the phase only at the theta border of no attention data to the mean phase of attention data yields a result that resembles attention data. These results show strong connections between EEGs and ERP, in particular, we suggest that the presentation of an auditory stimulus triggers the phase reset process at the theta-alpha border which leads to the emergence of the N100 wave. It is concluded that our study reinforces other studies on the importance of the EEG in ERP genesis.

  3. Examining age-related differences in auditory attention control using a task-switching procedure.

    Science.gov (United States)

    Lawo, Vera; Koch, Iring

    2014-03-01

    Using a novel task-switching variant of dichotic selective listening, we examined age-related differences in the ability to intentionally switch auditory attention between 2 speakers defined by their sex. In our task, young (M age = 23.2 years) and older adults (M age = 66.6 years) performed a numerical size categorization on spoken number words. The task-relevant speaker was indicated by a cue prior to auditory stimulus onset. The cuing interval was either short or long and varied randomly trial by trial. We found clear performance costs with instructed attention switches. These auditory attention switch costs decreased with prolonged cue-stimulus interval. Older adults were generally much slower (but not more error prone) than young adults, but switching-related effects did not differ across age groups. These data suggest that the ability to intentionally switch auditory attention in a selective listening task is not compromised in healthy aging. We discuss the role of modality-specific factors in age-related differences.

  4. Attention-dependent allocation of auditory processing resources as measured by mismatch negativity.

    Science.gov (United States)

    Dittmann-Balcar, A; Thienel, R; Schall, U

    1999-12-16

    Mismatch negativity (MMN) is a pre-attentive event-related potential measure of echoic memory. However, recent studies suggest attention-related modulation of MMN. This study investigates duration-elicited MMN in healthy subjects (n = 12) who were performing a visual discrimination task and, subsequently, an auditory discrimination task in a series of increasing task difficulty. MMN amplitude was found to be maximal at centro-frontal electrode sites without hemispheric differences. Comparison of both attend conditions (visual vs. auditory), revealed larger MMN amplitudes at Fz in the visual task without differences across task difficulty. However, significantly smaller MMN in the most demanding auditory condition supports the notion of limited processing capacity whose resources are modulated by attention in response to task requirements.

  5. Spatial auditory attention is modulated by tactile priming.

    Science.gov (United States)

    Menning, Hans; Ackermann, Hermann; Hertrich, Ingo; Mathiak, Klaus

    2005-07-01

    Previous studies have shown that cross-modal processing affects perception at a variety of neuronal levels. In this study, event-related brain responses were recorded via whole-head magnetoencephalography (MEG). Spatial auditory attention was directed via tactile pre-cues (primes) to one of four locations in the peripersonal space (left and right hand versus face). Auditory stimuli were white noise bursts, convoluted with head-related transfer functions, which ensured spatial perception of the four locations. Tactile primes (200-300 ms prior to acoustic onset) were applied randomly to one of these locations. Attentional load was controlled by three different visual distraction tasks. The auditory P50m (about 50 ms after stimulus onset) showed a significant "proximity" effect (larger responses to face stimulation as well as a "contralaterality" effect between side of stimulation and hemisphere). The tactile primes essentially reduced both the P50m and N100m components. However, facial tactile pre-stimulation yielded an enhanced ipsilateral N100m. These results show that earlier responses are mainly governed by exogenous stimulus properties whereas cross-sensory interaction is spatially selective at a later (endogenous) processing stage.

  6. Evoked potential correlates of selective attention with multi-channel auditory inputs

    Science.gov (United States)

    Schwent, V. L.; Hillyard, S. A.

    1975-01-01

    Ten subjects were presented with random, rapid sequences of four auditory tones which were separated in pitch and apparent spatial position. The N1 component of the auditory vertex evoked potential (EP) measured relative to a baseline was observed to increase with attention. It was concluded that the N1 enhancement reflects a finely tuned selective attention to one stimulus channel among several concurrent, competing channels. This EP enhancement probably increases with increased information load on the subject.

  7. Auditory Selective Attention in Cerebral-Palsied Individuals.

    Science.gov (United States)

    Laraway, Lee Ann

    1985-01-01

    To examine differences between auditory selective attention abilities of normal and cerebral-palsied individuals, 23 cerebral-palsied and 23 normal subjects (5-21) were asked to repeat a series of 30 items in presence of intermittent white noise. Results indicated that cerebral-palsied individuals perform significantly more poorly when the…

  8. Impact of Auditory Selective Attention on Verbal Short-Term Memory and Vocabulary Development

    Science.gov (United States)

    Majerus, Steve; Heiligenstein, Lucie; Gautherot, Nathalie; Poncelet, Martine; Van der Linden, Martial

    2009-01-01

    This study investigated the role of auditory selective attention capacities as a possible mediator of the well-established association between verbal short-term memory (STM) and vocabulary development. A total of 47 6- and 7-year-olds were administered verbal immediate serial recall and auditory attention tasks. Both task types probed processing…

  9. Tuning In to Sound: Frequency-Selective Attentional Filter in Human Primary Auditory Cortex

    Science.gov (United States)

    Da Costa, Sandra; van der Zwaag, Wietske; Miller, Lee M.; Clarke, Stephanie

    2013-01-01

    Cocktail parties, busy streets, and other noisy environments pose a difficult challenge to the auditory system: how to focus attention on selected sounds while ignoring others? Neurons of primary auditory cortex, many of which are sharply tuned to sound frequency, could help solve this problem by filtering selected sound information based on frequency-content. To investigate whether this occurs, we used high-resolution fMRI at 7 tesla to map the fine-scale frequency-tuning (1.5 mm isotropic resolution) of primary auditory areas A1 and R in six human participants. Then, in a selective attention experiment, participants heard low (250 Hz)- and high (4000 Hz)-frequency streams of tones presented at the same time (dual-stream) and were instructed to focus attention onto one stream versus the other, switching back and forth every 30 s. Attention to low-frequency tones enhanced neural responses within low-frequency-tuned voxels relative to high, and when attention switched the pattern quickly reversed. Thus, like a radio, human primary auditory cortex is able to tune into attended frequency channels and can switch channels on demand. PMID:23365225

  10. Switching auditory attention using spatial and non-spatial features recruits different cortical networks.

    Science.gov (United States)

    Larson, Eric; Lee, Adrian K C

    2014-01-01

    Switching attention between different stimuli of interest based on particular task demands is important in many everyday settings. In audition in particular, switching attention between different speakers of interest that are talking concurrently is often necessary for effective communication. Recently, it has been shown by multiple studies that auditory selective attention suppresses the representation of unwanted streams in auditory cortical areas in favor of the target stream of interest. However, the neural processing that guides this selective attention process is not well understood. Here we investigated the cortical mechanisms involved in switching attention based on two different types of auditory features. By combining magneto- and electro-encephalography (M-EEG) with an anatomical MRI constraint, we examined the cortical dynamics involved in switching auditory attention based on either spatial or pitch features. We designed a paradigm where listeners were cued in the beginning of each trial to switch or maintain attention halfway through the presentation of concurrent target and masker streams. By allowing listeners time to switch during a gap in the continuous target and masker stimuli, we were able to isolate the mechanisms involved in endogenous, top-down attention switching. Our results show a double dissociation between the involvement of right temporoparietal junction (RTPJ) and the left inferior parietal supramarginal part (LIPSP) in tasks requiring listeners to switch attention based on space and pitch features, respectively, suggesting that switching attention based on these features involves at least partially separate processes or behavioral strategies. © 2013 Elsevier Inc. All rights reserved.

  11. Priming T2 in a Visual and Auditory Attentional Blink Task

    NARCIS (Netherlands)

    Burg, E. van der; Olivers, C.N.L.; Bronkhorst, A.W.; Theeuwes, J.

    2008-01-01

    Participants performed an attentional blink (AB) task including digits as targets and letters as distractors within the visual and auditory domains. Prior to the rapid serial visual presentation, a visual or auditory prime was presented in the form of a digit that was identical to the second target

  12. Influence of memory, attention, IQ and age on auditory temporal processing tests: preliminary study

    OpenAIRE

    Murphy, Cristina Ferraz Borges; Zachi, Elaine Cristina; Roque, Daniela Tsubota; Ventura, Dora Selma Fix; Schochat, Eliane

    2014-01-01

    PURPOSE: To investigate the existence of correlations between the performance of children in auditory temporal tests (Frequency Pattern and Gaps in Noise - GIN) and IQ, attention, memory and age measurements. METHOD: Fifteen typically developing individuals between the ages of 7 to 12 years and normal hearing participated in the study. Auditory temporal processing tests (GIN and Frequency Pattern), as well as a Memory test (Digit Span), Attention tests (auditory and visual modality) and ...

  13. Brain activity during auditory and visual phonological, spatial and simple discrimination tasks.

    Science.gov (United States)

    Salo, Emma; Rinne, Teemu; Salonen, Oili; Alho, Kimmo

    2013-02-16

    We used functional magnetic resonance imaging to measure human brain activity during tasks demanding selective attention to auditory or visual stimuli delivered in concurrent streams. Auditory stimuli were syllables spoken by different voices and occurring in central or peripheral space. Visual stimuli were centrally or more peripherally presented letters in darker or lighter fonts. The participants performed a phonological, spatial or "simple" (speaker-gender or font-shade) discrimination task in either modality. Within each modality, we expected a clear distinction between brain activations related to nonspatial and spatial processing, as reported in previous studies. However, within each modality, different tasks activated largely overlapping areas in modality-specific (auditory and visual) cortices, as well as in the parietal and frontal brain regions. These overlaps may be due to effects of attention common for all three tasks within each modality or interaction of processing task-relevant features and varying task-irrelevant features in the attended-modality stimuli. Nevertheless, brain activations caused by auditory and visual phonological tasks overlapped in the left mid-lateral prefrontal cortex, while those caused by the auditory and visual spatial tasks overlapped in the inferior parietal cortex. These overlapping activations reveal areas of multimodal phonological and spatial processing. There was also some evidence for intermodal attention-related interaction. Most importantly, activity in the superior temporal sulcus elicited by unattended speech sounds was attenuated during the visual phonological task in comparison with the other visual tasks. This effect might be related to suppression of processing irrelevant speech presumably distracting the phonological task involving the letters. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. A common source of attention for auditory and visual tracking.

    Science.gov (United States)

    Fougnie, Daryl; Cockhren, Jurnell; Marois, René

    2018-05-01

    Tasks that require tracking visual information reveal the severe limitations of our capacity to attend to multiple objects that vary in time and space. Although these limitations have been extensively characterized in the visual domain, very little is known about tracking information in other sensory domains. Does tracking auditory information exhibit characteristics similar to those of tracking visual information, and to what extent do these two tracking tasks draw on the same attention resources? We addressed these questions by asking participants to perform either single or dual tracking tasks from the same (visual-visual) or different (visual-auditory) perceptual modalities, with the difficulty of the tracking tasks being manipulated across trials. The results revealed that performing two concurrent tracking tasks, whether they were in the same or different modalities, affected tracking performance as compared to performing each task alone (concurrence costs). Moreover, increasing task difficulty also led to increased costs in both the single-task and dual-task conditions (load-dependent costs). The comparison of concurrence costs between visual-visual and visual-auditory dual-task performance revealed slightly greater interference when two visual tracking tasks were paired. Interestingly, however, increasing task difficulty led to equivalent costs for visual-visual and visual-auditory pairings. We concluded that visual and auditory tracking draw largely, though not exclusively, on common central attentional resources.

  15. Irrelevant Auditory and Visual Events Induce a Visual Attentional Blink

    NARCIS (Netherlands)

    Van der Burg, Erik; Nieuwenstein, Mark R.; Theeuwes, Jan; Olivers, Christian N. L.

    2013-01-01

    In the present study we investigated whether a task-irrelevant distractor can induce a visual attentional blink pattern. Participants were asked to detect only a visual target letter (A, B, or C) and to ignore the preceding auditory, visual, or audiovisual distractor. An attentional blink was

  16. Auditory-Motor Control of Vocal Production during Divided Attention: Behavioral and ERP Correlates.

    Science.gov (United States)

    Liu, Ying; Fan, Hao; Li, Jingting; Jones, Jeffery A; Liu, Peng; Zhang, Baofeng; Liu, Hanjun

    2018-01-01

    When people hear unexpected perturbations in auditory feedback, they produce rapid compensatory adjustments of their vocal behavior. Recent evidence has shown enhanced vocal compensations and cortical event-related potentials (ERPs) in response to attended pitch feedback perturbations, suggesting that this reflex-like behavior is influenced by selective attention. Less is known, however, about auditory-motor integration for voice control during divided attention. The present cross-modal study investigated the behavioral and ERP correlates of auditory feedback control of vocal pitch production during divided attention. During the production of sustained vowels, 32 young adults were instructed to simultaneously attend to both pitch feedback perturbations they heard and flashing red lights they saw. The presentation rate of the visual stimuli was varied to produce a low, intermediate, and high attentional load. The behavioral results showed that the low-load condition elicited significantly smaller vocal compensations for pitch perturbations than the intermediate-load and high-load conditions. As well, the cortical processing of vocal pitch feedback was also modulated as a function of divided attention. When compared to the low-load and intermediate-load conditions, the high-load condition elicited significantly larger N1 responses and smaller P2 responses to pitch perturbations. These findings provide the first neurobehavioral evidence that divided attention can modulate auditory feedback control of vocal pitch production.

  17. Gender differences in pre-attentive change detection for visual but not auditory stimuli.

    Science.gov (United States)

    Yang, Xiuxian; Yu, Yunmiao; Chen, Lu; Sun, Hailian; Qiao, Zhengxue; Qiu, Xiaohui; Zhang, Congpei; Wang, Lin; Zhu, Xiongzhao; He, Jincai; Zhao, Lun; Yang, Yanjie

    2016-01-01

    Despite ongoing debate about gender differences in pre-attention processes, little is known about gender effects on change detection for auditory and visual stimuli. We explored gender differences in change detection while processing duration information in auditory and visual modalities. We investigated pre-attentive processing of duration information using a deviant-standard reverse oddball paradigm (50 ms/150 ms) for auditory and visual mismatch negativity (aMMN and vMMN) in males and females (n=21/group). In the auditory modality, decrement and increment aMMN were observed at 150-250 ms after the stimulus onset, and there was no significant gender effect on MMN amplitudes in temporal or fronto-central areas. In contrast, in the visual modality, only increment vMMN was observed at 180-260 ms after the onset of stimulus, and it was higher in males than in females. No gender effect was found in change detection for auditory stimuli, but change detection was facilitated for visual stimuli in males. Gender effects should be considered in clinical studies of pre-attention for visual stimuli. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  18. Modulation of auditory spatial attention by visual emotional cues: differential effects of attentional engagement and disengagement for pleasant and unpleasant cues.

    Science.gov (United States)

    Harrison, Neil R; Woodhouse, Rob

    2016-05-01

    Previous research has demonstrated that threatening, compared to neutral pictures, can bias attention towards non-emotional auditory targets. Here we investigated which subcomponents of attention contributed to the influence of emotional visual stimuli on auditory spatial attention. Participants indicated the location of an auditory target, after brief (250 ms) presentation of a spatially non-predictive peripheral visual cue. Responses to targets were faster at the location of the preceding visual cue, compared to at the opposite location (cue validity effect). The cue validity effect was larger for targets following pleasant and unpleasant cues compared to neutral cues, for right-sided targets. For unpleasant cues, the crossmodal cue validity effect was driven by delayed attentional disengagement, and for pleasant cues, it was driven by enhanced engagement. We conclude that both pleasant and unpleasant visual cues influence the distribution of attention across modalities and that the associated attentional mechanisms depend on the valence of the visual cue.

  19. Visual attention modulates brain activation to angry voices.

    Science.gov (United States)

    Mothes-Lasch, Martin; Mentzel, Hans-Joachim; Miltner, Wolfgang H R; Straube, Thomas

    2011-06-29

    In accordance with influential models proposing prioritized processing of threat, previous studies have shown automatic brain responses to angry prosody in the amygdala and the auditory cortex under auditory distraction conditions. However, it is unknown whether the automatic processing of angry prosody is also observed during cross-modal distraction. The current fMRI study investigated brain responses to angry versus neutral prosodic stimuli during visual distraction. During scanning, participants were exposed to angry or neutral prosodic stimuli while visual symbols were displayed simultaneously. By means of task requirements, participants either attended to the voices or to the visual stimuli. While the auditory task revealed pronounced activation in the auditory cortex and amygdala to angry versus neutral prosody, this effect was absent during the visual task. Thus, our results show a limitation of the automaticity of the activation of the amygdala and auditory cortex to angry prosody. The activation of these areas to threat-related voices depends on modality-specific attention.

  20. Auditory selective attention in adolescents with major depression: An event-related potential study.

    Science.gov (United States)

    Greimel, E; Trinkl, M; Bartling, J; Bakos, S; Grossheinrich, N; Schulte-Körne, G

    2015-02-01

    Major depression (MD) is associated with deficits in selective attention. Previous studies in adults with MD using event-related potentials (ERPs) reported abnormalities in the neurophysiological correlates of auditory selective attention. However, it is yet unclear whether these findings can be generalized to MD in adolescence. Thus, the aim of the present ERP study was to explore the neural mechanisms of auditory selective attention in adolescents with MD. 24 male and female unmedicated adolescents with MD and 21 control subjects were included in the study. ERPs were collected during an auditory oddball paradigm. Depressive adolescents tended to show a longer N100 latency to target and non-target tones. Moreover, MD subjects showed a prolonged latency of the P200 component to targets. Across groups, longer P200 latency was associated with a decreased tendency of disinhibited behavior as assessed by a behavioral questionnaire. To be able to draw more precise conclusions about differences between the neural bases of selective attention in adolescents vs. adults with MD, future studies should include both age groups and apply the same experimental setting across all subjects. The study provides strong support for abnormalities in the neurophysiolgical bases of selective attention in adolecents with MD at early stages of auditory information processing. Absent group differences in later ERP components reflecting voluntary attentional processes stand in contrast to results reported in adults with MD and may suggest that adolescents with MD possess mechanisms to compensate for abnormalities in the early stages of selective attention. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Auditory Attentional Capture: Effects of Singleton Distractor Sounds

    Science.gov (United States)

    Dalton, Polly; Lavie, Nilli

    2004-01-01

    The phenomenon of attentional capture by a unique yet irrelevant singleton distractor has typically been studied in visual search. In this article, the authors examine whether a similar phenomenon occurs in the auditory domain. Participants searched sequences of sounds for targets defined by frequency, intensity, or duration. The presence of a…

  2. Contingent capture of involuntary visual attention interferes with detection of auditory stimuli.

    Science.gov (United States)

    Kamke, Marc R; Harris, Jill

    2014-01-01

    The involuntary capture of attention by salient visual stimuli can be influenced by the behavioral goals of an observer. For example, when searching for a target item, irrelevant items that possess the target-defining characteristic capture attention more strongly than items not possessing that feature. Such contingent capture involves a shift of spatial attention toward the item with the target-defining characteristic. It is not clear, however, if the associated decrements in performance for detecting the target item are entirely due to involuntary orienting of spatial attention. To investigate whether contingent capture also involves a non-spatial interference, adult observers were presented with streams of visual and auditory stimuli and were tasked with simultaneously monitoring for targets in each modality. Visual and auditory targets could be preceded by a lateralized visual distractor that either did, or did not, possess the target-defining feature (a specific color). In agreement with the contingent capture hypothesis, target-colored distractors interfered with visual detection performance (response time and accuracy) more than distractors that did not possess the target color. Importantly, the same pattern of results was obtained for the auditory task: visual target-colored distractors interfered with sound detection. The decrement in auditory performance following a target-colored distractor suggests that contingent capture involves a source of processing interference in addition to that caused by a spatial shift of attention. Specifically, we argue that distractors possessing the target-defining characteristic enter a capacity-limited, serial stage of neural processing, which delays detection of subsequently presented stimuli regardless of the sensory modality.

  3. Contingent capture of involuntary visual attention interferes with detection of auditory stimuli

    Directory of Open Access Journals (Sweden)

    Marc R. Kamke

    2014-06-01

    Full Text Available The involuntary capture of attention by salient visual stimuli can be influenced by the behavioral goals of an observer. For example, when searching for a target item, irrelevant items that possess the target-defining characteristic capture attention more strongly than items not possessing that feature. Such contingent capture involves a shift of spatial attention toward the item with the target-defining characteristic. It is not clear, however, if the associated decrements in performance for detecting the target item are entirely due to involuntary orienting of spatial attention. To investigate whether contingent capture also involves a non-spatial interference, adult observers were presented with streams of visual and auditory stimuli and were tasked with simultaneously monitoring for targets in each modality. Visual and auditory targets could be preceded by a lateralized visual distractor that either did, or did not, possess the target-defining feature (a specific color. In agreement with the contingent capture hypothesis, target-colored distractors interfered with visual detection performance (response time and accuracy more than distractors that did not possess the target color. Importantly, the same pattern of results was obtained for the auditory task: visual target-colored distractors interfered with sound detection. The decrement in auditory performance following a target-colored distractor suggests that contingent capture involves a source of processing interference in addition to that caused by a spatial shift of attention. Specifically, we argue that distractors possessing the target-defining characteristic enter a capacity-limited, serial stage of neural processing, which delays detection of subsequently presented stimuli regardless of the sensory modality.

  4. Feature-Selective Attention Adaptively Shifts Noise Correlations in Primary Auditory Cortex.

    Science.gov (United States)

    Downer, Joshua D; Rapone, Brittany; Verhein, Jessica; O'Connor, Kevin N; Sutter, Mitchell L

    2017-05-24

    Sensory environments often contain an overwhelming amount of information, with both relevant and irrelevant information competing for neural resources. Feature attention mediates this competition by selecting the sensory features needed to form a coherent percept. How attention affects the activity of populations of neurons to support this process is poorly understood because population coding is typically studied through simulations in which one sensory feature is encoded without competition. Therefore, to study the effects of feature attention on population-based neural coding, investigations must be extended to include stimuli with both relevant and irrelevant features. We measured noise correlations ( r noise ) within small neural populations in primary auditory cortex while rhesus macaques performed a novel feature-selective attention task. We found that the effect of feature-selective attention on r noise depended not only on the population tuning to the attended feature, but also on the tuning to the distractor feature. To attempt to explain how these observed effects might support enhanced perceptual performance, we propose an extension of a simple and influential model in which shifts in r noise can simultaneously enhance the representation of the attended feature while suppressing the distractor. These findings present a novel mechanism by which attention modulates neural populations to support sensory processing in cluttered environments. SIGNIFICANCE STATEMENT Although feature-selective attention constitutes one of the building blocks of listening in natural environments, its neural bases remain obscure. To address this, we developed a novel auditory feature-selective attention task and measured noise correlations ( r noise ) in rhesus macaque A1 during task performance. Unlike previous studies showing that the effect of attention on r noise depends on population tuning to the attended feature, we show that the effect of attention depends on the tuning

  5. Human pupillary dilation response to deviant auditory stimuli: Effects of stimulus properties and voluntary attention

    Directory of Open Access Journals (Sweden)

    Hsin-I eLiao

    2016-02-01

    Full Text Available A unique sound that deviates from a repetitive background sound induces signature neural responses, such as mismatch negativity and novelty P3 response in electro-encephalography studies. Here we show that a deviant auditory stimulus induces a human pupillary dilation response (PDR that is sensitive to the stimulus properties and irrespective whether attention is directed to the sounds or not. In an auditory oddball sequence, we used white noise and 2000-Hz tones as oddballs against repeated 1000-Hz tones. Participants’ pupillary responses were recorded while they listened to the auditory oddball sequence. In Experiment 1, they were not involved in any task. Results show that pupils dilated to the noise oddballs for approximately 4 s, but no such PDR was found for the 2000-Hz tone oddballs. In Experiments 2, two types of visual oddballs were presented synchronously with the auditory oddballs. Participants discriminated the auditory or visual oddballs while trying to ignore stimuli from the other modality. The purpose of this manipulation was to direct attention to or away from the auditory sequence. In Experiment 3, the visual oddballs and the auditory oddballs were always presented asynchronously to prevent residuals of attention on to-be-ignored oddballs due to the concurrence with the attended oddballs. Results show that pupils dilated to both the noise and 2000-Hz tone oddballs in all conditions. Most importantly, PDRs to noise were larger than those to the 2000-Hz tone oddballs regardless of the attention condition in both experiments. The overall results suggest that the stimulus-dependent factor of the PDR appears to be independent of attention.

  6. Human Pupillary Dilation Response to Deviant Auditory Stimuli: Effects of Stimulus Properties and Voluntary Attention.

    Science.gov (United States)

    Liao, Hsin-I; Yoneya, Makoto; Kidani, Shunsuke; Kashino, Makio; Furukawa, Shigeto

    2016-01-01

    A unique sound that deviates from a repetitive background sound induces signature neural responses, such as mismatch negativity and novelty P3 response in electro-encephalography studies. Here we show that a deviant auditory stimulus induces a human pupillary dilation response (PDR) that is sensitive to the stimulus properties and irrespective whether attention is directed to the sounds or not. In an auditory oddball sequence, we used white noise and 2000-Hz tones as oddballs against repeated 1000-Hz tones. Participants' pupillary responses were recorded while they listened to the auditory oddball sequence. In Experiment 1, they were not involved in any task. Results show that pupils dilated to the noise oddballs for approximately 4 s, but no such PDR was found for the 2000-Hz tone oddballs. In Experiments 2, two types of visual oddballs were presented synchronously with the auditory oddballs. Participants discriminated the auditory or visual oddballs while trying to ignore stimuli from the other modality. The purpose of this manipulation was to direct attention to or away from the auditory sequence. In Experiment 3, the visual oddballs and the auditory oddballs were always presented asynchronously to prevent residuals of attention on to-be-ignored oddballs due to the concurrence with the attended oddballs. Results show that pupils dilated to both the noise and 2000-Hz tone oddballs in all conditions. Most importantly, PDRs to noise were larger than those to the 2000-Hz tone oddballs regardless of the attention condition in both experiments. The overall results suggest that the stimulus-dependent factor of the PDR appears to be independent of attention.

  7. Attention-driven auditory cortex short-term plasticity helps segregate relevant sounds from noise

    OpenAIRE

    Ahveninen, Jyrki; Hämäläinen, Matti; Jääskeläinen, Iiro P.; Ahlfors, Seppo P.; Huang, Samantha; Lin, Fa-Hsuan; Raij, Tommi; Sams, Mikko; Vasios, Christos E.; Belliveau, John W.

    2011-01-01

    How can we concentrate on relevant sounds in noisy environments? A “gain model” suggests that auditory attention simply amplifies relevant and suppresses irrelevant afferent inputs. However, it is unclear whether this suffices when attended and ignored features overlap to stimulate the same neuronal receptive fields. A “tuning model” suggests that, in addition to gain, attention modulates feature selectivity of auditory neurons. We recorded magnetoencephalography, EEG, and functional MRI (fMR...

  8. Broken Expectations: Violation of Expectancies, Not Novelty, Captures Auditory Attention

    Science.gov (United States)

    Vachon, Francois; Hughes, Robert W.; Jones, Dylan M.

    2012-01-01

    The role of memory in behavioral distraction by auditory attentional capture was investigated: We examined whether capture is a product of the novelty of the capturing event (i.e., the absence of a recent memory for the event) or its violation of learned expectancies on the basis of a memory for an event structure. Attentional capture--indicated…

  9. Dynamic crossmodal links revealed by steady-state responses in auditory-visual divided attention

    NARCIS (Netherlands)

    de Jong, Ritske; Toffanin, Paolo; Harbers, Marten; Martens, Sander

    Frequency tagging has been often used to study intramodal attention but not intermodal attention. We used EEG and simultaneous frequency tagging of auditory and visual sources to study intermodal focused and divided attention in detection and discrimination performance. Divided-attention costs were

  10. Size and synchronization of auditory cortex promotes musical, literacy, and attentional skills in children.

    Science.gov (United States)

    Seither-Preisler, Annemarie; Parncutt, Richard; Schneider, Peter

    2014-08-13

    Playing a musical instrument is associated with numerous neural processes that continuously modify the human brain and may facilitate characteristic auditory skills. In a longitudinal study, we investigated the auditory and neural plasticity of musical learning in 111 young children (aged 7-9 y) as a function of the intensity of instrumental practice and musical aptitude. Because of the frequent co-occurrence of central auditory processing disorders and attentional deficits, we also tested 21 children with attention deficit (hyperactivity) disorder [AD(H)D]. Magnetic resonance imaging and magnetoencephalography revealed enlarged Heschl's gyri and enhanced right-left hemispheric synchronization of the primary evoked response (P1) to harmonic complex sounds in children who spent more time practicing a musical instrument. The anatomical characteristics were positively correlated with frequency discrimination, reading, and spelling skills. Conversely, AD(H)D children showed reduced volumes of Heschl's gyri and enhanced volumes of the plana temporalia that were associated with a distinct bilateral P1 asynchrony. This may indicate a risk for central auditory processing disorders that are often associated with attentional and literacy problems. The longitudinal comparisons revealed a very high stability of auditory cortex morphology and gray matter volumes, suggesting that the combined anatomical and functional parameters are neural markers of musicality and attention deficits. Educational and clinical implications are considered. Copyright © 2014 the authors 0270-6474/14/3410937-13$15.00/0.

  11. Vestibular Stimulation and Auditory Perception in Children with Attention Deficit Hyperactivity Disorder

    Directory of Open Access Journals (Sweden)

    Azin Salamati

    2014-09-01

    Full Text Available Objectives: Rehabilitation strategies play a pivotal role in reliving the inappropriate behaviors and improving children's performance during school. Concentration and visual and auditory comprehension in children are crucial to effective learning and have drawn interest from researchers and clinicians. Vestibular function deficits usually cause high level of alertness and vigilance, and problems in maintaining focus, paying selective attention, and altering in precision and attention to the stimulus. The aim of this study is to investigate the correlation between vestibular stimulation and auditory perception in children with attention deficit hyperactivity disorder. Methods: Totally 30 children aged from 7 to 12 years with attention deficit hyperactivity disorder participated in this study. They were assessed based on the criteria of diagnostic and statistical manual of mental disorders. After obtaining guardian and parental consent, they were enrolled and randomly matched on age to two groups of intervention and control. Integrated visual and auditory continuous performance test was carried out as a pre-test. Those in the intervention group received vestibular stimulation during the therapy sessions, twice a week for 10 weeks. At the end the test was done to both groups as post-test. Results: The pre-and post-test scores were measured and compared the differences between means for two subject groups. Statistical analyses found a significant difference for the mean differences regarding auditory comprehension improvement. Discussion: The findings suggest that vestibular training is a reliable and powerful option treatment for attention deficit hyperactivity disorder especially along with other trainings, meaning that stimulating the sense of balance highlights the importance of interaction between inhabitation and cognition.

  12. Switching in the Cocktail Party: Exploring Intentional Control of Auditory Selective Attention

    Science.gov (United States)

    Koch, Iring; Lawo, Vera; Fels, Janina; Vorlander, Michael

    2011-01-01

    Using a novel variant of dichotic selective listening, we examined the control of auditory selective attention. In our task, subjects had to respond selectively to one of two simultaneously presented auditory stimuli (number words), always spoken by a female and a male speaker, by performing a numerical size categorization. The gender of the…

  13. Impact of Spatial and Verbal Short-Term Memory Load on Auditory Spatial Attention Gradients.

    Science.gov (United States)

    Golob, Edward J; Winston, Jenna; Mock, Jeffrey R

    2017-01-01

    Short-term memory load can impair attentional control, but prior work shows that the extent of the effect ranges from being very general to very specific. One factor for the mixed results may be reliance on point estimates of memory load effects on attention. Here we used auditory attention gradients as an analog measure to map-out the impact of short-term memory load over space. Verbal or spatial information was maintained during an auditory spatial attention task and compared to no-load. Stimuli were presented from five virtual locations in the frontal azimuth plane, and subjects focused on the midline. Reaction times progressively increased for lateral stimuli, indicating an attention gradient. Spatial load further slowed responses at lateral locations, particularly in the left hemispace, but had little effect at midline. Verbal memory load had no (Experiment 1), or a minimal (Experiment 2) influence on reaction times. Spatial and verbal load increased switch costs between memory encoding and attention tasks relative to the no load condition. The findings show that short-term memory influences the distribution of auditory attention over space; and that the specific pattern depends on the type of information in short-term memory.

  14. Impact of Spatial and Verbal Short-Term Memory Load on Auditory Spatial Attention Gradients

    Directory of Open Access Journals (Sweden)

    Edward J. Golob

    2017-11-01

    Full Text Available Short-term memory load can impair attentional control, but prior work shows that the extent of the effect ranges from being very general to very specific. One factor for the mixed results may be reliance on point estimates of memory load effects on attention. Here we used auditory attention gradients as an analog measure to map-out the impact of short-term memory load over space. Verbal or spatial information was maintained during an auditory spatial attention task and compared to no-load. Stimuli were presented from five virtual locations in the frontal azimuth plane, and subjects focused on the midline. Reaction times progressively increased for lateral stimuli, indicating an attention gradient. Spatial load further slowed responses at lateral locations, particularly in the left hemispace, but had little effect at midline. Verbal memory load had no (Experiment 1, or a minimal (Experiment 2 influence on reaction times. Spatial and verbal load increased switch costs between memory encoding and attention tasks relative to the no load condition. The findings show that short-term memory influences the distribution of auditory attention over space; and that the specific pattern depends on the type of information in short-term memory.

  15. Comparative Evaluation of Auditory Attention in 7 to 9 Year Old Learning Disabled Students

    Directory of Open Access Journals (Sweden)

    Fereshteh Amiriani

    2011-06-01

    Full Text Available Background and Aim: Learning disability is a term referes to a group of disorders manifesting listening, reading, writing, or mathematical problems. These children mostly have attention difficulties in classroom that leads to many learning problems. In this study we aimed to compare the auditory attention of 7 to 9 year old children with learning disability to non- learning disability age matched normal group.Methods: Twenty seven male 7 to 9 year old students with learning disability and 27 age and sex matched normal conrols were selected with unprobable simple sampling. 27 In order to evaluate auditory selective and divided attention, Farsi versions of speech in noise and dichotic digit test were used respectively.Results: Comparison of mean scores of Farsi versions of speech in noise in both ears of 7 and 8 year-old students in two groups indicated no significant difference (p>0.05 Mean scores of 9 year old controls was significant more than those of the cases only in the right ear (p=0.033. However, no significant difference was observed between mean scores of dichotic digit test assessing the right ear of 9 year-old learning disability and non learning disability students (p>0.05. Moreover, mean scores of 7 and 8 year- old students with learning disability was less than those of their normal peers in the left ear (p>0.05.Conclusion: Selective auditory attention is not affected in the optimal signal to noise ratio, while divided attention seems to be affected by maturity delay of auditory system or central auditory system disorders.

  16. Can you hear me now? Musical training shapes functional brain networks for selective auditory attention and hearing speech in noise

    Directory of Open Access Journals (Sweden)

    Dana L Strait

    2011-06-01

    Full Text Available Even in the quietest of rooms, our senses are perpetually inundated by a barrage of sounds, requiring the auditory system to adapt to a variety of listening conditions in order to extract signals of interest (e.g., one speaker’s voice amidst others. Brain networks that promote selective attention are thought to sharpen the neural encoding of a target signal, suppressing competing sounds and enhancing perceptual performance. Here, we ask: does musical training benefit cortical mechanisms that underlie selective attention to speech? To answer this question, we assessed the impact of selective auditory attention on cortical auditory-evoked response variability in musicians and nonmusicians. Outcomes indicate strengthened brain networks for selective auditory attention in musicians in that musicians but not nonmusicians demonstrate decreased prefrontal response variability with auditory attention. Results are interpreted in the context of previous work from our laboratory documenting perceptual and subcortical advantages in musicians for the hearing and neural encoding of speech in background noise. Musicians’ neural proficiency for selectively engaging and sustaining auditory attention to language indicates a potential benefit of music for auditory training. Given the importance of auditory attention for the development of language-related skills, musical training may aid in the prevention, habilitation and remediation of children with a wide range of attention-based language and learning impairments.

  17. Auditory Attention and Comprehension During a Simulated Night Shift: Effects of Task Characteristics.

    Science.gov (United States)

    Pilcher, June J; Jennings, Kristen S; Phillips, Ginger E; McCubbin, James A

    2016-11-01

    The current study investigated performance on a dual auditory task during a simulated night shift. Night shifts and sleep deprivation negatively affect performance on vigilance-based tasks, but less is known about the effects on complex tasks. Because language processing is necessary for successful work performance, it is important to understand how it is affected by night work and sleep deprivation. Sixty-two participants completed a simulated night shift resulting in 28 hr of total sleep deprivation. Performance on a vigilance task and a dual auditory language task was examined across four testing sessions. The results indicate that working at night negatively impacts vigilance, auditory attention, and comprehension. The effects on the auditory task varied based on the content of the auditory material. When the material was interesting and easy, the participants performed better. Night work had a greater negative effect when the auditory material was less interesting and more difficult. These findings support research that vigilance decreases during the night. The results suggest that auditory comprehension suffers when individuals are required to work at night. Maintaining attention and controlling effort especially on passages that are less interesting or more difficult could improve performance during night shifts. The results from the current study apply to many work environments where decision making is necessary in response to complex auditory information. Better predicting the effects of night work on language processing is important for developing improved means of coping with shiftwork. © 2016, Human Factors and Ergonomics Society.

  18. Music-induced positive mood broadens the scope of auditory attention.

    Science.gov (United States)

    Putkinen, Vesa; Makkonen, Tommi; Eerola, Tuomas

    2017-07-01

    Previous studies indicate that positive mood broadens the scope of visual attention, which can manifest as heightened distractibility. We used event-related potentials (ERP) to investigate whether music-induced positive mood has comparable effects on selective attention in the auditory domain. Subjects listened to experimenter-selected happy, neutral or sad instrumental music and afterwards participated in a dichotic listening task. Distractor sounds in the unattended channel elicited responses related to early sound encoding (N1/MMN) and bottom-up attention capture (P3a) while target sounds in the attended channel elicited a response related to top-down-controlled processing of task-relevant stimuli (P3b). For the subjects in a happy mood, the N1/MMN responses to the distractor sounds were enlarged while the P3b elicited by the target sounds was diminished. Behaviorally, these subjects tended to show heightened error rates on target trials following the distractor sounds. Thus, the ERP and behavioral results indicate that the subjects in a happy mood allocated their attentional resources more diffusely across the attended and the to-be-ignored channels. Therefore, the current study extends previous research on the effects of mood on visual attention and indicates that even unfamiliar instrumental music can broaden the scope of auditory attention via its effects on mood. © The Author (2017). Published by Oxford University Press.

  19. Interaction of streaming and attention in human auditory cortex.

    Science.gov (United States)

    Gutschalk, Alexander; Rupp, André; Dykstra, Andrew R

    2015-01-01

    Serially presented tones are sometimes segregated into two perceptually distinct streams. An ongoing debate is whether this basic streaming phenomenon reflects automatic processes or requires attention focused to the stimuli. Here, we examined the influence of focused attention on streaming-related activity in human auditory cortex using magnetoencephalography (MEG). Listeners were presented with a dichotic paradigm in which left-ear stimuli consisted of canonical streaming stimuli (ABA_ or ABAA) and right-ear stimuli consisted of a classical oddball paradigm. In phase one, listeners were instructed to attend the right-ear oddball sequence and detect rare deviants. In phase two, they were instructed to attend the left ear streaming stimulus and report whether they heard one or two streams. The frequency difference (ΔF) of the sequences was set such that the smallest and largest ΔF conditions generally induced one- and two-stream percepts, respectively. Two intermediate ΔF conditions were chosen to elicit bistable percepts (i.e., either one or two streams). Attention enhanced the peak-to-peak amplitude of the P1-N1 complex, but only for ambiguous ΔF conditions, consistent with the notion that automatic mechanisms for streaming tightly interact with attention and that the latter is of particular importance for ambiguous sound sequences.

  20. A Brief Period of Postnatal Visual Deprivation Alters the Balance between Auditory and Visual Attention.

    Science.gov (United States)

    de Heering, Adélaïde; Dormal, Giulia; Pelland, Maxime; Lewis, Terri; Maurer, Daphne; Collignon, Olivier

    2016-11-21

    Is a short and transient period of visual deprivation early in life sufficient to induce lifelong changes in how we attend to, and integrate, simple visual and auditory information [1, 2]? This question is of crucial importance given the recent demonstration in both animals and humans that a period of blindness early in life permanently affects the brain networks dedicated to visual, auditory, and multisensory processing [1-16]. To address this issue, we compared a group of adults who had been treated for congenital bilateral cataracts during early infancy with a group of normally sighted controls on a task requiring simple detection of lateralized visual and auditory targets, presented alone or in combination. Redundancy gains obtained from the audiovisual conditions were similar between groups and surpassed the reaction time distribution predicted by Miller's race model. However, in comparison to controls, cataract-reversal patients were faster at processing simple auditory targets and showed differences in how they shifted attention across modalities. Specifically, they were faster at switching attention from visual to auditory inputs than in the reverse situation, while an opposite pattern was observed for controls. Overall, these results reveal that the absence of visual input during the first months of life does not prevent the development of audiovisual integration but enhances the salience of simple auditory inputs, leading to a different crossmodal distribution of attentional resources between auditory and visual stimuli. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Do informal musical activities shape auditory skill development in preschool-age children?

    Science.gov (United States)

    Putkinen, Vesa; Saarikivi, Katri; Tervaniemi, Mari

    2013-08-29

    The influence of formal musical training on auditory cognition has been well established. For the majority of children, however, musical experience does not primarily consist of adult-guided training on a musical instrument. Instead, young children mostly engage in everyday musical activities such as singing and musical play. Here, we review recent electrophysiological and behavioral studies carried out in our laboratory and elsewhere which have begun to map how developing auditory skills are shaped by such informal musical activities both at home and in playschool-type settings. Although more research is still needed, the evidence emerging from these studies suggests that, in addition to formal musical training, informal musical activities can also influence the maturation of auditory discrimination and attention in preschool-aged children.

  2. Neural Correlates of Selective Attention With Hearing Aid Use Followed by ReadMyQuips Auditory Training Program.

    Science.gov (United States)

    Rao, Aparna; Rishiq, Dania; Yu, Luodi; Zhang, Yang; Abrams, Harvey

    The objectives of this study were to investigate the effects of hearing aid use and the effectiveness of ReadMyQuips (RMQ), an auditory training program, on speech perception performance and auditory selective attention using electrophysiological measures. RMQ is an audiovisual training program designed to improve speech perception in everyday noisy listening environments. Participants were adults with mild to moderate hearing loss who were first-time hearing aid users. After 4 weeks of hearing aid use, the experimental group completed RMQ training in 4 weeks, and the control group received listening practice on audiobooks during the same period. Cortical late event-related potentials (ERPs) and the Hearing in Noise Test (HINT) were administered at prefitting, pretraining, and post-training to assess effects of hearing aid use and RMQ training. An oddball paradigm allowed tracking of changes in P3a and P3b ERPs to distractors and targets, respectively. Behavioral measures were also obtained while ERPs were recorded from participants. After 4 weeks of hearing aid use but before auditory training, HINT results did not show a statistically significant change, but there was a significant P3a reduction. This reduction in P3a was correlated with improvement in d prime (d') in the selective attention task. Increased P3b amplitudes were also correlated with improvement in d' in the selective attention task. After training, this correlation between P3b and d' remained in the experimental group, but not in the control group. Similarly, HINT testing showed improved speech perception post training only in the experimental group. The criterion calculated in the auditory selective attention task showed a reduction only in the experimental group after training. ERP measures in the auditory selective attention task did not show any changes related to training. Hearing aid use was associated with a decrement in involuntary attention switch to distractors in the auditory selective

  3. Brain correlates of the orientation of auditory spatial attention onto speaker location in a "cocktail-party" situation.

    Science.gov (United States)

    Lewald, Jörg; Hanenberg, Christina; Getzmann, Stephan

    2016-10-01

    Successful speech perception in complex auditory scenes with multiple competing speakers requires spatial segregation of auditory streams into perceptually distinct and coherent auditory objects and focusing of attention toward the speaker of interest. Here, we focused on the neural basis of this remarkable capacity of the human auditory system and investigated the spatiotemporal sequence of neural activity within the cortical network engaged in solving the "cocktail-party" problem. Twenty-eight subjects localized a target word in the presence of three competing sound sources. The analysis of the ERPs revealed an anterior contralateral subcomponent of the N2 (N2ac), computed as the difference waveform for targets to the left minus targets to the right. The N2ac peaked at about 500 ms after stimulus onset, and its amplitude was correlated with better localization performance. Cortical source localization for the contrast of left versus right targets at the time of the N2ac revealed a maximum in the region around left superior frontal sulcus and frontal eye field, both of which are known to be involved in processing of auditory spatial information. In addition, a posterior-contralateral late positive subcomponent (LPCpc) occurred at a latency of about 700 ms. Both these subcomponents are potential correlates of allocation of spatial attention to the target under cocktail-party conditions. © 2016 Society for Psychophysiological Research.

  4. A Persian version of the sustained auditory attention capacity test and its results in normal children

    Directory of Open Access Journals (Sweden)

    Sanaz Soltanparast

    2013-03-01

    Full Text Available Background and Aim: Sustained attention refers to the ability to maintain attention in target stimuli over a sustained period of time. This study was conducted to develop a Persian version of the sustained auditory attention capacity test and to study its results in normal children.Methods: To develop the Persian version of the sustained auditory attention capacity test, like the original version, speech stimuli were used. The speech stimuli consisted of one hundred monosyllabic words consisting of a 20 times random of and repetition of the words of a 21-word list of monosyllabic words, which were randomly grouped together. The test was carried out at comfortable hearing level using binaural, and diotic presentation modes on 46 normal children of 7 to 11 years of age of both gender.Results: There was a significant difference between age, and an average of impulsiveness error score (p=0.004 and total score of sustained auditory attention capacity test (p=0.005. No significant difference was revealed between age, and an average of inattention error score and attention reduction span index. Gender did not have a significant impact on various indicators of the test.Conclusion: The results of this test on a group of normal hearing children confirmed its ability to measure sustained auditory attention capacity through speech stimuli.

  5. Bottom-up influences of voice continuity in focusing selective auditory attention.

    Science.gov (United States)

    Bressler, Scott; Masud, Salwa; Bharadwaj, Hari; Shinn-Cunningham, Barbara

    2014-01-01

    Selective auditory attention causes a relative enhancement of the neural representation of important information and suppression of the neural representation of distracting sound, which enables a listener to analyze and interpret information of interest. Some studies suggest that in both vision and in audition, the "unit" on which attention operates is an object: an estimate of the information coming from a particular external source out in the world. In this view, which object ends up in the attentional foreground depends on the interplay of top-down, volitional attention and stimulus-driven, involuntary attention. Here, we test the idea that auditory attention is object based by exploring whether continuity of a non-spatial feature (talker identity, a feature that helps acoustic elements bind into one perceptual object) also influences selective attention performance. In Experiment 1, we show that perceptual continuity of target talker voice helps listeners report a sequence of spoken target digits embedded in competing reversed digits spoken by different talkers. In Experiment 2, we provide evidence that this benefit of voice continuity is obligatory and automatic, as if voice continuity biases listeners by making it easier to focus on a subsequent target digit when it is perceptually linked to what was already in the attentional foreground. Our results support the idea that feature continuity enhances streaming automatically, thereby influencing the dynamic processes that allow listeners to successfully attend to objects through time in the cacophony that assails our ears in many everyday settings.

  6. The influence of an auditory-memory attention-demanding task on postural control in blind persons.

    Science.gov (United States)

    Melzer, Itshak; Damry, Elad; Landau, Anat; Yagev, Ronit

    2011-05-01

    In order to evaluate the effect of an auditory-memory attention-demanding task on balance control, nine blind adults were compared to nine age-gender-matched sighted controls. This issue is particularly relevant for the blind population in which functional assessment of postural control has to be revealed through "real life" motor and cognitive function. The study aimed to explore whether an auditory-memory attention-demanding cognitive task would influence postural control in blind persons and compare this with blindfolded sighted persons. Subjects were instructed to minimize body sway during narrow base upright standing on a single force platform under two conditions: 1) standing still (single task); 2) as in 1) while performing an auditory-memory attention-demanding cognitive task (dual task). Subjects in both groups were required to stand blindfolded with their eyes closed. Center of Pressure displacement data were collected and analyzed using summary statistics and stabilogram-diffusion analysis. Blind and sighted subjects had similar postural sway in eyes closed condition. However, for dual compared to single task, sighted subjects show significant decrease in postural sway while blind subjects did not. The auditory-memory attention-demanding cognitive task had no interference effect on balance control on blind subjects. It seems that sighted individuals used auditory cues to compensate for momentary loss of vision, whereas blind subjects did not. This may suggest that blind and sighted people use different sensorimotor strategies to achieve stability. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. Influence of memory, attention, IQ and age on auditory temporal processing tests: preliminary study.

    Science.gov (United States)

    Murphy, Cristina Ferraz Borges; Zachi, Elaine Cristina; Roque, Daniela Tsubota; Ventura, Dora Selma Fix; Schochat, Eliane

    2014-01-01

    To investigate the existence of correlations between the performance of children in auditory temporal tests (Frequency Pattern and Gaps in Noise--GIN) and IQ, attention, memory and age measurements. Fifteen typically developing individuals between the ages of 7 to 12 years and normal hearing participated in the study. Auditory temporal processing tests (GIN and Frequency Pattern), as well as a Memory test (Digit Span), Attention tests (auditory and visual modality) and intelligence tests (RAVEN test of Progressive Matrices) were applied. Significant and positive correlation between the Frequency Pattern test and age variable were found, which was considered good (p<0.01, 75.6%). There were no significant correlations between the GIN test and the variables tested. Auditory temporal skills seem to be influenced by different factors: while the performance in temporal ordering skill seems to be influenced by maturational processes, the performance in temporal resolution was not influenced by any of the aspects investigated.

  8. Auditory spatial attention to speech and complex non-speech sounds in children with autism spectrum disorder.

    Science.gov (United States)

    Soskey, Laura N; Allen, Paul D; Bennetto, Loisa

    2017-08-01

    One of the earliest observable impairments in autism spectrum disorder (ASD) is a failure to orient to speech and other social stimuli. Auditory spatial attention, a key component of orienting to sounds in the environment, has been shown to be impaired in adults with ASD. Additionally, specific deficits in orienting to social sounds could be related to increased acoustic complexity of speech. We aimed to characterize auditory spatial attention in children with ASD and neurotypical controls, and to determine the effect of auditory stimulus complexity on spatial attention. In a spatial attention task, target and distractor sounds were played randomly in rapid succession from speakers in a free-field array. Participants attended to a central or peripheral location, and were instructed to respond to target sounds at the attended location while ignoring nearby sounds. Stimulus-specific blocks evaluated spatial attention for simple non-speech tones, speech sounds (vowels), and complex non-speech sounds matched to vowels on key acoustic properties. Children with ASD had significantly more diffuse auditory spatial attention than neurotypical children when attending front, indicated by increased responding to sounds at adjacent non-target locations. No significant differences in spatial attention emerged based on stimulus complexity. Additionally, in the ASD group, more diffuse spatial attention was associated with more severe ASD symptoms but not with general inattention symptoms. Spatial attention deficits have important implications for understanding social orienting deficits and atypical attentional processes that contribute to core deficits of ASD. Autism Res 2017, 10: 1405-1416. © 2017 International Society for Autism Research, Wiley Periodicals, Inc. © 2017 International Society for Autism Research, Wiley Periodicals, Inc.

  9. Aberrant interference of auditory negative words on attention in patients with schizophrenia.

    Directory of Open Access Journals (Sweden)

    Norichika Iwashiro

    Full Text Available Previous research suggests that deficits in attention-emotion interaction are implicated in schizophrenia symptoms. Although disruption in auditory processing is crucial in the pathophysiology of schizophrenia, deficits in interaction between emotional processing of auditorily presented language stimuli and auditory attention have not yet been clarified. To address this issue, the current study used a dichotic listening task to examine 22 patients with schizophrenia and 24 age-, sex-, parental socioeconomic background-, handedness-, dexterous ear-, and intelligence quotient-matched healthy controls. The participants completed a word recognition task on the attended side in which a word with emotionally valenced content (negative/positive/neutral was presented to one ear and a different neutral word was presented to the other ear. Participants selectively attended to either ear. In the control subjects, presentation of negative but not positive word stimuli provoked a significantly prolonged reaction time compared with presentation of neutral word stimuli. This interference effect for negative words existed whether or not subjects directed attention to the negative words. This interference effect was significantly smaller in the patients with schizophrenia than in the healthy controls. Furthermore, the smaller interference effect was significantly correlated with severe positive symptoms and delusional behavior in the patients with schizophrenia. The present findings suggest that aberrant interaction between semantic processing of negative emotional content and auditory attention plays a role in production of positive symptoms in schizophrenia. (224 words.

  10. Comparison of auditory and visual oddball fMRI in schizophrenia.

    Science.gov (United States)

    Collier, Azurii K; Wolf, Daniel H; Valdez, Jeffrey N; Turetsky, Bruce I; Elliott, Mark A; Gur, Raquel E; Gur, Ruben C

    2014-09-01

    Individuals with schizophrenia often suffer from attentional deficits, both in focusing on task-relevant targets and in inhibiting responses to distractors. Schizophrenia also has a differential impact on attention depending on modality: auditory or visual. However, it remains unclear how abnormal activation of attentional circuitry differs between auditory and visual modalities, as these two modalities have not been directly compared in the same individuals with schizophrenia. We utilized event-related functional magnetic resonance imaging (fMRI) to compare patterns of brain activation during an auditory and visual oddball task in order to identify modality-specific attentional impairment. Healthy controls (n=22) and patients with schizophrenia (n=20) completed auditory and visual oddball tasks in separate sessions. For responses to targets, the auditory modality yielded greater activation than the visual modality (A-V) in auditory cortex, insula, and parietal operculum, but visual activation was greater than auditory (V-A) in visual cortex. For responses to novels, A-V differences were found in auditory cortex, insula, and supramarginal gyrus; and V-A differences in the visual cortex, inferior temporal gyrus, and superior parietal lobule. Group differences in modality-specific activation were found only for novel stimuli; controls showed larger A-V differences than patients in prefrontal cortex and the putamen. Furthermore, for patients, greater severity of negative symptoms was associated with greater divergence of A-V novel activation in the visual cortex. Our results demonstrate that patients have more pronounced activation abnormalities in auditory compared to visual attention, and link modality specific abnormalities to negative symptom severity. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. A Comparison of Selective Auditory Attention Abilities in Open-Space Versus Closed Classroom Students.

    Science.gov (United States)

    Reinertsen, Gloria M.

    A study compared performances on a test of selective auditory attention between students educated in open-space versus closed classroom environments. An open-space classroom environment was defined as having no walls separating it from hallways or other classrooms. It was hypothesized that the incidence of auditory figure-ground (ability to focus…

  12. Attention, memory, and auditory processing in 10- to 15-year-old children with listening difficulties.

    Science.gov (United States)

    Sharma, Mridula; Dhamani, Imran; Leung, Johahn; Carlile, Simon

    2014-12-01

    The aim of this study was to examine attention, memory, and auditory processing in children with reported listening difficulty in noise (LDN) despite having clinically normal hearing. Twenty-one children with LDN and 15 children with no listening concerns (controls) participated. The clinically normed auditory processing tests included the Frequency/Pitch Pattern Test (FPT; Musiek, 2002), the Dichotic Digits Test (Musiek, 1983), the Listening in Spatialized Noise-Sentences (LiSN-S) test (Dillon, Cameron, Glyde, Wilson, & Tomlin, 2012), gap detection in noise (Baker, Jayewardene, Sayle, & Saeed, 2008), and masking level difference (MLD; Wilson, Moncrieff, Townsend, & Pillion, 2003). Also included were research-based psychoacoustic tasks, such as auditory stream segregation, localization, sinusoidal amplitude modulation (SAM), and fine structure perception. All were also evaluated on attention and memory test batteries. The LDN group was significantly slower switching their auditory attention and had poorer inhibitory control. Additionally, the group mean results showed significantly poorer performance on FPT, MLD, 4-Hz SAM, and memory tests. Close inspection of the individual data revealed that only 5 participants (out of 21) in the LDN group showed significantly poor performance on FPT compared with clinical norms. Further testing revealed the frequency discrimination of these 5 children to be significantly impaired. Thus, the LDN group showed deficits in attention switching and inhibitory control, whereas only a subset of these participants demonstrated an additional frequency resolution deficit.

  13. Action video games improve reading abilities and visual-to-auditory attentional shifting in English-speaking children with dyslexia.

    Science.gov (United States)

    Franceschini, Sandro; Trevisan, Piergiorgio; Ronconi, Luca; Bertoni, Sara; Colmar, Susan; Double, Kit; Facoetti, Andrea; Gori, Simone

    2017-07-19

    Dyslexia is characterized by difficulties in learning to read and there is some evidence that action video games (AVG), without any direct phonological or orthographic stimulation, improve reading efficiency in Italian children with dyslexia. However, the cognitive mechanism underlying this improvement and the extent to which the benefits of AVG training would generalize to deep English orthography, remain two critical questions. During reading acquisition, children have to integrate written letters with speech sounds, rapidly shifting their attention from visual to auditory modality. In our study, we tested reading skills and phonological working memory, visuo-spatial attention, auditory, visual and audio-visual stimuli localization, and cross-sensory attentional shifting in two matched groups of English-speaking children with dyslexia before and after they played AVG or non-action video games. The speed of words recognition and phonological decoding increased after playing AVG, but not non-action video games. Furthermore, focused visuo-spatial attention and visual-to-auditory attentional shifting also improved only after AVG training. This unconventional reading remediation program also increased phonological short-term memory and phoneme blending skills. Our report shows that an enhancement of visuo-spatial attention and phonological working memory, and an acceleration of visual-to-auditory attentional shifting can directly translate into better reading in English-speaking children with dyslexia.

  14. Role of the right inferior parietal cortex in auditory selective attention: An rTMS study.

    Science.gov (United States)

    Bareham, Corinne A; Georgieva, Stanimira D; Kamke, Marc R; Lloyd, David; Bekinschtein, Tristan A; Mattingley, Jason B

    2018-02-01

    Selective attention is the process of directing limited capacity resources to behaviourally relevant stimuli while ignoring competing stimuli that are currently irrelevant. Studies in healthy human participants and in individuals with focal brain lesions have suggested that the right parietal cortex is crucial for resolving competition for attention. Following right-hemisphere damage, for example, patients may have difficulty reporting a brief, left-sided stimulus if it occurs with a competitor on the right, even though the same left stimulus is reported normally when it occurs alone. Such "extinction" of contralesional stimuli has been documented for all the major sense modalities, but it remains unclear whether its occurrence reflects involvement of one or more specific subregions of the temporo-parietal cortex. Here we employed repetitive transcranial magnetic stimulation (rTMS) over the right hemisphere to examine the effect of disruption of two candidate regions - the supramarginal gyrus (SMG) and the superior temporal gyrus (STG) - on auditory selective attention. Eighteen neurologically normal, right-handed participants performed an auditory task, in which they had to detect target digits presented within simultaneous dichotic streams of spoken distractor letters in the left and right channels, both before and after 20 min of 1 Hz rTMS over the SMG, STG or a somatosensory control site (S1). Across blocks, participants were asked to report on auditory streams in the left, right, or both channels, which yielded focused and divided attention conditions. Performance was unchanged for the two focused attention conditions, regardless of stimulation site, but was selectively impaired for contralateral left-sided targets in the divided attention condition following stimulation of the right SMG, but not the STG or S1. Our findings suggest a causal role for the right inferior parietal cortex in auditory selective attention. Copyright © 2017 Elsevier Ltd. All rights

  15. Attending to auditory memory.

    Science.gov (United States)

    Zimmermann, Jacqueline F; Moscovitch, Morris; Alain, Claude

    2016-06-01

    Attention to memory describes the process of attending to memory traces when the object is no longer present. It has been studied primarily for representations of visual stimuli with only few studies examining attention to sound object representations in short-term memory. Here, we review the interplay of attention and auditory memory with an emphasis on 1) attending to auditory memory in the absence of related external stimuli (i.e., reflective attention) and 2) effects of existing memory on guiding attention. Attention to auditory memory is discussed in the context of change deafness, and we argue that failures to detect changes in our auditory environments are most likely the result of a faulty comparison system of incoming and stored information. Also, objects are the primary building blocks of auditory attention, but attention can also be directed to individual features (e.g., pitch). We review short-term and long-term memory guided modulation of attention based on characteristic features, location, and/or semantic properties of auditory objects, and propose that auditory attention to memory pathways emerge after sensory memory. A neural model for auditory attention to memory is developed, which comprises two separate pathways in the parietal cortex, one involved in attention to higher-order features and the other involved in attention to sensory information. This article is part of a Special Issue entitled SI: Auditory working memory. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Auditory and Visual Working Memory Functioning in College Students with Attention-Deficit/Hyperactivity Disorder and/or Learning Disabilities.

    Science.gov (United States)

    Liebel, Spencer W; Nelson, Jason M

    2017-12-01

    We investigated the auditory and visual working memory functioning in college students with attention-deficit/hyperactivity disorder, learning disabilities, and clinical controls. We examined the role attention-deficit/hyperactivity disorder subtype status played in working memory functioning. The unique influence that both domains of working memory have on reading and math abilities was investigated. A sample of 268 individuals seeking postsecondary education comprise four groups of the present study: 110 had an attention-deficit/hyperactivity disorder diagnosis only, 72 had a learning disability diagnosis only, 35 had comorbid attention-deficit/hyperactivity disorder and learning disability diagnoses, and 60 individuals without either of these disorders comprise a clinical control group. Participants underwent a comprehensive neuropsychological evaluation, and licensed psychologists employed a multi-informant, multi-method approach in obtaining diagnoses. In the attention-deficit/hyperactivity disorder only group, there was no difference between auditory and visual working memory functioning, t(100) = -1.57, p = .12. In the learning disability group, however, auditory working memory functioning was significantly weaker compared with visual working memory, t(71) = -6.19, p attention-deficit/hyperactivity disorder only group, there were no auditory or visual working memory functioning differences between participants with either a predominantly inattentive type or a combined type diagnosis. Visual working memory did not incrementally contribute to the prediction of academic achievement skills. Individuals with attention-deficit/hyperactivity disorder did not demonstrate significant working memory differences compared with clinical controls. Individuals with a learning disability demonstrated weaker auditory working memory than individuals in either the attention-deficit/hyperactivity or clinical control groups. © The Author 2017. Published by Oxford University

  17. Age-dependent impairment of auditory processing under spatially focused and divided attention: an electrophysiological study.

    Science.gov (United States)

    Wild-Wall, Nele; Falkenstein, Michael

    2010-01-01

    By using event-related potentials (ERPs) the present study examines if age-related differences in preparation and processing especially emerge during divided attention. Binaurally presented auditory cues called for focused (valid and invalid) or divided attention to one or both ears. Responses were required to subsequent monaurally presented valid targets (vowels), but had to be suppressed to non-target vowels or invalidly cued vowels. Middle-aged participants were more impaired under divided attention than young ones, likely due to an age-related decline in preparatory attention following cues as was reflected in a decreased CNV. Under divided attention, target processing was increased in the middle-aged, likely reflecting compensatory effort to fulfill task requirements in the difficult condition. Additionally, middle-aged participants processed invalidly cued stimuli more intensely as was reflected by stimulus ERPs. The results suggest an age-related impairment in attentional preparation after auditory cues especially under divided attention and latent difficulties to suppress irrelevant information.

  18. Impaired Facilitatory Mechanisms of Auditory Attention After Damage of the Lateral Prefrontal Cortex

    OpenAIRE

    Bidet-Caulet, Aurélie; Buchanan, Kelly G.; Viswanath, Humsini; Black, Jessica; Scabini, Donatella; Bonnet-Brilhault, Frédérique; Knight, Robert T.

    2014-01-01

    There is growing evidence that auditory selective attention operates via distinct facilitatory and inhibitory mechanisms enabling selective enhancement and suppression of sound processing, respectively. The lateral prefrontal cortex (LPFC) plays a crucial role in the top-down control of selective attention. However, whether the LPFC controls facilitatory, inhibitory, or both attentional mechanisms is unclear. Facilitatory and inhibitory mechanisms were assessed, in patients with LPFC damage, ...

  19. Improvement of auditory hallucinations and reduction of primary auditory area's activation following TMS

    International Nuclear Information System (INIS)

    Giesel, Frederik L.; Mehndiratta, Amit; Hempel, Albrecht; Hempel, Eckhard; Kress, Kai R.; Essig, Marco; Schröder, Johannes

    2012-01-01

    Background: In the present case study, improvement of auditory hallucinations following transcranial magnetic stimulation (TMS) therapy was investigated with respect to activation changes of the auditory cortices. Methods: Using functional magnetic resonance imaging (fMRI), activation of the auditory cortices was assessed prior to and after a 4-week TMS series of the left superior temporal gyrus in a schizophrenic patient with medication-resistant auditory hallucinations. Results: Hallucinations decreased slightly after the third and profoundly after the fourth week of TMS. Activation in the primary auditory area decreased, whereas activation in the operculum and insula remained stable. Conclusions: Combination of TMS and repetitive fMRI is promising to elucidate the physiological changes induced by TMS.

  20. Bottom-up influences of voice continuity in focusing selective auditory attention

    OpenAIRE

    Bressler, Scott; Masud, Salwa; Bharadwaj, Hari; Shinn-Cunningham, Barbara

    2014-01-01

    Selective auditory attention causes a relative enhancement of the neural representation of important information and suppression of the neural representation of distracting sound, which enables a listener to analyze and interpret information of interest. Some studies suggest that in both vision and in audition, the “unit” on which attention operates is an object: an estimate of the information coming from a particular external source out in the world. In this view, which object ends up in the...

  1. Pre-Attentive Auditory Processing of Lexicality

    Science.gov (United States)

    Jacobsen, Thomas; Horvath, Janos; Schroger, Erich; Lattner, Sonja; Widmann, Andreas; Winkler, Istvan

    2004-01-01

    The effects of lexicality on auditory change detection based on auditory sensory memory representations were investigated by presenting oddball sequences of repeatedly presented stimuli, while participants ignored the auditory stimuli. In a cross-linguistic study of Hungarian and German participants, stimulus sequences were composed of words that…

  2. Examining Age-Related Differences in Auditory Attention Control Using a Task-Switching Procedure

    OpenAIRE

    Vera Lawo; Iring Koch

    2014-01-01

    Objectives. Using a novel task-switching variant of dichotic selective listening, we examined age-related differences in the ability to intentionally switch auditory attention between 2 speakers defined by their sex.

  3. Incorporating modern neuroscience findings to improve brain-computer interfaces: tracking auditory attention.

    Science.gov (United States)

    Wronkiewicz, Mark; Larson, Eric; Lee, Adrian Kc

    2016-10-01

    Brain-computer interface (BCI) technology allows users to generate actions based solely on their brain signals. However, current non-invasive BCIs generally classify brain activity recorded from surface electroencephalography (EEG) electrodes, which can hinder the application of findings from modern neuroscience research. In this study, we use source imaging-a neuroimaging technique that projects EEG signals onto the surface of the brain-in a BCI classification framework. This allowed us to incorporate prior research from functional neuroimaging to target activity from a cortical region involved in auditory attention. Classifiers trained to detect attention switches performed better with source imaging projections than with EEG sensor signals. Within source imaging, including subject-specific anatomical MRI information (instead of using a generic head model) further improved classification performance. This source-based strategy also reduced accuracy variability across three dimensionality reduction techniques-a major design choice in most BCIs. Our work shows that source imaging provides clear quantitative and qualitative advantages to BCIs and highlights the value of incorporating modern neuroscience knowledge and methods into BCI systems.

  4. A new test of attention in listening (TAIL) predicts auditory performance.

    Science.gov (United States)

    Zhang, Yu-Xuan; Barry, Johanna G; Moore, David R; Amitay, Sygal

    2012-01-01

    Attention modulates auditory perception, but there are currently no simple tests that specifically quantify this modulation. To fill the gap, we developed a new, easy-to-use test of attention in listening (TAIL) based on reaction time. On each trial, two clearly audible tones were presented sequentially, either at the same or different ears. The frequency of the tones was also either the same or different (by at least two critical bands). When the task required same/different frequency judgments, presentation at the same ear significantly speeded responses and reduced errors. A same/different ear (location) judgment was likewise facilitated by keeping tone frequency constant. Perception was thus influenced by involuntary orienting of attention along the task-irrelevant dimension. When information in the two stimulus dimensions were congruent (same-frequency same-ear, or different-frequency different-ear), response was faster and more accurate than when they were incongruent (same-frequency different-ear, or different-frequency same-ear), suggesting the involvement of executive control to resolve conflicts. In total, the TAIL yielded five independent outcome measures: (1) baseline reaction time, indicating information processing efficiency, (2) involuntary orienting of attention to frequency and (3) location, and (4) conflict resolution for frequency and (5) location. Processing efficiency and conflict resolution accounted for up to 45% of individual variances in the low- and high-threshold variants of three psychoacoustic tasks assessing temporal and spectral processing. Involuntary orientation of attention to the irrelevant dimension did not correlate with perceptual performance on these tasks. Given that TAIL measures are unlikely to be limited by perceptual sensitivity, we suggest that the correlations reflect modulation of perceptual performance by attention. The TAIL thus has the power to identify and separate contributions of different components of attention

  5. Active auditory experience in infancy promotes brain plasticity in Theta and Gamma oscillations

    Directory of Open Access Journals (Sweden)

    Gabriella Musacchia

    2017-08-01

    Full Text Available Language acquisition in infants is driven by on-going neural plasticity that is acutely sensitive to environmental acoustic cues. Recent studies showed that attention-based experience with non-linguistic, temporally-modulated auditory stimuli sharpens cortical responses. A previous ERP study from this laboratory showed that interactive auditory experience via behavior-based feedback (AEx, over a 6-week period from 4- to 7-months-of-age, confers a processing advantage, compared to passive auditory exposure (PEx or maturation alone (Naïve Control, NC. Here, we provide a follow-up investigation of the underlying neural oscillatory patterns in these three groups. In AEx infants, Standard stimuli with invariant frequency (STD elicited greater Theta-band (4–6 Hz activity in Right Auditory Cortex (RAC, as compared to NC infants, and Deviant stimuli with rapid frequency change (DEV elicited larger responses in Left Auditory Cortex (LAC. PEx and NC counterparts showed less-mature bilateral patterns. AEx infants also displayed stronger Gamma (33–37 Hz activity in the LAC during DEV discrimination, compared to NCs, while NC and PEx groups demonstrated bilateral activity in this band, if at all. This suggests that interactive acoustic experience with non-linguistic stimuli can promote a distinct, robust and precise cortical pattern during rapid auditory processing, perhaps reflecting mechanisms that support fine-tuning of early acoustic mapping.

  6. Long-term memory biases auditory spatial attention.

    Science.gov (United States)

    Zimmermann, Jacqueline F; Moscovitch, Morris; Alain, Claude

    2017-10-01

    Long-term memory (LTM) has been shown to bias attention to a previously learned visual target location. Here, we examined whether memory-predicted spatial location can facilitate the detection of a faint pure tone target embedded in real world audio clips (e.g., soundtrack of a restaurant). During an initial familiarization task, participants heard audio clips, some of which included a lateralized target (p = 50%). On each trial participants indicated whether the target was presented from the left, right, or was absent. Following a 1 hr retention interval, participants were presented with the same audio clips, which now all included a target. In Experiment 1, participants showed memory-based gains in response time and d'. Experiment 2 showed that temporal expectations modulate attention, with greater memory-guided attention effects on performance when temporal context was reinstated from learning (i.e., when timing of the target within audio clips was not changed from initially learned timing). Experiment 3 showed that while conscious recall of target locations was modulated by exposure to target-context associations during learning (i.e., better recall with higher number of learning blocks), the influence of LTM associations on spatial attention was not reduced (i.e., number of learning blocks did not affect memory-guided attention). Both Experiments 2 and 3 showed gains in performance related to target-context associations, even for associations that were not explicitly remembered. Together, these findings indicate that memory for audio clips is acquired quickly and is surprisingly robust; both implicit and explicit LTM for the location of a faint target tone modulated auditory spatial attention. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. A right-ear bias of auditory selective attention is evident in alpha oscillations.

    Science.gov (United States)

    Payne, Lisa; Rogers, Chad S; Wingfield, Arthur; Sekuler, Robert

    2017-04-01

    Auditory selective attention makes it possible to pick out one speech stream that is embedded in a multispeaker environment. We adapted a cued dichotic listening task to examine suppression of a speech stream lateralized to the nonattended ear, and to evaluate the effects of attention on the right ear's well-known advantage in the perception of linguistic stimuli. After being cued to attend to input from either their left or right ear, participants heard two different four-word streams presented simultaneously to the separate ears. Following each dichotic presentation, participants judged whether a spoken probe word had been in the attended ear's stream. We used EEG signals to track participants' spatial lateralization of auditory attention, which is marked by interhemispheric differences in EEG alpha (8-14 Hz) power. A right-ear advantage (REA) was evident in faster response times and greater sensitivity in distinguishing attended from unattended words. Consistent with the REA, we found strongest parietal and right frontotemporal alpha modulation during the attend-right condition. These findings provide evidence for a link between selective attention and the REA during directed dichotic listening. © 2016 Society for Psychophysiological Research.

  8. [Some electrophysiological and hemodynamic characteristics of auditory selective attention in norm and schizophrenia].

    Science.gov (United States)

    Lebedeva, I S; Akhadov, T A; Petriaĭkin, A V; Kaleda, V G; Barkhatova, A N; Golubev, S A; Rumiantseva, E E; Vdovenko, A M; Fufaeva, E A; Semenova, N A

    2011-01-01

    Six patients in the state of remission after the first episode ofjuvenile schizophrenia and seven sex- and age-matched mentally healthy subjects were examined by fMRI and ERP methods. The auditory oddball paradigm was applied. Differences in P300 parameters didn't reach the level of significance, however, a significantly higher hemodynamic response to target stimuli was found in patients bilaterally in the supramarginal gyrus and in the right medial frontal gyrus, which points to pathology of these brain areas in supporting of auditory selective attention.

  9. The right planum temporale is involved in stimulus-driven, auditory attention--evidence from transcranial magnetic stimulation.

    Directory of Open Access Journals (Sweden)

    Marco Hirnstein

    Full Text Available It is well known that the planum temporale (PT area in the posterior temporal lobe carries out spectro-temporal analysis of auditory stimuli, which is crucial for speech, for example. There are suggestions that the PT is also involved in auditory attention, specifically in the discrimination and selection of stimuli from the left and right ear. However, direct evidence is missing so far. To examine the role of the PT in auditory attention we asked fourteen participants to complete the Bergen Dichotic Listening Test. In this test two different consonant-vowel syllables (e.g., "ba" and "da" are presented simultaneously, one to each ear, and participants are asked to verbally report the syllable they heard best or most clearly. Thus attentional selection of a syllable is stimulus-driven. Each participant completed the test three times: after their left and right PT (located with anatomical brain scans had been stimulated with repetitive transcranial magnetic stimulation (rTMS, which transiently interferes with normal brain functioning in the stimulated sites, and after sham stimulation, where participants were led to believe they had been stimulated but no rTMS was applied (control. After sham stimulation the typical right ear advantage emerged, that is, participants reported relatively more right than left ear syllables, reflecting a left-hemispheric dominance for language. rTMS over the right but not left PT significantly reduced the right ear advantage. This was the result of participants reporting more left and fewer right ear syllables after right PT stimulation, suggesting there was a leftward shift in stimulus selection. Taken together, our findings point to a new function of the PT in addition to auditory perception: particularly the right PT is involved in stimulus selection and (stimulus-driven, auditory attention.

  10. Auditory Selective Attention: an introduction and evidence for distinct facilitation and inhibition mechanisms

    OpenAIRE

    Mikyska, Constanze Elisabeth Anna

    2012-01-01

    Objective Auditory selective attention is a complex brain function that is still not completely understood. The classic example is the so-called “cocktail party effect” (Cherry, 1953), which describes the impressive ability to focus one’s attention on a single voice from a multitude of voices. This means that particular stimuli in the environment are enhanced in contrast to other ones of lower priority that are ignored. To be able to understand how attention can influence the perception and p...

  11. Early auditory evoked potential is modulated by selective attention and related to individual differences in visual working memory capacity.

    Science.gov (United States)

    Giuliano, Ryan J; Karns, Christina M; Neville, Helen J; Hillyard, Steven A

    2014-12-01

    A growing body of research suggests that the predictive power of working memory (WM) capacity for measures of intellectual aptitude is due to the ability to control attention and select relevant information. Crucially, attentional mechanisms implicated in controlling access to WM are assumed to be domain-general, yet reports of enhanced attentional abilities in individuals with larger WM capacities are primarily within the visual domain. Here, we directly test the link between WM capacity and early attentional gating across sensory domains, hypothesizing that measures of visual WM capacity should predict an individual's capacity to allocate auditory selective attention. To address this question, auditory ERPs were recorded in a linguistic dichotic listening task, and individual differences in ERP modulations by attention were correlated with estimates of WM capacity obtained in a separate visual change detection task. Auditory selective attention enhanced ERP amplitudes at an early latency (ca. 70-90 msec), with larger P1 components elicited by linguistic probes embedded in an attended narrative. Moreover, this effect was associated with greater individual estimates of visual WM capacity. These findings support the view that domain-general attentional control mechanisms underlie the wide variation of WM capacity across individuals.

  12. [Thalamus and Attention].

    Science.gov (United States)

    Tokoro, Kazuhiko; Sato, Hironobu; Yamamoto, Mayumi; Nagai, Yoshiko

    2015-12-01

    Attention is the process by which information and selection occurs, the thalamus plays an important role in the selective attention of visual and auditory information. Selective attention is a conscious effort; however, it occurs subconsciously, as well. The lateral geniculate body (LGB) filters visual information before it reaches the cortex (bottom-up attention). The thalamic reticular nucleus (TRN) provides a strong inhibitory input to both the LGB and pulvinar. This regulation involves focusing a spotlight on important information, as well as inhibiting unnecessary background information. Behavioral contexts more strongly modulate activity of the TRN and pulvinar influencing feedforward and feedback information transmission between the frontal, temporal, parietal and occipital cortical areas (top-down attention). The medial geniculate body (MGB) filters auditory information the TRN inhibits the MGB. Attentional modulation occurring in the auditory pathway among the cochlea, cochlear nucleus, superior olivary complex, and inferior colliculus is more important than that of the MGB and TRN. We also discuss the attentional consequence of thalamic hemorrhage.

  13. Attention deficits revealed by passive auditory change detection for pure tones and lexical tones in ADHD children.

    Science.gov (United States)

    Yang, Ming-Tao; Hsu, Chun-Hsien; Yeh, Pei-Wen; Lee, Wang-Tso; Liang, Jao-Shwann; Fu, Wen-Mei; Lee, Chia-Ying

    2015-01-01

    Inattention (IA) has been a major problem in children with attention deficit/hyperactivity disorder (ADHD), accounting for their behavioral and cognitive dysfunctions. However, there are at least three processing steps underlying attentional control for auditory change detection, namely pre-attentive change detection, involuntary attention orienting, and attention reorienting for further evaluation. This study aimed to examine whether children with ADHD would show deficits in any of these subcomponents by using mismatch negativity (MMN), P3a, and late discriminative negativity (LDN) as event-related potential (ERP) markers, under the passive auditory oddball paradigm. Two types of stimuli-pure tones and Mandarin lexical tones-were used to examine if the deficits were general across linguistic and non-linguistic domains. Participants included 15 native Mandarin-speaking children with ADHD and 16 age-matched controls (across groups, age ranged between 6 and 15 years). Two passive auditory oddball paradigms (lexical tones and pure tones) were applied. The pure tone oddball paradigm included a standard stimulus (1000 Hz, 80%) and two deviant stimuli (1015 and 1090 Hz, 10% each). The Mandarin lexical tone oddball paradigm's standard stimulus was /yi3/ (80%) and two deviant stimuli were /yi1/ and /yi2/ (10% each). The results showed no MMN difference, but did show attenuated P3a and enhanced LDN to the large deviants for both pure and lexical tone changes in the ADHD group. Correlation analysis showed that children with higher ADHD tendency, as indexed by parents' and teachers' ratings on ADHD symptoms, showed less positive P3a amplitudes when responding to large lexical tone deviants. Thus, children with ADHD showed impaired auditory change detection for both pure tones and lexical tones in both involuntary attention switching, and attention reorienting for further evaluation. These ERP markers may therefore be used for the evaluation of anti-ADHD drugs that aim to

  14. Attention deficits revealed by passive auditory change detection for pure tones and lexical tones in ADHD children

    Directory of Open Access Journals (Sweden)

    Ming-Tao eYang

    2015-08-01

    Full Text Available Inattention has been a major problem in children with attention deficit/hyperactivity disorder (ADHD, accounting for their behavioral and cognitive dysfunctions. However, there are at least three processing steps underlying attentional control for auditory change detection, namely pre-attentive change detection, involuntary attention orienting, and attention reorienting for further evaluation. This study aimed to examine whether children with ADHD would show deficits in any of these subcomponents by using mismatch negativity (MMN, P3a, and late discriminative negativity (LDN as event-related potential (ERP markers, under the passive auditory oddball paradigm. Two types of stimuli - pure tones and Mandarin lexical tones - were used to examine if the deficits were general across linguistic and non-linguistic domains. Participants included 15 native Mandarin-speaking children with ADHD and 16 age-matched controls (across groups, age ranged between 6 and 15 years. Two passive auditory oddball paradigms (lexical tones and pure tones were applied. Pure tone paradigm included standard stimuli (1000 Hz, 80% and two deviant stimuli (1015 Hz and 1090 Hz, 10% each. The Mandarin lexical tone paradigm’s standard stimuli was /yi3/ (80% and two deviant stimuli were /yi1/ and /yi2/ (10% each. The results showed no MMN difference, but did show attenuated P3a and enhanced LDN to the large deviants for both pure and lexical tone changes in the ADHD group. Correlation analysis showed that children with higher ADHD tendency, as indexed by parents’ and teachers’ rating on ADHD symptoms, showed less positive P3a amplitudes when responding to large lexical tone deviants. Thus, children with ADHD showed impaired auditory change detection for both pure tones and lexical tones in both involuntary attention switching, and attention reorienting for further evaluation. These ERP markers may therefore be used for evaluation of anti-ADHD drugs that aim to alleviate these

  15. Attention effects at auditory periphery derived from human scalp potentials: displacement measure of potentials.

    Science.gov (United States)

    Ikeda, Kazunari; Hayashi, Akiko; Sekiguchi, Takahiro; Era, Shukichi

    2006-10-01

    It is known in humans that electrophysiological measures such as the auditory brainstem response (ABR) are difficult to identify the attention effect at the auditory periphery, whereas the centrifugal effect has been detected by measuring otoacoustic emissions. This research developed a measure responsive to the shift of human scalp potentials within a brief post-stimulus period (13 ms), that is, displacement percentage, and applied it to an experiment to retrieve the peripheral attention effect. In the present experimental paradigm, tone pips were exposed to the left ear whereas the other ear was masked by white noise. Twelve participants each conducted two conditions of either ignoring or attending to the tone pips. Relative to averaged scalp potentials in the ignoring condition, the shift of the potentials was found within early component range during the attentive condition, and displacement percentage then revealed a significant magnitude difference between the two conditions. These results suggest that, using a measure representing the potential shift itself, the peripheral effect of attention can be detected from human scalp potentials.

  16. Increased psychophysiological parameters of attention in non-psychotic individuals with auditory verbal hallucinations

    DEFF Research Database (Denmark)

    van Lutterveld, Remko; Oranje, Bob; Abramovic, Lucija

    2010-01-01

    with an auditory oddball paradigm in 18 non-psychotic individuals with AVH and 18 controls. RESULTS: P300 amplitude was increased in the AVH group as compared to controls, reflecting superior effortful attention. A trend in the same direction was found for processing negativity. No significant differences were...... found for mismatch negativity. CONCLUSION: Contrary to our expectations, non-psychotic individuals with AVH show increased rather than decreased psychophysiological measures of effortful attention compared to healthy controls, refuting a pivotal role of decreased effortful attention...

  17. Contributions of Sensory Coding and Attentional Control to Individual Differences in Performance in Spatial Auditory Selective Attention Tasks.

    Science.gov (United States)

    Dai, Lengshi; Shinn-Cunningham, Barbara G

    2016-01-01

    Listeners with normal hearing thresholds (NHTs) differ in their ability to steer attention to whatever sound source is important. This ability depends on top-down executive control, which modulates the sensory representation of sound in the cortex. Yet, this sensory representation also depends on the coding fidelity of the peripheral auditory system. Both of these factors may thus contribute to the individual differences in performance. We designed a selective auditory attention paradigm in which we could simultaneously measure envelope following responses (EFRs, reflecting peripheral coding), onset event-related potentials (ERPs) from the scalp (reflecting cortical responses to sound) and behavioral scores. We performed two experiments that varied stimulus conditions to alter the degree to which performance might be limited due to fine stimulus details vs. due to control of attentional focus. Consistent with past work, in both experiments we find that attention strongly modulates cortical ERPs. Importantly, in Experiment I, where coding fidelity limits the task, individual behavioral performance correlates with subcortical coding strength (derived by computing how the EFR is degraded for fully masked tones compared to partially masked tones); however, in this experiment, the effects of attention on cortical ERPs were unrelated to individual subject performance. In contrast, in Experiment II, where sensory cues for segregation are robust (and thus less of a limiting factor on task performance), inter-subject behavioral differences correlate with subcortical coding strength. In addition, after factoring out the influence of subcortical coding strength, behavioral differences are also correlated with the strength of attentional modulation of ERPs. These results support the hypothesis that behavioral abilities amongst listeners with NHTs can arise due to both subcortical coding differences and differences in attentional control, depending on stimulus characteristics

  18. Contributions of sensory coding and attentional control to individual differences in performance in spatial auditory selective attention tasks

    Directory of Open Access Journals (Sweden)

    Lengshi Dai

    2016-10-01

    Full Text Available Listeners with normal hearing thresholds differ in their ability to steer attention to whatever sound source is important. This ability depends on top-down executive control, which modulates the sensory representation of sound in cortex. Yet, this sensory representation also depends on the coding fidelity of the peripheral auditory system. Both of these factors may thus contribute to the individual differences in performance. We designed a selective auditory attention paradigm in which we could simultaneously measure envelope following responses (EFRs, reflecting peripheral coding, onset event-related potentials from the scalp (ERPs, reflecting cortical responses to sound, and behavioral scores. We performed two experiments that varied stimulus conditions to alter the degree to which performance might be limited due to fine stimulus details vs. due to control of attentional focus. Consistent with past work, in both experiments we find that attention strongly modulates cortical ERPs. Importantly, in Experiment I, where coding fidelity limits the task, individual behavioral performance correlates with subcortical coding strength (derived by computing how the EFR is degraded for fully masked tones compared to partially masked tones; however, in this experiment, the effects of attention on cortical ERPs were unrelated to individual subject performance. In contrast, in Experiment II, where sensory cues for segregation are robust (and thus less of a limiting factor on task performance, inter-subject behavioral differences correlate with subcortical coding strength. In addition, after factoring out the influence of subcortical coding strength, behavioral differences are also correlated with the strength of attentional modulation of ERPs. These results support the hypothesis that behavioral abilities amongst listeners with normal hearing thresholds can arise due to both subcortical coding differences and differences in attentional control, depending on

  19. Spatial selective auditory attention in the presence of reverberant energy: individual differences in normal-hearing listeners.

    Science.gov (United States)

    Ruggles, Dorea; Shinn-Cunningham, Barbara

    2011-06-01

    Listeners can selectively attend to a desired target by directing attention to known target source features, such as location or pitch. Reverberation, however, reduces the reliability of the cues that allow a target source to be segregated and selected from a sound mixture. Given this, it is likely that reverberant energy interferes with selective auditory attention. Anecdotal reports suggest that the ability to focus spatial auditory attention degrades even with early aging, yet there is little evidence that middle-aged listeners have behavioral deficits on tasks requiring selective auditory attention. The current study was designed to look for individual differences in selective attention ability and to see if any such differences correlate with age. Normal-hearing adults, ranging in age from 18 to 55 years, were asked to report a stream of digits located directly ahead in a simulated rectangular room. Simultaneous, competing masker digit streams were simulated at locations 15° left and right of center. The level of reverberation was varied to alter task difficulty by interfering with localization cues (increasing localization blur). Overall, performance was best in the anechoic condition and worst in the high-reverberation condition. Listeners nearly always reported a digit from one of the three competing streams, showing that reverberation did not render the digits unintelligible. Importantly, inter-subject differences were extremely large. These differences, however, were not significantly correlated with age, memory span, or hearing status. These results show that listeners with audiometrically normal pure tone thresholds differ in their ability to selectively attend to a desired source, a task important in everyday communication. Further work is necessary to determine if these differences arise from differences in peripheral auditory function or in more central function.

  20. Early Auditory Evoked Potential Is Modulated by Selective Attention and Related to Individual Differences in Visual Working Memory Capacity

    Science.gov (United States)

    Giuliano, Ryan J.; Karns, Christina M.; Neville, Helen J.; Hillyard, Steven A.

    2015-01-01

    A growing body of research suggests that the predictive power of working memory (WM) capacity for measures of intellectual aptitude is due to the ability to control attention and select relevant information. Crucially, attentional mechanisms implicated in controlling access to WM are assumed to be domain-general, yet reports of enhanced attentional abilities in individuals with larger WM capacities are primarily within the visual domain. Here, we directly test the link between WM capacity and early attentional gating across sensory domains, hypothesizing that measures of visual WM capacity should predict an individual’s capacity to allocate auditory selective attention. To address this question, auditory ERPs were recorded in a linguistic dichotic listening task, and individual differences in ERP modulations by attention were correlated with estimates of WM capacity obtained in a separate visual change detection task. Auditory selective attention enhanced ERP amplitudes at an early latency (ca. 70–90 msec), with larger P1 components elicited by linguistic probes embedded in an attended narrative. Moreover, this effect was associated with greater individual estimates of visual WM capacity. These findings support the view that domain-general attentional control mechanisms underlie the wide variation of WM capacity across individuals. PMID:25000526

  1. Functional sex differences in human primary auditory cortex

    International Nuclear Information System (INIS)

    Ruytjens, Liesbet; Georgiadis, Janniko R.; Holstege, Gert; Wit, Hero P.; Albers, Frans W.J.; Willemsen, Antoon T.M.

    2007-01-01

    We used PET to study cortical activation during auditory stimulation and found sex differences in the human primary auditory cortex (PAC). Regional cerebral blood flow (rCBF) was measured in 10 male and 10 female volunteers while listening to sounds (music or white noise) and during a baseline (no auditory stimulation). We found a sex difference in activation of the left and right PAC when comparing music to noise. The PAC was more activated by music than by noise in both men and women. But this difference between the two stimuli was significantly higher in men than in women. To investigate whether this difference could be attributed to either music or noise, we compared both stimuli with the baseline and revealed that noise gave a significantly higher activation in the female PAC than in the male PAC. Moreover, the male group showed a deactivation in the right prefrontal cortex when comparing noise to the baseline, which was not present in the female group. Interestingly, the auditory and prefrontal regions are anatomically and functionally linked and the prefrontal cortex is known to be engaged in auditory tasks that involve sustained or selective auditory attention. Thus we hypothesize that differences in attention result in a different deactivation of the right prefrontal cortex, which in turn modulates the activation of the PAC and thus explains the sex differences found in the activation of the PAC. Our results suggest that sex is an important factor in auditory brain studies. (orig.)

  2. Functional sex differences in human primary auditory cortex

    Energy Technology Data Exchange (ETDEWEB)

    Ruytjens, Liesbet [University Medical Center Groningen, Department of Otorhinolaryngology, Groningen (Netherlands); University Medical Center Utrecht, Department Otorhinolaryngology, P.O. Box 85500, Utrecht (Netherlands); Georgiadis, Janniko R. [University of Groningen, University Medical Center Groningen, Department of Anatomy and Embryology, Groningen (Netherlands); Holstege, Gert [University of Groningen, University Medical Center Groningen, Center for Uroneurology, Groningen (Netherlands); Wit, Hero P. [University Medical Center Groningen, Department of Otorhinolaryngology, Groningen (Netherlands); Albers, Frans W.J. [University Medical Center Utrecht, Department Otorhinolaryngology, P.O. Box 85500, Utrecht (Netherlands); Willemsen, Antoon T.M. [University Medical Center Groningen, Department of Nuclear Medicine and Molecular Imaging, Groningen (Netherlands)

    2007-12-15

    We used PET to study cortical activation during auditory stimulation and found sex differences in the human primary auditory cortex (PAC). Regional cerebral blood flow (rCBF) was measured in 10 male and 10 female volunteers while listening to sounds (music or white noise) and during a baseline (no auditory stimulation). We found a sex difference in activation of the left and right PAC when comparing music to noise. The PAC was more activated by music than by noise in both men and women. But this difference between the two stimuli was significantly higher in men than in women. To investigate whether this difference could be attributed to either music or noise, we compared both stimuli with the baseline and revealed that noise gave a significantly higher activation in the female PAC than in the male PAC. Moreover, the male group showed a deactivation in the right prefrontal cortex when comparing noise to the baseline, which was not present in the female group. Interestingly, the auditory and prefrontal regions are anatomically and functionally linked and the prefrontal cortex is known to be engaged in auditory tasks that involve sustained or selective auditory attention. Thus we hypothesize that differences in attention result in a different deactivation of the right prefrontal cortex, which in turn modulates the activation of the PAC and thus explains the sex differences found in the activation of the PAC. Our results suggest that sex is an important factor in auditory brain studies. (orig.)

  3. Neural responses to complex auditory rhythms: the role of attending

    Directory of Open Access Journals (Sweden)

    Heather L Chapin

    2010-12-01

    Full Text Available The aim of this study was to explore the role of attention in pulse and meter perception using complex rhythms. We used a selective attention paradigm in which participants attended to either a complex auditory rhythm or a visually presented word list. Performance on a reproduction task was used to gauge whether participants were attending to the appropriate stimulus. We hypothesized that attention to complex rhythms – which contain no energy at the pulse frequency – would lead to activations in motor areas involved in pulse perception. Moreover, because multiple repetitions of a complex rhythm are needed to perceive a pulse, activations in pulse related areas would be seen only after sufficient time had elapsed for pulse perception to develop. Selective attention was also expected to modulate activity in sensory areas specific to the modality. We found that selective attention to rhythms led to increased BOLD responses in basal ganglia, and basal ganglia activity was observed only after the rhythms had cycled enough times for a stable pulse percept to develop. These observations suggest that attention is needed to recruit motor activations associated with the perception of pulse in complex rhythms. Moreover, attention to the auditory stimulus enhanced activity in an attentional sensory network including primary auditory, insula, anterior cingulate, and prefrontal cortex, and suppressed activity in sensory areas associated with attending to the visual stimulus.

  4. Perceptual consequences of disrupted auditory nerve activity.

    Science.gov (United States)

    Zeng, Fan-Gang; Kong, Ying-Yee; Michalewski, Henry J; Starr, Arnold

    2005-06-01

    Perceptual consequences of disrupted auditory nerve activity were systematically studied in 21 subjects who had been clinically diagnosed with auditory neuropathy (AN), a recently defined disorder characterized by normal outer hair cell function but disrupted auditory nerve function. Neurological and electrophysical evidence suggests that disrupted auditory nerve activity is due to desynchronized or reduced neural activity or both. Psychophysical measures showed that the disrupted neural activity has minimal effects on intensity-related perception, such as loudness discrimination, pitch discrimination at high frequencies, and sound localization using interaural level differences. In contrast, the disrupted neural activity significantly impairs timing related perception, such as pitch discrimination at low frequencies, temporal integration, gap detection, temporal modulation detection, backward and forward masking, signal detection in noise, binaural beats, and sound localization using interaural time differences. These perceptual consequences are the opposite of what is typically observed in cochlear-impaired subjects who have impaired intensity perception but relatively normal temporal processing after taking their impaired intensity perception into account. These differences in perceptual consequences between auditory neuropathy and cochlear damage suggest the use of different neural codes in auditory perception: a suboptimal spike count code for intensity processing, a synchronized spike code for temporal processing, and a duplex code for frequency processing. We also proposed two underlying physiological models based on desynchronized and reduced discharge in the auditory nerve to successfully account for the observed neurological and behavioral data. These methods and measures cannot differentiate between these two AN models, but future studies using electric stimulation of the auditory nerve via a cochlear implant might. These results not only show the unique

  5. Reduced auditory efferent activity in childhood selective mutism.

    Science.gov (United States)

    Bar-Haim, Yair; Henkin, Yael; Ari-Even-Roth, Daphne; Tetin-Schneider, Simona; Hildesheimer, Minka; Muchnik, Chava

    2004-06-01

    Selective mutism is a psychiatric disorder of childhood characterized by consistent inability to speak in specific situations despite the ability to speak normally in others. The objective of this study was to test whether reduced auditory efferent activity, which may have direct bearings on speaking behavior, is compromised in selectively mute children. Participants were 16 children with selective mutism and 16 normally developing control children matched for age and gender. All children were tested for pure-tone audiometry, speech reception thresholds, speech discrimination, middle-ear acoustic reflex thresholds and decay function, transient evoked otoacoustic emission, suppression of transient evoked otoacoustic emission, and auditory brainstem response. Compared with control children, selectively mute children displayed specific deficiencies in auditory efferent activity. These aberrations in efferent activity appear along with normal pure-tone and speech audiometry and normal brainstem transmission as indicated by auditory brainstem response latencies. The diminished auditory efferent activity detected in some children with SM may result in desensitization of their auditory pathways by self-vocalization and in reduced control of masking and distortion of incoming speech sounds. These children may gradually learn to restrict vocalization to the minimal amount possible in contexts that require complex auditory processing.

  6. Auditory Stream Segregation Improves Infants' Selective Attention to Target Tones Amid Distracters

    Science.gov (United States)

    Smith, Nicholas A.; Trainor, Laurel J.

    2011-01-01

    This study examined the role of auditory stream segregation in the selective attention to target tones in infancy. Using a task adapted from Bregman and Rudnicky's 1975 study and implemented in a conditioned head-turn procedure, infant and adult listeners had to discriminate the temporal order of 2,200 and 2,400 Hz target tones presented alone,…

  7. Effects of attention on dichotic listening: an 15O-PET study

    DEFF Research Database (Denmark)

    Hugdahl, K; Law, I; Kyllingsbæk, Søren

    2000-01-01

    The present study investigated the effect of attention on brain activation in a dichotic listening situation. Dichotic listening is a technique to study laterality effects in the auditory sensory modality. Two different stimuli were presented simultaneously, one in each ear. Twelve subjects...... areas of Broca and Wernicke. The musical instrument stimuli mainly activated areas in visual association cortex, cerebellum, and the hippocampus. An interpretation of the findings is that attention has a facilitating effect for auditory processing, causing reduced activation in the primary auditory...... cortex when attention is explicitly recruited. The observed activations in the parietal lobe during the focused attention conditions could be part of a modality non-specific "attentional network"....

  8. Selective attention and the auditory vertex potential. 1: Effects of stimulus delivery rate

    Science.gov (United States)

    Schwent, V. L.; Hillyard, S. A.; Galambos, R.

    1975-01-01

    Enhancement of the auditory vertex potentials with selective attention to dichotically presented tone pips was found to be critically sensitive to the range of inter-stimulus intervals in use. Only at the shortest intervals was a clear-cut enhancement of the latency component to stimuli observed for the attended ear.

  9. Self-supervised, mobile-application based cognitive training of auditory attention: A behavioral and fMRI evaluation

    Directory of Open Access Journals (Sweden)

    Josef J. Bless

    2014-07-01

    Full Text Available Emerging evidence of the validity of collecting data in natural settings using smartphone applications has opened new possibilities for psychological assessment, treatment, and research. In this study we explored the feasibility and effectiveness of using a mobile application for self-supervised training of auditory attention. In addition, we investigated the neural underpinnings of the training procedure with functional magnetic resonance imaging (fMRI, as well as possible transfer effects to untrained cognitive interference tasks. Subjects in the training group performed the training task on an iPod touch two times a day (morning/evening for three weeks; subjects in the control group received no training, but were tested at the same time interval as the training group. Behavioral responses were measured before and after the training period in both groups, together with measures of task-related neural activations by fMRI. The results showed an expected performance increase after training that corresponded to activation decreases in brain regions associated with selective auditory processing (left posterior temporal gyrus and executive functions (right middle frontal gyrus, indicating more efficient processing in task-related neural networks after training. Our study suggests that cognitive training delivered via mobile applications is feasible and improves the ability to focus attention with corresponding effects on neural plasticity. Future research should focus on the clinical benefits of mobile cognitive training. Limitations of the study are discussed including reduced experimental control and lack of transfer effects.

  10. Development of Attentional Control of Verbal Auditory Perception from Middle to Late Childhood: Comparisons to Healthy Aging

    Science.gov (United States)

    Passow, Susanne; Müller, Maike; Westerhausen, René; Hugdahl, Kenneth; Wartenburger, Isabell; Heekeren, Hauke R.; Lindenberger, Ulman; Li, Shu-Chen

    2013-01-01

    Multitalker situations confront listeners with a plethora of competing auditory inputs, and hence require selective attention to relevant information, especially when the perceptual saliency of distracting inputs is high. This study augmented the classical forced-attention dichotic listening paradigm by adding an interaural intensity manipulation…

  11. Is the effect of tinnitus on auditory steady-state response amplitude mediated by attention?

    Directory of Open Access Journals (Sweden)

    Eugen eDiesch

    2012-05-01

    Full Text Available Objectives: The amplitude of the auditory steady-state response (ASSR is enhanced in tinnitus. As ASSR ampli¬tude is also enhanced by attention, the effect of tinnitus on ASSR amplitude could be interpreted as an effect of attention mediated by tinnitus. As attention effects on the N1 are signi¬fi¬cantly larger than those on the ASSR, if the effect of tinnitus on ASSR amplitude were due to attention, there should be similar amplitude enhancement effects in tinnitus for the N1 component of the auditory evoked response. Methods: MEG recordings of auditory evoked responses which were previously examined for the ASSR (Diesch et al. 2010 were analysed with respect to the N1m component. Like the ASSR previously, the N1m was analysed in the source domain (source space projection. Stimuli were amplitude-modulated tones with one of three carrier fre¬quen¬cies matching the tinnitus frequency or a surrogate frequency 1½ octaves above the audio¬metric edge frequency in con¬trols, the audiometric edge frequency, and a frequency below the audio¬metric edgeResults: In the earlier ASSR study (Diesch et al., 2010, the ASSR amplitude in tinnitus patients, but not in controls, was significantly larger in the (surrogate tinnitus condition than in the edge condition. In the present study, both tinnitus patients and healthy controls show an N1m-amplitude profile identical to the one of ASSR amplitudes in healthy controls. N1m amplitudes elicited by tonal frequencies located at the audiometric edge and at the (surrogate tinnitus frequency are smaller than N1m amplitudes elicited by sub-edge tones and do not differ among each other.Conclusions: There is no N1-amplitude enhancement effect in tinnitus. The enhancement effect of tinnitus on ASSR amplitude cannot be accounted for in terms of attention induced by tinnitus.

  12. Effects of scanner acoustic noise on intrinsic brain activity during auditory stimulation.

    Science.gov (United States)

    Yakunina, Natalia; Kang, Eun Kyoung; Kim, Tae Su; Min, Ji-Hoon; Kim, Sam Soo; Nam, Eui-Cheol

    2015-10-01

    Although the effects of scanner background noise (SBN) during functional magnetic resonance imaging (fMRI) have been extensively investigated for the brain regions involved in auditory processing, its impact on other types of intrinsic brain activity has largely been neglected. The present study evaluated the influence of SBN on a number of intrinsic connectivity networks (ICNs) during auditory stimulation by comparing the results obtained using sparse temporal acquisition (STA) with those using continuous acquisition (CA). Fourteen healthy subjects were presented with classical music pieces in a block paradigm during two sessions of STA and CA. A volume-matched CA dataset (CAm) was generated by subsampling the CA dataset to temporally match it with the STA data. Independent component analysis was performed on the concatenated STA-CAm datasets, and voxel data, time courses, power spectra, and functional connectivity were compared. The ICA revealed 19 ICNs; the auditory, default mode, salience, and frontoparietal networks showed greater activity in the STA. The spectral peaks in 17 networks corresponded to the stimulation cycles in the STA, while only five networks displayed this correspondence in the CA. The dorsal default mode and salience networks exhibited stronger correlations with the stimulus waveform in the STA. SBN appeared to influence not only the areas of auditory response but also the majority of other ICNs, including attention and sensory networks. Therefore, SBN should be regarded as a serious nuisance factor during fMRI studies investigating intrinsic brain activity under external stimulation or task loads.

  13. Effects of scanner acoustic noise on intrinsic brain activity during auditory stimulation

    Energy Technology Data Exchange (ETDEWEB)

    Yakunina, Natalia [Kangwon National University, Institute of Medical Science, School of Medicine, Chuncheon (Korea, Republic of); Kangwon National University Hospital, Neuroscience Research Institute, Chuncheon (Korea, Republic of); Kang, Eun Kyoung [Kangwon National University Hospital, Department of Rehabilitation Medicine, Chuncheon (Korea, Republic of); Kim, Tae Su [Kangwon National University Hospital, Department of Otolaryngology, Chuncheon (Korea, Republic of); Kangwon National University, School of Medicine, Department of Otolaryngology, Chuncheon (Korea, Republic of); Min, Ji-Hoon [University of Michigan, Department of Biopsychology, Cognition, and Neuroscience, Ann Arbor, MI (United States); Kim, Sam Soo [Kangwon National University Hospital, Neuroscience Research Institute, Chuncheon (Korea, Republic of); Kangwon National University, School of Medicine, Department of Radiology, Chuncheon (Korea, Republic of); Nam, Eui-Cheol [Kangwon National University Hospital, Neuroscience Research Institute, Chuncheon (Korea, Republic of); Kangwon National University, School of Medicine, Department of Otolaryngology, Chuncheon (Korea, Republic of)

    2015-10-15

    Although the effects of scanner background noise (SBN) during functional magnetic resonance imaging (fMRI) have been extensively investigated for the brain regions involved in auditory processing, its impact on other types of intrinsic brain activity has largely been neglected. The present study evaluated the influence of SBN on a number of intrinsic connectivity networks (ICNs) during auditory stimulation by comparing the results obtained using sparse temporal acquisition (STA) with those using continuous acquisition (CA). Fourteen healthy subjects were presented with classical music pieces in a block paradigm during two sessions of STA and CA. A volume-matched CA dataset (CAm) was generated by subsampling the CA dataset to temporally match it with the STA data. Independent component analysis was performed on the concatenated STA-CAm datasets, and voxel data, time courses, power spectra, and functional connectivity were compared. The ICA revealed 19 ICNs; the auditory, default mode, salience, and frontoparietal networks showed greater activity in the STA. The spectral peaks in 17 networks corresponded to the stimulation cycles in the STA, while only five networks displayed this correspondence in the CA. The dorsal default mode and salience networks exhibited stronger correlations with the stimulus waveform in the STA. SBN appeared to influence not only the areas of auditory response but also the majority of other ICNs, including attention and sensory networks. Therefore, SBN should be regarded as a serious nuisance factor during fMRI studies investigating intrinsic brain activity under external stimulation or task loads. (orig.)

  14. Effects of scanner acoustic noise on intrinsic brain activity during auditory stimulation

    International Nuclear Information System (INIS)

    Yakunina, Natalia; Kang, Eun Kyoung; Kim, Tae Su; Min, Ji-Hoon; Kim, Sam Soo; Nam, Eui-Cheol

    2015-01-01

    Although the effects of scanner background noise (SBN) during functional magnetic resonance imaging (fMRI) have been extensively investigated for the brain regions involved in auditory processing, its impact on other types of intrinsic brain activity has largely been neglected. The present study evaluated the influence of SBN on a number of intrinsic connectivity networks (ICNs) during auditory stimulation by comparing the results obtained using sparse temporal acquisition (STA) with those using continuous acquisition (CA). Fourteen healthy subjects were presented with classical music pieces in a block paradigm during two sessions of STA and CA. A volume-matched CA dataset (CAm) was generated by subsampling the CA dataset to temporally match it with the STA data. Independent component analysis was performed on the concatenated STA-CAm datasets, and voxel data, time courses, power spectra, and functional connectivity were compared. The ICA revealed 19 ICNs; the auditory, default mode, salience, and frontoparietal networks showed greater activity in the STA. The spectral peaks in 17 networks corresponded to the stimulation cycles in the STA, while only five networks displayed this correspondence in the CA. The dorsal default mode and salience networks exhibited stronger correlations with the stimulus waveform in the STA. SBN appeared to influence not only the areas of auditory response but also the majority of other ICNs, including attention and sensory networks. Therefore, SBN should be regarded as a serious nuisance factor during fMRI studies investigating intrinsic brain activity under external stimulation or task loads. (orig.)

  15. The absence of an auditory-visual attentional blink is not due to echoic memory.

    Science.gov (United States)

    Van der Burg, Erik; Olivers, Christian N; Bronkhorst, Adelbei W; Koelewijn, Thomas; Theeuwes, Jan

    2007-10-01

    The second of two targets is often missed when presented shortly after the first target--a phenomenon referred to as the attentional blink (AB). Whereas the AB is a robust phenomenon within sensory modalities, the evidence for cross-modal ABs is rather mixed. Here, we test the possibility that the absence of an auditory-visual AB for visual letter recognition when streams of tones are used is due to the efficient use of echoic memory, allowing for the postponement of auditory processing. However, forcing participants to immediately process the auditory target, either by presenting interfering sounds during retrieval or by making the first target directly relevant for a speeded response to the second target, did not result in a return of a cross-modal AB. Thefindings argue against echoic memory as an explanation for efficient cross-modal processing. Instead, we hypothesized that a cross-modal AB may be observed when the different modalities use common representations, such as semantic representations. In support of this, a deficit for visual letter recognition returned when the auditory task required a distinction between spoken digits and letters.

  16. Short-term plasticity in auditory cognition.

    Science.gov (United States)

    Jääskeläinen, Iiro P; Ahveninen, Jyrki; Belliveau, John W; Raij, Tommi; Sams, Mikko

    2007-12-01

    Converging lines of evidence suggest that auditory system short-term plasticity can enable several perceptual and cognitive functions that have been previously considered as relatively distinct phenomena. Here we review recent findings suggesting that auditory stimulation, auditory selective attention and cross-modal effects of visual stimulation each cause transient excitatory and (surround) inhibitory modulations in the auditory cortex. These modulations might adaptively tune hierarchically organized sound feature maps of the auditory cortex (e.g. tonotopy), thus filtering relevant sounds during rapidly changing environmental and task demands. This could support auditory sensory memory, pre-attentive detection of sound novelty, enhanced perception during selective attention, influence of visual processing on auditory perception and longer-term plastic changes associated with perceptual learning.

  17. Atypical auditory refractory periods in children from lower socio-economic status backgrounds: ERP evidence for a role of selective attention.

    Science.gov (United States)

    Stevens, Courtney; Paulsen, David; Yasen, Alia; Neville, Helen

    2015-02-01

    Previous neuroimaging studies indicate that lower socio-economic status (SES) is associated with reduced effects of selective attention on auditory processing. Here, we investigated whether lower SES is also associated with differences in a stimulus-driven aspect of auditory processing: the neural refractory period, or reduced amplitude response at faster rates of stimulus presentation. Thirty-two children aged 3 to 8 years participated, and were divided into two SES groups based on maternal education. Event-related brain potentials were recorded to probe stimuli presented at interstimulus intervals (ISIs) of 200, 500, or 1000 ms. These probes were superimposed on story narratives when attended and ignored, permitting a simultaneous experimental manipulation of selective attention. Results indicated that group differences in refractory periods differed as a function of attention condition. Children from higher SES backgrounds showed full neural recovery by 500 ms for attended stimuli, but required at least 1000 ms for unattended stimuli. In contrast, children from lower SES backgrounds showed similar refractory effects to attended and unattended stimuli, with full neural recovery by 500 ms. Thus, in higher SES children only, one functional consequence of selective attention is attenuation of the response to unattended stimuli, particularly at rapid ISIs, altering basic properties of the auditory refractory period. Together, these data indicate that differences in selective attention impact basic aspects of auditory processing in children from lower SES backgrounds. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Top-down modulation of the auditory steady-state response in a task-switch paradigm

    Directory of Open Access Journals (Sweden)

    Nadia Müller

    2009-02-01

    Full Text Available Auditory selective attention is an important mechanism for top-down selection of the vast amount of auditory information our perceptual system is exposed to. In the present study, the impact of attention on auditory steady-state responses - previously shown to be generated in primary auditory regions - was investigated. This issue is still a matter of debate and recent findings point to a complex pattern of attentional effects on the aSSR. The present study aimed at shedding light on the involvement of ipsilateral and contralateral activations to the attended sound taking into account hemispheric differences and a possible dependency on modulation frequency. In aid of this, a dichotic listening experiment was designed using amplitude-modulated tones that were presented to the left and right ear simultaneously. Participants had to detect target tones in a cued ear while their brain activity was assessed using MEG. Thereby, a modulation of the aSSR by attention could be revealed, interestingly restricted to the left hemisphere and 20 Hz responses: Contralateral activations were enhanced while ipsilateral activations turned out to be reduced. Thus, our findings support and extend recent findings, showing that auditory attention can influence the aSSR, but only under specific circumstances and in a complex pattern regarding the different effects for ipsilateral and contralateral activations.

  19. Difference in Perseverative Errors during a Visual Attention Task with Auditory Distractors in Alpha-9 Nicotinic Receptor Subunit Wild Type and Knock-Out Mice.

    Science.gov (United States)

    Jorratt, Pascal; Delano, Paul H; Delgado, Carolina; Dagnino-Subiabre, Alexies; Terreros, Gonzalo

    2017-01-01

    The auditory efferent system is a neural network that originates in the auditory cortex and projects to the cochlear receptor through olivocochlear (OC) neurons. Medial OC neurons make cholinergic synapses with outer hair cells (OHCs) through nicotinic receptors constituted by α9 and α10 subunits. One of the physiological functions of the α9 nicotinic receptor subunit (α9-nAChR) is the suppression of auditory distractors during selective attention to visual stimuli. In a recent study we demonstrated that the behavioral performance of alpha-9 nicotinic receptor knock-out (KO) mice is altered during selective attention to visual stimuli with auditory distractors since they made less correct responses and more omissions than wild type (WT) mice. As the inhibition of the behavioral responses to irrelevant stimuli is an important mechanism of the selective attention processes, behavioral errors are relevant measures that can reflect altered inhibitory control. Errors produced during a cued attention task can be classified as premature, target and perseverative errors. Perseverative responses can be considered as an inability to inhibit the repetition of an action already planned, while premature responses can be considered as an index of the ability to wait or retain an action. Here, we studied premature, target and perseverative errors during a visual attention task with auditory distractors in WT and KO mice. We found that α9-KO mice make fewer perseverative errors with longer latencies than WT mice in the presence of auditory distractors. In addition, although we found no significant difference in the number of target error between genotypes, KO mice made more short-latency target errors than WT mice during the presentation of auditory distractors. The fewer perseverative error made by α9-KO mice could be explained by a reduced motivation for reward and an increased impulsivity during decision making with auditory distraction in KO mice.

  20. Selective attention to phonology dynamically modulates initial encoding of auditory words within the left hemisphere.

    Science.gov (United States)

    Yoncheva, Yuliya; Maurer, Urs; Zevin, Jason D; McCandliss, Bruce D

    2014-08-15

    Selective attention to phonology, i.e., the ability to attend to sub-syllabic units within spoken words, is a critical precursor to literacy acquisition. Recent functional magnetic resonance imaging evidence has demonstrated that a left-lateralized network of frontal, temporal, and posterior language regions, including the visual word form area, supports this skill. The current event-related potential (ERP) study investigated the temporal dynamics of selective attention to phonology during spoken word perception. We tested the hypothesis that selective attention to phonology dynamically modulates stimulus encoding by recruiting left-lateralized processes specifically while the information critical for performance is unfolding. Selective attention to phonology was captured by manipulating listening goals: skilled adult readers attended to either rhyme or melody within auditory stimulus pairs. Each pair superimposed rhyming and melodic information ensuring identical sensory stimulation. Selective attention to phonology produced distinct early and late topographic ERP effects during stimulus encoding. Data-driven source localization analyses revealed that selective attention to phonology led to significantly greater recruitment of left-lateralized posterior and extensive temporal regions, which was notably concurrent with the rhyme-relevant information within the word. Furthermore, selective attention effects were specific to auditory stimulus encoding and not observed in response to cues, arguing against the notion that they reflect sustained task setting. Collectively, these results demonstrate that selective attention to phonology dynamically engages a left-lateralized network during the critical time-period of perception for achieving phonological analysis goals. These findings suggest a key role for selective attention in on-line phonological computations. Furthermore, these findings motivate future research on the role that neural mechanisms of attention may

  1. Enhancing Auditory Selective Attention Using a Visually Guided Hearing Aid

    Science.gov (United States)

    2017-01-01

    Purpose Listeners with hearing loss, as well as many listeners with clinically normal hearing, often experience great difficulty segregating talkers in a multiple-talker sound field and selectively attending to the desired “target” talker while ignoring the speech from unwanted “masker” talkers and other sources of sound. This listening situation forms the classic “cocktail party problem” described by Cherry (1953) that has received a great deal of study over the past few decades. In this article, a new approach to improving sound source segregation and enhancing auditory selective attention is described. The conceptual design, current implementation, and results obtained to date are reviewed and discussed in this article. Method This approach, embodied in a prototype “visually guided hearing aid” (VGHA) currently used for research, employs acoustic beamforming steered by eye gaze as a means for improving the ability of listeners to segregate and attend to one sound source in the presence of competing sound sources. Results The results from several studies demonstrate that listeners with normal hearing are able to use an attention-based “spatial filter” operating primarily on binaural cues to selectively attend to one source among competing spatially distributed sources. Furthermore, listeners with sensorineural hearing loss generally are less able to use this spatial filter as effectively as are listeners with normal hearing especially in conditions high in “informational masking.” The VGHA enhances auditory spatial attention for speech-on-speech masking and improves signal-to-noise ratio for conditions high in “energetic masking.” Visual steering of the beamformer supports the coordinated actions of vision and audition in selective attention and facilitates following sound source transitions in complex listening situations. Conclusions Both listeners with normal hearing and with sensorineural hearing loss may benefit from the acoustic

  2. Difference in Perseverative Errors during a Visual Attention Task with Auditory Distractors in Alpha-9 Nicotinic Receptor Subunit Wild Type and Knock-Out Mice

    Directory of Open Access Journals (Sweden)

    Pascal Jorratt

    2017-11-01

    Full Text Available The auditory efferent system is a neural network that originates in the auditory cortex and projects to the cochlear receptor through olivocochlear (OC neurons. Medial OC neurons make cholinergic synapses with outer hair cells (OHCs through nicotinic receptors constituted by α9 and α10 subunits. One of the physiological functions of the α9 nicotinic receptor subunit (α9-nAChR is the suppression of auditory distractors during selective attention to visual stimuli. In a recent study we demonstrated that the behavioral performance of alpha-9 nicotinic receptor knock-out (KO mice is altered during selective attention to visual stimuli with auditory distractors since they made less correct responses and more omissions than wild type (WT mice. As the inhibition of the behavioral responses to irrelevant stimuli is an important mechanism of the selective attention processes, behavioral errors are relevant measures that can reflect altered inhibitory control. Errors produced during a cued attention task can be classified as premature, target and perseverative errors. Perseverative responses can be considered as an inability to inhibit the repetition of an action already planned, while premature responses can be considered as an index of the ability to wait or retain an action. Here, we studied premature, target and perseverative errors during a visual attention task with auditory distractors in WT and KO mice. We found that α9-KO mice make fewer perseverative errors with longer latencies than WT mice in the presence of auditory distractors. In addition, although we found no significant difference in the number of target error between genotypes, KO mice made more short-latency target errors than WT mice during the presentation of auditory distractors. The fewer perseverative error made by α9-KO mice could be explained by a reduced motivation for reward and an increased impulsivity during decision making with auditory distraction in KO mice.

  3. Characterizing the roles of alpha and theta oscillations in multisensory attention.

    Science.gov (United States)

    Keller, Arielle S; Payne, Lisa; Sekuler, Robert

    2017-05-01

    Cortical alpha oscillations (8-13Hz) appear to play a role in suppressing distractions when just one sensory modality is being attended, but do they also contribute when attention is distributed over multiple sensory modalities? For an answer, we examined cortical oscillations in human subjects who were dividing attention between auditory and visual sequences. In Experiment 1, subjects performed an oddball task with auditory, visual, or simultaneous audiovisual sequences in separate blocks, while the electroencephalogram was recorded using high-density scalp electrodes. Alpha oscillations were present continuously over posterior regions while subjects were attending to auditory sequences. This supports the idea that the brain suppresses processing of visual input in order to advantage auditory processing. During a divided-attention audiovisual condition, an oddball (a rare, unusual stimulus) occurred in either the auditory or the visual domain, requiring that attention be divided between the two modalities. Fronto-central theta band (4-7Hz) activity was strongest in this audiovisual condition, when subjects monitored auditory and visual sequences simultaneously. Theta oscillations have been associated with both attention and with short-term memory. Experiment 2 sought to distinguish these possible roles of fronto-central theta activity during multisensory divided attention. Using a modified version of the oddball task from Experiment 1, Experiment 2 showed that differences in theta power among conditions were independent of short-term memory load. Ruling out theta's association with short-term memory, we conclude that fronto-central theta activity is likely a marker of multisensory divided attention. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Working memory capacity and visual-verbal cognitive load modulate auditory-sensory gating in the brainstem: toward a unified view of attention.

    Science.gov (United States)

    Sörqvist, Patrik; Stenfelt, Stefan; Rönnberg, Jerker

    2012-11-01

    Two fundamental research questions have driven attention research in the past: One concerns whether selection of relevant information among competing, irrelevant, information takes place at an early or at a late processing stage; the other concerns whether the capacity of attention is limited by a central, domain-general pool of resources or by independent, modality-specific pools. In this article, we contribute to these debates by showing that the auditory-evoked brainstem response (an early stage of auditory processing) to task-irrelevant sound decreases as a function of central working memory load (manipulated with a visual-verbal version of the n-back task). Furthermore, individual differences in central/domain-general working memory capacity modulated the magnitude of the auditory-evoked brainstem response, but only in the high working memory load condition. The results support a unified view of attention whereby the capacity of a late/central mechanism (working memory) modulates early precortical sensory processing.

  5. Selective attention to phonology dynamically modulates initial encoding of auditory words within the left hemisphere

    Science.gov (United States)

    Yoncheva; Maurer, Urs; Zevin, Jason; McCandliss, Bruce

    2015-01-01

    Selective attention to phonology, i.e., the ability to attend to sub-syllabic units within spoken words, is a critical precursor to literacy acquisition. Recent functional magnetic resonance imaging evidence has demonstrated that a left-lateralized network of frontal, temporal, and posterior language regions, including the visual word form area, supports this skill. The current event-related potential (ERP) study investigated the temporal dynamics of selective attention to phonology during spoken word perception. We tested the hypothesis that selective atten tion to phonology dynamically modulates stimulus encoding by recruiting left-lateralized processes specifically while the information critical for performance is unfolding. Selective attention to phonology was captured by ma nipulating listening goals: skilled adult readers attended to either rhyme or melody within auditory stimulus pairs. Each pair superimposed rhyming and melodic information ensuring identical sensory stimulation. Selective attention to phonology produced distinct early and late topographic ERP effects during stimulus encoding. Data- driven source localization analyses revealed that selective attention to phonology led to significantly greater re cruitment of left-lateralized posterior and extensive temporal regions, which was notably concurrent with the rhyme-relevant information within the word. Furthermore, selective attention effects were specific to auditory stimulus encoding and not observed in response to cues, arguing against the notion that they reflect sustained task setting. Collectively, these results demonstrate that selective attention to phonology dynamically engages a left-lateralized network during the critical time-period of perception for achieving phonological analysis goals. These findings support the key role of selective attention to phonology in the development of literacy and motivate future research on the neural bases of the interaction between phonological

  6. Frequency-Selective Attention in Auditory Scenes Recruits Frequency Representations Throughout Human Superior Temporal Cortex.

    Science.gov (United States)

    Riecke, Lars; Peters, Judith C; Valente, Giancarlo; Kemper, Valentin G; Formisano, Elia; Sorger, Bettina

    2017-05-01

    A sound of interest may be tracked amid other salient sounds by focusing attention on its characteristic features including its frequency. Functional magnetic resonance imaging findings have indicated that frequency representations in human primary auditory cortex (AC) contribute to this feat. However, attentional modulations were examined at relatively low spatial and spectral resolutions, and frequency-selective contributions outside the primary AC could not be established. To address these issues, we compared blood oxygenation level-dependent (BOLD) responses in the superior temporal cortex of human listeners while they identified single frequencies versus listened selectively for various frequencies within a multifrequency scene. Using best-frequency mapping, we observed that the detailed spatial layout of attention-induced BOLD response enhancements in primary AC follows the tonotopy of stimulus-driven frequency representations-analogous to the "spotlight" of attention enhancing visuospatial representations in retinotopic visual cortex. Moreover, using an algorithm trained to discriminate stimulus-driven frequency representations, we could successfully decode the focus of frequency-selective attention from listeners' BOLD response patterns in nonprimary AC. Our results indicate that the human brain facilitates selective listening to a frequency of interest in a scene by reinforcing the fine-grained activity pattern throughout the entire superior temporal cortex that would be evoked if that frequency was present alone. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Auditory Integration Training

    Directory of Open Access Journals (Sweden)

    Zahra Jafari

    2002-07-01

    Full Text Available Auditory integration training (AIT is a hearing enhancement training process for sensory input anomalies found in individuals with autism, attention deficit hyperactive disorder, dyslexia, hyperactivity, learning disability, language impairments, pervasive developmental disorder, central auditory processing disorder, attention deficit disorder, depressin, and hyperacute hearing. AIT, recently introduced in the United States, and has received much notice of late following the release of The Sound of a Moracle, by Annabel Stehli. In her book, Mrs. Stehli describes before and after auditory integration training experiences with her daughter, who was diagnosed at age four as having autism.

  8. Auditory cortical function during verbal episodic memory encoding in Alzheimer's disease.

    Science.gov (United States)

    Dhanjal, Novraj S; Warren, Jane E; Patel, Maneesh C; Wise, Richard J S

    2013-02-01

    Episodic memory encoding of a verbal message depends upon initial registration, which requires sustained auditory attention followed by deep semantic processing of the message. Motivated by previous data demonstrating modulation of auditory cortical activity during sustained attention to auditory stimuli, we investigated the response of the human auditory cortex during encoding of sentences to episodic memory. Subsequently, we investigated this response in patients with mild cognitive impairment (MCI) and probable Alzheimer's disease (pAD). Using functional magnetic resonance imaging, 31 healthy participants were studied. The response in 18 MCI and 18 pAD patients was then determined, and compared to 18 matched healthy controls. Subjects heard factual sentences, and subsequent retrieval performance indicated successful registration and episodic encoding. The healthy subjects demonstrated that suppression of auditory cortical responses was related to greater success in encoding heard sentences; and that this was also associated with greater activity in the semantic system. In contrast, there was reduced auditory cortical suppression in patients with MCI, and absence of suppression in pAD. Administration of a central cholinesterase inhibitor (ChI) partially restored the suppression in patients with pAD, and this was associated with an improvement in verbal memory. Verbal episodic memory impairment in AD is associated with altered auditory cortical function, reversible with a ChI. Although these results may indicate the direct influence of pathology in auditory cortex, they are also likely to indicate a partially reversible impairment of feedback from neocortical systems responsible for sustained attention and semantic processing. Copyright © 2012 American Neurological Association.

  9. Beyond the real world: attention debates in auditory mismatch negativity.

    Science.gov (United States)

    Chung, Kyungmi; Park, Jin Young

    2018-04-11

    The aim of this study was to address the potential for the auditory mismatch negativity (aMMN) to be used in applied event-related potential (ERP) studies by determining whether the aMMN would be an attention-dependent ERP component and could be differently modulated across visual tasks or virtual reality (VR) stimuli with different visual properties and visual complexity levels. A total of 80 participants, aged 19-36 years, were assigned to either a reading-task (21 men and 19 women) or a VR-task (22 men and 18 women) group. Two visual-task groups of healthy young adults were matched in age, sex, and handedness. All participants were instructed to focus only on the given visual tasks and ignore auditory change detection. While participants in the reading-task group read text slides, those in the VR-task group viewed three 360° VR videos in a random order and rated how visually complex the given virtual environment was immediately after each VR video ended. Inconsistent with the finding of a partial significant difference in perceived visual complexity in terms of brightness of virtual environments, both visual properties of distance and brightness showed no significant differences in the modulation of aMMN amplitudes. A further analysis was carried out to compare elicited aMMN amplitudes of a typical MMN task and an applied VR task. No significant difference in the aMMN amplitudes was found across the two groups who completed visual tasks with different visual-task demands. In conclusion, the aMMN is a reliable ERP marker of preattentive cognitive processing for auditory deviance detection.

  10. The effects of interstimulus interval on event-related indices of attention: an auditory selective attention test of perceptual load theory.

    Science.gov (United States)

    Gomes, Hilary; Barrett, Sophia; Duff, Martin; Barnhardt, Jack; Ritter, Walter

    2008-03-01

    We examined the impact of perceptual load by manipulating interstimulus interval (ISI) in two auditory selective attention studies that varied in the difficulty of the target discrimination. In the paradigm, channels were separated by frequency and target/deviant tones were softer in intensity. Three ISI conditions were presented: fast (300ms), medium (600ms) and slow (900ms). Behavioral (accuracy and RT) and electrophysiological measures (Nd, P3b) were observed. In both studies, participants evidenced poorer accuracy during the fast ISI condition than the slow suggesting that ISI impacted task difficulty. However, none of the three measures of processing examined, Nd amplitude, P3b amplitude elicited by unattended deviant stimuli, or false alarms to unattended deviants, were impacted by ISI in the manner predicted by perceptual load theory. The prediction based on perceptual load theory, that there would be more processing of irrelevant stimuli under conditions of low as compared to high perceptual load, was not supported in these auditory studies. Task difficulty/perceptual load impacts the processing of irrelevant stimuli in the auditory modality differently than predicted by perceptual load theory, and perhaps differently than in the visual modality.

  11. Hippocampal volume and auditory attention on a verbal memory task with adult survivors of pediatric brain tumor.

    Science.gov (United States)

    Jayakar, Reema; King, Tricia Z; Morris, Robin; Na, Sabrina

    2015-03-01

    We examined the nature of verbal memory deficits and the possible hippocampal underpinnings in long-term adult survivors of childhood brain tumor. 35 survivors (M = 24.10 ± 4.93 years at testing; 54% female), on average 15 years post-diagnosis, and 59 typically developing adults (M = 22.40 ± 4.35 years, 54% female) participated. Automated FMRIB Software Library (FSL) tools were used to measure hippocampal, putamen, and whole brain volumes. The California Verbal Learning Test-Second Edition (CVLT-II) was used to assess verbal memory. Hippocampal, F(1, 91) = 4.06, ηp² = .04; putamen, F(1, 91) = 11.18, ηp² = .11; and whole brain, F(1, 92) = 18.51, ηp² = .17, volumes were significantly lower for survivors than controls (p memory indices of auditory attention list span (Trial 1: F(1, 92) = 12.70, η² = .12) and final list learning (Trial 5: F(1, 92) = 6.01, η² = .06) were significantly lower for survivors (p attention, but none of the other CVLT-II indices. Secondary analyses for the effect of treatment factors are presented. Volumetric differences between survivors and controls exist for the whole brain and for subcortical structures on average 15 years post-diagnosis. Treatment factors seem to have a unique effect on subcortical structures. Memory differences between survivors and controls are largely contingent upon auditory attention list span. Only hippocampal volume is associated with the auditory attention list span component of verbal memory. These findings are particularly robust for survivors treated with radiation. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  12. Neural Correlates of Automatic and Controlled Auditory Processing in Schizophrenia

    Science.gov (United States)

    Morey, Rajendra A.; Mitchell, Teresa V.; Inan, Seniha; Lieberman, Jeffrey A.; Belger, Aysenil

    2009-01-01

    Individuals with schizophrenia demonstrate impairments in selective attention and sensory processing. The authors assessed differences in brain function between 26 participants with schizophrenia and 17 comparison subjects engaged in automatic (unattended) and controlled (attended) auditory information processing using event-related functional MRI. Lower regional neural activation during automatic auditory processing in the schizophrenia group was not confined to just the temporal lobe, but also extended to prefrontal regions. Controlled auditory processing was associated with a distributed frontotemporal and subcortical dysfunction. Differences in activation between these two modes of auditory information processing were more pronounced in the comparison group than in the patient group. PMID:19196926

  13. Emotionally negative pictures increase attention to a subsequent auditory stimulus.

    Science.gov (United States)

    Tartar, Jaime L; de Almeida, Kristen; McIntosh, Roger C; Rosselli, Monica; Nash, Allan J

    2012-01-01

    Emotionally negative stimuli serve as a mechanism of biological preparedness to enhance attention. We hypothesized that emotionally negative stimuli would also serve as motivational priming to increase attention resources for subsequent stimuli. To that end, we tested 11 participants in a dual sensory modality task, wherein emotionally negative pictures were contrasted with emotionally neutral pictures and each picture was followed 600 ms later by a tone in an auditory oddball paradigm. Each trial began with a picture displayed for 200 ms; half of the trials began with an emotionally negative picture and half of the trials began with an emotionally neutral picture; 600 ms following picture presentation, the participants heard either an oddball tone or a standard tone. At the end of each trial (picture followed by tone), the participants categorized, with a button press, the picture and tone combination. As expected, and consistent with previous studies, we found an enhanced visual late positive potential (latency range=300-700 ms) to the negative picture stimuli. We further found that compared to neutral pictures, negative pictures resulted in early attention and orienting effects to subsequent tones (measured through an enhanced N1 and N2) and sustained attention effects only to the subsequent oddball tones (measured through late processing negativity, latency range=400-700 ms). Number pad responses to both the picture and tone category showed the shortest response latencies and greatest percentage of correct picture-tone categorization on the negative picture followed by oddball tone trials. Consistent with previous work on natural selective attention, our results support the idea that emotional stimuli can alter attention resource allocation. This finding has broad implications for human attention and performance as it specifically shows the conditions in which an emotionally negative stimulus can result in extended stimulus evaluation. Copyright © 2011

  14. Selective attention reduces physiological noise in the external ear canals of humans. II: Visual attention

    Science.gov (United States)

    Walsh, Kyle P.; Pasanen, Edward G.; McFadden, Dennis

    2014-01-01

    Human subjects performed in several behavioral conditions requiring, or not requiring, selective attention to visual stimuli. Specifically, the attentional task was to recognize strings of digits that had been presented visually. A nonlinear version of the stimulus-frequency otoacoustic emission (SFOAE), called the nSFOAE, was collected during the visual presentation of the digits. The segment of the physiological response discussed here occurred during brief silent periods immediately following the SFOAE-evoking stimuli. For all subjects tested, the physiological-noise magnitudes were substantially weaker (less noisy) during the tasks requiring the most visual attention. Effect sizes for the differences were >2.0. Our interpretation is that cortico-olivo influences adjusted the magnitude of efferent activation during the SFOAE-evoking stimulation depending upon the attention task in effect, and then that magnitude of efferent activation persisted throughout the silent period where it also modulated the physiological noise present. Because the results were highly similar to those obtained when the behavioral conditions involved auditory attention, similar mechanisms appear to operate both across modalities and within modalities. Supplementary measurements revealed that the efferent activation was spectrally global, as it was for auditory attention. PMID:24732070

  15. An analysis of nonlinear dynamics underlying neural activity related to auditory induction in the rat auditory cortex.

    Science.gov (United States)

    Noto, M; Nishikawa, J; Tateno, T

    2016-03-24

    A sound interrupted by silence is perceived as discontinuous. However, when high-intensity noise is inserted during the silence, the missing sound may be perceptually restored and be heard as uninterrupted. This illusory phenomenon is called auditory induction. Recent electrophysiological studies have revealed that auditory induction is associated with the primary auditory cortex (A1). Although experimental evidence has been accumulating, the neural mechanisms underlying auditory induction in A1 neurons are poorly understood. To elucidate this, we used both experimental and computational approaches. First, using an optical imaging method, we characterized population responses across auditory cortical fields to sound and identified five subfields in rats. Next, we examined neural population activity related to auditory induction with high temporal and spatial resolution in the rat auditory cortex (AC), including the A1 and several other AC subfields. Our imaging results showed that tone-burst stimuli interrupted by a silent gap elicited early phasic responses to the first tone and similar or smaller responses to the second tone following the gap. In contrast, tone stimuli interrupted by broadband noise (BN), considered to cause auditory induction, considerably suppressed or eliminated responses to the tone following the noise. Additionally, tone-burst stimuli that were interrupted by notched noise centered at the tone frequency, which is considered to decrease the strength of auditory induction, partially restored the second responses from the suppression caused by BN. To phenomenologically mimic the neural population activity in the A1 and thus investigate the mechanisms underlying auditory induction, we constructed a computational model from the periphery through the AC, including a nonlinear dynamical system. The computational model successively reproduced some of the above-mentioned experimental results. Therefore, our results suggest that a nonlinear, self

  16. Predictive coding of visual-auditory and motor-auditory events: An electrophysiological study.

    Science.gov (United States)

    Stekelenburg, Jeroen J; Vroomen, Jean

    2015-11-11

    The amplitude of auditory components of the event-related potential (ERP) is attenuated when sounds are self-generated compared to externally generated sounds. This effect has been ascribed to internal forward modals predicting the sensory consequences of one's own motor actions. Auditory potentials are also attenuated when a sound is accompanied by a video of anticipatory visual motion that reliably predicts the sound. Here, we investigated whether the neural underpinnings of prediction of upcoming auditory stimuli are similar for motor-auditory (MA) and visual-auditory (VA) events using a stimulus omission paradigm. In the MA condition, a finger tap triggered the sound of a handclap whereas in the VA condition the same sound was accompanied by a video showing the handclap. In both conditions, the auditory stimulus was omitted in either 50% or 12% of the trials. These auditory omissions induced early and mid-latency ERP components (oN1 and oN2, presumably reflecting prediction and prediction error), and subsequent higher-order error evaluation processes. The oN1 and oN2 of MA and VA were alike in amplitude, topography, and neural sources despite that the origin of the prediction stems from different brain areas (motor versus visual cortex). This suggests that MA and VA predictions activate a sensory template of the sound in auditory cortex. This article is part of a Special Issue entitled SI: Prediction and Attention. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Visual cortex and auditory cortex activation in early binocularly blind macaques: A BOLD-fMRI study using auditory stimuli.

    Science.gov (United States)

    Wang, Rong; Wu, Lingjie; Tang, Zuohua; Sun, Xinghuai; Feng, Xiaoyuan; Tang, Weijun; Qian, Wen; Wang, Jie; Jin, Lixin; Zhong, Yufeng; Xiao, Zebin

    2017-04-15

    Cross-modal plasticity within the visual and auditory cortices of early binocularly blind macaques is not well studied. In this study, four healthy neonatal macaques were assigned to group A (control group) or group B (binocularly blind group). Sixteen months later, blood oxygenation level-dependent functional imaging (BOLD-fMRI) was conducted to examine the activation in the visual and auditory cortices of each macaque while being tested using pure tones as auditory stimuli. The changes in the BOLD response in the visual and auditory cortices of all macaques were compared with immunofluorescence staining findings. Compared with group A, greater BOLD activity was observed in the bilateral visual cortices of group B, and this effect was particularly obvious in the right visual cortex. In addition, more activated volumes were found in the bilateral auditory cortices of group B than of group A, especially in the right auditory cortex. These findings were consistent with the fact that there were more c-Fos-positive cells in the bilateral visual and auditory cortices of group B compared with group A (p visual cortices of binocularly blind macaques can be reorganized to process auditory stimuli after visual deprivation, and this effect is more obvious in the right than the left visual cortex. These results indicate the establishment of cross-modal plasticity within the visual and auditory cortices. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Review: Auditory Integration Training

    Directory of Open Access Journals (Sweden)

    Zahra Ja'fari

    2003-01-01

    Full Text Available Auditory integration training (AIT is a hearing enhancement training process for sensory input anomalies found in individuals with autism, attention deficit hyperactive disorder, dyslexia, hyperactivity, learning disability, language impairments, pervasive developmental disorder, central auditory processing disorder, attention deficit disorder, depression, and hyper acute hearing. AIT, recently introduced in the United States, and has received much notice of late following the release of the sound of a miracle, by Annabel Stehli. In her book, Mrs. Stehli describes before and after auditory integration training experiences with her daughter, who was diagnosed at age four as having autism.

  19. Visual-induced expectations modulate auditory cortical responses

    Directory of Open Access Journals (Sweden)

    Virginie evan Wassenhove

    2015-02-01

    Full Text Available Active sensing has important consequences on multisensory processing (Schroeder et al. 2010. Here, we asked whether in the absence of saccades, the position of the eyes and the timing of transient colour changes of visual stimuli could selectively affect the excitability of auditory cortex by predicting the where and the when of a sound, respectively. Human participants were recorded with magnetoencephalography (MEG while maintaining the position of their eyes on the left, right, or centre of the screen. Participants counted colour changes of the fixation cross while neglecting sounds which could be presented to the left, right or both ears. First, clear alpha power increases were observed in auditory cortices, consistent with participants’ attention directed to visual inputs. Second, colour changes elicited robust modulations of auditory cortex responses (when prediction seen as ramping activity, early alpha phase-locked responses, and enhanced high-gamma band responses in the contralateral side of sound presentation. Third, no modulations of auditory evoked or oscillatory activity were found to be specific to eye position. Altogether, our results suggest that visual transience can automatically elicit a prediction of when a sound will occur by changing the excitability of auditory cortices irrespective of the attended modality, eye position or spatial congruency of auditory and visual events. To the contrary, auditory cortical responses were not significantly affected by eye position suggesting that where predictions may require active sensing or saccadic reset to modulate auditory cortex responses, notably in the absence of spatial orientation to sounds.

  20. The effect of sleep deprivation on BOLD activity elicited by a divided attention task.

    Science.gov (United States)

    Jackson, Melinda L; Hughes, Matthew E; Croft, Rodney J; Howard, Mark E; Crewther, David; Kennedy, Gerard A; Owens, Katherine; Pierce, Rob J; O'Donoghue, Fergal J; Johnston, Patrick

    2011-06-01

    Sleep loss, widespread in today's society and associated with a number of clinical conditions, has a detrimental effect on a variety of cognitive domains including attention. This study examined the sequelae of sleep deprivation upon BOLD fMRI activation during divided attention. Twelve healthy males completed two randomized sessions; one after 27 h of sleep deprivation and one after a normal night of sleep. During each session, BOLD fMRI was measured while subjects completed a cross-modal divided attention task (visual and auditory). After normal sleep, increased BOLD activation was observed bilaterally in the superior frontal gyrus and the inferior parietal lobe during divided attention performance. Subjects reported feeling significantly more sleepy in the sleep deprivation session, and there was a trend towards poorer divided attention task performance. Sleep deprivation led to a down regulation of activation in the left superior frontal gyrus, possibly reflecting an attenuation of top-down control mechanisms on the attentional system. These findings have implications for understanding the neural correlates of divided attention and the neurofunctional changes that occur in individuals who are sleep deprived.

  1. Preconditioning of Spatial and Auditory Cues: Roles of the Hippocampus, Frontal Cortex, and Cue-Directed Attention

    Directory of Open Access Journals (Sweden)

    Andrew C. Talk

    2016-12-01

    Full Text Available Loss of function of the hippocampus or frontal cortex is associated with reduced performance on memory tasks, in which subjects are incidentally exposed to cues at specific places in the environment and are subsequently asked to recollect the location at which the cue was experienced. Here, we examined the roles of the rodent hippocampus and frontal cortex in cue-directed attention during encoding of memory for the location of a single incidentally experienced cue. During a spatial sensory preconditioning task, rats explored an elevated platform while an auditory cue was incidentally presented at one corner. The opposite corner acted as an unpaired control location. The rats demonstrated recollection of location by avoiding the paired corner after the auditory cue was in turn paired with shock. Damage to either the dorsal hippocampus or the frontal cortex impaired this memory ability. However, we also found that hippocampal lesions enhanced attention directed towards the cue during the encoding phase, while frontal cortical lesions reduced cue-directed attention. These results suggest that the deficit in spatial sensory preconditioning caused by frontal cortical damage may be mediated by inattention to the location of cues during the latent encoding phase, while deficits following hippocampal damage must be related to other mechanisms such as generation of neural plasticity.

  2. Preconditioning of Spatial and Auditory Cues: Roles of the Hippocampus, Frontal Cortex, and Cue-Directed Attention

    Science.gov (United States)

    Talk, Andrew C.; Grasby, Katrina L.; Rawson, Tim; Ebejer, Jane L.

    2016-01-01

    Loss of function of the hippocampus or frontal cortex is associated with reduced performance on memory tasks, in which subjects are incidentally exposed to cues at specific places in the environment and are subsequently asked to recollect the location at which the cue was experienced. Here, we examined the roles of the rodent hippocampus and frontal cortex in cue-directed attention during encoding of memory for the location of a single incidentally experienced cue. During a spatial sensory preconditioning task, rats explored an elevated platform while an auditory cue was incidentally presented at one corner. The opposite corner acted as an unpaired control location. The rats demonstrated recollection of location by avoiding the paired corner after the auditory cue was in turn paired with shock. Damage to either the dorsal hippocampus or the frontal cortex impaired this memory ability. However, we also found that hippocampal lesions enhanced attention directed towards the cue during the encoding phase, while frontal cortical lesions reduced cue-directed attention. These results suggest that the deficit in spatial sensory preconditioning caused by frontal cortical damage may be mediated by inattention to the location of cues during the latent encoding phase, while deficits following hippocampal damage must be related to other mechanisms such as generation of neural plasticity. PMID:27999366

  3. Modality-specificity of Selective Attention Networks

    OpenAIRE

    Stewart, Hannah J.; Amitay, Sygal

    2015-01-01

    Objective: To establish the modality specificity and generality of selective attention networks. Method: Forty-eight young adults completed a battery of four auditory and visual selective attention tests based upon the Attention Network framework: the visual and auditory Attention Network Tests (vANT, aANT), the Test of Everyday Attention (TEA), and the Test of Attention in Listening (TAiL). These provided independent measures for auditory and visual alerting, orienting, and conflict resoluti...

  4. [Attention characteristics of children with different clinical subtypes of attention deficit hyperactivity disorder].

    Science.gov (United States)

    Liu, Wen-Long; Zhao, Xu; Tan, Jian-Hui; Wang, Juan

    2014-09-01

    To explore the attention characteristics of children with different clinical subtypes of attention deficit hyperactivity disorder (ADHD) and to provide a basis for clinical intervention. A total of 345 children diagnosed with ADHD were selected and the subtypes were identified. Attention assessment was performed by the intermediate visual and auditory continuous performance test at diagnosis, and the visual and auditory attention characteristics were compared between children with different subtypes. A total of 122 normal children were recruited in the control group and their attention characteristics were compared with those of children with ADHD. The scores of full scale attention quotient (AQ) and full scale response control quotient (RCQ) of children with all three subtypes of ADHD were significantly lower than those of normal children (Phyperactive/impulsive subtype (Pattention function of children with ADHD is worse than that of normal children, and the impairment of visual attention function is severer than that of auditory attention function. The degree of functional impairment of visual or auditory attention shows no significant differences between three subtypes of ADHD.

  5. Modulation of electrocortical brain activity by attention in individuals with and without tinnitus.

    Science.gov (United States)

    Paul, Brandon T; Bruce, Ian C; Bosnyak, Daniel J; Thompson, David C; Roberts, Larry E

    2014-01-01

    Age and hearing-level matched tinnitus and control groups were presented with a 40 Hz AM sound using a carrier frequency of either 5 kHz (in the tinnitus frequency region of the tinnitus subjects) or 500 Hz (below this region). On attended blocks subjects pressed a button after each sound indicating whether a single 40 Hz AM pulse of variable increased amplitude (target, probability 0.67) had or had not occurred. On passive blocks subjects rested and ignored the sounds. The amplitude of the 40 Hz auditory steady-state response (ASSR) localizing to primary auditory cortex (A1) increased with attention in control groups probed at 500 Hz and 5 kHz and in the tinnitus group probed at 500 Hz, but not in the tinnitus group probed at 5 kHz (128 channel EEG). N1 amplitude (this response localizing to nonprimary cortex, A2) increased with attention at both sound frequencies in controls but at neither frequency in tinnitus. We suggest that tinnitus-related neural activity occurring in the 5 kHz but not the 500 Hz region of tonotopic A1 disrupted attentional modulation of the 5 kHz ASSR in tinnitus subjects, while tinnitus-related activity in A1 distributing nontonotopically in A2 impaired modulation of N1 at both sound frequencies.

  6. Nicotine, auditory sensory memory and attention in a human ketamine model of schizophrenia: moderating influence of a hallucinatory trait

    Directory of Open Access Journals (Sweden)

    Verner eKnott

    2012-09-01

    Full Text Available Background: The procognitive actions of the nicotinic acetylcholine receptor (nAChR agonist nicotine are believed, in part, to motivate the excessive cigarette smoking in schizophrenia, a disorder associated with deficits in multiple cognitive domains, including low level auditory sensory processes and higher order attention-dependent operations. Objectives: As N-methyl-D-aspartate receptor (NMDAR hypofunction has been shown to contribute to these cognitive impairments, the primary aims of this healthy volunteer study were to: a to shed light on the separate and interactive roles of nAChR and NMDAR systems in the modulation of auditory sensory memory (and sustained attention, as indexed by the auditory event-related brain potential (ERP – mismatch negativity (MMN, and b to examine how these effects are moderated by a predisposition to auditory hallucinations/delusions (HD. Methods: In a randomized, double-blind, placebo controlled design involving a low intravenous dose of ketamine (.04 mg/kg and a 4 mg dose of nicotine gum, MMN and performance on a rapid visual information processing (RVIP task of sustained attention were examined in 24 healthy controls psychometrically stratified as being lower (L-HD, n = 12 or higher (H-HD for HD propensity. Results: Ketamine significantly slowed MMN, and reduced MMN in H-HD, with amplitude attenuation being blocked by the co-administration of nicotine. Nicotine significantly enhanced response speed (reaction time and accuracy (increased % hits and d΄ and reduced false alarms on the RIVIP, with improved performance accuracy being prevented when nicotine was administered with ketamine. Both % hits and d΄, as well as reaction time were poorer in H-HD (vs. L-HD and while hit rate and d΄ was increased by nicotine in H-HD, reaction time was slowed by ketamine in L-HD. Conclusions: Nicotine alleviated ketamine-induced sensory memory impairments and improved attention, particularly in individuals prone to HD.

  7. Bottom-up driven involuntary auditory evoked field change: constant sound sequencing amplifies but does not sharpen neural activity.

    Science.gov (United States)

    Okamoto, Hidehiko; Stracke, Henning; Lagemann, Lothar; Pantev, Christo

    2010-01-01

    The capability of involuntarily tracking certain sound signals during the simultaneous presence of noise is essential in human daily life. Previous studies have demonstrated that top-down auditory focused attention can enhance excitatory and inhibitory neural activity, resulting in sharpening of frequency tuning of auditory neurons. In the present study, we investigated bottom-up driven involuntary neural processing of sound signals in noisy environments by means of magnetoencephalography. We contrasted two sound signal sequencing conditions: "constant sequencing" versus "random sequencing." Based on a pool of 16 different frequencies, either identical (constant sequencing) or pseudorandomly chosen (random sequencing) test frequencies were presented blockwise together with band-eliminated noises to nonattending subjects. The results demonstrated that the auditory evoked fields elicited in the constant sequencing condition were significantly enhanced compared with the random sequencing condition. However, the enhancement was not significantly different between different band-eliminated noise conditions. Thus the present study confirms that by constant sound signal sequencing under nonattentive listening the neural activity in human auditory cortex can be enhanced, but not sharpened. Our results indicate that bottom-up driven involuntary neural processing may mainly amplify excitatory neural networks, but may not effectively enhance inhibitory neural circuits.

  8. Frequency-specific modulation of population-level frequency tuning in human auditory cortex

    Directory of Open Access Journals (Sweden)

    Roberts Larry E

    2009-01-01

    Full Text Available Abstract Background Under natural circumstances, attention plays an important role in extracting relevant auditory signals from simultaneously present, irrelevant noises. Excitatory and inhibitory neural activity, enhanced by attentional processes, seems to sharpen frequency tuning, contributing to improved auditory performance especially in noisy environments. In the present study, we investigated auditory magnetic fields in humans that were evoked by pure tones embedded in band-eliminated noises during two different stimulus sequencing conditions (constant vs. random under auditory focused attention by means of magnetoencephalography (MEG. Results In total, we used identical auditory stimuli between conditions, but presented them in a different order, thereby manipulating the neural processing and the auditory performance of the listeners. Constant stimulus sequencing blocks were characterized by the simultaneous presentation of pure tones of identical frequency with band-eliminated noises, whereas random sequencing blocks were characterized by the simultaneous presentation of pure tones of random frequencies and band-eliminated noises. We demonstrated that auditory evoked neural responses were larger in the constant sequencing compared to the random sequencing condition, particularly when the simultaneously presented noises contained narrow stop-bands. Conclusion The present study confirmed that population-level frequency tuning in human auditory cortex can be sharpened in a frequency-specific manner. This frequency-specific sharpening may contribute to improved auditory performance during detection and processing of relevant sound inputs characterized by specific frequency distributions in noisy environments.

  9. Persistent neural activity in auditory cortex is related to auditory working memory in humans and nonhuman primates.

    Science.gov (United States)

    Huang, Ying; Matysiak, Artur; Heil, Peter; König, Reinhard; Brosch, Michael

    2016-07-20

    Working memory is the cognitive capacity of short-term storage of information for goal-directed behaviors. Where and how this capacity is implemented in the brain are unresolved questions. We show that auditory cortex stores information by persistent changes of neural activity. We separated activity related to working memory from activity related to other mental processes by having humans and monkeys perform different tasks with varying working memory demands on the same sound sequences. Working memory was reflected in the spiking activity of individual neurons in auditory cortex and in the activity of neuronal populations, that is, in local field potentials and magnetic fields. Our results provide direct support for the idea that temporary storage of information recruits the same brain areas that also process the information. Because similar activity was observed in the two species, the cellular bases of some auditory working memory processes in humans can be studied in monkeys.

  10. Modality-specificity of Selective Attention Networks.

    Science.gov (United States)

    Stewart, Hannah J; Amitay, Sygal

    2015-01-01

    To establish the modality specificity and generality of selective attention networks. Forty-eight young adults completed a battery of four auditory and visual selective attention tests based upon the Attention Network framework: the visual and auditory Attention Network Tests (vANT, aANT), the Test of Everyday Attention (TEA), and the Test of Attention in Listening (TAiL). These provided independent measures for auditory and visual alerting, orienting, and conflict resolution networks. The measures were subjected to an exploratory factor analysis to assess underlying attention constructs. The analysis yielded a four-component solution. The first component comprised of a range of measures from the TEA and was labeled "general attention." The third component was labeled "auditory attention," as it only contained measures from the TAiL using pitch as the attended stimulus feature. The second and fourth components were labeled as "spatial orienting" and "spatial conflict," respectively-they were comprised of orienting and conflict resolution measures from the vANT, aANT, and TAiL attend-location task-all tasks based upon spatial judgments (e.g., the direction of a target arrow or sound location). These results do not support our a-priori hypothesis that attention networks are either modality specific or supramodal. Auditory attention separated into selectively attending to spatial and non-spatial features, with the auditory spatial attention loading onto the same factor as visual spatial attention, suggesting spatial attention is supramodal. However, since our study did not include a non-spatial measure of visual attention, further research will be required to ascertain whether non-spatial attention is modality-specific.

  11. A Characterization of Visual, Semantic and Auditory Memory in Children with Combination-Type Attention Deficit, Primarily Inattentive, and a Control Group

    Science.gov (United States)

    Ramirez, Luz Angela; Arenas, Angela Maria; Henao, Gloria Cecilia

    2005-01-01

    Introduction: This investigation describes and compares characteristics of visual, semantic and auditory memory in a group of children diagnosed with combined-type attention deficit with hyperactivity, attention deficit predominating, and a control group. Method: 107 boys and girls were selected, from 7 to 11 years of age, all residents in the…

  12. Bilingualism increases neural response consistency and attentional control: evidence for sensory and cognitive coupling.

    Science.gov (United States)

    Krizman, Jennifer; Skoe, Erika; Marian, Viorica; Kraus, Nina

    2014-01-01

    Auditory processing is presumed to be influenced by cognitive processes - including attentional control - in a top-down manner. In bilinguals, activation of both languages during daily communication hones inhibitory skills, which subsequently bolster attentional control. We hypothesize that the heightened attentional demands of bilingual communication strengthens connections between cognitive (i.e., attentional control) and auditory processing, leading to greater across-trial consistency in the auditory evoked response (i.e., neural consistency) in bilinguals. To assess this, we collected passively-elicited auditory evoked responses to the syllable [da] in adolescent Spanish-English bilinguals and English monolinguals and separately obtained measures of attentional control and language ability. Bilinguals demonstrated enhanced attentional control and more consistent brainstem and cortical responses. In bilinguals, but not monolinguals, brainstem consistency tracked with language proficiency and attentional control. We interpret these enhancements in neural consistency as the outcome of strengthened attentional control that emerged from experience communicating in two languages. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Developmental Dyslexia: Exploring How Much Phonological and Visual Attention Span Disorders Are Linked to Simultaneous Auditory Processing Deficits

    Science.gov (United States)

    Lallier, Marie; Donnadieu, Sophie; Valdois, Sylviane

    2013-01-01

    The simultaneous auditory processing skills of 17 dyslexic children and 17 skilled readers were measured using a dichotic listening task. Results showed that the dyslexic children exhibited difficulties reporting syllabic material when presented simultaneously. As a measure of simultaneous visual processing, visual attention span skills were…

  14. Global dynamics of selective attention and its lapses in primary auditory cortex.

    Science.gov (United States)

    Lakatos, Peter; Barczak, Annamaria; Neymotin, Samuel A; McGinnis, Tammy; Ross, Deborah; Javitt, Daniel C; O'Connell, Monica Noelle

    2016-12-01

    Previous research demonstrated that while selectively attending to relevant aspects of the external world, the brain extracts pertinent information by aligning its neuronal oscillations to key time points of stimuli or their sampling by sensory organs. This alignment mechanism is termed oscillatory entrainment. We investigated the global, long-timescale dynamics of this mechanism in the primary auditory cortex of nonhuman primates, and hypothesized that lapses of entrainment would correspond to lapses of attention. By examining electrophysiological and behavioral measures, we observed that besides the lack of entrainment by external stimuli, attentional lapses were also characterized by high-amplitude alpha oscillations, with alpha frequency structuring of neuronal ensemble and single-unit operations. Entrainment and alpha-oscillation-dominated periods were strongly anticorrelated and fluctuated rhythmically at an ultra-slow rate. Our results indicate that these two distinct brain states represent externally versus internally oriented computational resources engaged by large-scale task-positive and task-negative functional networks.

  15. Temporal Resolution and Active Auditory Discrimination Skill in Vocal Musicians

    Directory of Open Access Journals (Sweden)

    Kumar, Prawin

    2015-12-01

    Full Text Available Introduction Enhanced auditory perception in musicians is likely to result from auditory perceptual learning during several years of training and practice. Many studies have focused on biological processing of auditory stimuli among musicians. However, there is a lack of literature on temporal resolution and active auditory discrimination skills in vocal musicians. Objective The aim of the present study is to assess temporal resolution and active auditory discrimination skill in vocal musicians. Method The study participants included 15 vocal musicians with a minimum professional experience of 5 years of music exposure, within the age range of 20 to 30 years old, as the experimental group, while 15 age-matched non-musicians served as the control group. We used duration discrimination using pure-tones, pulse-train duration discrimination, and gap detection threshold tasks to assess temporal processing skills in both groups. Similarly, we assessed active auditory discrimination skill in both groups using Differential Limen of Frequency (DLF. All tasks were done using MATLab software installed in a personal computer at 40dBSL with maximum likelihood procedure. The collected data were analyzed using SPSS (version 17.0. Result Descriptive statistics showed better threshold for vocal musicians compared with non-musicians for all tasks. Further, independent t-test showed that vocal musicians performed significantly better compared with non-musicians on duration discrimination using pure tone, pulse train duration discrimination, gap detection threshold, and differential limen of frequency. Conclusion The present study showed enhanced temporal resolution ability and better (lower active discrimination threshold in vocal musicians in comparison to non-musicians.

  16. Cortical gamma activity during auditory tone omission provides evidence for the involvement of oscillatory activity in top-down processing.

    Science.gov (United States)

    Gurtubay, I G; Alegre, M; Valencia, M; Artieda, J

    2006-11-01

    Perception is an active process in which our brains use top-down influences to modulate afferent information. To determine whether this modulation might be based on oscillatory activity, we asked seven subjects to detect a silence that appeared randomly in a rhythmic auditory sequence, counting the number of omissions ("count" task), or responding to each omission with a right index finger extension ("move" task). Despite the absence of physical stimuli, these tasks induced a 'non-phase-locked' gamma oscillation in temporal-parietal areas, providing evidence of intrinsically generated oscillatory activity during top-down processing. This oscillation is probably related to the local neural activation that takes place during the process of stimulus detection, involving the functional comparison between the tones and the absence of stimuli as well as the auditory echoic memory processes. The amplitude of the gamma oscillations was reduced with the repetition of the tasks. Moreover, it correlated positively with the number of correctly detected omissions and negatively with the reaction time. These findings indicate that these oscillations, like others described, may be modulated by attentional processes. In summary, our findings support the active and adaptive concept of brain function that has emerged over recent years, suggesting that the match of sensory information with memory contents generates gamma oscillations.

  17. Modulation of Electrocortical Brain Activity by Attention in Individuals with and without Tinnitus

    Directory of Open Access Journals (Sweden)

    Brandon T. Paul

    2014-01-01

    Full Text Available Age and hearing-level matched tinnitus and control groups were presented with a 40 Hz AM sound using a carrier frequency of either 5 kHz (in the tinnitus frequency region of the tinnitus subjects or 500 Hz (below this region. On attended blocks subjects pressed a button after each sound indicating whether a single 40 Hz AM pulse of variable increased amplitude (target, probability 0.67 had or had not occurred. On passive blocks subjects rested and ignored the sounds. The amplitude of the 40 Hz auditory steady-state response (ASSR localizing to primary auditory cortex (A1 increased with attention in control groups probed at 500 Hz and 5 kHz and in the tinnitus group probed at 500 Hz, but not in the tinnitus group probed at 5 kHz (128 channel EEG. N1 amplitude (this response localizing to nonprimary cortex, A2 increased with attention at both sound frequencies in controls but at neither frequency in tinnitus. We suggest that tinnitus-related neural activity occurring in the 5 kHz but not the 500 Hz region of tonotopic A1 disrupted attentional modulation of the 5 kHz ASSR in tinnitus subjects, while tinnitus-related activity in A1 distributing nontonotopically in A2 impaired modulation of N1 at both sound frequencies.

  18. A psychophysiological evaluation of the perceived urgency of auditory warning signals

    Science.gov (United States)

    Burt, J. L.; Bartolome, D. S.; Burdette, D. W.; Comstock, J. R. Jr

    1995-01-01

    One significant concern that pilots have about cockpit auditory warnings is that the signals presently used lack a sense of priority. The relationship between auditory warning sound parameters and perceived urgency is, therefore, an important topic of enquiry in aviation psychology. The present investigation examined the relationship among subjective assessments of urgency, reaction time, and brainwave activity with three auditory warning signals. Subjects performed a tracking task involving automated and manual conditions, and were presented with auditory warnings having various levels of perceived and situational urgency. Subjective assessments revealed that subjects were able to rank warnings on an urgency scale, but rankings were altered after warnings were mapped to a situational urgency scale. Reaction times differed between automated and manual tracking task conditions, and physiological data showed attentional differences in response to perceived and situational warning urgency levels. This study shows that the use of physiological measures sensitive to attention and arousal, in conjunction with behavioural and subjective measures, may lead to the design of auditory warnings that produce a sense of urgency in an operator that matches the urgency of the situation.

  19. Modality-specificity of selective attention networks

    Directory of Open Access Journals (Sweden)

    Hannah Jamieson Stewart

    2015-11-01

    Full Text Available Objective: To establish the modality specificity and generality of selective attention networks. Method: Forty-eight young adults completed a battery of four auditory and visual selective attention tests based upon the Attention Network framework: the visual and auditory Attention Network Tests (vANT, aANT, the Test of Everyday Attention (TEA, and the Test of Attention in Listening (TAiL. These provided independent measures for auditory and visual alerting, orienting, and conflict resolution networks. The measures were subjected to an exploratory factor analysis to assess underlying attention constructs. Results: The analysis yielded a four-component solution. The first component comprised of a range of measures from the TEA and was labeled ‘general attention’. The third component was labeled ‘auditory attention’, as it only contained measures from the TAiL using pitch as the attended stimulus feature. The second and fourth components were labeled as ‘spatial orienting’ and ‘spatial conflict’, respectively – they were comprised of orienting and conflict resolution measures from the vANT, aANT and TAiL attend-location task – all tasks based upon spatial judgments (e.g., the direction of a target arrow or sound location. Conclusions: These results do not support our a-priori hypothesis that attention networks are either modality specific or supramodal. Auditory attention separated into selectively attending to spatial and non-spatial features, with the auditory spatial attention loading onto the same factor as visual spatial attention, suggesting spatial attention is supramodal. However, since our study did not include a non-spatial measure of visual attention, further research will be required to ascertain whether non-spatial attention is modality-specific.

  20. Validation of auditory detection response task method for assessing the attentional effects of cognitive load.

    Science.gov (United States)

    Stojmenova, Kristina; Sodnik, Jaka

    2018-07-04

    There are 3 standardized versions of the Detection Response Task (DRT), 2 using visual stimuli (remote DRT and head-mounted DRT) and one using tactile stimuli. In this article, we present a study that proposes and validates a type of auditory signal to be used as DRT stimulus and evaluate the proposed auditory version of this method by comparing it with the standardized visual and tactile version. This was a within-subject design study performed in a driving simulator with 24 participants. Each participant performed 8 2-min-long driving sessions in which they had to perform 3 different tasks: driving, answering to DRT stimuli, and performing a cognitive task (n-back task). Presence of additional cognitive load and type of DRT stimuli were defined as independent variables. DRT response times and hit rates, n-back task performance, and pupil size were observed as dependent variables. Significant changes in pupil size for trials with a cognitive task compared to trials without showed that cognitive load was induced properly. Each DRT version showed a significant increase in response times and a decrease in hit rates for trials with a secondary cognitive task compared to trials without. Similar and significantly better results in differences in response times and hit rates were obtained for the auditory and tactile version compared to the visual version. There were no significant differences in performance rate between the trials without DRT stimuli compared to trials with and among the trials with different DRT stimuli modalities. The results from this study show that the auditory DRT version, using the signal implementation suggested in this article, is sensitive to the effects of cognitive load on driver's attention and is significantly better than the remote visual and tactile version for auditory-vocal cognitive (n-back) secondary tasks.

  1. Diminished auditory sensory gating during active auditory verbal hallucinations.

    Science.gov (United States)

    Thoma, Robert J; Meier, Andrew; Houck, Jon; Clark, Vincent P; Lewine, Jeffrey D; Turner, Jessica; Calhoun, Vince; Stephen, Julia

    2017-10-01

    Auditory sensory gating, assessed in a paired-click paradigm, indicates the extent to which incoming stimuli are filtered, or "gated", in auditory cortex. Gating is typically computed as the ratio of the peak amplitude of the event related potential (ERP) to a second click (S2) divided by the peak amplitude of the ERP to a first click (S1). Higher gating ratios are purportedly indicative of incomplete suppression of S2 and considered to represent sensory processing dysfunction. In schizophrenia, hallucination severity is positively correlated with gating ratios, and it was hypothesized that a failure of sensory control processes early in auditory sensation (gating) may represent a larger system failure within the auditory data stream; resulting in auditory verbal hallucinations (AVH). EEG data were collected while patients (N=12) with treatment-resistant AVH pressed a button to indicate the beginning (AVH-on) and end (AVH-off) of each AVH during a paired click protocol. For each participant, separate gating ratios were computed for the P50, N100, and P200 components for each of the AVH-off and AVH-on states. AVH trait severity was assessed using the Psychotic Symptoms Rating Scales AVH Total score (PSYRATS). The results of a mixed model ANOVA revealed an overall effect for AVH state, such that gating ratios were significantly higher during the AVH-on state than during AVH-off for all three components. PSYRATS score was significantly and negatively correlated with N100 gating ratio only in the AVH-off state. These findings link onset of AVH with a failure of an empirically-defined auditory inhibition system, auditory sensory gating, and pave the way for a sensory gating model of AVH. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Central auditory processing outcome after stroke in children

    Directory of Open Access Journals (Sweden)

    Karla M. I. Freiria Elias

    2014-09-01

    Full Text Available Objective To investigate central auditory processing in children with unilateral stroke and to verify whether the hemisphere affected by the lesion influenced auditory competence. Method 23 children (13 male between 7 and 16 years old were evaluated through speech-in-noise tests (auditory closure; dichotic digit test and staggered spondaic word test (selective attention; pitch pattern and duration pattern sequence tests (temporal processing and their results were compared with control children. Auditory competence was established according to the performance in auditory analysis ability. Results Was verified similar performance between groups in auditory closure ability and pronounced deficits in selective attention and temporal processing abilities. Most children with stroke showed an impaired auditory ability in a moderate degree. Conclusion Children with stroke showed deficits in auditory processing and the degree of impairment was not related to the hemisphere affected by the lesion.

  3. Activation of auditory white matter tracts as revealed by functional magnetic resonance imaging

    Energy Technology Data Exchange (ETDEWEB)

    Tae, Woo Suk [Kangwon National University, Neuroscience Research Institute, School of Medicine, Chuncheon (Korea, Republic of); Yakunina, Natalia; Nam, Eui-Cheol [Kangwon National University, Neuroscience Research Institute, School of Medicine, Chuncheon (Korea, Republic of); Kangwon National University, Department of Otolaryngology, School of Medicine, Chuncheon, Kangwon-do (Korea, Republic of); Kim, Tae Su [Kangwon National University Hospital, Department of Otolaryngology, Chuncheon (Korea, Republic of); Kim, Sam Soo [Kangwon National University, Neuroscience Research Institute, School of Medicine, Chuncheon (Korea, Republic of); Kangwon National University, Department of Radiology, School of Medicine, Chuncheon (Korea, Republic of)

    2014-07-15

    The ability of functional magnetic resonance imaging (fMRI) to detect activation in brain white matter (WM) is controversial. In particular, studies on the functional activation of WM tracts in the central auditory system are scarce. We utilized fMRI to assess and characterize the entire auditory WM pathway under robust experimental conditions involving the acquisition of a large number of functional volumes, the application of broadband auditory stimuli of high intensity, and the use of sparse temporal sampling to avoid scanner noise effects and increase signal-to-noise ratio. Nineteen healthy volunteers were subjected to broadband white noise in a block paradigm; each run had four sound-on/off alternations and was repeated nine times for each subject. Sparse sampling (TR = 8 s) was used. In addition to traditional gray matter (GM) auditory center activation, WM activation was detected in the isthmus and midbody of the corpus callosum (CC), tapetum, auditory radiation, lateral lemniscus, and decussation of the superior cerebellar peduncles. At the individual level, 13 of 19 subjects (68 %) had CC activation. Callosal WM exhibited a temporal delay of approximately 8 s in response to the stimulation compared with GM. These findings suggest that direct evaluation of the entire functional network of the central auditory system may be possible using fMRI, which may aid in understanding the neurophysiological basis of the central auditory system and in developing treatment strategies for various central auditory disorders. (orig.)

  4. Activation of auditory white matter tracts as revealed by functional magnetic resonance imaging

    International Nuclear Information System (INIS)

    Tae, Woo Suk; Yakunina, Natalia; Nam, Eui-Cheol; Kim, Tae Su; Kim, Sam Soo

    2014-01-01

    The ability of functional magnetic resonance imaging (fMRI) to detect activation in brain white matter (WM) is controversial. In particular, studies on the functional activation of WM tracts in the central auditory system are scarce. We utilized fMRI to assess and characterize the entire auditory WM pathway under robust experimental conditions involving the acquisition of a large number of functional volumes, the application of broadband auditory stimuli of high intensity, and the use of sparse temporal sampling to avoid scanner noise effects and increase signal-to-noise ratio. Nineteen healthy volunteers were subjected to broadband white noise in a block paradigm; each run had four sound-on/off alternations and was repeated nine times for each subject. Sparse sampling (TR = 8 s) was used. In addition to traditional gray matter (GM) auditory center activation, WM activation was detected in the isthmus and midbody of the corpus callosum (CC), tapetum, auditory radiation, lateral lemniscus, and decussation of the superior cerebellar peduncles. At the individual level, 13 of 19 subjects (68 %) had CC activation. Callosal WM exhibited a temporal delay of approximately 8 s in response to the stimulation compared with GM. These findings suggest that direct evaluation of the entire functional network of the central auditory system may be possible using fMRI, which may aid in understanding the neurophysiological basis of the central auditory system and in developing treatment strategies for various central auditory disorders. (orig.)

  5. Towards a Cognitive Model of Distraction by Auditory Novelty: The Role of Involuntary Attention Capture and Semantic Processing

    Science.gov (United States)

    Parmentier, Fabrice B. R.

    2008-01-01

    Unexpected auditory stimuli are potent distractors, able to break through selective attention and disrupt performance in an unrelated visual task. This study examined the processing fate of novel sounds by examining the extent to which their semantic content is analyzed and whether the outcome of this processing can impact on subsequent behavior.…

  6. White Matter Integrity Dissociates Verbal Memory and Auditory Attention Span in Emerging Adults with Congenital Heart Disease.

    Science.gov (United States)

    Brewster, Ryan C; King, Tricia Z; Burns, Thomas G; Drossner, David M; Mahle, William T

    2015-01-01

    White matter disruptions have been identified in individuals with congenital heart disease (CHD). However, no specific theory-driven relationships between microstructural white matter disruptions and cognition have been established in CHD. We conducted a two-part study. First, we identified significant differences in fractional anisotropy (FA) of emerging adults with CHD using Tract-Based Spatial Statistics (TBSS). TBSS analyses between 22 participants with CHD and 18 demographically similar controls identified five regions of normal appearing white matter with significantly lower FA in CHD, and two higher. Next, two regions of lower FA in CHD were selected to examine theory-driven differential relationships with cognition: voxels along the left uncinate fasciculus (UF; a tract theorized to contribute to verbal memory) and voxels along the right middle cerebellar peduncle (MCP; a tract previously linked to attention). In CHD, a significant positive correlation between UF FA and memory was found, r(20)=.42, p=.049 (uncorrected). There was no correlation between UF and auditory attention span. A positive correlation between MCP FA and auditory attention span was found, r(20)=.47, p=.027 (uncorrected). There was no correlation between MCP and memory. In controls, no significant relationships were identified. These results are consistent with previous literature demonstrating lower FA in younger CHD samples, and provide novel evidence for disrupted white matter integrity in emerging adults with CHD. Furthermore, a correlational double dissociation established distinct white matter circuitry (UF and MCP) and differential cognitive correlates (memory and attention span, respectively) in young adults with CHD.

  7. The impact of visual gaze direction on auditory object tracking

    OpenAIRE

    Pomper, U.; Chait, M.

    2017-01-01

    Subjective experience suggests that we are able to direct our auditory attention independent of our visual gaze, e.g when shadowing a nearby conversation at a cocktail party. But what are the consequences at the behavioural and neural level? While numerous studies have investigated both auditory attention and visual gaze independently, little is known about their interaction during selective listening. In the present EEG study, we manipulated visual gaze independently of auditory attention wh...

  8. Attention shifts the language network reflecting paradigm presentation

    Directory of Open Access Journals (Sweden)

    Kathrin eKollndorfer

    2013-11-01

    Full Text Available Objectives: Functional magnetic resonance imaging (fMRI is a reliable and non-invasive method with which to localize language function in pre-surgical planning. In clinical practice, visual stimulus presentation is often difficult or impossible, due to the patient’s restricted language or attention abilities. Therefore, our aim was to investigate modality-specific differences in visual and auditory stimulus presentation.Methods: Ten healthy subjects participated in an fMRI study comprising two experiments with visual and auditory stimulus presentation. In both experiments, two language paradigms (one for language comprehension and one for language production used in clinical practice were investigated. In addition to standard data analysis by the means of the general linear model (GLM, independent component analysis (ICA was performed to achieve more detailed information on language processing networks.Results: GLM analysis revealed modality-specific brain activation for both language paradigms for the contrast visual > auditory in the area of the intraparietal sulcus and the hippocampus, two areas related to attention and working memory. Using group ICA, a language network was detected for both paradigms independent of stimulus presentation modality. The investigation of language lateralization revealed no significant variations. Visually presented stimuli further activated an attention-shift network, which could not be identified for the auditory presented language.Conclusion: The results of this study indicate that the visually presented language stimuli additionally activate an attention-shift network. These findings will provide important information for pre-surgical planning in order to preserve reading abilities after brain surgery, significantly improving surgical outcomes. Our findings suggest that the presentation modality for language paradigms should be adapted on behalf of individual indication.

  9. Rapid effects of hearing song on catecholaminergic activity in the songbird auditory pathway.

    Directory of Open Access Journals (Sweden)

    Lisa L Matragrano

    Full Text Available Catecholaminergic (CA neurons innervate sensory areas and affect the processing of sensory signals. For example, in birds, CA fibers innervate the auditory pathway at each level, including the midbrain, thalamus, and forebrain. We have shown previously that in female European starlings, CA activity in the auditory forebrain can be enhanced by exposure to attractive male song for one week. It is not known, however, whether hearing song can initiate that activity more rapidly. Here, we exposed estrogen-primed, female white-throated sparrows to conspecific male song and looked for evidence of rapid synthesis of catecholamines in auditory areas. In one hemisphere of the brain, we used immunohistochemistry to detect the phosphorylation of tyrosine hydroxylase (TH, a rate-limiting enzyme in the CA synthetic pathway. We found that immunoreactivity for TH phosphorylated at serine 40 increased dramatically in the auditory forebrain, but not the auditory thalamus and midbrain, after 15 min of song exposure. In the other hemisphere, we used high pressure liquid chromatography to measure catecholamines and their metabolites. We found that two dopamine metabolites, dihydroxyphenylacetic acid and homovanillic acid, increased in the auditory forebrain but not the auditory midbrain after 30 min of exposure to conspecific song. Our results are consistent with the hypothesis that exposure to a behaviorally relevant auditory stimulus rapidly induces CA activity, which may play a role in auditory responses.

  10. Did You Listen to the Beat? Auditory Steady-State Responses in the Human Electroencephalogram at 4 and 7 Hz Modulation Rates Reflect Selective Attention.

    Science.gov (United States)

    Jaeger, Manuela; Bleichner, Martin G; Bauer, Anna-Katharina R; Mirkovic, Bojana; Debener, Stefan

    2018-02-27

    The acoustic envelope of human speech correlates with the syllabic rate (4-8 Hz) and carries important information for intelligibility, which is typically compromised in multi-talker, noisy environments. In order to better understand the dynamics of selective auditory attention to low frequency modulated sound sources, we conducted a two-stream auditory steady-state response (ASSR) selective attention electroencephalogram (EEG) study. The two streams consisted of 4 and 7 Hz amplitude and frequency modulated sounds presented from the left and right side. One of two streams had to be attended while the other had to be ignored. The attended stream always contained a target, allowing for the behavioral confirmation of the attention manipulation. EEG ASSR power analysis revealed a significant increase in 7 Hz power for the attend compared to the ignore conditions. There was no significant difference in 4 Hz power when the 4 Hz stream had to be attended compared to when it had to be ignored. This lack of 4 Hz attention modulation could be explained by a distracting effect of a third frequency at 3 Hz (beat frequency) perceivable when the 4 and 7 Hz streams are presented simultaneously. Taken together our results show that low frequency modulations at syllabic rate are modulated by selective spatial attention. Whether attention effects act as enhancement of the attended stream or suppression of to be ignored stream may depend on how well auditory streams can be segregated.

  11. Predictive Power of Attention and Reading Readiness Variables on Auditory Reasoning and Processing Skills of Six-Year-Old Children

    Science.gov (United States)

    Erbay, Filiz

    2013-01-01

    The aim of present research was to describe the relation of six-year-old children's attention and reading readiness skills (general knowledge, word comprehension, sentences, and matching) with their auditory reasoning and processing skills. This was a quantitative study based on scanning model. Research sampling consisted of 204 kindergarten…

  12. Selective Attention and Multisensory Integration: Multiple Phases of Effects on the Evoked Brain Activity

    NARCIS (Netherlands)

    Talsma, D.; Woldorff, Marty G.

    2005-01-01

    We used event-related potentials (ERPs) to evaluate the role of attention in the integration of visual and auditory features of multisensory objects. This was done by contrasting the ERPs to multisensory stimuli (AV) to the sum of the ERPs to the corresponding auditory-only (A) and visual-only (V)

  13. Different patterns of auditory cortex activation revealed by functional magnetic resonance imaging

    International Nuclear Information System (INIS)

    Formisano, E.; Pepino, A.; Bracale, M.; Di Salle, F.; Lanfermann, H.; Zanella, F.E.

    1998-01-01

    In the last few years, functional Magnetic Resonance Imaging (fMRI) has been widely accepted as an effective tool for mapping brain activities in both the sensorimotor and the cognitive field. The present work aims to assess the possibility of using fMRI methods to study the cortical response to different acoustic stimuli. Furthermore, we refer to recent data collected at Frankfurt University on the cortical pattern of auditory hallucinations. Healthy subjects showed broad bilateral activation, mostly located in the transverse gyrus of Heschl. The analysis of the cortical activation induced by different stimuli has pointed out a remarkable difference in the spatial and temporal features of the auditory cortex response to pulsed tones and pure tones. The activated areas during episodes of auditory hallucinations match the location of primary auditory cortex as defined in control measurements with the same patients and in the experiments on healthy subjects. (authors)

  14. Auditory interfaces: The human perceiver

    Science.gov (United States)

    Colburn, H. Steven

    1991-01-01

    A brief introduction to the basic auditory abilities of the human perceiver with particular attention toward issues that may be important for the design of auditory interfaces is presented. The importance of appropriate auditory inputs to observers with normal hearing is probably related to the role of hearing as an omnidirectional, early warning system and to its role as the primary vehicle for communication of strong personal feelings.

  15. The effects of distraction and a brief intervention on auditory and visual-spatial working memory in college students with attention deficit hyperactivity disorder.

    Science.gov (United States)

    Lineweaver, Tara T; Kercood, Suneeta; O'Keeffe, Nicole B; O'Brien, Kathleen M; Massey, Eric J; Campbell, Samantha J; Pierce, Jenna N

    2012-01-01

    Two studies addressed how young adult college students with attention deficit hyperactivity disorder (ADHD) (n = 44) compare to their nonaffected peers (n = 42) on tests of auditory and visual-spatial working memory (WM), are vulnerable to auditory and visual distractions, and are affected by a simple intervention. Students with ADHD demonstrated worse auditory WM than did controls. A near significant trend indicated that auditory distractions interfered with the visual WM of both groups and that, whereas controls were also vulnerable to visual distractions, visual distractions improved visual WM in the ADHD group. The intervention was ineffective. Limited correlations emerged between self-reported ADHD symptoms and objective test performances; students with ADHD who perceived themselves as more symptomatic often had better WM and were less vulnerable to distractions than their ADHD peers.

  16. Different patterns of auditory cortex activation revealed by functional magnetic resonance imaging

    Energy Technology Data Exchange (ETDEWEB)

    Formisano, E; Pepino, A; Bracale, M [Department of Electronic Engineering, Biomedical Unit, Universita di Napoli, Federic II, Italy, Via Claudio 21, 80125 Napoli (Italy); Di Salle, F [Department of Biomorphological and Functional Sciences, Radiologucal Unit, Universita di Napoli, Federic II, Italy, Via Claudio 21, 80125 Napoli (Italy); Lanfermann, H; Zanella, F E [Department of Neuroradiology, J.W. Goethe Universitat, Frankfurt/M. (Germany)

    1999-12-31

    In the last few years, functional Magnetic Resonance Imaging (fMRI) has been widely accepted as an effective tool for mapping brain activities in both the sensorimotor and the cognitive field. The present work aims to assess the possibility of using fMRI methods to study the cortical response to different acoustic stimuli. Furthermore, we refer to recent data collected at Frankfurt University on the cortical pattern of auditory hallucinations. Healthy subjects showed broad bilateral activation, mostly located in the transverse gyrus of Heschl. The analysis of the cortical activation induced by different stimuli has pointed out a remarkable difference in the spatial and temporal features of the auditory cortex response to pulsed tones and pure tones. The activated areas during episodes of auditory hallucinations match the location of primary auditory cortex as defined in control measurements with the same patients and in the experiments on healthy subjects. (authors) 17 refs., 4 figs.

  17. An online brain-computer interface based on shifting attention to concurrent streams of auditory stimuli

    Science.gov (United States)

    Hill, N. J.; Schölkopf, B.

    2012-04-01

    We report on the development and online testing of an electroencephalogram-based brain-computer interface (BCI) that aims to be usable by completely paralysed users—for whom visual or motor-system-based BCIs may not be suitable, and among whom reports of successful BCI use have so far been very rare. The current approach exploits covert shifts of attention to auditory stimuli in a dichotic-listening stimulus design. To compare the efficacy of event-related potentials (ERPs) and steady-state auditory evoked potentials (SSAEPs), the stimuli were designed such that they elicited both ERPs and SSAEPs simultaneously. Trial-by-trial feedback was provided online, based on subjects' modulation of N1 and P3 ERP components measured during single 5 s stimulation intervals. All 13 healthy subjects were able to use the BCI, with performance in a binary left/right choice task ranging from 75% to 96% correct across subjects (mean 85%). BCI classification was based on the contrast between stimuli in the attended stream and stimuli in the unattended stream, making use of every stimulus, rather than contrasting frequent standard and rare ‘oddball’ stimuli. SSAEPs were assessed offline: for all subjects, spectral components at the two exactly known modulation frequencies allowed discrimination of pre-stimulus from stimulus intervals, and of left-only stimuli from right-only stimuli when one side of the dichotic stimulus pair was muted. However, attention modulation of SSAEPs was not sufficient for single-trial BCI communication, even when the subject's attention was clearly focused well enough to allow classification of the same trials via ERPs. ERPs clearly provided a superior basis for BCI. The ERP results are a promising step towards the development of a simple-to-use, reliable yes/no communication system for users in the most severely paralysed states, as well as potential attention-monitoring and -training applications outside the context of assistive technology.

  18. An online brain-computer interface based on shifting attention to concurrent streams of auditory stimuli

    Science.gov (United States)

    Hill, N J; Schölkopf, B

    2012-01-01

    We report on the development and online testing of an EEG-based brain-computer interface (BCI) that aims to be usable by completely paralysed users—for whom visual or motor-system-based BCIs may not be suitable, and among whom reports of successful BCI use have so far been very rare. The current approach exploits covert shifts of attention to auditory stimuli in a dichotic-listening stimulus design. To compare the efficacy of event-related potentials (ERPs) and steady-state auditory evoked potentials (SSAEPs), the stimuli were designed such that they elicited both ERPs and SSAEPs simultaneously. Trial-by-trial feedback was provided online, based on subjects’ modulation of N1 and P3 ERP components measured during single 5-second stimulation intervals. All 13 healthy subjects were able to use the BCI, with performance in a binary left/right choice task ranging from 75% to 96% correct across subjects (mean 85%). BCI classification was based on the contrast between stimuli in the attended stream and stimuli in the unattended stream, making use of every stimulus, rather than contrasting frequent standard and rare “oddball” stimuli. SSAEPs were assessed offline: for all subjects, spectral components at the two exactly-known modulation frequencies allowed discrimination of pre-stimulus from stimulus intervals, and of left-only stimuli from right-only stimuli when one side of the dichotic stimulus pair was muted. However, attention-modulation of SSAEPs was not sufficient for single-trial BCI communication, even when the subject’s attention was clearly focused well enough to allow classification of the same trials via ERPs. ERPs clearly provided a superior basis for BCI. The ERP results are a promising step towards the development of a simple-to-use, reliable yes/no communication system for users in the most severely paralysed states, as well as potential attention-monitoring and -training applications outside the context of assistive technology. PMID:22333135

  19. Differences in Speech Recognition Between Children with Attention Deficits and Typically Developed Children Disappear When Exposed to 65 dB of Auditory Noise.

    Science.gov (United States)

    Söderlund, Göran B W; Jobs, Elisabeth Nilsson

    2016-01-01

    The most common neuropsychiatric condition in the in children is attention deficit hyperactivity disorder (ADHD), affecting ∼6-9% of the population. ADHD is distinguished by inattention and hyperactive, impulsive behaviors as well as poor performance in various cognitive tasks often leading to failures at school. Sensory and perceptual dysfunctions have also been noticed. Prior research has mainly focused on limitations in executive functioning where differences are often explained by deficits in pre-frontal cortex activation. Less notice has been given to sensory perception and subcortical functioning in ADHD. Recent research has shown that children with ADHD diagnosis have a deviant auditory brain stem response compared to healthy controls. The aim of the present study was to investigate if the speech recognition threshold differs between attentive and children with ADHD symptoms in two environmental sound conditions, with and without external noise. Previous research has namely shown that children with attention deficits can benefit from white noise exposure during cognitive tasks and here we investigate if noise benefit is present during an auditory perceptual task. For this purpose we used a modified Hagerman's speech recognition test where children with and without attention deficits performed a binaural speech recognition task to assess the speech recognition threshold in no noise and noise conditions (65 dB). Results showed that the inattentive group displayed a higher speech recognition threshold than typically developed children and that the difference in speech recognition threshold disappeared when exposed to noise at supra threshold level. From this we conclude that inattention can partly be explained by sensory perceptual limitations that can possibly be ameliorated through noise exposure.

  20. Differences in Speech Recognition Between Children with Attention Deficits and Typically Developed Children Disappear when Exposed to 65 dB of Auditory Noise

    Directory of Open Access Journals (Sweden)

    Göran B W Söderlund

    2016-01-01

    Full Text Available The most common neuropsychiatric condition in the in children is attention deficit hyperactivity disorder (ADHD, affecting approximately 6-9 % of the population. ADHD is distinguished by inattention and hyperactive, impulsive behaviors as well as poor performance in various cognitive tasks often leading to failures at school. Sensory and perceptual dysfunctions have also been noticed. Prior research has mainly focused on limitations in executive functioning where differences are often explained by deficits in pre-frontal cortex activation. Less notice has been given to sensory perception and subcortical functioning in ADHD. Recent research has shown that children with ADHD diagnosis have a deviant auditory brain stem response compared to healthy controls. The aim of the present study was to investigate if the speech recognition threshold differs between attentive and children with ADHD symptoms in two environmental sound conditions, with and without external noise. Previous research has namely shown that children with attention deficits can benefit from white noise exposure during cognitive tasks and here we investigate if noise benefit is present during an auditory perceptual task. For this purpose we used a modified Hagerman’s speech recognition test where children with and without attention deficits performed a binaural speech recognition task to assess the speech recognition threshold in no noise and noise conditions (65 dB. Results showed that the inattentive group displayed a higher speech recognition threshold than typically developed children (TDC and that the difference in speech recognition threshold disappeared when exposed to noise at supra threshold level. From this we conclude that inattention can partly be explained by sensory perceptual limitations that can possibly be ameliorated through noise exposure.

  1. Peripheral hearing loss reduces the ability of children to direct selective attention during multi-talker listening.

    Science.gov (United States)

    Holmes, Emma; Kitterick, Padraig T; Summerfield, A Quentin

    2017-07-01

    Restoring normal hearing requires knowledge of how peripheral and central auditory processes are affected by hearing loss. Previous research has focussed primarily on peripheral changes following sensorineural hearing loss, whereas consequences for central auditory processing have received less attention. We examined the ability of hearing-impaired children to direct auditory attention to a voice of interest (based on the talker's spatial location or gender) in the presence of a common form of background noise: the voices of competing talkers (i.e. during multi-talker, or "Cocktail Party" listening). We measured brain activity using electro-encephalography (EEG) when children prepared to direct attention to the spatial location or gender of an upcoming target talker who spoke in a mixture of three talkers. Compared to normally-hearing children, hearing-impaired children showed significantly less evidence of preparatory brain activity when required to direct spatial attention. This finding is consistent with the idea that hearing-impaired children have a reduced ability to prepare spatial attention for an upcoming talker. Moreover, preparatory brain activity was not restored when hearing-impaired children listened with their acoustic hearing aids. An implication of these findings is that steps to improve auditory attention alongside acoustic hearing aids may be required to improve the ability of hearing-impaired children to understand speech in the presence of competing talkers. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Aging increases distraction by auditory oddballs in visual, but not auditory tasks.

    Science.gov (United States)

    Leiva, Alicia; Parmentier, Fabrice B R; Andrés, Pilar

    2015-05-01

    Aging is typically considered to bring a reduction of the ability to resist distraction by task-irrelevant stimuli. Yet recent work suggests that this conclusion must be qualified and that the effect of aging is mitigated by whether irrelevant and target stimuli emanate from the same modalities or from distinct ones. Some studies suggest that aging is especially sensitive to distraction within-modality while others suggest it is greater across modalities. Here we report the first study to measure the effect of aging on deviance distraction in cross-modal (auditory-visual) and uni-modal (auditory-auditory) oddball tasks. Young and older adults were asked to judge the parity of target digits (auditory or visual in distinct blocks of trials), each preceded by a task-irrelevant sound (the same tone on most trials-the standard sound-or, on rare and unpredictable trials, a burst of white noise-the deviant sound). Deviant sounds yielded distraction (longer response times relative to standard sounds) in both tasks and age groups. However, an age-related increase in distraction was observed in the cross-modal task and not in the uni-modal task. We argue that aging might affect processes involved in the switching of attention across modalities and speculate that this may due to the slowing of this type of attentional shift or a reduction in cognitive control required to re-orient attention toward the target's modality.

  3. Auditory learning through active engagement with sound: Biological impact of community music lessons in at-risk children

    Directory of Open Access Journals (Sweden)

    Nina eKraus

    2014-11-01

    Full Text Available The young nervous system is primed for sensory learning, facilitating the acquisition of language and communication skills. Social and linguistic impoverishment can limit these learning opportunities, eventually leading to language-related challenges such as poor reading. Music training offers a promising auditory learning strategy by directing attention to meaningful acoustic elements in the soundscape. In light of evidence that music training improves auditory skills and their neural substrates, there are increasing efforts to enact community-based programs to provide music instruction to at-risk children. Harmony Project is a community foundation that has provided free music instruction to over 1,000 children from Los Angeles gang-reduction zones over the past decade. We conducted an independent evaluation of biological effects of participating in Harmony Project by following a cohort of children for one year. Here we focus on a comparison between students who actively engaged with sound through instrumental music training vs. students who took music appreciation classes. All children began with an introductory music appreciation class, but midway through the year half of the children transitioned to an instrumental training class. After the year of training, the children who actively engaged with sound through instrumental music training had faster and more robust neural processing of speech than the children who stayed in the music appreciation class, observed in neural responses to a speech sound /d/. The neurophysiological measures found to be enhanced in the instrumentally trained children have been previously linked to reading ability, suggesting a gain in neural processes important for literacy stemming from active auditory learning. These findings speak to the potential of active engagement with sound (i.e., music-making to engender experience-dependent neuroplasticity during trand may inform the development of strategies for auditory

  4. Auditory learning through active engagement with sound: biological impact of community music lessons in at-risk children.

    Science.gov (United States)

    Kraus, Nina; Slater, Jessica; Thompson, Elaine C; Hornickel, Jane; Strait, Dana L; Nicol, Trent; White-Schwoch, Travis

    2014-01-01

    The young nervous system is primed for sensory learning, facilitating the acquisition of language and communication skills. Social and linguistic impoverishment can limit these learning opportunities, eventually leading to language-related challenges such as poor reading. Music training offers a promising auditory learning strategy by directing attention to meaningful acoustic elements of the soundscape. In light of evidence that music training improves auditory skills and their neural substrates, there are increasing efforts to enact community-based programs to provide music instruction to at-risk children. Harmony Project is a community foundation that has provided free music instruction to over 1000 children from Los Angeles gang-reduction zones over the past decade. We conducted an independent evaluation of biological effects of participating in Harmony Project by following a cohort of children for 1 year. Here we focus on a comparison between students who actively engaged with sound through instrumental music training vs. students who took music appreciation classes. All children began with an introductory music appreciation class, but midway through the year half of the children transitioned to the instrumental training. After the year of training, the children who actively engaged with sound through instrumental music training had faster and more robust neural processing of speech than the children who stayed in the music appreciation class, observed in neural responses to a speech sound /d/. The neurophysiological measures found to be enhanced in the instrumentally-trained children have been previously linked to reading ability, suggesting a gain in neural processes important for literacy stemming from active auditory learning. Despite intrinsic constraints on our study imposed by a community setting, these findings speak to the potential of active engagement with sound (i.e., music-making) to engender experience-dependent neuroplasticity and may inform the

  5. Transmodal comparison of auditory, motor, and visual post-processing with and without intentional short-term memory maintenance.

    Science.gov (United States)

    Bender, Stephan; Behringer, Stephanie; Freitag, Christine M; Resch, Franz; Weisbrod, Matthias

    2010-12-01

    To elucidate the contributions of modality-dependent post-processing in auditory, motor and visual cortical areas to short-term memory. We compared late negative waves (N700) during the post-processing of single lateralized stimuli which were separated by long intertrial intervals across the auditory, motor and visual modalities. Tasks either required or competed with attention to post-processing of preceding events, i.e. active short-term memory maintenance. N700 indicated that cortical post-processing exceeded short movements as well as short auditory or visual stimuli for over half a second without intentional short-term memory maintenance. Modality-specific topographies pointed towards sensory (respectively motor) generators with comparable time-courses across the different modalities. Lateralization and amplitude of auditory/motor/visual N700 were enhanced by active short-term memory maintenance compared to attention to current perceptions or passive stimulation. The memory-related N700 increase followed the characteristic time-course and modality-specific topography of the N700 without intentional memory-maintenance. Memory-maintenance-related lateralized negative potentials may be related to a less lateralised modality-dependent post-processing N700 component which occurs also without intentional memory maintenance (automatic memory trace or effortless attraction of attention). Encoding to short-term memory may involve controlled attention to modality-dependent post-processing. Similar short-term memory processes may exist in the auditory, motor and visual systems. Copyright © 2010 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  6. Sensorineural hearing loss degrades behavioral and physiological measures of human spatial selective auditory attention

    Science.gov (United States)

    Dai, Lengshi; Best, Virginia; Shinn-Cunningham, Barbara G.

    2018-01-01

    Listeners with sensorineural hearing loss often have trouble understanding speech amid other voices. While poor spatial hearing is often implicated, direct evidence is weak; moreover, studies suggest that reduced audibility and degraded spectrotemporal coding may explain such problems. We hypothesized that poor spatial acuity leads to difficulty deploying selective attention, which normally filters out distracting sounds. In listeners with normal hearing, selective attention causes changes in the neural responses evoked by competing sounds, which can be used to quantify the effectiveness of attentional control. Here, we used behavior and electroencephalography to explore whether control of selective auditory attention is degraded in hearing-impaired (HI) listeners. Normal-hearing (NH) and HI listeners identified a simple melody presented simultaneously with two competing melodies, each simulated from different lateral angles. We quantified performance and attentional modulation of cortical responses evoked by these competing streams. Compared with NH listeners, HI listeners had poorer sensitivity to spatial cues, performed more poorly on the selective attention task, and showed less robust attentional modulation of cortical responses. Moreover, across NH and HI individuals, these measures were correlated. While both groups showed cortical suppression of distracting streams, this modulation was weaker in HI listeners, especially when attending to a target at midline, surrounded by competing streams. These findings suggest that hearing loss interferes with the ability to filter out sound sources based on location, contributing to communication difficulties in social situations. These findings also have implications for technologies aiming to use neural signals to guide hearing aid processing. PMID:29555752

  7. The impact of visual gaze direction on auditory object tracking.

    Science.gov (United States)

    Pomper, Ulrich; Chait, Maria

    2017-07-05

    Subjective experience suggests that we are able to direct our auditory attention independent of our visual gaze, e.g when shadowing a nearby conversation at a cocktail party. But what are the consequences at the behavioural and neural level? While numerous studies have investigated both auditory attention and visual gaze independently, little is known about their interaction during selective listening. In the present EEG study, we manipulated visual gaze independently of auditory attention while participants detected targets presented from one of three loudspeakers. We observed increased response times when gaze was directed away from the locus of auditory attention. Further, we found an increase in occipital alpha-band power contralateral to the direction of gaze, indicative of a suppression of distracting input. Finally, this condition also led to stronger central theta-band power, which correlated with the observed effect in response times, indicative of differences in top-down processing. Our data suggest that a misalignment between gaze and auditory attention both reduce behavioural performance and modulate underlying neural processes. The involvement of central theta-band and occipital alpha-band effects are in line with compensatory neural mechanisms such as increased cognitive control and the suppression of task irrelevant inputs.

  8. Abnormalities in auditory efferent activities in children with selective mutism.

    Science.gov (United States)

    Muchnik, Chava; Ari-Even Roth, Daphne; Hildesheimer, Minka; Arie, Miri; Bar-Haim, Yair; Henkin, Yael

    2013-01-01

    Two efferent feedback pathways to the auditory periphery may play a role in monitoring self-vocalization: the middle-ear acoustic reflex (MEAR) and the medial olivocochlear bundle (MOCB) reflex. Since most studies regarding the role of auditory efferent activity during self-vocalization were conducted in animals, human data are scarce. The working premise of the current study was that selective mutism (SM), a rare psychiatric disorder characterized by consistent failure to speak in specific social situations despite the ability to speak normally in other situations, may serve as a human model for studying the potential involvement of auditory efferent activity during self-vocalization. For this purpose, auditory efferent function was assessed in a group of 31 children with SM and compared to that of a group of 31 normally developing control children (mean age 8.9 and 8.8 years, respectively). All children exhibited normal hearing thresholds and type A tympanograms. MEAR and MOCB functions were evaluated by means of acoustic reflex thresholds and decay functions and the suppression of transient-evoked otoacoustic emissions, respectively. Auditory afferent function was tested by means of auditory brainstem responses (ABR). Results indicated a significantly higher proportion of children with abnormal MEAR and MOCB function in the SM group (58.6 and 38%, respectively) compared to controls (9.7 and 8%, respectively). The prevalence of abnormal MEAR and/or MOCB function was significantly higher in the SM group (71%) compared to controls (16%). Intact afferent function manifested in normal absolute and interpeak latencies of ABR components in all children. The finding of aberrant efferent auditory function in a large proportion of children with SM provides further support for the notion that MEAR and MOCB may play a significant role in the process of self-vocalization. © 2013 S. Karger AG, Basel.

  9. Auditory processing during deep propofol sedation and recovery from unconsciousness.

    Science.gov (United States)

    Koelsch, Stefan; Heinke, Wolfgang; Sammler, Daniela; Olthoff, Derk

    2006-08-01

    Using evoked potentials, this study investigated effects of deep propofol sedation, and effects of recovery from unconsciousness, on the processing of auditory information with stimuli suited to elicit a physical MMN, and a (music-syntactic) ERAN. Levels of sedation were assessed using the Bispectral Index (BIS) and the Modified Observer's Assessment of Alertness and Sedation Scale (MOAAS). EEG-measurements were performed during wakefulness, deep propofol sedation (MOAAS 2-3, mean BIS=68), and a recovery period. Between deep sedation and recovery period, the infusion rate of propofol was increased to achieve unconsciousness (MOAAS 0-1, mean BIS=35); EEG measurements of recovery period were performed after subjects regained consciousness. During deep sedation, the physical MMN was markedly reduced, but still significant. No ERAN was observed in this level. A clear P3a was elicited during deep sedation by those deviants, which were task-relevant during the awake state. As soon as subjects regained consciousness during the recovery period, a normal MMN was elicited. By contrast, the P3a was absent in the recovery period, and the P3b was markedly reduced. Results indicate that the auditory sensory memory (as indexed by the physical MMN) is still active, although strongly reduced, during deep sedation (MOAAS 2-3). The presence of the P3a indicates that attention-related processes are still operating during this level. Processes of syntactic analysis appear to be abolished during deep sedation. After propofol-induced anesthesia, the auditory sensory memory appears to operate normal as soon as subjects regain consciousness, whereas the attention-related processes indexed by P3a and P3b are markedly impaired. Results inform about effects of sedative drugs on auditory and attention-related mechanisms. The findings are important because these mechanisms are prerequisites for auditory awareness, auditory learning and memory, as well as language perception during anesthesia.

  10. Sustained Attention in Auditory and Visual Monitoring Tasks: Evaluation of the Administration of a Rest Break or Exogenous Vibrotactile Signals.

    Science.gov (United States)

    Arrabito, G Robert; Ho, Geoffrey; Aghaei, Behzad; Burns, Catherine; Hou, Ming

    2015-12-01

    Performance and mental workload were observed for the administration of a rest break or exogenous vibrotactile signals in auditory and visual monitoring tasks. Sustained attention is mentally demanding. Techniques are required to improve observer performance in vigilance tasks. Participants (N = 150) monitored an auditory or a visual display for changes in signal duration in a 40-min watch. During the watch, participants were administered a rest break or exogenous vibrotactile signals. Detection accuracy was significantly greater in the auditory than in the visual modality. A short rest break restored detection accuracy in both sensory modalities following deterioration in performance. Participants experienced significantly lower mental workload when monitoring auditory than visual signals, and a rest break significantly reduced mental workload in both sensory modalities. Exogenous vibrotactile signals had no beneficial effects on performance, or mental workload. A rest break can restore performance in auditory and visual vigilance tasks. Although sensory differences in vigilance tasks have been studied, this study is the initial effort to investigate the effects of a rest break countermeasure in both auditory and visual vigilance tasks, and it is also the initial effort to explore the effects of the intervention of a rest break on the perceived mental workload of auditory and visual vigilance tasks. Further research is warranted to determine exact characteristics of effective exogenous vibrotactile signals in vigilance tasks. Potential applications of this research include procedures for decreasing the temporal decline in observer performance and the high mental workload imposed by vigilance tasks. © 2015, Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence.

  11. Influence of auditory spatial attention on cross-modal semantic priming effect: evidence from N400 effect.

    Science.gov (United States)

    Wang, Hongyan; Zhang, Gaoyan; Liu, Baolin

    2017-01-01

    Semantic priming is an important research topic in the field of cognitive neuroscience. Previous studies have shown that the uni-modal semantic priming effect can be modulated by attention. However, the influence of attention on cross-modal semantic priming is unclear. To investigate this issue, the present study combined a cross-modal semantic priming paradigm with an auditory spatial attention paradigm, presenting the visual pictures as the prime stimuli and the semantically related or unrelated sounds as the target stimuli. Event-related potentials results showed that when the target sound was attended to, the N400 effect was evoked. The N400 effect was also observed when the target sound was not attended to, demonstrating that the cross-modal semantic priming effect persists even though the target stimulus is not focused on. Further analyses revealed that the N400 effect evoked by the unattended sound was significantly lower than the effect evoked by the attended sound. This contrast provides new evidence that the cross-modal semantic priming effect can be modulated by attention.

  12. Auditory measures of selective and divided attention in young and older adults using single-talker competition.

    Science.gov (United States)

    Humes, Larry E; Lee, Jae Hee; Coughlin, Maureen P

    2006-11-01

    In this study, two experiments were conducted on auditory selective and divided attention in which the listening task involved the identification of words in sentences spoken by one talker while a second talker produced a very similar competing sentence. Ten young normal-hearing (YNH) and 13 elderly hearing-impaired (EHI) listeners participated in each experiment. The type of attention cue used was the main difference between experiments. Across both experiments, several consistent trends were observed. First, in eight of the nine divided-attention tasks across both experiments, the EHI subjects performed significantly worse than the YNH subjects. By comparison, significant differences in performance between age groups were only observed on three of the nine selective-attention tasks. Finally, there were consistent individual differences in performance across both experiments. Correlational analyses performed on the data from the 13 older adults suggested that the individual differences in performance were associated with individual differences in memory (digit span). Among the elderly, differences in age or differences in hearing loss did not contribute to the individual differences observed in either experiment.

  13. Biomimetic Sonar for Electrical Activation of the Auditory Pathway

    Directory of Open Access Journals (Sweden)

    D. Menniti

    2017-01-01

    Full Text Available Relying on the mechanism of bat’s echolocation system, a bioinspired electronic device has been developed to investigate the cortical activity of mammals in response to auditory sensorial stimuli. By means of implanted electrodes, acoustical information about the external environment generated by a biomimetic system and converted in electrical signals was delivered to anatomically selected structures of the auditory pathway. Electrocorticographic recordings showed that cerebral activity response is highly dependent on the information carried out by ultrasounds and is frequency-locked with the signal repetition rate. Frequency analysis reveals that delta and beta rhythm content increases, suggesting that sensorial information is successfully transferred and integrated. In addition, principal component analysis highlights how all the stimuli generate patterns of neural activity which can be clearly classified. The results show that brain response is modulated by echo signal features suggesting that spatial information sent by biomimetic sonar is efficiently interpreted and encoded by the auditory system. Consequently, these results give new perspective in artificial environmental perception, which could be used for developing new techniques useful in treating pathological conditions or influencing our perception of the surroundings.

  14. Effectiveness of auditory and tactile crossmodal cues in a dual-task visual and auditory scenario.

    Science.gov (United States)

    Hopkins, Kevin; Kass, Steven J; Blalock, Lisa Durrance; Brill, J Christopher

    2017-05-01

    In this study, we examined how spatially informative auditory and tactile cues affected participants' performance on a visual search task while they simultaneously performed a secondary auditory task. Visual search task performance was assessed via reaction time and accuracy. Tactile and auditory cues provided the approximate location of the visual target within the search display. The inclusion of tactile and auditory cues improved performance in comparison to the no-cue baseline conditions. In comparison to the no-cue conditions, both tactile and auditory cues resulted in faster response times in the visual search only (single task) and visual-auditory (dual-task) conditions. However, the effectiveness of auditory and tactile cueing for visual task accuracy was shown to be dependent on task-type condition. Crossmodal cueing remains a viable strategy for improving task performance without increasing attentional load within a singular sensory modality. Practitioner Summary: Crossmodal cueing with dual-task performance has not been widely explored, yet has practical applications. We examined the effects of auditory and tactile crossmodal cues on visual search performance, with and without a secondary auditory task. Tactile cues aided visual search accuracy when also engaged in a secondary auditory task, whereas auditory cues did not.

  15. Comorbidity of Auditory Processing, Language, and Reading Disorders

    Science.gov (United States)

    Sharma, Mridula; Purdy, Suzanne C.; Kelly, Andrea S.

    2009-01-01

    Purpose: The authors assessed comorbidity of auditory processing disorder (APD), language impairment (LI), and reading disorder (RD) in school-age children. Method: Children (N = 68) with suspected APD and nonverbal IQ standard scores of 80 or more were assessed using auditory, language, reading, attention, and memory measures. Auditory processing…

  16. Assessment of children with suspected auditory processing disorder: a factor analysis study.

    Science.gov (United States)

    Ahmmed, Ansar U; Ahmmed, Afsara A; Bath, Julie R; Ferguson, Melanie A; Plack, Christopher J; Moore, David R

    2014-01-01

    To identify the factors that may underlie the deficits in children with listening difficulties, despite normal pure-tone audiograms. These children may have auditory processing disorder (APD), but there is no universally agreed consensus as to what constitutes APD. The authors therefore refer to these children as children with suspected APD (susAPD) and aim to clarify the role of attention, cognition, memory, sensorimotor processing speed, speech, and nonspeech auditory processing in susAPD. It was expected that a factor analysis would show how nonauditory and supramodal factors relate to auditory behavioral measures in such children with susAPD. This would facilitate greater understanding of the nature of listening difficulties, thus further helping with characterizing APD and designing multimodal test batteries to diagnose APD. Factor analysis of outcomes from 110 children (68 male, 42 female; aged 6 to 11 years) with susAPD on a widely used clinical test battery (SCAN-C) and a research test battery (MRC Institute of Hearing Research Multi-center Auditory Processing "IMAP"), that have age-based normative data. The IMAP included backward masking, simultaneous masking, frequency discrimination, nonverbal intelligence, working memory, reading, alerting attention and motor reaction times to auditory and visual stimuli. SCAN-C included monaural low-redundancy speech (auditory closure and speech in noise) and dichotic listening tests (competing words and competing sentences) that assess divided auditory attention and hence executive attention. Three factors were extracted: "general auditory processing," "working memory and executive attention," and "processing speed and alerting attention." Frequency discrimination, backward masking, simultaneous masking, and monaural low-redundancy speech tests represented the "general auditory processing" factor. Dichotic listening and the IMAP cognitive tests (apart from nonverbal intelligence) were represented in the "working

  17. Pilot feasibility study of binaural auditory beats for reducing symptoms of inattention in children and adolescents with attention-deficit/hyperactivity disorder.

    Science.gov (United States)

    Kennel, Susan; Taylor, Ann Gill; Lyon, Debra; Bourguignon, Cheryl

    2010-02-01

    The purpose of this pilot study was to explore the potential for the use of binaural auditory beat stimulation to reduce the symptom of inattention in children and adolescents with attention-deficit/hyperactivity disorder. This pilot study had a randomized, double-blind, placebo-controlled design. Twenty participants were randomly assigned to listen to either an audio program on compact disk that contained binaural auditory beats or a sham audio program that did not have binaural beats for 20 minutes, three times a week for 3 weeks. The Children's Color Trails Test, the Color Trails Test, the Test of Variables of Attention (TOVA), and the Homework Problem Checklist were used to measure changes in inattention pre- and postintervention. Repeated measures analysis of variance was used to analyze pre- and postintervention scores on the Color Trails Tests, Homework Problem Checklist, and the TOVA. The effect of time was significant on the Color Trails Test. However, there were no significant group differences on the Color Trails Test or the TOVA scores postintervention. Parents reported that the study participants had fewer homework problems postintervention. The results from this study indicate that binaural auditory beat stimulation did not significantly reduce the symptom of inattention in the experimental group. However, parents and adolescents stated that homework problems due to inattention improved during the 3-week study. Parents and participants stated that the modality was easy to use and helpful. Therefore, this modality should be studied over a longer time frame in a larger sample to further its effectiveness to reduce the symptom of inattention in those diagnosed with attention-deficit/hyperactivity disorder. Copyright 2010 Elsevier Inc. All rights reserved.

  18. Using auditory pre-information to solve the cocktail-party problem: electrophysiological evidence for age-specific differences.

    Science.gov (United States)

    Getzmann, Stephan; Lewald, Jörg; Falkenstein, Michael

    2014-01-01

    Speech understanding in complex and dynamic listening environments requires (a) auditory scene analysis, namely auditory object formation and segregation, and (b) allocation of the attentional focus to the talker of interest. There is evidence that pre-information is actively used to facilitate these two aspects of the so-called "cocktail-party" problem. Here, a simulated multi-talker scenario was combined with electroencephalography to study scene analysis and allocation of attention in young and middle-aged adults. Sequences of short words (combinations of brief company names and stock-price values) from four talkers at different locations were simultaneously presented, and the detection of target names and the discrimination between critical target values were assessed. Immediately prior to speech sequences, auditory pre-information was provided via cues that either prepared auditory scene analysis or attentional focusing, or non-specific pre-information was given. While performance was generally better in younger than older participants, both age groups benefited from auditory pre-information. The analysis of the cue-related event-related potentials revealed age-specific differences in the use of pre-cues: Younger adults showed a pronounced N2 component, suggesting early inhibition of concurrent speech stimuli; older adults exhibited a stronger late P3 component, suggesting increased resource allocation to process the pre-information. In sum, the results argue for an age-specific utilization of auditory pre-information to improve listening in complex dynamic auditory environments.

  19. Using auditory pre-information to solve the cocktail-party problem: electrophysiological evidence for age-specific differences

    Directory of Open Access Journals (Sweden)

    Stephan eGetzmann

    2014-12-01

    Full Text Available Speech understanding in complex and dynamic listening environments requires (a auditory scene analysis, namely auditory object formation and segregation, and (b allocation of the attentional focus to the talker of interest. There is evidence that pre-information is actively used to facilitate these two aspects of the so-called cocktail-party problem. Here, a simulated multi-talker scenario was combined with electroencephalography to study scene analysis and allocation of attention in young and middle-aged adults. Sequences of short words (combinations of brief company names and stock-price values from four talkers at different locations were simultaneously presented, and the detection of target names and the discrimination between critical target values were assessed. Immediately prior to speech sequences, auditory pre-information was provided via cues that either prepared auditory scene analysis or attentional focusing, or non-specific pre-information was given. While performance was generally better in younger than older participants, both age groups benefited from auditory pre-information. The analysis of the cue-related event-related potentials revealed age-specific differences in the use of pre-cues: Younger adults showed a pronounced N2 component, suggesting early inhibition of concurrent speech stimuli; older adults exhibited a stronger late P3 component, suggesting increased resource allocation to process the pre-information. In sum, the results argue for an age-specific utilization of auditory pre-information to improve listening in complex dynamic auditory environments.

  20. Intentional switching in auditory selective attention: Exploring age-related effects in a spatial setup requiring speech perception.

    Science.gov (United States)

    Oberem, Josefa; Koch, Iring; Fels, Janina

    2017-06-01

    Using a binaural-listening paradigm, age-related differences in the ability to intentionally switch auditory selective attention between two speakers, defined by their spatial location, were examined. Therefore 40 normal-hearing participants (20 young, Ø 24.8years; 20 older Ø 67.8years) were tested. The spatial reproduction of stimuli was provided by headphones using head-related-transfer-functions of an artificial head. Spoken number words of two speakers were presented simultaneously to participants from two out of eight locations on the horizontal plane. Guided by a visual cue indicating the spatial location of the target speaker, the participants were asked to categorize the target's number word into smaller vs. greater than five while ignoring the distractor's speech. Results showed significantly higher reaction times and error rates for older participants. The relative influence of the spatial switch of the target-speaker (switch or repetition of speaker's direction in space) was identical across age groups. Congruency effects (stimuli spoken by target and distractor may evoke the same answer or different answers) were increased for older participants and depend on the target's position. Results suggest that the ability to intentionally switch auditory attention to a new cued location was unimpaired whereas it was generally harder for older participants to suppress processing the distractor's speech. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Attention to memory: orienting attention to sound object representations.

    Science.gov (United States)

    Backer, Kristina C; Alain, Claude

    2014-01-01

    Despite a growing acceptance that attention and memory interact, and that attention can be focused on an active internal mental representation (i.e., reflective attention), there has been a paucity of work focusing on reflective attention to 'sound objects' (i.e., mental representations of actual sound sources in the environment). Further research on the dynamic interactions between auditory attention and memory, as well as its degree of neuroplasticity, is important for understanding how sound objects are represented, maintained, and accessed in the brain. This knowledge can then guide the development of training programs to help individuals with attention and memory problems. This review article focuses on attention to memory with an emphasis on behavioral and neuroimaging studies that have begun to explore the mechanisms that mediate reflective attentional orienting in vision and more recently, in audition. Reflective attention refers to situations in which attention is oriented toward internal representations rather than focused on external stimuli. We propose four general principles underlying attention to short-term memory. Furthermore, we suggest that mechanisms involved in orienting attention to visual object representations may also apply for orienting attention to sound object representations.

  2. Psychopathic traits associated with abnormal hemodynamic activity in salience and default mode networks during auditory oddball task.

    Science.gov (United States)

    Anderson, Nathaniel E; Maurer, J Michael; Steele, Vaughn R; Kiehl, Kent A

    2018-06-01

    Psychopathy is a personality disorder accompanied by abnormalities in emotional processing and attention. Recent theoretical applications of network-based models of cognition have been used to explain the diverse range of abnormalities apparent in psychopathy. Still, the physiological basis for these abnormalities is not well understood. A significant body of work has examined psychopathy-related abnormalities in simple attention-based tasks, but these studies have largely been performed using electrocortical measures, such as event-related potentials (ERPs), and they often have been carried out among individuals with low levels of psychopathic traits. In this study, we examined neural activity during an auditory oddball task using functional magnetic resonance imaging (fMRI) during a simple auditory target detection (oddball) task among 168 incarcerated adult males, with psychopathic traits assessed via the Hare Psychopathy Checklist-Revised (PCL-R). Event-related contrasts demonstrated that the largest psychopathy-related effects were apparent between the frequent standard stimulus condition and a task-off, implicit baseline. Negative correlations with interpersonal-affective dimensions (Factor 1) of the PCL-R were apparent in regions comprising default mode and salience networks. These findings support models of psychopathy describing impaired integration across functional networks. They additionally corroborate reports which have implicated failures of efficient transition between default mode and task-positive networks. Finally, they demonstrate a neurophysiological basis for abnormal mobilization of attention and reduced engagement with stimuli that have little motivational significance among those with high psychopathic traits.

  3. Neuronal activity in primate auditory cortex during the performance of audiovisual tasks.

    Science.gov (United States)

    Brosch, Michael; Selezneva, Elena; Scheich, Henning

    2015-03-01

    This study aimed at a deeper understanding of which cognitive and motivational aspects of tasks affect auditory cortical activity. To this end we trained two macaque monkeys to perform two different tasks on the same audiovisual stimulus and to do this with two different sizes of water rewards. The monkeys had to touch a bar after a tone had been turned on together with an LED, and to hold the bar until either the tone (auditory task) or the LED (visual task) was turned off. In 399 multiunits recorded from core fields of auditory cortex we confirmed that during task engagement neurons responded to auditory and non-auditory stimuli that were task-relevant, such as light and water. We also confirmed that firing rates slowly increased or decreased for several seconds during various phases of the tasks. Responses to non-auditory stimuli and slow firing changes were observed during both the auditory and the visual task, with some differences between them. There was also a weak task-dependent modulation of the responses to auditory stimuli. In contrast to these cognitive aspects, motivational aspects of the tasks were not reflected in the firing, except during delivery of the water reward. In conclusion, the present study supports our previous proposal that there are two response types in the auditory cortex that represent the timing and the type of auditory and non-auditory elements of a auditory tasks as well the association between elements. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  4. Effect of attentional load on audiovisual speech perception: Evidence from ERPs

    Directory of Open Access Journals (Sweden)

    Agnès eAlsius

    2014-07-01

    Full Text Available Seeing articulatory movements influences perception of auditory speech. This is often reflected in a shortened latency of auditory event-related potentials (ERPs generated in the auditory cortex. The present study addressed whether this early neural correlate of audiovisual interaction is modulated by attention. We recorded ERPs in 15 subjects while they were presented with auditory, visual and audiovisual spoken syllables. Audiovisual stimuli consisted of incongruent auditory and visual components known to elicit a McGurk effect, i.e. a visually driven alteration in the auditory speech percept. In a Dual task condition, participants were asked to identify spoken syllables whilst monitoring a rapid visual stream of pictures for targets, i.e., they had to divide their attention. In a Single task condition, participants identified the syllables without any other tasks, i.e., they were asked to ignore the pictures and focus their attention fully on the spoken syllables. The McGurk effect was weaker in the Dual task than in the Single task condition, indicating an effect of attentional load on audiovisual speech perception. Early auditory ERP components, N1 and P2, peaked earlier to audiovisual stimuli than to auditory stimuli when attention was fully focused on syllables, indicating neurophysiological audiovisual interaction. This latency decrement was reduced when attention was loaded, suggesting that attention influences early neural processing of audiovisual speech. We conclude that reduced attention weakens the interaction between vision and audition in speech.

  5. Effect of attentional load on audiovisual speech perception: evidence from ERPs.

    Science.gov (United States)

    Alsius, Agnès; Möttönen, Riikka; Sams, Mikko E; Soto-Faraco, Salvador; Tiippana, Kaisa

    2014-01-01

    Seeing articulatory movements influences perception of auditory speech. This is often reflected in a shortened latency of auditory event-related potentials (ERPs) generated in the auditory cortex. The present study addressed whether this early neural correlate of audiovisual interaction is modulated by attention. We recorded ERPs in 15 subjects while they were presented with auditory, visual, and audiovisual spoken syllables. Audiovisual stimuli consisted of incongruent auditory and visual components known to elicit a McGurk effect, i.e., a visually driven alteration in the auditory speech percept. In a Dual task condition, participants were asked to identify spoken syllables whilst monitoring a rapid visual stream of pictures for targets, i.e., they had to divide their attention. In a Single task condition, participants identified the syllables without any other tasks, i.e., they were asked to ignore the pictures and focus their attention fully on the spoken syllables. The McGurk effect was weaker in the Dual task than in the Single task condition, indicating an effect of attentional load on audiovisual speech perception. Early auditory ERP components, N1 and P2, peaked earlier to audiovisual stimuli than to auditory stimuli when attention was fully focused on syllables, indicating neurophysiological audiovisual interaction. This latency decrement was reduced when attention was loaded, suggesting that attention influences early neural processing of audiovisual speech. We conclude that reduced attention weakens the interaction between vision and audition in speech.

  6. Unimodal and crossmodal gradients of spatial attention

    DEFF Research Database (Denmark)

    Föcker, J.; Hötting, K.; Gondan, Matthias

    2010-01-01

    Behavioral and event-related potential (ERP) studies have shown that spatial attention is gradually distributed around the center of the attentional focus. The present study compared uni- and crossmodal gradients of spatial attention to investigate whether the orienting of auditory and visual...... spatial attention is based on modality specific or supramodal representations of space. Auditory and visual stimuli were presented from five speaker locations positioned in the right hemifield. Participants had to attend to the innermost or outmost right position in order to detect either visual...... or auditory deviant stimuli. Detection rates and event-related potentials (ERPs) indicated that spatial attention is distributed as a gradient. Unimodal spatial ERP gradients correlated with the spatial resolution of the modality. Crossmodal spatial gradients were always broader than the corresponding...

  7. Modulating the Focus of Attention for Spoken Words at Encoding Affects Frontoparietal Activation for Incidental Verbal Memory

    Directory of Open Access Journals (Sweden)

    Thomas A. Christensen

    2012-01-01

    Full Text Available Attention is crucial for encoding information into memory, and current dual-process models seek to explain the roles of attention in both recollection memory and incidental-perceptual memory processes. The present study combined an incidental memory paradigm with event-related functional MRI to examine the effect of attention at encoding on the subsequent neural activation associated with unintended perceptual memory for spoken words. At encoding, we systematically varied attention levels as listeners heard a list of single English nouns. We then presented these words again in the context of a recognition task and assessed the effect of modulating attention at encoding on the BOLD responses to words that were either attended strongly, weakly, or not heard previously. MRI revealed activity in right-lateralized inferior parietal and prefrontal regions, and positive BOLD signals varied with the relative level of attention present at encoding. Temporal analysis of hemodynamic responses further showed that the time course of BOLD activity was modulated differentially by unintentionally encoded words compared to novel items. Our findings largely support current models of memory consolidation and retrieval, but they also provide fresh evidence for hemispheric differences and functional subdivisions in right frontoparietal attention networks that help shape auditory episodic recall.

  8. Modulating the focus of attention for spoken words at encoding affects frontoparietal activation for incidental verbal memory.

    Science.gov (United States)

    Christensen, Thomas A; Almryde, Kyle R; Fidler, Lesley J; Lockwood, Julie L; Antonucci, Sharon M; Plante, Elena

    2012-01-01

    Attention is crucial for encoding information into memory, and current dual-process models seek to explain the roles of attention in both recollection memory and incidental-perceptual memory processes. The present study combined an incidental memory paradigm with event-related functional MRI to examine the effect of attention at encoding on the subsequent neural activation associated with unintended perceptual memory for spoken words. At encoding, we systematically varied attention levels as listeners heard a list of single English nouns. We then presented these words again in the context of a recognition task and assessed the effect of modulating attention at encoding on the BOLD responses to words that were either attended strongly, weakly, or not heard previously. MRI revealed activity in right-lateralized inferior parietal and prefrontal regions, and positive BOLD signals varied with the relative level of attention present at encoding. Temporal analysis of hemodynamic responses further showed that the time course of BOLD activity was modulated differentially by unintentionally encoded words compared to novel items. Our findings largely support current models of memory consolidation and retrieval, but they also provide fresh evidence for hemispheric differences and functional subdivisions in right frontoparietal attention networks that help shape auditory episodic recall.

  9. Auditory and audio-visual processing in patients with cochlear, auditory brainstem, and auditory midbrain implants: An EEG study.

    Science.gov (United States)

    Schierholz, Irina; Finke, Mareike; Kral, Andrej; Büchner, Andreas; Rach, Stefan; Lenarz, Thomas; Dengler, Reinhard; Sandmann, Pascale

    2017-04-01

    There is substantial variability in speech recognition ability across patients with cochlear implants (CIs), auditory brainstem implants (ABIs), and auditory midbrain implants (AMIs). To better understand how this variability is related to central processing differences, the current electroencephalography (EEG) study compared hearing abilities and auditory-cortex activation in patients with electrical stimulation at different sites of the auditory pathway. Three different groups of patients with auditory implants (Hannover Medical School; ABI: n = 6, CI: n = 6; AMI: n = 2) performed a speeded response task and a speech recognition test with auditory, visual, and audio-visual stimuli. Behavioral performance and cortical processing of auditory and audio-visual stimuli were compared between groups. ABI and AMI patients showed prolonged response times on auditory and audio-visual stimuli compared with NH listeners and CI patients. This was confirmed by prolonged N1 latencies and reduced N1 amplitudes in ABI and AMI patients. However, patients with central auditory implants showed a remarkable gain in performance when visual and auditory input was combined, in both speech and non-speech conditions, which was reflected by a strong visual modulation of auditory-cortex activation in these individuals. In sum, the results suggest that the behavioral improvement for audio-visual conditions in central auditory implant patients is based on enhanced audio-visual interactions in the auditory cortex. Their findings may provide important implications for the optimization of electrical stimulation and rehabilitation strategies in patients with central auditory prostheses. Hum Brain Mapp 38:2206-2225, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  10. The changes in relation of auditory and visual input activity between hemispheres analized in cartographic EEG in a child with hyperactivity syndrome

    Directory of Open Access Journals (Sweden)

    Radičević Zoran

    2015-01-01

    Full Text Available The paper discusses the changes in relations of visual and auditory inputs between the hemispheres in a child with hyperactive syndrome and its effects which may lead to better attention engagement in auditory and visual information analysis. The method included the use of cartographic EEG and clinical procedure in a 10-year-old boy with hyperactive syndrome and attention deficit disorder, who has theta dysfunction manifested in standard EEG. Cartographic EEG patterns was performed on NihonKohden Corporation, EEG - 1200K Neurofax apparatus in longitudinal bipolar electrode assembly schedule by utilizing10/20 International electrode positioning. Impedance was maintained below 5 kΩ, with not more than 1 kΩ differences between the electrodes. Lower filter was set at 0.53 Hz and higher filter at 35 Hz. Recording was performed in a quiet period and during stimulation procedures that include speech and language basis. Standard EEG and Neurofeedback (NFB treatment indicated higher theta load, alpha 2 and beta 1 activity measured in the cartographic EEG which was done after the relative failure of NFB treatment. After this, the NFB treatment was applied which lasted for six months, in a way that when the boy was reading, the visual input was enhanced to the left hemisphere and auditory input was reduced to the right hemisphere. Repeated EEG mapping analysis showed that there was a significant improvement, both in EEG findings as well as in attention, behavioural and learning disorders. The paper discusses some aspects of learning, attention and behaviour in relation to changes in the standard EEG, especially in cartographic EEG and NFB findings.

  11. Development of the auditory system

    Science.gov (United States)

    Litovsky, Ruth

    2015-01-01

    Auditory development involves changes in the peripheral and central nervous system along the auditory pathways, and these occur naturally, and in response to stimulation. Human development occurs along a trajectory that can last decades, and is studied using behavioral psychophysics, as well as physiologic measurements with neural imaging. The auditory system constructs a perceptual space that takes information from objects and groups, segregates sounds, and provides meaning and access to communication tools such as language. Auditory signals are processed in a series of analysis stages, from peripheral to central. Coding of information has been studied for features of sound, including frequency, intensity, loudness, and location, in quiet and in the presence of maskers. In the latter case, the ability of the auditory system to perform an analysis of the scene becomes highly relevant. While some basic abilities are well developed at birth, there is a clear prolonged maturation of auditory development well into the teenage years. Maturation involves auditory pathways. However, non-auditory changes (attention, memory, cognition) play an important role in auditory development. The ability of the auditory system to adapt in response to novel stimuli is a key feature of development throughout the nervous system, known as neural plasticity. PMID:25726262

  12. Widespread auditory deficits in tune deafness.

    Science.gov (United States)

    Jones, Jennifer L; Zalewski, Christopher; Brewer, Carmen; Lucker, Jay; Drayna, Dennis

    2009-02-01

    The goal of this study was to investigate auditory function in individuals with deficits in musical pitch perception. We hypothesized that such individuals have deficits in nonspeech areas of auditory processing. We screened 865 randomly selected individuals to identify those who scored poorly on the Distorted Tunes test (DTT), a measure of musical pitch recognition ability. Those who scored poorly were given a comprehensive audiologic examination, and those with hearing loss or other confounding audiologic factors were excluded from further testing. Thirty-five individuals with tune deafness constituted the experimental group. Thirty-four individuals with normal hearing and normal DTT scores, matched for age, gender, handedness, and education, and without overt or reported psychiatric disorders made up the normal control group. Individual and group performance for pure-tone frequency discrimination at 1000 Hz was determined by measuring the difference limen for frequency (DLF). Auditory processing abilities were assessed using tests of pitch pattern recognition, duration pattern recognition, and auditory gap detection. In addition, we evaluated both attention and short- and long-term memory as variables that might influence performance on our experimental measures. Differences between groups were evaluated statistically using Wilcoxon nonparametric tests and t-tests as appropriate. The DLF at 1000 Hz in the group with tune deafness was significantly larger than that of the normal control group. However, approximately one-third of participants with tune deafness had DLFs within the range of performance observed in the control group. Many individuals with tune deafness also displayed a high degree of variability in their intertrial frequency discrimination performance that could not be explained by deficits in memory or attention. Pitch and duration pattern discrimination and auditory gap-detection ability were significantly poorer in the group with tune deafness

  13. The role of visual spatial attention in audiovisual speech perception

    DEFF Research Database (Denmark)

    Andersen, Tobias; Tiippana, K.; Laarni, J.

    2009-01-01

    Auditory and visual information is integrated when perceiving speech, as evidenced by the McGurk effect in which viewing an incongruent talking face categorically alters auditory speech perception. Audiovisual integration in speech perception has long been considered automatic and pre-attentive b......Auditory and visual information is integrated when perceiving speech, as evidenced by the McGurk effect in which viewing an incongruent talking face categorically alters auditory speech perception. Audiovisual integration in speech perception has long been considered automatic and pre...... from each of the faces and from the voice on the auditory speech percept. We found that directing visual spatial attention towards a face increased the influence of that face on auditory perception. However, the influence of the voice on auditory perception did not change suggesting that audiovisual...... integration did not change. Visual spatial attention was also able to select between the faces when lip reading. This suggests that visual spatial attention acts at the level of visual speech perception prior to audiovisual integration and that the effect propagates through audiovisual integration...

  14. Cross-modal activation of auditory regions during visuo-spatial working memory in early deafness.

    Science.gov (United States)

    Ding, Hao; Qin, Wen; Liang, Meng; Ming, Dong; Wan, Baikun; Li, Qiang; Yu, Chunshui

    2015-09-01

    Early deafness can reshape deprived auditory regions to enable the processing of signals from the remaining intact sensory modalities. Cross-modal activation has been observed in auditory regions during non-auditory tasks in early deaf subjects. In hearing subjects, visual working memory can evoke activation of the visual cortex, which further contributes to behavioural performance. In early deaf subjects, however, whether and how auditory regions participate in visual working memory remains unclear. We hypothesized that auditory regions may be involved in visual working memory processing and activation of auditory regions may contribute to the superior behavioural performance of early deaf subjects. In this study, 41 early deaf subjects (22 females and 19 males, age range: 20-26 years, age of onset of deafness memory task than did the hearing controls. Compared with hearing controls, deaf subjects exhibited increased activation in the superior temporal gyrus bilaterally during the recognition stage. This increased activation amplitude predicted faster and more accurate working memory performance in deaf subjects. Deaf subjects also had increased activation in the superior temporal gyrus bilaterally during the maintenance stage and in the right superior temporal gyrus during the encoding stage. These increased activation amplitude also predicted faster reaction times on the spatial working memory task in deaf subjects. These findings suggest that cross-modal plasticity occurs in auditory association areas in early deaf subjects. These areas are involved in visuo-spatial working memory. Furthermore, amplitudes of cross-modal activation during the maintenance stage were positively correlated with the age of onset of hearing aid use and were negatively correlated with the percentage of lifetime hearing aid use in deaf subjects. These findings suggest that earlier and longer hearing aid use may inhibit cross-modal reorganization in early deaf subjects. Granger

  15. Perception of parents about the auditory attention skills of his kid with cleft lip and palate: retrospective study

    Directory of Open Access Journals (Sweden)

    Mondelli, Maria Fernanda Capoani Garcia

    2012-01-01

    Full Text Available Introduction: To process and decode the acoustic stimulation are necessary cognitive and neurophysiological mechanisms. The hearing stimulation is influenced by cognitive factor from the highest levels, such as the memory, attention and learning. The sensory deprivation caused by hearing loss from the conductive type, frequently in population with cleft lip and palate, can affect many cognitive functions - among them the attention, besides harm the school performance, linguistic and interpersonal. Objective: Verify the perception of the parents of children with cleft lip and palate about the hearing attention of their kids. Method: Retrospective study of infants with any type of cleft lip and palate, without any genetic syndrome associate which parents answered a relevant questionnaire about the auditory attention skills. Results: 44 are from the male kind and 26 from the female kind, 35,71% of the answers were affirmative for the hearing loss and 71,43% to otologic infections. Conclusion: Most of the interviewed parents pointed at least one of the behaviors related to attention contained in the questionnaire, indicating that the presence of cleft lip and palate can be related to difficulties in hearing attention.

  16. The impact of auditory working memory training on the fronto-parietal working memory network.

    Science.gov (United States)

    Schneiders, Julia A; Opitz, Bertram; Tang, Huijun; Deng, Yuan; Xie, Chaoxiang; Li, Hong; Mecklinger, Axel

    2012-01-01

    Working memory training has been widely used to investigate working memory processes. We have shown previously that visual working memory benefits only from intra-modal visual but not from across-modal auditory working memory training. In the present functional magnetic resonance imaging study we examined whether auditory working memory processes can also be trained specifically and which training-induced activation changes accompany theses effects. It was investigated whether working memory training with strongly distinct auditory materials transfers exclusively to an auditory (intra-modal) working memory task or whether it generalizes to a (across-modal) visual working memory task. We used adaptive n-back training with tonal sequences and a passive control condition. The memory training led to a reliable training gain. Transfer effects were found for the (intra-modal) auditory but not for the (across-modal) visual transfer task. Training-induced activation decreases in the auditory transfer task were found in two regions in the right inferior frontal gyrus. These effects confirm our previous findings in the visual modality and extents intra-modal effects in the prefrontal cortex to the auditory modality. As the right inferior frontal gyrus is frequently found in maintaining modality-specific auditory information, these results might reflect increased neural efficiency in auditory working memory processes. Furthermore, task-unspecific (amodal) activation decreases in the visual and auditory transfer task were found in the right inferior parietal lobule and the superior portion of the right middle frontal gyrus reflecting less demand on general attentional control processes. These data are in good agreement with amodal activation decreases within the same brain regions on a visual transfer task reported previously.

  17. The impact of auditory working memory training on the fronto-parietal working memory network

    Science.gov (United States)

    Schneiders, Julia A.; Opitz, Bertram; Tang, Huijun; Deng, Yuan; Xie, Chaoxiang; Li, Hong; Mecklinger, Axel

    2012-01-01

    Working memory training has been widely used to investigate working memory processes. We have shown previously that visual working memory benefits only from intra-modal visual but not from across-modal auditory working memory training. In the present functional magnetic resonance imaging study we examined whether auditory working memory processes can also be trained specifically and which training-induced activation changes accompany theses effects. It was investigated whether working memory training with strongly distinct auditory materials transfers exclusively to an auditory (intra-modal) working memory task or whether it generalizes to a (across-modal) visual working memory task. We used adaptive n-back training with tonal sequences and a passive control condition. The memory training led to a reliable training gain. Transfer effects were found for the (intra-modal) auditory but not for the (across-modal) visual transfer task. Training-induced activation decreases in the auditory transfer task were found in two regions in the right inferior frontal gyrus. These effects confirm our previous findings in the visual modality and extents intra-modal effects in the prefrontal cortex to the auditory modality. As the right inferior frontal gyrus is frequently found in maintaining modality-specific auditory information, these results might reflect increased neural efficiency in auditory working memory processes. Furthermore, task-unspecific (amodal) activation decreases in the visual and auditory transfer task were found in the right inferior parietal lobule and the superior portion of the right middle frontal gyrus reflecting less demand on general attentional control processes. These data are in good agreement with amodal activation decreases within the same brain regions on a visual transfer task reported previously. PMID:22701418

  18. Rendering visual events as sounds: Spatial attention capture by auditory augmented reality.

    Science.gov (United States)

    Stone, Scott A; Tata, Matthew S

    2017-01-01

    Many salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial tasks such as localization. However not all visual information has analogous auditory percepts, such as viewing a computer monitor. Here, we describe a system capable of detecting and augmenting visual salient events into localizable auditory events. The system uses a neuromorphic camera (DAVIS 240B) to detect logarithmic changes of brightness intensity in the scene, which can be interpreted as salient visual events. Participants were blindfolded and asked to use the device to detect new objects in the scene, as well as determine direction of motion for a moving visual object. Results suggest the system is robust enough to allow for the simple detection of new salient stimuli, as well accurately encoding direction of visual motion. Future successes are probable as neuromorphic devices are likely to become faster and smaller in the future, making this system much more feasible.

  19. Rendering visual events as sounds: Spatial attention capture by auditory augmented reality.

    Directory of Open Access Journals (Sweden)

    Scott A Stone

    Full Text Available Many salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial tasks such as localization. However not all visual information has analogous auditory percepts, such as viewing a computer monitor. Here, we describe a system capable of detecting and augmenting visual salient events into localizable auditory events. The system uses a neuromorphic camera (DAVIS 240B to detect logarithmic changes of brightness intensity in the scene, which can be interpreted as salient visual events. Participants were blindfolded and asked to use the device to detect new objects in the scene, as well as determine direction of motion for a moving visual object. Results suggest the system is robust enough to allow for the simple detection of new salient stimuli, as well accurately encoding direction of visual motion. Future successes are probable as neuromorphic devices are likely to become faster and smaller in the future, making this system much more feasible.

  20. The Influence of Selective and Divided Attention on Audiovisual Integration in Children.

    Science.gov (United States)

    Yang, Weiping; Ren, Yanna; Yang, Dan Ou; Yuan, Xue; Wu, Jinglong

    2016-01-24

    This article aims to investigate whether there is a difference in audiovisual integration in school-aged children (aged 6 to 13 years; mean age = 9.9 years) between the selective attention condition and divided attention condition. We designed a visual and/or auditory detection task that included three blocks (divided attention, visual-selective attention, and auditory-selective attention). The results showed that the response to bimodal audiovisual stimuli was faster than to unimodal auditory or visual stimuli under both divided attention and auditory-selective attention conditions. However, in the visual-selective attention condition, no significant difference was found between the unimodal visual and bimodal audiovisual stimuli in response speed. Moreover, audiovisual behavioral facilitation effects were compared between divided attention and selective attention (auditory or visual attention). In doing so, we found that audiovisual behavioral facilitation was significantly difference between divided attention and selective attention. The results indicated that audiovisual integration was stronger in the divided attention condition than that in the selective attention condition in children. Our findings objectively support the notion that attention can modulate audiovisual integration in school-aged children. Our study might offer a new perspective for identifying children with conditions that are associated with sustained attention deficit, such as attention-deficit hyperactivity disorder. © The Author(s) 2016.

  1. The Impact of Auditory Working Memory Training on the Fronto-Parietal Working Memory Network

    Directory of Open Access Journals (Sweden)

    Julia eSchneiders

    2012-06-01

    Full Text Available Working memory training has been widely used to investigate working memory processes. We have shown previously that visual working memory benefits only from intra-modal visual but not from across-modal auditory working memory training. In the present functional magnetic resonance imaging study we examined whether auditory working memory processes can also be trained specifically and which training-induced activation changes accompany theses effects. It was investigated whether working memory training with strongly distinct auditory materials transfers exclusively to an auditory (intra-modal working memory task or whether it generalizes to an (across-modal visual working memory task. We used an adaptive n-back training with tonal sequences and a passive control condition. The memory training led to a reliable training gain. Transfer effects were found for the (intra-modal auditory but not for the (across-modal visual 2-back task. Training-induced activation changes in the auditory 2-back task were found in two regions in the right inferior frontal gyrus. These effects confirm our previous findings in the visual modality and extends intra-modal effects to the auditory modality. These results might reflect increased neural efficiency in auditory working memory processes as in the right inferior frontal gyrus is frequently found in maintaining modality-specific auditory information. By this, these effects are analogical to the activation decreases in the right middle frontal gyrus for the visual modality in our previous study. Furthermore, task-unspecific (across-modal activation decreases in the visual and auditory 2-back task were found in the right inferior parietal lobule and the superior portion of the right middle frontal gyrus reflecting less demands on general attentional control processes. These data are in good agreement with across-modal activation decreases within the same brain regions on a visual 2-back task reported previously.

  2. Auditory working memory load impairs visual ventral stream processing: toward a unified model of attentional load.

    Science.gov (United States)

    Klemen, Jane; Büchel, Christian; Bühler, Mira; Menz, Mareike M; Rose, Michael

    2010-03-01

    Attentional interference between tasks performed in parallel is known to have strong and often undesired effects. As yet, however, the mechanisms by which interference operates remain elusive. A better knowledge of these processes may facilitate our understanding of the effects of attention on human performance and the debilitating consequences that disruptions to attention can have. According to the load theory of cognitive control, processing of task-irrelevant stimuli is increased by attending in parallel to a relevant task with high cognitive demands. This is due to the relevant task engaging cognitive control resources that are, hence, unavailable to inhibit the processing of task-irrelevant stimuli. However, it has also been demonstrated that a variety of types of load (perceptual and emotional) can result in a reduction of the processing of task-irrelevant stimuli, suggesting a uniform effect of increased load irrespective of the type of load. In the present study, we concurrently presented a relevant auditory matching task [n-back working memory (WM)] of low or high cognitive load (1-back or 2-back WM) and task-irrelevant images at one of three object visibility levels (0%, 50%, or 100%). fMRI activation during the processing of the task-irrelevant visual stimuli was measured in the lateral occipital cortex and found to be reduced under high, compared to low, WM load. In combination with previous findings, this result is suggestive of a more generalized load theory, whereby cognitive load, as well as other types of load (e.g., perceptual), can result in a reduction of the processing of task-irrelevant stimuli, in line with a uniform effect of increased load irrespective of the type of load.

  3. Neurophysiological evidence for context-dependent encoding of sensory input in human auditory cortex.

    Science.gov (United States)

    Sussman, Elyse; Steinschneider, Mitchell

    2006-02-23

    Attention biases the way in which sound information is stored in auditory memory. Little is known, however, about the contribution of stimulus-driven processes in forming and storing coherent sound events. An electrophysiological index of cortical auditory change detection (mismatch negativity [MMN]) was used to assess whether sensory memory representations could be biased toward one organization over another (one or two auditory streams) without attentional control. Results revealed that sound representations held in sensory memory biased the organization of subsequent auditory input. The results demonstrate that context-dependent sound representations modulate stimulus-dependent neural encoding at early stages of auditory cortical processing.

  4. Active listening: task-dependent plasticity of spectrotemporal receptive fields in primary auditory cortex.

    Science.gov (United States)

    Fritz, Jonathan; Elhilali, Mounya; Shamma, Shihab

    2005-08-01

    Listening is an active process in which attentive focus on salient acoustic features in auditory tasks can influence receptive field properties of cortical neurons. Recent studies showing rapid task-related changes in neuronal spectrotemporal receptive fields (STRFs) in primary auditory cortex of the behaving ferret are reviewed in the context of current research on cortical plasticity. Ferrets were trained on spectral tasks, including tone detection and two-tone discrimination, and on temporal tasks, including gap detection and click-rate discrimination. STRF changes could be measured on-line during task performance and occurred within minutes of task onset. During spectral tasks, there were specific spectral changes (enhanced response to tonal target frequency in tone detection and discrimination, suppressed response to tonal reference frequency in tone discrimination). However, only in the temporal tasks, the STRF was changed along the temporal dimension by sharpening temporal dynamics. In ferrets trained on multiple tasks, distinctive and task-specific STRF changes could be observed in the same cortical neurons in successive behavioral sessions. These results suggest that rapid task-related plasticity is an ongoing process that occurs at a network and single unit level as the animal switches between different tasks and dynamically adapts cortical STRFs in response to changing acoustic demands.

  5. Functional Imaging of Human Vestibular Cortex Activity Elicited by Skull Tap and Auditory Tone Burst

    Science.gov (United States)

    Noohi, Fatemeh; Kinnaird, Catherine; Wood, Scott; Bloomberg, Jacob; Mulavara, Ajitkumar; Seidler, Rachael

    2014-01-01

    The aim of the current study was to characterize the brain activation in response to two modes of vestibular stimulation: skull tap and auditory tone burst. The auditory tone burst has been used in previous studies to elicit saccular Vestibular Evoked Myogenic Potentials (VEMP) (Colebatch & Halmagyi 1992; Colebatch et al. 1994). Some researchers have reported that airconducted skull tap elicits both saccular and utricle VEMPs, while being faster and less irritating for the subjects (Curthoys et al. 2009, Wackym et al., 2012). However, it is not clear whether the skull tap and auditory tone burst elicit the same pattern of cortical activity. Both forms of stimulation target the otolith response, which provides a measurement of vestibular function independent from semicircular canals. This is of high importance for studying the vestibular disorders related to otolith deficits. Previous imaging studies have documented activity in the anterior and posterior insula, superior temporal gyrus, inferior parietal lobule, pre and post central gyri, inferior frontal gyrus, and the anterior cingulate cortex in response to different modes of vestibular stimulation (Bottini et al., 1994; Dieterich et al., 2003; Emri et al., 2003; Schlindwein et al., 2008; Janzen et al., 2008). Here we hypothesized that the skull tap elicits the similar pattern of cortical activity as the auditory tone burst. Subjects put on a set of MR compatible skull tappers and headphones inside the 3T GE scanner, while lying in supine position, with eyes closed. All subjects received both forms of the stimulation, however, the order of stimulation with auditory tone burst and air-conducted skull tap was counterbalanced across subjects. Pneumatically powered skull tappers were placed bilaterally on the cheekbones. The vibration of the cheekbone was transmitted to the vestibular cortex, resulting in vestibular response (Halmagyi et al., 1995). Auditory tone bursts were also delivered for comparison. To validate

  6. Mobile EEG on the bike: disentangling attentional and physical contributions to auditory attention tasks

    Science.gov (United States)

    Zink, Rob; Hunyadi, Borbála; Van Huffel, Sabine; De Vos, Maarten

    2016-08-01

    Objective. In the past few years there has been a growing interest in studying brain functioning in natural, real-life situations. Mobile EEG allows to study the brain in real unconstrained environments but it faces the intrinsic challenge that it is impossible to disentangle observed changes in brain activity due to increase in cognitive demands by the complex natural environment or due to the physical involvement. In this work we aim to disentangle the influence of cognitive demands and distractions that arise from such outdoor unconstrained recordings. Approach. We evaluate the ERP and single trial characteristics of a three-class auditory oddball paradigm recorded in outdoor scenario’s while peddling on a fixed bike or biking freely around. In addition we also carefully evaluate the trial specific motion artifacts through independent gyro measurements and control for muscle artifacts. Main results. A decrease in P300 amplitude was observed in the free biking condition as compared to the fixed bike conditions. Above chance P300 single-trial classification in highly dynamic real life environments while biking outdoors was achieved. Certain significant artifact patterns were identified in the free biking condition, but neither these nor the increase in movement (as derived from continuous gyrometer measurements) can explain the differences in classification accuracy and P300 waveform differences with full clarity. The increased cognitive load in real-life scenarios is shown to play a major role in the observed differences. Significance. Our findings suggest that auditory oddball results measured in natural real-life scenarios are influenced mainly by increased cognitive load due to being in an unconstrained environment.

  7. Level of Intrauterine Cocaine Exposure and Neuropsychological Test Scores in Preadolescence: Subtle Effects on Auditory Attention and Narrative Memory

    Science.gov (United States)

    Beeghly, Marjorie; Rose-Jacobs, Ruth; Martin, Brett M.; Cabral, Howard J.; Heeren, Timothy C.; Frank, Deborah A.

    2014-01-01

    Neuropsychological processes such as attention and memory contribute to children's higher-level cognitive and language functioning and predict academic achievement. The goal of this analysis was to evaluate whether level of intrauterine cocaine exposure (IUCE) alters multiple aspects of preadolescents' neuropsychological functioning assessed using a single age-referenced instrument, the NEPSY: A Developmental Neuropsychological Assessment (NEPSY) [71], after controlling for relevant covariates. Participants included 137 term 9.5-year-old children from low-income urban backgrounds (51% male, 90% African American/Caribbean) from an ongoing prospective longitudinal study. Level of IUCE was assessed in the newborn period using infant meconium and maternal report. 52% of the children had IUCE (65% with lighter IUCE, and 35% with heavier IUCE), and 48% were unexposed. Infants with Fetal Alcohol Syndrome, HIV seropositivity, or intrauterine exposure to illicit substances other than cocaine and marijuana were excluded. At the 9.5-year follow-up visit, trained examiners masked to IUCE and background variables evaluated children's neuropsychological functioning using the NEPSY. The association between level of IUCE and NEPSY outcomes was evaluated in a series of linear regressions controlling for intrauterine exposure to other substances and relevant child, caregiver, and demographic variables. Results indicated that level of IUCE was associated with lower scores on the Auditory Attention and Narrative Memory tasks, both of which require auditory information processing and sustained attention for successful performance. However, results did not follow the expected ordinal, dose-dependent pattern. Children's neuropsychological test scores were also altered by a variety of other biological and psychosocial factors. PMID:24978115

  8. Further Evidence of Auditory Extinction in Aphasia

    Science.gov (United States)

    Marshall, Rebecca Shisler; Basilakos, Alexandra; Love-Myers, Kim

    2013-01-01

    Purpose: Preliminary research ( Shisler, 2005) suggests that auditory extinction in individuals with aphasia (IWA) may be connected to binding and attention. In this study, the authors expanded on previous findings on auditory extinction to determine the source of extinction deficits in IWA. Method: Seventeen IWA (M[subscript age] = 53.19 years)…

  9. Active listening impairs visual perception and selectivity: an ERP study of auditory dual-task costs on visual attention.

    Science.gov (United States)

    Gherri, Elena; Eimer, Martin

    2011-04-01

    The ability to drive safely is disrupted by cell phone conversations, and this has been attributed to a diversion of attention from the visual environment. We employed behavioral and ERP measures to study whether the attentive processing of spoken messages is, in itself, sufficient to produce visual-attentional deficits. Participants searched for visual targets defined by a unique feature (Experiment 1) or feature conjunction (Experiment 2), and simultaneously listened to narrated text passages that had to be recalled later (encoding condition), or heard backward-played speech sounds that could be ignored (control condition). Responses to targets were slower in the encoding condition, and ERPs revealed that the visual processing of search arrays and the attentional selection of target stimuli were less efficient in the encoding relative to the control condition. Results demonstrate that the attentional processing of visual information is impaired when concurrent spoken messages are encoded and maintained, in line with cross-modal links in selective attention, but inconsistent with the view that attentional resources are modality-specific. The distraction of visual attention by active listening could contribute to the adverse effects of cell phone use on driving performance.

  10. Neuronal Effects of Auditory Distraction on Visual Attention

    Science.gov (United States)

    Smucny, Jason; Rojas, Donald C.; Eichman, Lindsay C.; Tregellas, Jason R.

    2013-01-01

    Selective attention in the presence of distraction is a key aspect of healthy cognition. The underlying neurobiological processes, have not, however, been functionally well characterized. In the present study, we used functional magnetic resonance imaging to determine how ecologically relevant distracting noise affects cortical activity in 27…

  11. Auditory risk assessment of college music students in jazz band-based instructional activity

    Directory of Open Access Journals (Sweden)

    Kamakshi V Gopal

    2013-01-01

    Full Text Available It is well-known that musicians are at risk for music-induced hearing loss, however, systematic evaluation of music exposure and its effects on the auditory system are still difficult to assess. The purpose of the study was to determine if college students in jazz band-based instructional activity are exposed to loud classroom noise and consequently exhibit acute but significant changes in basic auditory measures compared to non-music students in regular classroom sessions. For this we (1 measured and compared personal exposure levels of college students (n = 14 participating in a routine 50 min jazz ensemble-based instructional activity (experimental to personal exposure levels of non-music students (n = 11 participating in a 50-min regular classroom activity (control, and (2 measured and compared pre- to post-auditory changes associated with these two types of classroom exposures. Results showed that the L eq (equivalent continuous noise level generated during the 50 min jazz ensemble-based instructional activity ranged from 95 dBA to 105.8 dBA with a mean of 99.5 ± 2.5 dBA. In the regular classroom, the L eq ranged from 46.4 dBA to 67.4 dBA with a mean of 49.9 ± 10.6 dBA. Additionally, significant differences were observed in pre to post-auditory measures between the two groups. The experimental group showed a significant temporary threshold shift bilaterally at 4000 Hz (P < 0.05, and a significant decrease in the amplitude of transient-evoked otoacoustic emission response in both ears (P < 0.05 after exposure to the jazz ensemble-based instructional activity. No significant changes were found in the control group between pre- and post-exposure measures. This study quantified the noise exposure in jazz band-based practice sessions and its effects on basic auditory measures. Temporary, yet significant, auditory changes seen in music students place them at risk for hearing loss compared to their non-music cohorts.

  12. Auditory hallucinations: A review of the ERC "VOICE" project.

    Science.gov (United States)

    Hugdahl, Kenneth

    2015-06-22

    In this invited review I provide a selective overview of recent research on brain mechanisms and cognitive processes involved in auditory hallucinations. The review is focused on research carried out in the "VOICE" ERC Advanced Grant Project, funded by the European Research Council, but I also review and discuss the literature in general. Auditory hallucinations are suggested to be perceptual phenomena, with a neuronal origin in the speech perception areas in the temporal lobe. The phenomenology of auditory hallucinations is conceptualized along three domains, or dimensions; a perceptual dimension, experienced as someone speaking to the patient; a cognitive dimension, experienced as an inability to inhibit, or ignore the voices, and an emotional dimension, experienced as the "voices" having primarily a negative, or sinister, emotional tone. I will review cognitive, imaging, and neurochemistry data related to these dimensions, primarily the first two. The reviewed data are summarized in a model that sees auditory hallucinations as initiated from temporal lobe neuronal hyper-activation that draws attentional focus inward, and which is not inhibited due to frontal lobe hypo-activation. It is further suggested that this is maintained through abnormal glutamate and possibly gamma-amino-butyric-acid transmitter mediation, which could point towards new pathways for pharmacological treatment. A final section discusses new methods of acquiring quantitative data on the phenomenology and subjective experience of auditory hallucination that goes beyond standard interview questionnaires, by suggesting an iPhone/iPod app.

  13. Amplitude-modulated stimuli reveal auditory-visual interactions in brain activity and brain connectivity

    Directory of Open Access Journals (Sweden)

    Mark eLaing

    2015-10-01

    Full Text Available The temporal congruence between auditory and visual signals coming from the same source can be a powerful means by which the brain integrates information from different senses. To investigate how the brain uses temporal information to integrate auditory and visual information from continuous yet unfamiliar stimuli, we use amplitude-modulated tones and size-modulated shapes with which we could manipulate the temporal congruence between the sensory signals. These signals were independently modulated at a slow or a fast rate. Participants were presented with auditory-only, visual-only or auditory-visual (AV trials in the scanner. On AV trials, the auditory and visual signal could have the same (AV congruent or different modulation rates (AV incongruent. Using psychophysiological interaction analyses, we found that auditory regions showed increased functional connectivity predominantly with frontal regions for AV incongruent relative to AV congruent stimuli. We further found that superior temporal regions, shown previously to integrate auditory and visual signals, showed increased connectivity with frontal and parietal regions for the same contrast. Our findings provide evidence that both activity in a network of brain regions and their connectivity are important for auditory-visual integration, and help to bridge the gap between transient and familiar AV stimuli used in previous studies.

  14. Cross-modal selective attention: on the difficulty of ignoring sounds at the locus of visual attention.

    Science.gov (United States)

    Spence, C; Ranson, J; Driver, J

    2000-02-01

    In three experiments, we investigated whether the ease with which distracting sounds can be ignored depends on their distance from fixation and from attended visual events. In the first experiment, participants shadowed an auditory stream of words presented behind their heads, while simultaneously fixating visual lip-read information consistent with the relevant auditory stream, or meaningless "chewing" lip movements. An irrelevant auditory stream of words, which participants had to ignore, was presented either from the same side as the fixated visual stream or from the opposite side. Selective shadowing was less accurate in the former condition, implying that distracting sounds are harder to ignore when fixated. Furthermore, the impairment when fixating toward distractor sounds was greater when speaking lips were fixated than when chewing lips were fixated, suggesting that people find it particularly difficult to ignore sounds at locations that are actively attended for visual lipreading rather than merely passively fixated. Experiments 2 and 3 tested whether these results are specific to cross-modal links in speech perception by replacing the visual lip movements with a rapidly changing stream of meaningless visual shapes. The auditory task was again shadowing, but the active visual task was now monitoring for a specific visual shape at one location. A decrement in shadowing was again observed when participants passively fixated toward the irrelevant auditory stream. This decrement was larger when participants performed a difficult active visual task there versus fixating, but not for a less demanding visual task versus fixation. The implications for cross-modal links in spatial attention are discussed.

  15. Selective attention in normal and impaired hearing.

    Science.gov (United States)

    Shinn-Cunningham, Barbara G; Best, Virginia

    2008-12-01

    A common complaint among listeners with hearing loss (HL) is that they have difficulty communicating in common social settings. This article reviews how normal-hearing listeners cope in such settings, especially how they focus attention on a source of interest. Results of experiments with normal-hearing listeners suggest that the ability to selectively attend depends on the ability to analyze the acoustic scene and to form perceptual auditory objects properly. Unfortunately, sound features important for auditory object formation may not be robustly encoded in the auditory periphery of HL listeners. In turn, impaired auditory object formation may interfere with the ability to filter out competing sound sources. Peripheral degradations are also likely to reduce the salience of higher-order auditory cues such as location, pitch, and timbre, which enable normal-hearing listeners to select a desired sound source out of a sound mixture. Degraded peripheral processing is also likely to increase the time required to form auditory objects and focus selective attention so that listeners with HL lose the ability to switch attention rapidly (a skill that is particularly important when trying to participate in a lively conversation). Finally, peripheral deficits may interfere with strategies that normal-hearing listeners employ in complex acoustic settings, including the use of memory to fill in bits of the conversation that are missed. Thus, peripheral hearing deficits are likely to cause a number of interrelated problems that challenge the ability of HL listeners to communicate in social settings requiring selective attention.

  16. Functional Imaging of Human Vestibular Cortex Activity Elicited by Skull Tap and Auditory Tone Burst

    Science.gov (United States)

    Noohi, F.; Kinnaird, C.; Wood, S.; Bloomberg, J.; Mulavara, A.; Seidler, R.

    2016-01-01

    The current study characterizes brain activation in response to two modes of vestibular stimulation: skull tap and auditory tone burst. The auditory tone burst has been used in previous studies to elicit either the vestibulo-spinal reflex (saccular-mediated colic Vestibular Evoked Myogenic Potentials (cVEMP)), or the ocular muscle response (utricle-mediated ocular VEMP (oVEMP)). Some researchers have reported that air-conducted skull tap elicits both saccular and utricle-mediated VEMPs, while being faster and less irritating for the subjects. However, it is not clear whether the skull tap and auditory tone burst elicit the same pattern of cortical activity. Both forms of stimulation target the otolith response, which provides a measurement of vestibular function independent from semicircular canals. This is of high importance for studying otolith-specific deficits, including gait and balance problems that astronauts experience upon returning to earth. Previous imaging studies have documented activity in the anterior and posterior insula, superior temporal gyrus, inferior parietal lobule, inferior frontal gyrus, and the anterior cingulate cortex in response to different modes of vestibular stimulation. Here we hypothesized that skull taps elicit similar patterns of cortical activity as the auditory tone bursts, and previous vestibular imaging studies. Subjects wore bilateral MR compatible skull tappers and headphones inside the 3T GE scanner, while lying in the supine position, with eyes closed. Subjects received both forms of the stimulation in a counterbalanced fashion. Pneumatically powered skull tappers were placed bilaterally on the cheekbones. The vibration of the cheekbone was transmitted to the vestibular system, resulting in the vestibular cortical response. Auditory tone bursts were also delivered for comparison. To validate our stimulation method, we measured the ocular VEMP outside of the scanner. This measurement showed that both skull tap and auditory

  17. Activations in temporal areas using visual and auditory naming stimuli: A language fMRI study in temporal lobe epilepsy.

    Science.gov (United States)

    Gonzálvez, Gloria G; Trimmel, Karin; Haag, Anja; van Graan, Louis A; Koepp, Matthias J; Thompson, Pamela J; Duncan, John S

    2016-12-01

    Verbal fluency functional MRI (fMRI) is used for predicting language deficits after anterior temporal lobe resection (ATLR) for temporal lobe epilepsy (TLE), but primarily engages frontal lobe areas. In this observational study we investigated fMRI paradigms using visual and auditory stimuli, which predominately involve language areas resected during ATLR. Twenty-three controls and 33 patients (20 left (LTLE), 13 right (RTLE)) were assessed using three fMRI paradigms: verbal fluency, auditory naming with a contrast of auditory reversed speech; picture naming with a contrast of scrambled pictures and blurred faces. Group analysis showed bilateral temporal activations for auditory naming and picture naming. Correcting for auditory and visual input (by subtracting activations resulting from auditory reversed speech and blurred pictures/scrambled faces respectively) resulted in left-lateralised activations for patients and controls, which was more pronounced for LTLE compared to RTLE patients. Individual subject activations at a threshold of T>2.5, extent >10 voxels, showed that verbal fluency activated predominantly the left inferior frontal gyrus (IFG) in 90% of LTLE, 92% of RTLE, and 65% of controls, compared to right IFG activations in only 15% of LTLE and RTLE and 26% of controls. Middle temporal (MTG) or superior temporal gyrus (STG) activations were seen on the left in 30% of LTLE, 23% of RTLE, and 52% of controls, and on the right in 15% of LTLE, 15% of RTLE, and 35% of controls. Auditory naming activated temporal areas more frequently than did verbal fluency (LTLE: 93%/73%; RTLE: 92%/58%; controls: 82%/70% (left/right)). Controlling for auditory input resulted in predominantly left-sided temporal activations. Picture naming resulted in temporal lobe activations less frequently than did auditory naming (LTLE 65%/55%; RTLE 53%/46%; controls 52%/35% (left/right)). Controlling for visual input had left-lateralising effects. Auditory and picture naming activated

  18. Large-scale synchronized activity during vocal deviance detection in the zebra finch auditory forebrain.

    Science.gov (United States)

    Beckers, Gabriël J L; Gahr, Manfred

    2012-08-01

    Auditory systems bias responses to sounds that are unexpected on the basis of recent stimulus history, a phenomenon that has been widely studied using sequences of unmodulated tones (mismatch negativity; stimulus-specific adaptation). Such a paradigm, however, does not directly reflect problems that neural systems normally solve for adaptive behavior. We recorded multiunit responses in the caudomedial auditory forebrain of anesthetized zebra finches (Taeniopygia guttata) at 32 sites simultaneously, to contact calls that recur probabilistically at a rate that is used in communication. Neurons in secondary, but not primary, auditory areas respond preferentially to calls when they are unexpected (deviant) compared with the same calls when they are expected (standard). This response bias is predominantly due to sites more often not responding to standard events than to deviant events. When two call stimuli alternate between standard and deviant roles, most sites exhibit a response bias to deviant events of both stimuli. This suggests that biases are not based on a use-dependent decrease in response strength but involve a more complex mechanism that is sensitive to auditory deviance per se. Furthermore, between many secondary sites, responses are tightly synchronized, a phenomenon that is driven by internal neuronal interactions rather than by the timing of stimulus acoustic features. We hypothesize that this deviance-sensitive, internally synchronized network of neurons is involved in the involuntary capturing of attention by unexpected and behaviorally potentially relevant events in natural auditory scenes.

  19. The role of temporal coherence in auditory stream segregation

    DEFF Research Database (Denmark)

    Christiansen, Simon Krogholt

    The ability to perceptually segregate concurrent sound sources and focus one’s attention on a single source at a time is essential for the ability to use acoustic information. While perceptual experiments have determined a range of acoustic cues that help facilitate auditory stream segregation......, it is not clear how the auditory system realizes the task. This thesis presents a study of the mechanisms involved in auditory stream segregation. Through a combination of psychoacoustic experiments, designed to characterize the influence of acoustic cues on auditory stream formation, and computational models...... of auditory processing, the role of auditory preprocessing and temporal coherence in auditory stream formation was evaluated. The computational model presented in this study assumes that auditory stream segregation occurs when sounds stimulate non-overlapping neural populations in a temporally incoherent...

  20. Auditory evoked fields to vocalization during passive listening and active generation in adults who stutter.

    Science.gov (United States)

    Beal, Deryk S; Cheyne, Douglas O; Gracco, Vincent L; Quraan, Maher A; Taylor, Margot J; De Nil, Luc F

    2010-10-01

    We used magnetoencephalography to investigate auditory evoked responses to speech vocalizations and non-speech tones in adults who do and do not stutter. Neuromagnetic field patterns were recorded as participants listened to a 1 kHz tone, playback of their own productions of the vowel /i/ and vowel-initial words, and actively generated the vowel /i/ and vowel-initial words. Activation of the auditory cortex at approximately 50 and 100 ms was observed during all tasks. A reduction in the peak amplitudes of the M50 and M100 components was observed during the active generation versus passive listening tasks dependent on the stimuli. Adults who stutter did not differ in the amount of speech-induced auditory suppression relative to fluent speakers. Adults who stutter had shorter M100 latencies for the actively generated speaking tasks in the right hemisphere relative to the left hemisphere but the fluent speakers showed similar latencies across hemispheres. During passive listening tasks, adults who stutter had longer M50 and M100 latencies than fluent speakers. The results suggest that there are timing, rather than amplitude, differences in auditory processing during speech in adults who stutter and are discussed in relation to hypotheses of auditory-motor integration breakdown in stuttering. Copyright 2010 Elsevier Inc. All rights reserved.

  1. Attention failures versus misplaced diligence: separating attention lapses from speed-accuracy trade-offs.

    Science.gov (United States)

    Seli, Paul; Cheyne, James Allan; Smilek, Daniel

    2012-03-01

    In two studies of a GO-NOGO task assessing sustained attention, we examined the effects of (1) altering speed-accuracy trade-offs through instructions (emphasizing both speed and accuracy or accuracy only) and (2) auditory alerts distributed throughout the task. Instructions emphasizing accuracy reduced errors and changed the distribution of GO trial RTs. Additionally, correlations between errors and increasing RTs produced a U-function; excessively fast and slow RTs accounted for much of the variance of errors. Contrary to previous reports, alerts increased errors and RT variability. The results suggest that (1) standard instructions for sustained attention tasks, emphasizing speed and accuracy equally, produce errors arising from attempts to conform to the misleading requirement for speed, which become conflated with attention-lapse produced errors and (2) auditory alerts have complex, and sometimes deleterious, effects on attention. We argue that instructions emphasizing accuracy provide a more precise assessment of attention lapses in sustained attention tasks. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Auditory Processing Assessment in Children with Attention Deficit Hyperactivity Disorder: An Open Study Examining Methylphenidate Effects.

    Science.gov (United States)

    Lanzetta-Valdo, Bianca Pinheiro; Oliveira, Giselle Alves de; Ferreira, Jane Tagarro Correa; Palacios, Ester Miyuki Nakamura

    2017-01-01

    Introduction  Children with Attention Deficit Hyperactivity Disorder can present Auditory Processing (AP) Disorder. Objective  The study examined the AP in ADHD children compared with non-ADHD children, and before and after 3 and 6 months of methylphenidate (MPH) treatment in ADHD children. Methods  Drug-naive children diagnosed with ADHD combined subtype aging between 7 and 11 years, coming from public and private outpatient service or public and private school, and age-gender-matched non-ADHD children, participated in an open, non-randomized study from February 2013 to December 2013. They were submitted to a behavioral battery of AP tests comprising Speech with white Noise, Dichotic Digits (DD), and Pitch Pattern Sequence (PPS) and were compared with non-ADHD children. They were followed for 3 and 6 months of MPH treatment (0.5 mg/kg/day). Results  ADHD children presented larger number of errors in DD ( p  < 0.01), and less correct responses in the PPS ( p  < 0.0001) and in the SN ( p  < 0.05) tests when compared with non-ADHD children. The treatment with MPH, especially along 6 months, significantly decreased the mean errors in the DD ( p  < 0.01) and increased the correct response in the PPS ( p  < 0.001) and SN ( p  < 0.01) tests when compared with the performance before MPH treatment. Conclusions  ADHD children show inefficient AP in selected behavioral auditory battery suggesting impaired in auditory closure, binaural integration, and temporal ordering. Treatment with MPH gradually improved these deficiencies and completely reversed them by reaching a performance similar to non-ADHD children at 6 months of treatment.

  3. Assessing attentional systems in children with Attention Deficit Hyperactivity Disorder.

    Science.gov (United States)

    Casagrande, Maria; Martella, Diana; Ruggiero, Maria Cleonice; Maccari, Lisa; Paloscia, Claudio; Rosa, Caterina; Pasini, Augusto

    2012-01-01

    The aim of this study was to evaluate the efficiency and interactions of attentional systems in children with Attention Deficit Hyperactivity Disorder (ADHD) by considering the effects of reinforcement and auditory warning on each component of attention. Thirty-six drug-naïve children (18 children with ADHD/18 typically developing children) performed two revised versions of the Attentional Network Test, which assess the efficiency of alerting, orienting, and executive systems. In feedback trials, children received feedback about their accuracy, whereas in the no-feedback trials, feedback was not given. In both conditions, children with ADHD performed more slowly than did typically developing children. They also showed impairments in the ability to disengage attention and in executive functioning, which improved when alertness was increased by administering the auditory warning. The performance of the attentional networks appeared to be modulated by the absence or the presence of reinforcement. We suggest that the observed executive system deficit in children with ADHD could depend on their low level of arousal rather than being an independent disorder. © The Author 2011. Published by Oxford University Press. All rights reserved.

  4. Level of intrauterine cocaine exposure and neuropsychological test scores in preadolescence: subtle effects on auditory attention and narrative memory.

    Science.gov (United States)

    Beeghly, Marjorie; Rose-Jacobs, Ruth; Martin, Brett M; Cabral, Howard J; Heeren, Timothy C; Frank, Deborah A

    2014-01-01

    Neuropsychological processes such as attention and memory contribute to children's higher-level cognitive and language functioning and predict academic achievement. The goal of this analysis was to evaluate whether level of intrauterine cocaine exposure (IUCE) alters multiple aspects of preadolescents' neuropsychological functioning assessed using a single age-referenced instrument, the NEPSY: A Developmental Neuropsychological Assessment (NEPSY) (Korkman et al., 1998), after controlling for relevant covariates. Participants included 137 term 9.5-year-old children from low-income urban backgrounds (51% male, 90% African American/Caribbean) from an ongoing prospective longitudinal study. Level of IUCE was assessed in the newborn period using infant meconium and maternal report. 52% of the children had IUCE (65% with lighter IUCE, and 35% with heavier IUCE), and 48% were unexposed. Infants with Fetal Alcohol Syndrome, HIV seropositivity, or intrauterine exposure to illicit substances other than cocaine and marijuana were excluded. At the 9.5-year follow-up visit, trained examiners masked to IUCE and background variables evaluated children's neuropsychological functioning using the NEPSY. The association between level of IUCE and NEPSY outcomes was evaluated in a series of linear regressions controlling for intrauterine exposure to other substances and relevant child, caregiver, and demographic variables. Results indicated that level of IUCE was associated with lower scores on the Auditory Attention and Narrative Memory tasks, both of which require auditory information processing and sustained attention for successful performance. However, results did not follow the expected ordinal, dose-dependent pattern. Children's neuropsychological test scores were also altered by a variety of other biological and psychosocial factors. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Contributions from eye movement potentials to stimulus preceding negativity during anticipation of auditory stimulation

    DEFF Research Database (Denmark)

    Engdahl, Lis; Bjerre, Vicky K; Christoffersen, Gert R J

    2007-01-01

    Cognitive anticipation of a stimulus has been associated with an ERP called "stimulus preceding negativity" (SPN). A new auditory delay task without stimulus-related motor activity demonstrated a prefrontal SPN, present during attentive anticipation of sounds with closed eyes, but absent during d...

  6. Listening to polyphonic music recruits domain-general attention and working memory circuits.

    Science.gov (United States)

    Janata, Petr; Tillmann, Barbara; Bharucha, Jamshed J

    2002-06-01

    Polyphonic music combines multiple auditory streams to create complex auditory scenes, thus providing a tool for investigating the neural mechanisms that orient attention in natural auditory contexts. Across two fMRI experiments, we varied stimuli and task demands in order to identify the cortical areas that are activated during attentive listening to real music. In individual experiments and in a conjunction analysis of the two experiments, we found bilateral blood oxygen level dependent (BOLD) signal increases in temporal (the superior temporal gyrus), parietal (the intraparietal sulcus), and frontal (the precentral sulcus, the inferior frontal sulcus and gyrus, and the frontal operculum) areas during selective and global listening, as compared with passive rest without musical stimulation. Direct comparisons of the listening conditions showed significant differences between attending to single timbres (instruments) and attending across multiple instruments, although the patterns that were observed depended on the relative demands of the tasks being compared. The overall pattern of BOLD signal increases indicated that attentive listening to music recruits neural circuits underlying multiple forms of working memory, attention, semantic processing, target detection, and motor imagery. Thus, attentive listening to music appears to be enabled by areas that serve general functions, rather than by music-specific cortical modules.

  7. Age effects on preattentive and early attentive auditory processing of redundant stimuli: is sensory gating affected by physiological aging?

    Science.gov (United States)

    Gmehlin, Dennis; Kreisel, Stefan H; Bachmann, Silke; Weisbrod, Matthias; Thomas, Christine

    2011-10-01

    The frontal hypothesis of aging predicts an age-related decline in cognitive functions requiring inhibitory or attentional regulation. In Alzheimer's disease, preattentive gating out of redundant information is impaired. Our study aimed to examine changes associated with physiological aging in both pre- and early attentive inhibition of recurrent acoustic information. Using a passive double-click paradigm, we recorded mid-latency (P30-P50) and late-latency (N100 and P200) evoked potentials in healthy young (26 ± 5 years) and healthy elderly subjects (72 ± 5 years). Physiological aging did not affect auditory gating in amplitude measures. Both age groups exhibited clear inhibition in preattentive P50 and attention-modulated (N100) components, whereas P30 was not attenuated. Irrespective of age, the magnitude of inhibition differed significantly, being most pronounced for N100 gating. Inhibition of redundant information seems to be preserved with physiological aging. Early attentive N100 gating showed the maximum effect. Further studies are warranted to evaluate sensory gating as a suitable biomarker of underlying neurodegenerative disease.

  8. Acute stress alters auditory selective attention in humans independent of HPA: a study of evoked potentials.

    Directory of Open Access Journals (Sweden)

    Ludger Elling

    Full Text Available BACKGROUND: Acute stress is a stereotypical, but multimodal response to a present or imminent challenge overcharging an organism. Among the different branches of this multimodal response, the consequences of glucocorticoid secretion have been extensively investigated, mostly in connection with long-term memory (LTM. However, stress responses comprise other endocrine signaling and altered neuronal activity wholly independent of pituitary regulation. To date, knowledge of the impact of such "paracorticoidal" stress responses on higher cognitive functions is scarce. We investigated the impact of an ecological stressor on the ability to direct selective attention using event-related potentials in humans. Based on research in rodents, we assumed that a stress-induced imbalance of catecholaminergic transmission would impair this ability. METHODOLOGY/PRINCIPAL FINDINGS: The stressor consisted of a single cold pressor test. Auditory negative difference (Nd and mismatch negativity (MMN were recorded in a tonal dichotic listening task. A time series of such tasks confirmed an increased distractibility occurring 4-7 minutes after onset of the stressor as reflected by an attenuated Nd. Salivary cortisol began to rise 8-11 minutes after onset when no further modulations in the event-related potentials (ERP occurred, thus precluding a causal relationship. This effect may be attributed to a stress-induced activation of mesofrontal dopaminergic projections. It may also be attributed to an activation of noradrenergic projections. Known characteristics of the modulation of ERP by different stress-related ligands were used for further disambiguation of causality. The conjuncture of an attenuated Nd and an increased MMN might be interpreted as indicating a dopaminergic influence. The selective effect on the late portion of the Nd provides another tentative clue for this. CONCLUSIONS/SIGNIFICANCE: Prior studies have deliberately tracked the adrenocortical influence

  9. Localized brain activation related to the strength of auditory learning in a parrot.

    Directory of Open Access Journals (Sweden)

    Hiroko Eda-Fujiwara

    Full Text Available Parrots and songbirds learn their vocalizations from a conspecific tutor, much like human infants acquire spoken language. Parrots can learn human words and it has been suggested that they can use them to communicate with humans. The caudomedial pallium in the parrot brain is homologous with that of songbirds, and analogous to the human auditory association cortex, involved in speech processing. Here we investigated neuronal activation, measured as expression of the protein product of the immediate early gene ZENK, in relation to auditory learning in the budgerigar (Melopsittacus undulatus, a parrot. Budgerigar males successfully learned to discriminate two Japanese words spoken by another male conspecific. Re-exposure to the two discriminanda led to increased neuronal activation in the caudomedial pallium, but not in the hippocampus, compared to untrained birds that were exposed to the same words, or were not exposed to words. Neuronal activation in the caudomedial pallium of the experimental birds was correlated significantly and positively with the percentage of correct responses in the discrimination task. These results suggest that in a parrot, the caudomedial pallium is involved in auditory learning. Thus, in parrots, songbirds and humans, analogous brain regions may contain the neural substrate for auditory learning and memory.

  10. Amplitude-modulated stimuli reveal auditory-visual interactions in brain activity and brain connectivity.

    Science.gov (United States)

    Laing, Mark; Rees, Adrian; Vuong, Quoc C

    2015-01-01

    The temporal congruence between auditory and visual signals coming from the same source can be a powerful means by which the brain integrates information from different senses. To investigate how the brain uses temporal information to integrate auditory and visual information from continuous yet unfamiliar stimuli, we used amplitude-modulated tones and size-modulated shapes with which we could manipulate the temporal congruence between the sensory signals. These signals were independently modulated at a slow or a fast rate. Participants were presented with auditory-only, visual-only, or auditory-visual (AV) trials in the fMRI scanner. On AV trials, the auditory and visual signal could have the same (AV congruent) or different modulation rates (AV incongruent). Using psychophysiological interaction analyses, we found that auditory regions showed increased functional connectivity predominantly with frontal regions for AV incongruent relative to AV congruent stimuli. We further found that superior temporal regions, shown previously to integrate auditory and visual signals, showed increased connectivity with frontal and parietal regions for the same contrast. Our findings provide evidence that both activity in a network of brain regions and their connectivity are important for AV integration, and help to bridge the gap between transient and familiar AV stimuli used in previous studies.

  11. Patterns of language and auditory dysfunction in 6-year-old children with epilepsy.

    Science.gov (United States)

    Selassie, Gunilla Rejnö-Habte; Olsson, Ingrid; Jennische, Margareta

    2009-01-01

    In a previous study we reported difficulty with expressive language and visuoperceptual ability in preschool children with epilepsy and otherwise normal development. The present study analysed speech and language dysfunction for each individual in relation to epilepsy variables, ear preference, and intelligence in these children and described their auditory function. Twenty 6-year-old children with epilepsy (14 females, 6 males; mean age 6:5 y, range 6 y-6 y 11 mo) and 30 reference children without epilepsy (18 females, 12 males; mean age 6:5 y, range 6 y-6 y 11 mo) were assessed for language and auditory ability. Low scores for the children with epilepsy were analysed with respect to speech-language domains, type of epilepsy, site of epileptiform activity, intelligence, and language laterality. Auditory attention, perception, discrimination, and ear preference were measured with a dichotic listening test, and group comparisons were performed. Children with left-sided partial epilepsy had extensive language dysfunction. Most children with partial epilepsy had phonological dysfunction. Language dysfunction was also found in children with generalized and unclassified epilepsies. The children with epilepsy performed significantly worse than the reference children in auditory attention, perception of vowels and discrimination of consonants for the right ear and had more left ear advantage for vowels, indicating undeveloped language laterality.

  12. Adaptation in the auditory system: an overview

    Directory of Open Access Journals (Sweden)

    David ePérez-González

    2014-02-01

    Full Text Available The early stages of the auditory system need to preserve the timing information of sounds in order to extract the basic features of acoustic stimuli. At the same time, different processes of neuronal adaptation occur at several levels to further process the auditory information. For instance, auditory nerve fiber responses already experience adaptation of their firing rates, a type of response that can be found in many other auditory nuclei and may be useful for emphasizing the onset of the stimuli. However, it is at higher levels in the auditory hierarchy where more sophisticated types of neuronal processing take place. For example, stimulus-specific adaptation, where neurons show adaptation to frequent, repetitive stimuli, but maintain their responsiveness to stimuli with different physical characteristics, thus representing a distinct kind of processing that may play a role in change and deviance detection. In the auditory cortex, adaptation takes more elaborate forms, and contributes to the processing of complex sequences, auditory scene analysis and attention. Here we review the multiple types of adaptation that occur in the auditory system, which are part of the pool of resources that the neurons employ to process the auditory scene, and are critical to a proper understanding of the neuronal mechanisms that govern auditory perception.

  13. The modality effect of ego depletion: Auditory task modality reduces ego depletion.

    Science.gov (United States)

    Li, Qiong; Wang, Zhenhong

    2016-08-01

    An initial act of self-control that impairs subsequent acts of self-control is called ego depletion. The ego depletion phenomenon has been observed consistently. The modality effect refers to the effect of the presentation modality on the processing of stimuli. The modality effect was also robustly found in a large body of research. However, no study to date has examined the modality effects of ego depletion. This issue was addressed in the current study. In Experiment 1, after all participants completed a handgrip task, one group's participants completed a visual attention regulation task and the other group's participants completed an auditory attention regulation task, and then all participants again completed a handgrip task. The ego depletion phenomenon was observed in both the visual and the auditory attention regulation task. Moreover, participants who completed the visual task performed worse on the handgrip task than participants who completed the auditory task, which indicated that there was high ego depletion in the visual task condition. In Experiment 2, participants completed an initial task that either did or did not deplete self-control resources, and then they completed a second visual or auditory attention control task. The results indicated that depleted participants performed better on the auditory attention control task than the visual attention control task. These findings suggest that altering task modality may reduce ego depletion. © 2016 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  14. Auditory short-term memory activation during score reading.

    Science.gov (United States)

    Simoens, Veerle L; Tervaniemi, Mari

    2013-01-01

    Performing music on the basis of reading a score requires reading ahead of what is being played in order to anticipate the necessary actions to produce the notes. Score reading thus not only involves the decoding of a visual score and the comparison to the auditory feedback, but also short-term storage of the musical information due to the delay of the auditory feedback during reading ahead. This study investigates the mechanisms of encoding of musical information in short-term memory during such a complicated procedure. There were three parts in this study. First, professional musicians participated in an electroencephalographic (EEG) experiment to study the slow wave potentials during a time interval of short-term memory storage in a situation that requires cross-modal translation and short-term storage of visual material to be compared with delayed auditory material, as it is the case in music score reading. This delayed visual-to-auditory matching task was compared with delayed visual-visual and auditory-auditory matching tasks in terms of EEG topography and voltage amplitudes. Second, an additional behavioural experiment was performed to determine which type of distractor would be the most interfering with the score reading-like task. Third, the self-reported strategies of the participants were also analyzed. All three parts of this study point towards the same conclusion according to which during music score reading, the musician most likely first translates the visual score into an auditory cue, probably starting around 700 or 1300 ms, ready for storage and delayed comparison with the auditory feedback.

  15. Dividing time: Concurrent timing of auditory and visual events by young and elderly adults

    OpenAIRE

    McAuley, J. Devin; Miller, Jonathan P.; Wang, Mo; Pang, Kevin C. H.

    2010-01-01

    This article examines age differences in individual’s ability to produce the durations of learned auditory and visual target events either in isolation (focused attention) or concurrently (divided attention). Young adults produced learned target durations equally well in focused and divided attention conditions. Older adults in contrast showed an age-related increase in timing variability in divided attention conditions that tended to be more pronounced for visual targets than for auditory ta...

  16. Beneficial auditory and cognitive effects of auditory brainstem implantation in children.

    Science.gov (United States)

    Colletti, Liliana

    2007-09-01

    This preliminary study demonstrates the development of hearing ability and shows that there is a significant improvement in some cognitive parameters related to selective visual/spatial attention and to fluid or multisensory reasoning, in children fitted with auditory brainstem implantation (ABI). The improvement in cognitive paramenters is due to several factors, among which there is certainly, as demonstrated in the literature on a cochlear implants (CIs), the activation of the auditory sensory canal, which was previously absent. The findings of the present study indicate that children with cochlear or cochlear nerve abnormalities with associated cognitive deficits should not be excluded from ABI implantation. The indications for ABI have been extended over the last 10 years to adults with non-tumoral (NT) cochlear or cochlear nerve abnormalities that cannot benefit from CI. We demonstrated that the ABI with surface electrodes may provide sufficient stimulation of the central auditory system in adults for open set speech recognition. These favourable results motivated us to extend ABI indications to children with profound hearing loss who were not candidates for a CI. This study investigated the performances of young deaf children undergoing ABI, in terms of their auditory perceptual development and their non-verbal cognitive abilities. In our department from 2000 to 2006, 24 children aged 14 months to 16 years received an ABI for different tumour and non-tumour diseases. Two children had NF2 tumours. Eighteen children had bilateral cochlear nerve aplasia. In this group, nine children had associated cochlear malformations, two had unilateral facial nerve agenesia and two had combined microtia, aural atresia and middle ear malformations. Four of these children had previously been fitted elsewhere with a CI with no auditory results. One child had bilateral incomplete cochlear partition (type II); one child, who had previously been fitted unsuccessfully elsewhere

  17. Concentrated pitch discrimination modulates auditory brainstem responses during contralateral noise exposure.

    Science.gov (United States)

    Ikeda, Kazunari; Sekiguchi, Takahiro; Hayashi, Akiko

    2010-03-31

    This study examined a notion that auditory discrimination is a requisite for attention-related modulation of the auditory brainstem response (ABR) during contralateral noise exposure. Given that the right ear was exposed continuously with white noise at an intensity of 60-80 dB sound pressure level, tone pips at 80 dB sound pressure level were delivered to the left ear through either single-stimulus or oddball procedures. Participants conducted reading (ignoring task) and counting target tones (attentive task) during stimulation. The oddball but not the single-stimulus procedures elicited task-related modulations in both early (ABR) and late (processing negativity) event-related potentials simultaneously. The elicitation of the attention-related ABR modulation during contralateral noise exposure is thus considered to require auditory discrimination and have the corticofugal nature evidently.

  18. Reduced object related negativity response indicates impaired auditory scene analysis in adults with autistic spectrum disorder

    Directory of Open Access Journals (Sweden)

    Veema Lodhia

    2014-02-01

    Full Text Available Auditory Scene Analysis provides a useful framework for understanding atypical auditory perception in autism. Specifically, a failure to segregate the incoming acoustic energy into distinct auditory objects might explain the aversive reaction autistic individuals have to certain auditory stimuli or environments. Previous research with non-autistic participants has demonstrated the presence of an Object Related Negativity (ORN in the auditory event related potential that indexes pre-attentive processes associated with auditory scene analysis. Also evident is a later P400 component that is attention dependent and thought to be related to decision-making about auditory objects. We sought to determine whether there are differences between individuals with and without autism in the levels of processing indexed by these components. Electroencephalography (EEG was used to measure brain responses from a group of 16 autistic adults, and 16 age- and verbal-IQ-matched typically-developing adults. Auditory responses were elicited using lateralized dichotic pitch stimuli in which inter-aural timing differences create the illusory perception of a pitch that is spatially separated from a carrier noise stimulus. As in previous studies, control participants produced an ORN in response to the pitch stimuli. However, this component was significantly reduced in the participants with autism. In contrast, processing differences were not observed between the groups at the attention-dependent level (P400. These findings suggest that autistic individuals have difficulty segregating auditory stimuli into distinct auditory objects, and that this difficulty arises at an early pre-attentive level of processing.

  19. Auditory verbal hallucinations predominantly activate the right inferior frontal area

    NARCIS (Netherlands)

    Sommer, Iris E. C.; Diederen, Kelly M. J.; Blom, Jan-Dirk; Willems, Anne; Kushan, Leila; Slotema, Karin; Boks, Marco P. M.; Daalman, Kirstin; Hoek, Hans W.; Neggers, Sebastiaan F. W.; Kahn, Rene S.

    2008-01-01

    The pathophysiology of auditory verbal hallucinations (AVH) is largely unknown. Several functional imaging studies have measured cerebral activation during these hallucinations, but sample sizes were relatively small (one to eight subjects) and findings inconsistent. In this study cerebral

  20. Do informal musical activities shape auditory skill development in preschool-age children?

    OpenAIRE

    Putkinen, Vesa; Saarikivi, Katri; Tervaniemi, Mari

    2013-01-01

    The influence of formal musical training on auditory cognition has been well established. For the majority of children, however, musical experience does not primarily consist of adult-guided training on a musical instrument. Instead, young children mostly engage in everyday musical activities such as singing and musical play. Here, we review recent electrophysiological and behavioral studies carried out in our laboratory and elsewhere which have begun to map how developing auditory skills are...

  1. What determines auditory distraction? On the roles of local auditory changes and expectation violations.

    Directory of Open Access Journals (Sweden)

    Jan P Röer

    Full Text Available Both the acoustic variability of a distractor sequence and the degree to which it violates expectations are important determinants of auditory distraction. In four experiments we examined the relative contribution of local auditory changes on the one hand and expectation violations on the other hand in the disruption of serial recall by irrelevant sound. We present evidence for a greater disruption by auditory sequences ending in unexpected steady state distractor repetitions compared to auditory sequences with expected changing state endings even though the former contained fewer local changes. This effect was demonstrated with piano melodies (Experiment 1 and speech distractors (Experiment 2. Furthermore, it was replicated when the expectation violation occurred after the encoding of the target items (Experiment 3, indicating that the items' maintenance in short-term memory was disrupted by attentional capture and not their encoding. This seems to be primarily due to the violation of a model of the specific auditory distractor sequences because the effect vanishes and even reverses when the experiment provides no opportunity to build up a specific neural model about the distractor sequence (Experiment 4. Nevertheless, the violation of abstract long-term knowledge about auditory regularities seems to cause a small and transient capture effect: Disruption decreased markedly over the course of the experiments indicating that participants habituated to the unexpected distractor repetitions across trials. The overall pattern of results adds to the growing literature that the degree to which auditory distractors violate situation-specific expectations is a more important determinant of auditory distraction than the degree to which a distractor sequence contains local auditory changes.

  2. Early Stages of Melody Processing: Stimulus-Sequence and Task-Dependent Neuronal Activity in Monkey Auditory Cortical Fields A1 and R

    Science.gov (United States)

    Yin, Pingbo; Mishkin, Mortimer; Sutter, Mitchell; Fritz, Jonathan B.

    2008-01-01

    To explore the effects of acoustic and behavioral context on neuronal responses in the core of auditory cortex (fields A1 and R), two monkeys were trained on a go/no-go discrimination task in which they learned to respond selectively to a four-note target (S+) melody and withhold response to a variety of other nontarget (S−) sounds. We analyzed evoked activity from 683 units in A1/R of the trained monkeys during task performance and from 125 units in A1/R of two naive monkeys. We characterized two broad classes of neural activity that were modulated by task performance. Class I consisted of tone-sequence–sensitive enhancement and suppression responses. Enhanced or suppressed responses to specific tonal components of the S+ melody were frequently observed in trained monkeys, but enhanced responses were rarely seen in naive monkeys. Both facilitatory and suppressive responses in the trained monkeys showed a temporal pattern different from that observed in naive monkeys. Class II consisted of nonacoustic activity, characterized by a task-related component that correlated with bar release, the behavioral response leading to reward. We observed a significantly higher percentage of both Class I and Class II neurons in field R than in A1. Class I responses may help encode a long-term representation of the behaviorally salient target melody. Class II activity may reflect a variety of nonacoustic influences, such as attention, reward expectancy, somatosensory inputs, and/or motor set and may help link auditory perception and behavioral response. Both types of neuronal activity are likely to contribute to the performance of the auditory task. PMID:18842950

  3. What we expect is not always what we get: evidence for both the direction-of-change and the specific-stimulus hypotheses of auditory attentional capture.

    Science.gov (United States)

    Nöstl, Anatole; Marsh, John E; Sörqvist, Patrik

    2014-01-01

    Participants were requested to respond to a sequence of visual targets while listening to a well-known lullaby. One of the notes in the lullaby was occasionally exchanged with a pattern deviant. Experiment 1 found that deviants capture attention as a function of the pitch difference between the deviant and the replaced/expected tone. However, when the pitch difference between the expected tone and the deviant tone is held constant, a violation to the direction-of-pitch change across tones can also capture attention (Experiment 2). Moreover, in more complex auditory environments, wherein it is difficult to build a coherent neural model of the sound environment from which expectations are formed, deviations can capture attention but it appears to matter less whether this is a violation from a specific stimulus or a violation of the current direction-of-change (Experiment 3). The results support the expectation violation account of auditory distraction and suggest that there are at least two different expectations that can be violated: One appears to be bound to a specific stimulus and the other would seem to be bound to a more global cross-stimulus rule such as the direction-of-change based on a sequence of preceding sound events. Factors like base-rate probability of tones within the sound environment might become the driving mechanism of attentional capture--rather than violated expectations--in complex sound environments.

  4. Click-Evoked Auditory Efferent Activity: Rate and Level Effects.

    Science.gov (United States)

    Boothalingam, Sriram; Kurke, Julianne; Dhar, Sumitrajit

    2018-05-07

    There currently are no standardized protocols to evaluate auditory efferent function in humans. Typical tests use broadband noise to activate the efferents, but only test the contralateral efferent pathway, risk activating the middle ear muscle reflex (MEMR), and are laborious for clinical use. In an attempt to develop a clinical test of bilateral auditory efferent function, we have designed a method that uses clicks to evoke efferent activity, obtain click-evoked otoacoustic emissions (CEOAEs), and monitor MEMR. This allows for near-simultaneous estimation of cochlear and efferent function. In the present study, we manipulated click level (60, 70, and 80 dB peak-equivalent sound pressure level [peSPL]) and rate (40, 50, and 62.5 Hz) to identify an optimal rate-level combination that evokes measurable efferent modulation of CEOAEs. Our findings (n = 58) demonstrate that almost all click levels and rates used caused significant inhibition of CEOAEs, with a significant interaction between level and rate effects. Predictably, bilateral activation produced greater inhibition compared to stimulating the efferents only in the ipsilateral or contralateral ear. In examining the click rate-level effects during bilateral activation in greater detail, we observed a 1-dB inhibition of CEOAE level for each 10-dB increase in click level, with rate held constant at 62.5 Hz. Similarly, a 10-Hz increase in rate produced a 0.74-dB reduction in CEOAE level, with click level held constant at 80 dB peSPL. The effect size (Cohen's d) was small for either monaural condition and medium for bilateral, faster-rate, and higher-level conditions. We were also able to reliably extract CEOAEs from efferent eliciting clicks. We conclude that clicks can indeed be profitably employed to simultaneously evaluate cochlear health using CEOAEs as well as their efferent modulation. Furthermore, using bilateral clicks allows the evaluation of both the crossed and uncrossed elements of the auditory

  5. For Better or Worse: The Effect of Prismatic Adaptation on Auditory Neglect

    Directory of Open Access Journals (Sweden)

    Isabel Tissieres

    2017-01-01

    Full Text Available Patients with auditory neglect attend less to auditory stimuli on their left and/or make systematic directional errors when indicating sound positions. Rightward prismatic adaptation (R-PA was repeatedly shown to alleviate symptoms of visuospatial neglect and once to restore partially spatial bias in dichotic listening. It is currently unknown whether R-PA affects only this ear-related symptom or also other aspects of auditory neglect. We have investigated the effect of R-PA on left ear extinction in dichotic listening, space-related inattention assessed by diotic listening, and directional errors in auditory localization in patients with auditory neglect. The most striking effect of R-PA was the alleviation of left ear extinction in dichotic listening, which occurred in half of the patients with initial deficit. In contrast to nonresponders, their lesions spared the right dorsal attentional system and posterior temporal cortex. The beneficial effect of R-PA on an ear-related performance contrasted with detrimental effects on diotic listening and auditory localization. The former can be parsimoniously explained by the SHD-VAS model (shift in hemispheric dominance within the ventral attentional system; Clarke and Crottaz-Herbette 2016, which is based on the R-PA-induced shift of the right-dominant ventral attentional system to the left hemisphere. The negative effects in space-related tasks may be due to the complex nature of auditory space encoding at a cortical level.

  6. A Brain System for Auditory Working Memory.

    Science.gov (United States)

    Kumar, Sukhbinder; Joseph, Sabine; Gander, Phillip E; Barascud, Nicolas; Halpern, Andrea R; Griffiths, Timothy D

    2016-04-20

    The brain basis for auditory working memory, the process of actively maintaining sounds in memory over short periods of time, is controversial. Using functional magnetic resonance imaging in human participants, we demonstrate that the maintenance of single tones in memory is associated with activation in auditory cortex. In addition, sustained activation was observed in hippocampus and inferior frontal gyrus. Multivoxel pattern analysis showed that patterns of activity in auditory cortex and left inferior frontal gyrus distinguished the tone that was maintained in memory. Functional connectivity during maintenance was demonstrated between auditory cortex and both the hippocampus and inferior frontal cortex. The data support a system for auditory working memory based on the maintenance of sound-specific representations in auditory cortex by projections from higher-order areas, including the hippocampus and frontal cortex. In this work, we demonstrate a system for maintaining sound in working memory based on activity in auditory cortex, hippocampus, and frontal cortex, and functional connectivity among them. Specifically, our work makes three advances from the previous work. First, we robustly demonstrate hippocampal involvement in all phases of auditory working memory (encoding, maintenance, and retrieval): the role of hippocampus in working memory is controversial. Second, using a pattern classification technique, we show that activity in the auditory cortex and inferior frontal gyrus is specific to the maintained tones in working memory. Third, we show long-range connectivity of auditory cortex to hippocampus and frontal cortex, which may be responsible for keeping such representations active during working memory maintenance. Copyright © 2016 Kumar et al.

  7. Manipulation of Auditory Inputs as Rehabilitation Therapy for Maladaptive Auditory Cortical Reorganization

    Directory of Open Access Journals (Sweden)

    Hidehiko Okamoto

    2018-01-01

    Full Text Available Neurophysiological and neuroimaging data suggest that the brains of not only children but also adults are reorganized based on sensory inputs and behaviors. Plastic changes in the brain are generally beneficial; however, maladaptive cortical reorganization in the auditory cortex may lead to hearing disorders such as tinnitus and hyperacusis. Recent studies attempted to noninvasively visualize pathological neural activity in the living human brain and reverse maladaptive cortical reorganization by the suitable manipulation of auditory inputs in order to alleviate detrimental auditory symptoms. The effects of the manipulation of auditory inputs on maladaptively reorganized brain were reviewed herein. The findings obtained indicate that rehabilitation therapy based on the manipulation of auditory inputs is an effective and safe approach for hearing disorders. The appropriate manipulation of sensory inputs guided by the visualization of pathological brain activities using recent neuroimaging techniques may contribute to the establishment of new clinical applications for affected individuals.

  8. A habilidade de atenção auditiva sustentada em crianças com fissura labiopalatina e transtorno fonológico Sustained auditory attention ability in children with cleft lip and palate and phonological disorders

    Directory of Open Access Journals (Sweden)

    Tâmyne Ferreira Duarte de Moraes

    2011-12-01

    Full Text Available OBJETIVO: Verificar a habilidade de atenção auditiva sustentada em crianças com fissura labiopalatina e transtorno fonológico, comparando o desempenho com crianças com fissura labiopalatina e ausência de transtorno fonológico. MÉTODOS: Dezessete crianças com idade entre 6 e 11 anos, com fissura labiopalatina transforame unilateral operada e ausência de queixa e/ou alteração auditiva, separadas em dois grupos: GI (com transtorno fonológico e GII (com auŝencia de transtorno fonológico. Para detecção de alteração auditiva foram realizadas audiometria e timpanometria. Para avaliação fonológica foram utilizados os seguintes instrumentos: Teste de Linguagem Infantil e Consciência Fonológica: Instrumento de Avaliação Sequencial. Para avaliar a habilidade de atenção auditiva foi aplicado o Teste da Habilidade de Atenção Auditiva Sustentada. RESULTADOS: Das sete crianças com transtorno fonológico (41%, duas (29% apresentaram alteração nos resultados do Teste da Habilidade de Atenção Auditiva Sustentada. Não houve diferença entre as crianças com fissura labiopalatina e transtorno fonológico e as crianças com fissura labiopalatina e ausência de transtorno fonológico quanto aos resultados do Teste de Habilidade de Atenção Auditiva Sustentada. CONCLUSÃO: A habilidade de atenção auditiva sustentada nas crianças com fissura labiopalatina e transtorno fonológico não difere da habilidade de atenção auditiva sustentada de crianças com fissura labiopalatina sem transtorno fonológico.PURPOSE: To verify the ability of sustained auditory attention in children with cleft lip and palate and phonological disorder, in comparison with the performance of children with cleft lip and palate and absence of phonological disorder. METHODS: Seventeen children with ages between 6 and 11 years, with repaired unilateral complete cleft lip and palate and absence of auditory complaints or hearing problems, were divided into two

  9. Temporal envelope processing in the human auditory cortex: response and interconnections of auditory cortical areas.

    Science.gov (United States)

    Gourévitch, Boris; Le Bouquin Jeannès, Régine; Faucon, Gérard; Liégeois-Chauvel, Catherine

    2008-03-01

    Temporal envelope processing in the human auditory cortex has an important role in language analysis. In this paper, depth recordings of local field potentials in response to amplitude modulated white noises were used to design maps of activation in primary, secondary and associative auditory areas and to study the propagation of the cortical activity between them. The comparison of activations between auditory areas was based on a signal-to-noise ratio associated with the response to amplitude modulation (AM). The functional connectivity between cortical areas was quantified by the directed coherence (DCOH) applied to auditory evoked potentials. This study shows the following reproducible results on twenty subjects: (1) the primary auditory cortex (PAC), the secondary cortices (secondary auditory cortex (SAC) and planum temporale (PT)), the insular gyrus, the Brodmann area (BA) 22 and the posterior part of T1 gyrus (T1Post) respond to AM in both hemispheres. (2) A stronger response to AM was observed in SAC and T1Post of the left hemisphere independent of the modulation frequency (MF), and in the left BA22 for MFs 8 and 16Hz, compared to those in the right. (3) The activation and propagation features emphasized at least four different types of temporal processing. (4) A sequential activation of PAC, SAC and BA22 areas was clearly visible at all MFs, while other auditory areas may be more involved in parallel processing upon a stream originating from primary auditory area, which thus acts as a distribution hub. These results suggest that different psychological information is carried by the temporal envelope of sounds relative to the rate of amplitude modulation.

  10. Arousal and attention re-orienting in autism spectrum disorders: evidence from auditory event-related potentials

    Directory of Open Access Journals (Sweden)

    Elena V Orekhova

    2014-02-01

    Full Text Available The extended phenotype of autism spectrum disorders (ASD includes a combination of arousal regulation problems, sensory modulation difficulties, and attention re-orienting deficit. A slow and inefficient re-orienting to stimuli that appear outside of the attended sensory stream is thought to be especially detrimental for social functioning. Event-related potentials (ERPs and magnetic fields (ERFs may help to reveal which processing stages underlying brain response to unattended but salient sensory event are affected in individuals with ASD. Previous research focusing on two sequential stages of the brain response - automatic detection of physical changes in auditory stream, indexed by mismatch negativity (MMN, and evaluation of stimulus novelty, indexed by P3a component, - found in individuals with ASD either increased, decreased or normal processing of deviance and novelty. The review examines these apparently conflicting results, notes gaps in previous findings, and suggests a potentially unifying hypothesis relating the dampened responses to unattended sensory events to the deficit in rapid arousal process. Specifically, ‘sensory gating’ studies focused on pre-attentive arousal consistently demonstrated that brain response to unattended and temporally novel sound in ASD is already affected at around 100 ms after stimulus onset. We hypothesize that abnormalities in nicotinic cholinergic arousal pathways, previously reported in individuals with ASD, may contribute to these ERP/ERF aberrations and result in attention re-orienting deficit. Such cholinergic dysfunction may be present in individuals with ASD early in life and can influence both sensory processing and attention re-orienting behavior. Identification of early neurophysiological biomarkers for cholinergic deficit would help to detect infants at risk who can potentially benefit from particular types of therapies or interventions.

  11. Auditory risk assessment of college music students in jazz band-based instructional activity.

    Science.gov (United States)

    Gopal, Kamakshi V; Chesky, Kris; Beschoner, Elizabeth A; Nelson, Paul D; Stewart, Bradley J

    2013-01-01

    It is well-known that musicians are at risk for music-induced hearing loss, however, systematic evaluation of music exposure and its effects on the auditory system are still difficult to assess. The purpose of the study was to determine if college students in jazz band-based instructional activity are exposed to loud classroom noise and consequently exhibit acute but significant changes in basic auditory measures compared to non-music students in regular classroom sessions. For this we (1) measured and compared personal exposure levels of college students (n = 14) participating in a routine 50 min jazz ensemble-based instructional activity (experimental) to personal exposure levels of non-music students (n = 11) participating in a 50-min regular classroom activity (control), and (2) measured and compared pre- to post-auditory changes associated with these two types of classroom exposures. Results showed that the L eq (equivalent continuous noise level) generated during the 50 min jazz ensemble-based instructional activity ranged from 95 dBA to 105.8 dBA with a mean of 99.5 ± 2.5 dBA. In the regular classroom, the L eq ranged from 46.4 dBA to 67.4 dBA with a mean of 49.9 ± 10.6 dBA. Additionally, significant differences were observed in pre to post-auditory measures between the two groups. The experimental group showed a significant temporary threshold shift bilaterally at 4000 Hz (P music students place them at risk for hearing loss compared to their non-music cohorts.

  12. Gender effect on pre-attentive change detection in major depressive disorder patients revealed by auditory MMN.

    Science.gov (United States)

    Qiao, Zhengxue; Yang, Aiying; Qiu, Xiaohui; Yang, Xiuxian; Zhang, Congpei; Zhu, Xiongzhao; He, Jincai; Wang, Lin; Bai, Bing; Sun, Hailian; Zhao, Lun; Yang, Yanjie

    2015-10-30

    Gender differences in rates of major depressive disorder (MDD) are well established, but gender differences in cognitive function have been little studied. Auditory mismatch negativity (MMN) was used to investigate gender differences in pre-attentive information processing in first episode MDD. In the deviant-standard reverse oddball paradigm, duration auditory MMN was obtained in 30 patients (15 males) and 30 age-/education-matched controls. Over frontal-central areas, mean amplitude of increment MMN (to a 150-ms deviant tone) was smaller in female than male patients; there was no sex difference in decrement MMN (to a 50-ms deviant tone). Neither increment nor decrement MMN differed between female and male patients over temporal areas. Frontal-central MMN and temporal MMN did not differ between male and female controls in any condition. Over frontal-central areas, mean amplitude of increment MMN was smaller in female patients than female controls; there was no difference in decrement MMN. Neither increment nor decrement MMN differed between female patients and female controls over temporal areas. Frontal-central MMN and temporal MMN did not differ between male patients and male controls. Mean amplitude of increment MMN in female patients did not correlate with symptoms, suggesting this sex-specific deficit is a trait- not a state-dependent phenomenon. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. The auditory oddball paradigm revised to improve bedside detection of consciousness in behaviorally unresponsive patients.

    Science.gov (United States)

    Morlet, Dominique; Ruby, Perrine; André-Obadia, Nathalie; Fischer, Catherine

    2017-11-01

    Active paradigms requiring subjects to engage in a mental task on request have been developed to detect consciousness in behaviorally unresponsive patients. Using auditory ERPs, the active condition consists in orienting patient's attention toward oddball stimuli. In comparison with passive listening, larger P300 in the active condition identifies voluntary processes. However, contrast between these two conditions is usually too weak to be detected at the individual level. To improve test sensitivity, we propose as a control condition to actively divert the subject's attention from the auditory stimuli with a mental imagery task that has been demonstrated to be within the grasp of the targeted patients: navigate in one's home. Twenty healthy subjects were presented with a two-tone oddball paradigm in the three following condition: (a) passive listening, (b) mental imagery, (c) silent counting of deviant stimuli. Mental imagery proved to be more efficient than passive listening to lessen P300 response to deviant tones as compared with the active counting condition. An effect of attention manipulation (oriented vs. diverted) was observed in 19/20 subjects, of whom 18 showed the expected P300 effect and 1 showed an effect restricted to the N2 component. The only subject showing no effect also proved insufficient engagement in the tasks. Our study demonstrated the efficiency of diverting attention using mental imagery to improve the sensitivity of the active oddball paradigm. Using recorded instructions and requiring a small number of electrodes, the test was designed to be conveniently and economically used at the patient's bedside. © 2017 Society for Psychophysiological Research.

  14. Auditory Pattern Memory and Group Signal Detection

    National Research Council Canada - National Science Library

    Sorkin, Robert

    1997-01-01

    .... The experiments with temporally-coded auditory patterns showed how listeners' attention is influenced by the position and the amount of information carried by different segments of the pattern...

  15. Supramodal Executive Control of Attention

    Directory of Open Access Journals (Sweden)

    ALFREDO eSPAGNA

    2015-02-01

    Full Text Available The human attentional system can be subdivided into three functional networks of alerting, orienting, and executive control. Although these networks have been extensively studied in the visuospatial modality, whether the same mechanisms are deployed across different sensory modalities remains unclear. In this study we used the attention network test for visuospatial modality, in addition to two auditory variants with spatial and frequency manipulations to examine cross-modal correlations between network functions. Results showed that among the visual and auditory tasks the effects of executive control, but not effects of alerting and orienting were significantly correlated. These findings suggest that while alerting and orienting functions rely more upon modality specific processes, the executive control of attention coordinates complex behavior via supramodal mechanisms.

  16. Evidence for cue-independent spatial representation in the human auditory cortex during active listening.

    Science.gov (United States)

    Higgins, Nathan C; McLaughlin, Susan A; Rinne, Teemu; Stecker, G Christopher

    2017-09-05

    Few auditory functions are as important or as universal as the capacity for auditory spatial awareness (e.g., sound localization). That ability relies on sensitivity to acoustical cues-particularly interaural time and level differences (ITD and ILD)-that correlate with sound-source locations. Under nonspatial listening conditions, cortical sensitivity to ITD and ILD takes the form of broad contralaterally dominated response functions. It is unknown, however, whether that sensitivity reflects representations of the specific physical cues or a higher-order representation of auditory space (i.e., integrated cue processing), nor is it known whether responses to spatial cues are modulated by active spatial listening. To investigate, sensitivity to parametrically varied ITD or ILD cues was measured using fMRI during spatial and nonspatial listening tasks. Task type varied across blocks where targets were presented in one of three dimensions: auditory location, pitch, or visual brightness. Task effects were localized primarily to lateral posterior superior temporal gyrus (pSTG) and modulated binaural-cue response functions differently in the two hemispheres. Active spatial listening (location tasks) enhanced both contralateral and ipsilateral responses in the right hemisphere but maintained or enhanced contralateral dominance in the left hemisphere. Two observations suggest integrated processing of ITD and ILD. First, overlapping regions in medial pSTG exhibited significant sensitivity to both cues. Second, successful classification of multivoxel patterns was observed for both cue types and-critically-for cross-cue classification. Together, these results suggest a higher-order representation of auditory space in the human auditory cortex that at least partly integrates the specific underlying cues.

  17. Acute physical exercise affected processing efficiency in an auditory attention task more than processing effectiveness.

    Science.gov (United States)

    Dutke, Stephan; Jaitner, Thomas; Berse, Timo; Barenberg, Jonathan

    2014-02-01

    Research on effects of acute physical exercise on performance in a concurrent cognitive task has generated equivocal evidence. Processing efficiency theory predicts that concurrent physical exercise can increase resource requirements for sustaining cognitive performance even when the level of performance is unaffected. This hypothesis was tested in a dual-task experiment. Sixty young adults worked on a primary auditory attention task and a secondary interval production task while cycling on a bicycle ergometer. Physical load (cycling) and cognitive load of the primary task were manipulated. Neither physical nor cognitive load affected primary task performance, but both factors interacted on secondary task performance. Sustaining primary task performance under increased physical and/or cognitive load increased resource consumption as indicated by decreased secondary task performance. Results demonstrated that physical exercise effects on cognition might be underestimated when only single task performance is the focus.

  18. Functional studies of the human auditory cortex, auditory memory and musical hallucinations

    International Nuclear Information System (INIS)

    Goycoolea, Marcos; Mena, Ismael; Neubauer, Sonia

    2004-01-01

    Objectives. 1. To determine which areas of the cerebral cortex are activated stimulating the left ear with pure tones, and what type of stimulation occurs (eg. excitatory or inhibitory) in these different areas. 2. To use this information as an initial step to develop a normal functional data base for future studies. 3. To try to determine if there is a biological substrate to the process of recalling previous auditory perceptions and if possible, suggest a locus for auditory memory. Method. Brain perfusion single photon emission computerized tomography (SPECT) evaluation was conducted: 1-2) Using auditory stimulation with pure tones in 4 volunteers with normal hearing. 3) In a patient with bilateral profound hearing loss who had auditory perception of previous musical experiences; while injected with Tc99m HMPAO while she was having the sensation of hearing a well known melody. Results. Both in the patient with auditory hallucinations and the normal controls -stimulated with pure tones- there was a statistically significant increase in perfusion in Brodmann's area 39, more intense on the right side (right to left p < 0.05). With a lesser intensity there was activation in the adjacent area 40 and there was intense activation also in the executive frontal cortex areas 6, 8, 9, and 10 of Brodmann. There was also activation of area 7 of Brodmann; an audio-visual association area; more marked on the right side in the patient and the normal stimulated controls. In the subcortical structures there was also marked activation in the patient with hallucinations in both lentiform nuclei, thalamus and caudate nuclei also more intense in the right hemisphere, 5, 4.7 and 4.2 S.D. above the mean respectively and 5, 3.3, and 3 S.D. above the normal mean in the left hemisphere respectively. Similar findings were observed in normal controls. Conclusions. After auditory stimulation with pure tones in the left ear of normal female volunteers, there is bilateral activation of area 39

  19. The roles of superficial amygdala and auditory cortex in music-evoked fear and joy.

    Science.gov (United States)

    Koelsch, Stefan; Skouras, Stavros; Fritz, Thomas; Herrera, Perfecto; Bonhage, Corinna; Küssner, Mats B; Jacobs, Arthur M

    2013-11-01

    This study investigates neural correlates of music-evoked fear and joy with fMRI. Studies on neural correlates of music-evoked fear are scant, and there are only a few studies on neural correlates of joy in general. Eighteen individuals listened to excerpts of fear-evoking, joy-evoking, as well as neutral music and rated their own emotional state in terms of valence, arousal, fear, and joy. Results show that BOLD signal intensity increased during joy, and decreased during fear (compared to the neutral condition) in bilateral auditory cortex (AC) and bilateral superficial amygdala (SF). In the right primary somatosensory cortex (area 3b) BOLD signals increased during exposure to fear-evoking music. While emotion-specific activity in AC increased with increasing duration of each trial, SF responded phasically in the beginning of the stimulus, and then SF activity declined. Psychophysiological Interaction (PPI) analysis revealed extensive emotion-specific functional connectivity of AC with insula, cingulate cortex, as well as with visual, and parietal attentional structures. These findings show that the auditory cortex functions as a central hub of an affective-attentional network that is more extensive than previously believed. PPI analyses also showed functional connectivity of SF with AC during the joy condition, taken to reflect that SF is sensitive to social signals with positive valence. During fear music, SF showed functional connectivity with visual cortex and area 7 of the superior parietal lobule, taken to reflect increased visual alertness and an involuntary shift of attention during the perception of auditory signals of danger. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. Attention and multisensory integration of emotions in schizophrenia

    Directory of Open Access Journals (Sweden)

    Mikhail eZvyagintsev

    2013-10-01

    Full Text Available The impairment of multisensory integration in schizophrenia is often explained by deficits of attentional selection. Emotion perception, however, does not always depend on attention because affective stimuli can capture attention automatically. In our study, we specify the role of attention in the multisensory perception of emotional stimuli in schizophrenia. We evaluated attention by interference between conflicting auditory and visual information in two multisensory paradigms in patients with schizophrenia and healthy participants. In the first paradigm, interference occurred between physical features of the dynamic auditory and visual stimuli. In the second paradigm, interference occurred between the emotional content of the auditory and visual stimuli, namely fearful and sad emotions. In patients with schizophrenia, the interference effect was observed in both paradigms. In contrast, in healthy participants, the interference occurred in the emotional paradigm only. These findings indicate that the information leakage between different modalities in patients with schizophrenia occurs at the perceptual level, which is intact in healthy participants. However, healthy participants can have problems with the separation of fearful and sad emotions similar to those of patients with schizophrenia.

  1. What we expect is not always what we get: evidence for both the direction-of-change and the specific-stimulus hypotheses of auditory attentional capture.

    Directory of Open Access Journals (Sweden)

    Anatole Nöstl

    Full Text Available Participants were requested to respond to a sequence of visual targets while listening to a well-known lullaby. One of the notes in the lullaby was occasionally exchanged with a pattern deviant. Experiment 1 found that deviants capture attention as a function of the pitch difference between the deviant and the replaced/expected tone. However, when the pitch difference between the expected tone and the deviant tone is held constant, a violation to the direction-of-pitch change across tones can also capture attention (Experiment 2. Moreover, in more complex auditory environments, wherein it is difficult to build a coherent neural model of the sound environment from which expectations are formed, deviations can capture attention but it appears to matter less whether this is a violation from a specific stimulus or a violation of the current direction-of-change (Experiment 3. The results support the expectation violation account of auditory distraction and suggest that there are at least two different expectations that can be violated: One appears to be bound to a specific stimulus and the other would seem to be bound to a more global cross-stimulus rule such as the direction-of-change based on a sequence of preceding sound events. Factors like base-rate probability of tones within the sound environment might become the driving mechanism of attentional capture--rather than violated expectations--in complex sound environments.

  2. Dividing time: concurrent timing of auditory and visual events by young and elderly adults.

    Science.gov (United States)

    McAuley, J Devin; Miller, Jonathan P; Wang, Mo; Pang, Kevin C H

    2010-07-01

    This article examines age differences in individual's ability to produce the durations of learned auditory and visual target events either in isolation (focused attention) or concurrently (divided attention). Young adults produced learned target durations equally well in focused and divided attention conditions. Older adults, in contrast, showed an age-related increase in timing variability in divided attention conditions that tended to be more pronounced for visual targets than for auditory targets. Age-related impairments were associated with a decrease in working memory span; moreover, the relationship between working memory and timing performance was largest for visual targets in divided attention conditions.

  3. Binaural auditory beats affect long-term memory.

    Science.gov (United States)

    Garcia-Argibay, Miguel; Santed, Miguel A; Reales, José M

    2017-12-08

    The presentation of two pure tones to each ear separately with a slight difference in their frequency results in the perception of a single tone that fluctuates in amplitude at a frequency that equals the difference of interaural frequencies. This perceptual phenomenon is known as binaural auditory beats, and it is thought to entrain electrocortical activity and enhance cognition functions such as attention and memory. The aim of this study was to determine the effect of binaural auditory beats on long-term memory. Participants (n = 32) were kept blind to the goal of the study and performed both the free recall and recognition tasks after being exposed to binaural auditory beats, either in the beta (20 Hz) or theta (5 Hz) frequency bands and white noise as a control condition. Exposure to beta-frequency binaural beats yielded a greater proportion of correctly recalled words and a higher sensitivity index d' in recognition tasks, while theta-frequency binaural-beat presentation lessened the number of correctly remembered words and the sensitivity index. On the other hand, we could not find differences in the conditional probability for recall given recognition between beta and theta frequencies and white noise, suggesting that the observed changes in recognition were due to the recollection component. These findings indicate that the presentation of binaural auditory beats can affect long-term memory both positively and negatively, depending on the frequency used.

  4. Lateralization of functional magnetic resonance imaging (fMRI) activation in the auditory pathway of patients with lateralized tinnitus

    Energy Technology Data Exchange (ETDEWEB)

    Smits, Marion [Erasmus MC - University Medical Center Rotterdam, Department of Radiology, Hs 224, Rotterdam (Netherlands); Kovacs, Silvia; Peeters, Ronald R; Hecke, Paul van; Sunaert, Stefan [University Hospitals of the Catholic University Leuven, Department of Radiology, Leuven (Belgium); Ridder, Dirk de [University of Antwerp, Department of Neurosurgery, Edegem (Belgium)

    2007-08-15

    Tinnitus is hypothesized to be an auditory phantom phenomenon resulting from spontaneous neuronal activity somewhere along the auditory pathway. We performed fMRI of the entire auditory pathway, including the inferior colliculus (IC), the medial geniculate body (MGB) and the auditory cortex (AC), in 42 patients with tinnitus and 10 healthy volunteers to assess lateralization of fMRI activation. Subjects were scanned on a 3T MRI scanner. A T2*-weighted EPI silent gap sequence was used during the stimulation paradigm, which consisted of a blocked design of 12 epochs in which music presented binaurally through headphones, which was switched on and off for periods of 50 s. Using SPM2 software, single subject and group statistical parametric maps were calculated. Lateralization of activation was assessed qualitatively and quantitatively. Tinnitus was lateralized in 35 patients (83%, 13 right-sided and 22 left-sided). Significant signal change (P{sub corrected} < 0.05) was found bilaterally in the primary and secondary AC, the IC and the MGB. Signal change was symmetrical in patients with bilateral tinnitus. In patients with lateralized tinnitus, fMRI activation was lateralized towards the side of perceived tinnitus in the primary AC and IC in patients with right-sided tinnitus, and in the MGB in patients with left-sided tinnitus. In healthy volunteers, activation in the primary AC was left-lateralized. Our paradigm adequately visualized the auditory pathways in tinnitus patients. In lateralized tinnitus fMRI activation was also lateralized, supporting the hypothesis that tinnitus is an auditory phantom phenomenon. (orig.)

  5. Lateralization of functional magnetic resonance imaging (fMRI) activation in the auditory pathway of patients with lateralized tinnitus

    International Nuclear Information System (INIS)

    Smits, Marion; Kovacs, Silvia; Peeters, Ronald R.; Hecke, Paul van; Sunaert, Stefan; Ridder, Dirk de

    2007-01-01

    Tinnitus is hypothesized to be an auditory phantom phenomenon resulting from spontaneous neuronal activity somewhere along the auditory pathway. We performed fMRI of the entire auditory pathway, including the inferior colliculus (IC), the medial geniculate body (MGB) and the auditory cortex (AC), in 42 patients with tinnitus and 10 healthy volunteers to assess lateralization of fMRI activation. Subjects were scanned on a 3T MRI scanner. A T2*-weighted EPI silent gap sequence was used during the stimulation paradigm, which consisted of a blocked design of 12 epochs in which music presented binaurally through headphones, which was switched on and off for periods of 50 s. Using SPM2 software, single subject and group statistical parametric maps were calculated. Lateralization of activation was assessed qualitatively and quantitatively. Tinnitus was lateralized in 35 patients (83%, 13 right-sided and 22 left-sided). Significant signal change (P corrected < 0.05) was found bilaterally in the primary and secondary AC, the IC and the MGB. Signal change was symmetrical in patients with bilateral tinnitus. In patients with lateralized tinnitus, fMRI activation was lateralized towards the side of perceived tinnitus in the primary AC and IC in patients with right-sided tinnitus, and in the MGB in patients with left-sided tinnitus. In healthy volunteers, activation in the primary AC was left-lateralized. Our paradigm adequately visualized the auditory pathways in tinnitus patients. In lateralized tinnitus fMRI activation was also lateralized, supporting the hypothesis that tinnitus is an auditory phantom phenomenon. (orig.)

  6. Auditory distraction and serial memory: The avoidable and the ineluctable

    Directory of Open Access Journals (Sweden)

    Dylan M Jones

    2010-01-01

    Full Text Available One mental activity that is very vulnerable to auditory distraction is serial recall. This review of the contemporary findings relating to serial recall charts the key determinants of distraction. It is evident that there is one form of distraction that is a joint product of the cognitive characteristics of the task and of the obligatory cognitive processing of the sound. For sequences of sound, distraction appears to be an ineluctable product of similarity-of-process, specifically, the serial order processing of the visually presented items and the serial order coding that is the by-product of the streaming of the sound. However, recently emerging work shows that the distraction from a single sound (one deviating from a prevailing sequence results in attentional capture and is qualitatively distinct from that of a sequence in being restricted in its action to encoding, not to rehearsal of list members. Capture is also sensitive to the sensory task load, suggesting that it is subject to top-down control and therefore avoidable. These two forms of distraction-conflict of process and attentional capture-may be two consequences of auditory perceptual organization processes that serve to strike the optimal balance between attentional selectivity and distractability.

  7. Visually Evoked Visual-Auditory Changes Associated with Auditory Performance in Children with Cochlear Implants

    Directory of Open Access Journals (Sweden)

    Maojin Liang

    2017-10-01

    Full Text Available Activation of the auditory cortex by visual stimuli has been reported in deaf children. In cochlear implant (CI patients, a residual, more intense cortical activation in the frontotemporal areas in response to photo stimuli was found to be positively associated with poor auditory performance. Our study aimed to investigate the mechanism by which visual processing in CI users activates the auditory-associated cortex during the period after cochlear implantation as well as its relation to CI outcomes. Twenty prelingually deaf children with CI were recruited. Ten children were good CI performers (GCP and ten were poor (PCP. Ten age- and sex- matched normal-hearing children were recruited as controls, and visual evoked potentials (VEPs were recorded. The characteristics of the right frontotemporal N1 component were analyzed. In the prelingually deaf children, higher N1 amplitude was observed compared to normal controls. While the GCP group showed significant decreases in N1 amplitude, and source analysis showed the most significant decrease in brain activity was observed in the primary visual cortex (PVC, with a downward trend in the primary auditory cortex (PAC activity, but these did not occur in the PCP group. Meanwhile, higher PVC activation (comparing to controls before CI use (0M and a significant decrease in source energy after CI use were found to be related to good CI outcomes. In the GCP group, source energy decreased in the visual-auditory cortex with CI use. However, no significant cerebral hemispheric dominance was found. We supposed that intra- or cross-modal reorganization and higher PVC activation in prelingually deaf children may reflect a stronger potential ability of cortical plasticity. Brain activity evolution appears to be related to CI auditory outcomes.

  8. Visually Evoked Visual-Auditory Changes Associated with Auditory Performance in Children with Cochlear Implants.

    Science.gov (United States)

    Liang, Maojin; Zhang, Junpeng; Liu, Jiahao; Chen, Yuebo; Cai, Yuexin; Wang, Xianjun; Wang, Junbo; Zhang, Xueyuan; Chen, Suijun; Li, Xianghui; Chen, Ling; Zheng, Yiqing

    2017-01-01

    Activation of the auditory cortex by visual stimuli has been reported in deaf children. In cochlear implant (CI) patients, a residual, more intense cortical activation in the frontotemporal areas in response to photo stimuli was found to be positively associated with poor auditory performance. Our study aimed to investigate the mechanism by which visual processing in CI users activates the auditory-associated cortex during the period after cochlear implantation as well as its relation to CI outcomes. Twenty prelingually deaf children with CI were recruited. Ten children were good CI performers (GCP) and ten were poor (PCP). Ten age- and sex- matched normal-hearing children were recruited as controls, and visual evoked potentials (VEPs) were recorded. The characteristics of the right frontotemporal N1 component were analyzed. In the prelingually deaf children, higher N1 amplitude was observed compared to normal controls. While the GCP group showed significant decreases in N1 amplitude, and source analysis showed the most significant decrease in brain activity was observed in the primary visual cortex (PVC), with a downward trend in the primary auditory cortex (PAC) activity, but these did not occur in the PCP group. Meanwhile, higher PVC activation (comparing to controls) before CI use (0M) and a significant decrease in source energy after CI use were found to be related to good CI outcomes. In the GCP group, source energy decreased in the visual-auditory cortex with CI use. However, no significant cerebral hemispheric dominance was found. We supposed that intra- or cross-modal reorganization and higher PVC activation in prelingually deaf children may reflect a stronger potential ability of cortical plasticity. Brain activity evolution appears to be related to CI auditory outcomes.

  9. Neural biomarkers for dyslexia, ADHD and ADD in the auditory cortex of children

    OpenAIRE

    Bettina Serrallach; Christine Gross; Valdis Bernhofs; Dorte Engelmann; Jan Benner; Jan Benner; Nadine Gündert; Maria Blatow; Martina Wengenroth; Angelika Seitz; Monika Brunner; Stefan Seither; Stefan Seither; Richard Parncutt; Peter Schneider

    2016-01-01

    Dyslexia, attention deficit hyperactivity disorder (ADHD), and attention deficit disorder (ADD) show distinct clinical profiles that may include auditory and language-related impairments. Currently, an objective brain-based diagnosis of these developmental disorders is still unavailable. We investigated the neuro-auditory systems of dyslexic, ADHD, ADD, and age-matched control children (N=147) using neuroimaging, magnet-encephalography and psychoacoustics. All disorder subgroups exhibited an ...

  10. The role of modality : Auditory and visual distractors in Stroop interference

    NARCIS (Netherlands)

    Elliott, Emily M.; Morey, Candice C.; Morey, Richard D.; Eaves, Sharon D.; Shelton, Jill Talley; Lutfi-Proctor, Danielle A.

    2014-01-01

    As a commonly used measure of selective attention, it is important to understand the factors contributing to interference in the Stroop task. The current research examined distracting stimuli in the auditory and visual modalities to determine whether the use of auditory distractors would create

  11. Stress improves selective attention towards emotionally neutral left ear stimuli.

    Science.gov (United States)

    Hoskin, Robert; Hunter, M D; Woodruff, P W R

    2014-09-01

    Research concerning the impact of psychological stress on visual selective attention has produced mixed results. The current paper describes two experiments which utilise a novel auditory oddball paradigm to test the impact of psychological stress on auditory selective attention. Participants had to report the location of emotionally-neutral auditory stimuli, while ignoring task-irrelevant changes in their content. The results of the first experiment, in which speech stimuli were presented, suggested that stress improves the ability to selectively attend to left, but not right ear stimuli. When this experiment was repeated using tonal stimuli the same result was evident, but only for female participants. Females were also found to experience greater levels of distraction in general across the two experiments. These findings support the goal-shielding theory which suggests that stress improves selective attention by reducing the attentional resources available to process task-irrelevant information. The study also demonstrates, for the first time, that this goal-shielding effect extends to auditory perception. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Selective attention and the auditory vertex potential. I - Effects of stimulus delivery rate. II - Effects of signal intensity and masking noise

    Science.gov (United States)

    Schwent, V. L.; Hillyard, S. A.; Galambos, R.

    1976-01-01

    The effects of varying the rate of delivery of dichotic tone pip stimuli on selective attention measured by evoked-potential amplitudes and signal detectability scores were studied. The subjects attended to one channel (ear) of tones, ignored the other, and pressed a button whenever occasional targets - tones of a slightly higher pitch were detected in the attended ear. Under separate conditions, randomized interstimulus intervals were short, medium, and long. Another study compared the effects of attention on the N1 component of the auditory evoked potential for tone pips presented alone and when white noise was added to make the tones barely above detectability threshold in a three-channel listening task. Major conclusions are that (1) N1 is enlarged to stimuli in an attended channel only in the short interstimulus interval condition (averaging 350 msec), (2) N1 and P3 are related to different modes of selective attention, and (3) attention selectivity in multichannel listening task is greater when tones are faint and/or difficult to detect.

  13. Spontaneous high-gamma band activity reflects functional organization of auditory cortex in the awake macaque.

    Science.gov (United States)

    Fukushima, Makoto; Saunders, Richard C; Leopold, David A; Mishkin, Mortimer; Averbeck, Bruno B

    2012-06-07

    In the absence of sensory stimuli, spontaneous activity in the brain has been shown to exhibit organization at multiple spatiotemporal scales. In the macaque auditory cortex, responses to acoustic stimuli are tonotopically organized within multiple, adjacent frequency maps aligned in a caudorostral direction on the supratemporal plane (STP) of the lateral sulcus. Here, we used chronic microelectrocorticography to investigate the correspondence between sensory maps and spontaneous neural fluctuations in the auditory cortex. We first mapped tonotopic organization across 96 electrodes spanning approximately two centimeters along the primary and higher auditory cortex. In separate sessions, we then observed that spontaneous activity at the same sites exhibited spatial covariation that reflected the tonotopic map of the STP. This observation demonstrates a close relationship between functional organization and spontaneous neural activity in the sensory cortex of the awake monkey. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Temporal integration of sequential auditory events: silent period in sound pattern activates human planum temporale.

    Science.gov (United States)

    Mustovic, Henrietta; Scheffler, Klaus; Di Salle, Francesco; Esposito, Fabrizio; Neuhoff, John G; Hennig, Jürgen; Seifritz, Erich

    2003-09-01

    Temporal integration is a fundamental process that the brain carries out to construct coherent percepts from serial sensory events. This process critically depends on the formation of memory traces reconciling past with present events and is particularly important in the auditory domain where sensory information is received both serially and in parallel. It has been suggested that buffers for transient auditory memory traces reside in the auditory cortex. However, previous studies investigating "echoic memory" did not distinguish between brain response to novel auditory stimulus characteristics on the level of basic sound processing and a higher level involving matching of present with stored information. Here we used functional magnetic resonance imaging in combination with a regular pattern of sounds repeated every 100 ms and deviant interspersed stimuli of 100-ms duration, which were either brief presentations of louder sounds or brief periods of silence, to probe the formation of auditory memory traces. To avoid interaction with scanner noise, the auditory stimulation sequence was implemented into the image acquisition scheme. Compared to increased loudness events, silent periods produced specific neural activation in the right planum temporale and temporoparietal junction. Our findings suggest that this area posterior to the auditory cortex plays a critical role in integrating sequential auditory events and is involved in the formation of short-term auditory memory traces. This function of the planum temporale appears to be fundamental in the segregation of simultaneous sound sources.

  15. Auditory processing in autism spectrum disorder

    DEFF Research Database (Denmark)

    Vlaskamp, Chantal; Oranje, Bob; Madsen, Gitte Falcher

    2017-01-01

    Children with autism spectrum disorders (ASD) often show changes in (automatic) auditory processing. Electrophysiology provides a method to study auditory processing, by investigating event-related potentials such as mismatch negativity (MMN) and P3a-amplitude. However, findings on MMN in autism...... a hyper-responsivity at the attentional level. In addition, as similar MMN deficits are found in schizophrenia, these MMN results may explain some of the frequently reported increased risk of children with ASD to develop schizophrenia later in life. Autism Res 2017, 10: 1857–1865....

  16. Selective memory retrieval of auditory what and auditory where involves the ventrolateral prefrontal cortex.

    Science.gov (United States)

    Kostopoulos, Penelope; Petrides, Michael

    2016-02-16

    There is evidence from the visual, verbal, and tactile memory domains that the midventrolateral prefrontal cortex plays a critical role in the top-down modulation of activity within posterior cortical areas for the selective retrieval of specific aspects of a memorized experience, a functional process often referred to as active controlled retrieval. In the present functional neuroimaging study, we explore the neural bases of active retrieval for auditory nonverbal information, about which almost nothing is known. Human participants were scanned with functional magnetic resonance imaging (fMRI) in a task in which they were presented with short melodies from different locations in a simulated virtual acoustic environment within the scanner and were then instructed to retrieve selectively either the particular melody presented or its location. There were significant activity increases specifically within the midventrolateral prefrontal region during the selective retrieval of nonverbal auditory information. During the selective retrieval of information from auditory memory, the right midventrolateral prefrontal region increased its interaction with the auditory temporal region and the inferior parietal lobule in the right hemisphere. These findings provide evidence that the midventrolateral prefrontal cortical region interacts with specific posterior cortical areas in the human cerebral cortex for the selective retrieval of object and location features of an auditory memory experience.

  17. Auditory midbrain processing is differentially modulated by auditory and visual cortices: An auditory fMRI study.

    Science.gov (United States)

    Gao, Patrick P; Zhang, Jevin W; Fan, Shu-Juan; Sanes, Dan H; Wu, Ed X

    2015-12-01

    gain modulation is mediated primarily through direct projections and they point to future investigations of the differential roles of the direct and indirect projections in corticofugal modulation. In summary, our imaging findings demonstrate the large-scale descending influences, from both the auditory and visual cortices, on sound processing in different IC subdivisions. They can guide future studies on the coordinated activity across multiple regions of the auditory network, and its dysfunctions. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Automatic phoneme category selectivity in the dorsal auditory stream.

    Science.gov (United States)

    Chevillet, Mark A; Jiang, Xiong; Rauschecker, Josef P; Riesenhuber, Maximilian

    2013-03-20

    Debates about motor theories of speech perception have recently been reignited by a burst of reports implicating premotor cortex (PMC) in speech perception. Often, however, these debates conflate perceptual and decision processes. Evidence that PMC activity correlates with task difficulty and subject performance suggests that PMC might be recruited, in certain cases, to facilitate category judgments about speech sounds (rather than speech perception, which involves decoding of sounds). However, it remains unclear whether PMC does, indeed, exhibit neural selectivity that is relevant for speech decisions. Further, it is unknown whether PMC activity in such cases reflects input via the dorsal or ventral auditory pathway, and whether PMC processing of speech is automatic or task-dependent. In a novel modified categorization paradigm, we presented human subjects with paired speech sounds from a phonetic continuum but diverted their attention from phoneme category using a challenging dichotic listening task. Using fMRI rapid adaptation to probe neural selectivity, we observed acoustic-phonetic selectivity in left anterior and left posterior auditory cortical regions. Conversely, we observed phoneme-category selectivity in left PMC that correlated with explicit phoneme-categorization performance measured after scanning, suggesting that PMC recruitment can account for performance on phoneme-categorization tasks. Structural equation modeling revealed connectivity from posterior, but not anterior, auditory cortex to PMC, suggesting a dorsal route for auditory input to PMC. Our results provide evidence for an account of speech processing in which the dorsal stream mediates automatic sensorimotor integration of speech and may be recruited to support speech decision tasks.

  19. Minimal effects of visual memory training on the auditory performance of adult cochlear implant users

    Science.gov (United States)

    Oba, Sandra I.; Galvin, John J.; Fu, Qian-Jie

    2014-01-01

    Auditory training has been shown to significantly improve cochlear implant (CI) users’ speech and music perception. However, it is unclear whether post-training gains in performance were due to improved auditory perception or to generally improved attention, memory and/or cognitive processing. In this study, speech and music perception, as well as auditory and visual memory were assessed in ten CI users before, during, and after training with a non-auditory task. A visual digit span (VDS) task was used for training, in which subjects recalled sequences of digits presented visually. After the VDS training, VDS performance significantly improved. However, there were no significant improvements for most auditory outcome measures (auditory digit span, phoneme recognition, sentence recognition in noise, digit recognition in noise), except for small (but significant) improvements in vocal emotion recognition and melodic contour identification. Post-training gains were much smaller with the non-auditory VDS training than observed in previous auditory training studies with CI users. The results suggest that post-training gains observed in previous studies were not solely attributable to improved attention or memory, and were more likely due to improved auditory perception. The results also suggest that CI users may require targeted auditory training to improve speech and music perception. PMID:23516087

  20. The perception of prosody and associated auditory cues in early-implanted children: the role of auditory working memory and musical activities.

    Science.gov (United States)

    Torppa, Ritva; Faulkner, Andrew; Huotilainen, Minna; Järvikivi, Juhani; Lipsanen, Jari; Laasonen, Marja; Vainio, Martti

    2014-03-01

    To study prosodic perception in early-implanted children in relation to auditory discrimination, auditory working memory, and exposure to music. Word and sentence stress perception, discrimination of fundamental frequency (F0), intensity and duration, and forward digit span were measured twice over approximately 16 months. Musical activities were assessed by questionnaire. Twenty-one early-implanted and age-matched normal-hearing (NH) children (4-13 years). Children with cochlear implants (CIs) exposed to music performed better than others in stress perception and F0 discrimination. Only this subgroup of implanted children improved with age in word stress perception, intensity discrimination, and improved over time in digit span. Prosodic perception, F0 discrimination and forward digit span in implanted children exposed to music was equivalent to the NH group, but other implanted children performed more poorly. For children with CIs, word stress perception was linked to digit span and intensity discrimination: sentence stress perception was additionally linked to F0 discrimination. Prosodic perception in children with CIs is linked to auditory working memory and aspects of auditory discrimination. Engagement in music was linked to better performance across a range of measures, suggesting that music is a valuable tool in the rehabilitation of implanted children.

  1. Biological impact of music and software-based auditory training

    Science.gov (United States)

    Kraus, Nina

    2012-01-01

    Auditory-based communication skills are developed at a young age and are maintained throughout our lives. However, some individuals – both young and old – encounter difficulties in achieving or maintaining communication proficiency. Biological signals arising from hearing sounds relate to real-life communication skills such as listening to speech in noisy environments and reading, pointing to an intersection between hearing and cognition. Musical experience, amplification, and software-based training can improve these biological signals. These findings of biological plasticity, in a variety of subject populations, relate to attention and auditory memory, and represent an integrated auditory system influenced by both sensation and cognition. Learning outcomes The reader will (1) understand that the auditory system is malleable to experience and training, (2) learn the ingredients necessary for auditory learning to successfully be applied to communication, (3) learn that the auditory brainstem response to complex sounds (cABR) is a window into the integrated auditory system, and (4) see examples of how cABR can be used to track the outcome of experience and training. PMID:22789822

  2. Auditory short-term memory in the primate auditory cortex.

    Science.gov (United States)

    Scott, Brian H; Mishkin, Mortimer

    2016-06-01

    Sounds are fleeting, and assembling the sequence of inputs at the ear into a coherent percept requires auditory memory across various time scales. Auditory short-term memory comprises at least two components: an active ׳working memory' bolstered by rehearsal, and a sensory trace that may be passively retained. Working memory relies on representations recalled from long-term memory, and their rehearsal may require phonological mechanisms unique to humans. The sensory component, passive short-term memory (pSTM), is tractable to study in nonhuman primates, whose brain architecture and behavioral repertoire are comparable to our own. This review discusses recent advances in the behavioral and neurophysiological study of auditory memory with a focus on single-unit recordings from macaque monkeys performing delayed-match-to-sample (DMS) tasks. Monkeys appear to employ pSTM to solve these tasks, as evidenced by the impact of interfering stimuli on memory performance. In several regards, pSTM in monkeys resembles pitch memory in humans, and may engage similar neural mechanisms. Neural correlates of DMS performance have been observed throughout the auditory and prefrontal cortex, defining a network of areas supporting auditory STM with parallels to that supporting visual STM. These correlates include persistent neural firing, or a suppression of firing, during the delay period of the memory task, as well as suppression or (less commonly) enhancement of sensory responses when a sound is repeated as a ׳match' stimulus. Auditory STM is supported by a distributed temporo-frontal network in which sensitivity to stimulus history is an intrinsic feature of auditory processing. This article is part of a Special Issue entitled SI: Auditory working memory. Published by Elsevier B.V.

  3. Failure of the extended contingent attentional capture account in multimodal settings

    Directory of Open Access Journals (Sweden)

    Rob H.J. Van der Lubbe

    2006-01-01

    Full Text Available Sudden changes in our environment like sound bursts or light flashes are thought to automatically attract our attention thereby affecting responses to subsequent targets, although an alternative view (the contingent attentional capture account holds that stimuli only capture our attention when they match target features. In the current study, we examined whether an extended version of the latter view can explain exogenous cuing effects on speed and accuracy of performance to targets (uncued-cued in multimodal settings, in which auditory and visual stimuli co-occur. To this end, we determined whether observed effects of visual and auditory cues, which were always intermixed, depend on top-down settings in "pure" blocks, in which only one target modality occurred, as compared to "mixed" blocks, in which targets were either visual or auditory. Results revealed that unimodal and crossmodal cuing effects depend on top-down settings. However, our findingswerenot in accordance with predictions derived from the extended contingent attentional capture account. Specifically,visual cues showed comparable effects for visual targets in pure and mixed blocks, but also a comparable effect for auditory targets in pure blocks, and most surprisingly, an opposite effect in mixed blocks. The latter result suggests that visual stimuli may distract attention from the auditory modality in case when the modality of the forthcoming target is unknown. The results additionally revealed that the Simon effect, the influence of correspondence or not between stimulus and response side, is modulated by exogenous cues in unimodal settings, but not in crossmodal settings. These findings accord with the view that attention plays an important role for the Simon effect, and additionally questions the directness of links between maps of visual and auditory space.

  4. Neural biomarkers for dyslexia, ADHD and ADD in the auditory cortex of children

    Directory of Open Access Journals (Sweden)

    Bettina Serrallach

    2016-07-01

    Full Text Available Dyslexia, attention deficit hyperactivity disorder (ADHD, and attention deficit disorder (ADD show distinct clinical profiles that may include auditory and language-related impairments. Currently, an objective brain-based diagnosis of these developmental disorders is still unavailable. We investigated the neuro-auditory systems of dyslexic, ADHD, ADD, and age-matched control children (N=147 using neuroimaging, magnet-encephalography and psychoacoustics. All disorder subgroups exhibited an oversized left planum temporale and an abnormal interhemispheric asynchrony (10-40 ms of the primary auditory evoked P1-response. Considering right auditory cortex morphology, bilateral P1 source waveform shapes, and auditory performance, the three disorder subgroups could be reliably differentiated with outstanding accuracies of 89-98%. We therefore for the first time provide differential biomarkers for a brain-based diagnosis of dyslexia, ADHD, and ADD. The method allowed not only a clear discrimination between two subtypes of attentional disorders (ADHD and ADD, a topic controversially discussed for decades in the scientific community, but also revealed the potential for objectively identifying comorbid cases. Noteworthy, in children playing a musical instrument, after three and a half years of training the observed interhemispheric asynchronies were reduced by about 2/3, thus suggesting a strong beneficial influence of music experience on brain development. These findings might have far-reaching implications for both research and practice and enable a profound understanding of the brain-related etiology, diagnosis, and musically based therapy of common auditory-related developmental disorders and learning disabilities.

  5. Minimal effects of visual memory training on auditory performance of adult cochlear implant users.

    Science.gov (United States)

    Oba, Sandra I; Galvin, John J; Fu, Qian-Jie

    2013-01-01

    Auditory training has been shown to significantly improve cochlear implant (CI) users' speech and music perception. However, it is unclear whether posttraining gains in performance were due to improved auditory perception or to generally improved attention, memory, and/or cognitive processing. In this study, speech and music perception, as well as auditory and visual memory, were assessed in 10 CI users before, during, and after training with a nonauditory task. A visual digit span (VDS) task was used for training, in which subjects recalled sequences of digits presented visually. After the VDS training, VDS performance significantly improved. However, there were no significant improvements for most auditory outcome measures (auditory digit span, phoneme recognition, sentence recognition in noise, digit recognition in noise), except for small (but significant) improvements in vocal emotion recognition and melodic contour identification. Posttraining gains were much smaller with the nonauditory VDS training than observed in previous auditory training studies with CI users. The results suggest that posttraining gains observed in previous studies were not solely attributable to improved attention or memory and were more likely due to improved auditory perception. The results also suggest that CI users may require targeted auditory training to improve speech and music perception.

  6. Behavioral semantics of learning and crossmodal processing in auditory cortex: the semantic processor concept.

    Science.gov (United States)

    Scheich, Henning; Brechmann, André; Brosch, Michael; Budinger, Eike; Ohl, Frank W; Selezneva, Elena; Stark, Holger; Tischmeyer, Wolfgang; Wetzel, Wolfram

    2011-01-01

    Two phenomena of auditory cortex activity have recently attracted attention, namely that the primary field can show different types of learning-related changes of sound representation and that during learning even this early auditory cortex is under strong multimodal influence. Based on neuronal recordings in animal auditory cortex during instrumental tasks, in this review we put forward the hypothesis that these two phenomena serve to derive the task-specific meaning of sounds by associative learning. To understand the implications of this tenet, it is helpful to realize how a behavioral meaning is usually derived for novel environmental sounds. For this purpose, associations with other sensory, e.g. visual, information are mandatory to develop a connection between a sound and its behaviorally relevant cause and/or the context of sound occurrence. This makes it plausible that in instrumental tasks various non-auditory sensory and procedural contingencies of sound generation become co-represented by neuronal firing in auditory cortex. Information related to reward or to avoidance of discomfort during task learning, that is essentially non-auditory, is also co-represented. The reinforcement influence points to the dopaminergic internal reward system, the local role of which for memory consolidation in auditory cortex is well-established. Thus, during a trial of task performance, the neuronal responses to the sounds are embedded in a sequence of representations of such non-auditory information. The embedded auditory responses show task-related modulations of auditory responses falling into types that correspond to three basic logical classifications that may be performed with a perceptual item, i.e. from simple detection to discrimination, and categorization. This hierarchy of classifications determine the semantic "same-different" relationships among sounds. Different cognitive classifications appear to be a consequence of learning task and lead to a recruitment of

  7. Multichannel auditory search: toward understanding control processes in polychotic auditory listening.

    Science.gov (United States)

    Lee, M D

    2001-01-01

    Two experiments are presented that serve as a framework for exploring auditory information processing. The framework is referred to as polychotic listening or auditory search, and it requires a listener to scan multiple simultaneous auditory streams for the appearance of a target word (the name of a letter such as A or M). Participants' ability to scan between two and six simultaneous auditory streams of letter and digit names for the name of a target letter was examined using six loudspeakers. The main independent variable was auditory load, or the number of active audio streams on a given trial. The primary dependent variables were target localization accuracy and reaction time. Results showed that as load increased, performance decreased. The performance decrease was evident in reaction time, accuracy, and sensitivity measures. The second study required participants to practice the same task for 10 sessions, for a total of 1800 trials. Results indicated that even with extensive practice, performance was still affected by auditory load. The present results are compared with findings in the visual search literature. The implications for the use of multiple auditory displays are discussed. Potential applications include cockpit and automobile warning displays, virtual reality systems, and training systems.

  8. Odors Bias Time Perception in Visual and Auditory Modalities.

    Science.gov (United States)

    Yue, Zhenzhu; Gao, Tianyu; Chen, Lihan; Wu, Jiashuang

    2016-01-01

    Previous studies have shown that emotional states alter our perception of time. However, attention, which is modulated by a number of factors, such as emotional events, also influences time perception. To exclude potential attentional effects associated with emotional events, various types of odors (inducing different levels of emotional arousal) were used to explore whether olfactory events modulated time perception differently in visual and auditory modalities. Participants were shown either a visual dot or heard a continuous tone for 1000 or 4000 ms while they were exposed to odors of jasmine, lavender, or garlic. Participants then reproduced the temporal durations of the preceding visual or auditory stimuli by pressing the spacebar twice. Their reproduced durations were compared to those in the control condition (without odor). The results showed that participants produced significantly longer time intervals in the lavender condition than in the jasmine or garlic conditions. The overall influence of odor on time perception was equivalent for both visual and auditory modalities. The analysis of the interaction effect showed that participants produced longer durations than the actual duration in the short interval condition, but they produced shorter durations in the long interval condition. The effect sizes were larger for the auditory modality than those for the visual modality. Moreover, by comparing performance across the initial and the final blocks of the experiment, we found odor adaptation effects were mainly manifested as longer reproductions for the short time interval later in the adaptation phase, and there was a larger effect size in the auditory modality. In summary, the present results indicate that odors imposed differential impacts on reproduced time durations, and they were constrained by different sensory modalities, valence of the emotional events, and target durations. Biases in time perception could be accounted for by a framework of

  9. Binaural auditory beats affect vigilance performance and mood.

    Science.gov (United States)

    Lane, J D; Kasian, S J; Owens, J E; Marsh, G R

    1998-01-01

    When two tones of slightly different frequency are presented separately to the left and right ears the listener perceives a single tone that varies in amplitude at a frequency equal to the frequency difference between the two tones, a perceptual phenomenon known as the binaural auditory beat. Anecdotal reports suggest that binaural auditory beats within the electroencephalograph frequency range can entrain EEG activity and may affect states of consciousness, although few scientific studies have been published. This study compared the effects of binaural auditory beats in the EEG beta and EEG theta/delta frequency ranges on mood and on performance of a vigilance task to investigate their effects on subjective and objective measures of arousal. Participants (n = 29) performed a 30-min visual vigilance task on three different days while listening to pink noise containing simple tones or binaural beats either in the beta range (16 and 24 Hz) or the theta/delta range (1.5 and 4 Hz). However, participants were kept blind to the presence of binaural beats to control expectation effects. Presentation of beta-frequency binaural beats yielded more correct target detections and fewer false alarms than presentation of theta/delta frequency binaural beats. In addition, the beta-frequency beats were associated with less negative mood. Results suggest that the presentation of binaural auditory beats can affect psychomotor performance and mood. This technology may have applications for the control of attention and arousal and the enhancement of human performance.

  10. Generalization of Auditory Sensory and Cognitive Learning in Typically Developing Children.

    Directory of Open Access Journals (Sweden)

    Cristina F B Murphy

    Full Text Available Despite the well-established involvement of both sensory ("bottom-up" and cognitive ("top-down" processes in literacy, the extent to which auditory or cognitive (memory or attention learning transfers to phonological and reading skills remains unclear. Most research has demonstrated learning of the trained task or even learning transfer to a closely related task. However, few studies have reported "far-transfer" to a different domain, such as the improvement of phonological and reading skills following auditory or cognitive training. This study assessed the effectiveness of auditory, memory or attention training on far-transfer measures involving phonological and reading skills in typically developing children. Mid-transfer was also assessed through untrained auditory, attention and memory tasks. Sixty 5- to 8-year-old children with normal hearing were quasi-randomly assigned to one of five training groups: attention group (AG, memory group (MG, auditory sensory group (SG, placebo group (PG; drawing, painting, and a control, untrained group (CG. Compliance, mid-transfer and far-transfer measures were evaluated before and after training. All trained groups received 12 x 45-min training sessions over 12 weeks. The CG did not receive any intervention. All trained groups, especially older children, exhibited significant learning of the trained task. On pre- to post-training measures (test-retest, most groups exhibited improvements on most tasks. There was significant mid-transfer for a visual digit span task, with highest span in the MG, relative to other groups. These results show that both sensory and cognitive (memory or attention training can lead to learning in the trained task and to mid-transfer learning on a task (visual digit span within the same domain as the trained tasks. However, learning did not transfer to measures of language (reading and phonological awareness, as the PG and CG improved as much as the other trained groups. Further

  11. Generalization of Auditory Sensory and Cognitive Learning in Typically Developing Children.

    Science.gov (United States)

    Murphy, Cristina F B; Moore, David R; Schochat, Eliane

    2015-01-01

    Despite the well-established involvement of both sensory ("bottom-up") and cognitive ("top-down") processes in literacy, the extent to which auditory or cognitive (memory or attention) learning transfers to phonological and reading skills remains unclear. Most research has demonstrated learning of the trained task or even learning transfer to a closely related task. However, few studies have reported "far-transfer" to a different domain, such as the improvement of phonological and reading skills following auditory or cognitive training. This study assessed the effectiveness of auditory, memory or attention training on far-transfer measures involving phonological and reading skills in typically developing children. Mid-transfer was also assessed through untrained auditory, attention and memory tasks. Sixty 5- to 8-year-old children with normal hearing were quasi-randomly assigned to one of five training groups: attention group (AG), memory group (MG), auditory sensory group (SG), placebo group (PG; drawing, painting), and a control, untrained group (CG). Compliance, mid-transfer and far-transfer measures were evaluated before and after training. All trained groups received 12 x 45-min training sessions over 12 weeks. The CG did not receive any intervention. All trained groups, especially older children, exhibited significant learning of the trained task. On pre- to post-training measures (test-retest), most groups exhibited improvements on most tasks. There was significant mid-transfer for a visual digit span task, with highest span in the MG, relative to other groups. These results show that both sensory and cognitive (memory or attention) training can lead to learning in the trained task and to mid-transfer learning on a task (visual digit span) within the same domain as the trained tasks. However, learning did not transfer to measures of language (reading and phonological awareness), as the PG and CG improved as much as the other trained groups. Further research

  12. Attention and prediction in human audition: a lesson from cognitive psychophysiology

    Science.gov (United States)

    Schröger, Erich; Marzecová, Anna; SanMiguel, Iria

    2015-01-01

    Attention is a hypothetical mechanism in the service of perception that facilitates the processing of relevant information and inhibits the processing of irrelevant information. Prediction is a hypothetical mechanism in the service of perception that considers prior information when interpreting the sensorial input. Although both (attention and prediction) aid perception, they are rarely considered together. Auditory attention typically yields enhanced brain activity, whereas auditory prediction often results in attenuated brain responses. However, when strongly predicted sounds are omitted, brain responses to silence resemble those elicited by sounds. Studies jointly investigating attention and prediction revealed that these different mechanisms may interact, e.g. attention may magnify the processing differences between predicted and unpredicted sounds. Following the predictive coding theory, we suggest that prediction relates to predictions sent down from predictive models housed in higher levels of the processing hierarchy to lower levels and attention refers to gain modulation of the prediction error signal sent up to the higher level. As predictions encode contents and confidence in the sensory data, and as gain can be modulated by the intention of the listener and by the predictability of the input, various possibilities for interactions between attention and prediction can be unfolded. From this perspective, the traditional distinction between bottom-up/exogenous and top-down/endogenous driven attention can be revisited and the classic concepts of attentional gain and attentional trace can be integrated. PMID:25728182

  13. Behavioral and EEG evidence for auditory memory suppression

    Directory of Open Access Journals (Sweden)

    Maya Elizabeth Cano

    2016-03-01

    Full Text Available The neural basis of motivated forgetting using the Think/No-Think (TNT paradigm is receiving increased attention with a particular focus on the mechanisms that enable memory suppression. However, most TNT studies have been limited to the visual domain. To assess whether and to what extent direct memory suppression extends across sensory modalities, we examined behavioral and electroencephalographic (EEG effects of auditory Think/No-Think in healthy young adults by adapting the TNT paradigm to the auditory modality. Behaviorally, suppression of memory strength was indexed by prolonged response times during the retrieval of subsequently remembered No-Think words. We examined task-related EEG activity of both attempted memory retrieval and inhibition of a previously learned target word during the presentation of its paired associate. Event-related EEG responses revealed two main findings: 1 a centralized Think > No-Think positivity during auditory word presentation (from approximately 0-500ms, and 2 a sustained Think positivity over parietal electrodes beginning at approximately 600ms reflecting the memory retrieval effect which was significantly reduced for No-Think words. In addition, word-locked theta (4-8 Hz power was initially greater for No-Think compared to Think during auditory word presentation over fronto-central electrodes. This was followed by a posterior theta increase indexing successful memory retrieval in the Think condition.The observed event-related potential pattern and theta power analysis are similar to that reported in visual Think/No-Think studies and support a modality non-specific mechanism for memory inhibition. The EEG data also provide evidence supporting differing roles and time courses of frontal and parietal regions in the flexible control of auditory memory.

  14. Behavioral and EEG Evidence for Auditory Memory Suppression.

    Science.gov (United States)

    Cano, Maya E; Knight, Robert T

    2016-01-01

    The neural basis of motivated forgetting using the Think/No-Think (TNT) paradigm is receiving increased attention with a particular focus on the mechanisms that enable memory suppression. However, most TNT studies have been limited to the visual domain. To assess whether and to what extent direct memory suppression extends across sensory modalities, we examined behavioral and electroencephalographic (EEG) effects of auditory TNT in healthy young adults by adapting the TNT paradigm to the auditory modality. Behaviorally, suppression of memory strength was indexed by prolonged response time (RTs) during the retrieval of subsequently remembered No-Think words. We examined task-related EEG activity of both attempted memory retrieval and inhibition of a previously learned target word during the presentation of its paired associate. Event-related EEG responses revealed two main findings: (1) a centralized Think > No-Think positivity during auditory word presentation (from approximately 0-500 ms); and (2) a sustained Think positivity over parietal electrodes beginning at approximately 600 ms reflecting the memory retrieval effect which was significantly reduced for No-Think words. In addition, word-locked theta (4-8 Hz) power was initially greater for No-Think compared to Think during auditory word presentation over fronto-central electrodes. This was followed by a posterior theta increase indexing successful memory retrieval in the Think condition. The observed event-related potential pattern and theta power analysis are similar to that reported in visual TNT studies and support a modality non-specific mechanism for memory inhibition. The EEG data also provide evidence supporting differing roles and time courses of frontal and parietal regions in the flexible control of auditory memory.

  15. A spatial approach of magnitude-squared coherence applied to selective attention detection.

    Science.gov (United States)

    Bonato Felix, Leonardo; de Souza Ranaudo, Fernando; D'affonseca Netto, Aluizio; Ferreira Leite Miranda de Sá, Antonio Mauricio

    2014-05-30

    Auditory selective attention is the human ability of actively focusing in a certain sound stimulus while avoiding all other ones. This ability can be used, for example, in behavioral studies and brain-machine interface. In this work we developed an objective method - called Spatial Coherence - to detect the side where a subject is focusing attention to. This method takes into consideration the Magnitude Squared Coherence and the topographic distribution of responses among electroencephalogram electrodes. The individuals were stimulated with amplitude-modulated tones binaurally and were oriented to focus attention to only one of the stimuli. The results indicate a contralateral modulation of ASSR in the attention condition and are in agreement with prior studies. Furthermore, the best combination of electrodes led to a hit rate of 82% for 5.03 commands per minute. Using a similar paradigm, in a recent work, a maximum hit rate of 84.33% was achieved, but with a greater a classification time (20s, i.e. 3 commands per minute). It seems that Spatial Coherence is a useful technique for detecting focus of auditory selective attention. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Electrophysiological Evidence of Developmental Changes in the Duration of Auditory Sensory Memory.

    Science.gov (United States)

    Gomes, Hilary; And Others

    1999-01-01

    Investigated developmental change in duration of auditory sensory memory for tonal frequency by measuring mismatch negativity, an electrophysiological component of the auditory event-related potential that is relatively insensitive to attention and does not require a behavioral response. Findings among children and adults suggest that there are…

  17. Auditory-motor learning influences auditory memory for music.

    Science.gov (United States)

    Brown, Rachel M; Palmer, Caroline

    2012-05-01

    In two experiments, we investigated how auditory-motor learning influences performers' memory for music. Skilled pianists learned novel melodies in four conditions: auditory only (listening), motor only (performing without sound), strongly coupled auditory-motor (normal performance), and weakly coupled auditory-motor (performing along with auditory recordings). Pianists' recognition of the learned melodies was better following auditory-only or auditory-motor (weakly coupled and strongly coupled) learning than following motor-only learning, and better following strongly coupled auditory-motor learning than following auditory-only learning. Auditory and motor imagery abilities modulated the learning effects: Pianists with high auditory imagery scores had better recognition following motor-only learning, suggesting that auditory imagery compensated for missing auditory feedback at the learning stage. Experiment 2 replicated the findings of Experiment 1 with melodies that contained greater variation in acoustic features. Melodies that were slower and less variable in tempo and intensity were remembered better following weakly coupled auditory-motor learning. These findings suggest that motor learning can aid performers' auditory recognition of music beyond auditory learning alone, and that motor learning is influenced by individual abilities in mental imagery and by variation in acoustic features.

  18. Rejection Positivity Predicts Trial-to-Trial Reaction Times in an Auditory Selective Attention Task: A Computational Analysis of Inhibitory Control

    Directory of Open Access Journals (Sweden)

    Sufen eChen

    2014-08-01

    Full Text Available A series of computer simulations using variants of a formal model of attention (Melara & Algom, 2003 probed the role of rejection positivity (RP, a slow-wave electroencephalographic (EEG component, in the inhibitory control of distraction. Behavioral and EEG data were recorded as participants performed auditory selective attention tasks. Simulations that modulated processes of distractor inhibition accounted well for reaction-time (RT performance, whereas those that modulated target excitation did not. A model that incorporated RP from actual EEG recordings in estimating distractor inhibition was superior in predicting changes in RT as a function of distractor salience across conditions. A model that additionally incorporated momentary fluctuations in EEG as the source of trial-to-trial variation in performance precisely predicted individual RTs within each condition. The results lend support to the linking proposition that RP controls the speed of responding to targets through the inhibitory control of distractors.

  19. Auditory short-term memory in the primate auditory cortex

    OpenAIRE

    Scott, Brian H.; Mishkin, Mortimer

    2015-01-01

    Sounds are fleeting, and assembling the sequence of inputs at the ear into a coherent percept requires auditory memory across various time scales. Auditory short-term memory comprises at least two components: an active ���working memory��� bolstered by rehearsal, and a sensory trace that may be passively retained. Working memory relies on representations recalled from long-term memory, and their rehearsal may require phonological mechanisms unique to humans. The sensory component, passive sho...

  20. Selective increase of auditory cortico-striatal coherence during auditory-cued Go/NoGo discrimination learning.

    Directory of Open Access Journals (Sweden)

    Andreas L. Schulz

    2016-01-01

    Full Text Available Goal directed behavior and associated learning processes are tightly linked to neuronal activity in the ventral striatum. Mechanisms that integrate task relevant sensory information into striatal processing during decision making and learning are implicitly assumed in current reinforcementmodels, yet they are still weakly understood. To identify the functional activation of cortico-striatal subpopulations of connections during auditory discrimination learning, we trained Mongolian gerbils in a two-way active avoidance task in a shuttlebox to discriminate between falling and rising frequency modulated tones with identical spectral properties. We assessed functional coupling by analyzing the field-field coherence between the auditory cortex and the ventral striatum of animals performing the task. During the course of training, we observed a selective increase of functionalcoupling during Go-stimulus presentations. These results suggest that the auditory cortex functionally interacts with the ventral striatum during auditory learning and that the strengthening of these functional connections is selectively goal-directed.

  1. Functional Mapping of the Human Auditory Cortex: fMRI Investigation of a Patient with Auditory Agnosia from Trauma to the Inferior Colliculus.

    Science.gov (United States)

    Poliva, Oren; Bestelmeyer, Patricia E G; Hall, Michelle; Bultitude, Janet H; Koller, Kristin; Rafal, Robert D

    2015-09-01

    To use functional magnetic resonance imaging to map the auditory cortical fields that are activated, or nonreactive, to sounds in patient M.L., who has auditory agnosia caused by trauma to the inferior colliculi. The patient cannot recognize speech or environmental sounds. Her discrimination is greatly facilitated by context and visibility of the speaker's facial movements, and under forced-choice testing. Her auditory temporal resolution is severely compromised. Her discrimination is more impaired for words differing in voice onset time than place of articulation. Words presented to her right ear are extinguished with dichotic presentation; auditory stimuli in the right hemifield are mislocalized to the left. We used functional magnetic resonance imaging to examine cortical activations to different categories of meaningful sounds embedded in a block design. Sounds activated the caudal sub-area of M.L.'s primary auditory cortex (hA1) bilaterally and her right posterior superior temporal gyrus (auditory dorsal stream), but not the rostral sub-area (hR) of her primary auditory cortex or the anterior superior temporal gyrus in either hemisphere (auditory ventral stream). Auditory agnosia reflects dysfunction of the auditory ventral stream. The ventral and dorsal auditory streams are already segregated as early as the primary auditory cortex, with the ventral stream projecting from hR and the dorsal stream from hA1. M.L.'s leftward localization bias, preserved audiovisual integration, and phoneme perception are explained by preserved processing in her right auditory dorsal stream.

  2. Probing the lifetimes of auditory novelty detection processes.

    Science.gov (United States)

    Pegado, Felipe; Bekinschtein, Tristan; Chausson, Nicolas; Dehaene, Stanislas; Cohen, Laurent; Naccache, Lionel

    2010-08-01

    Auditory novelty detection can be fractionated into multiple cognitive processes associated with their respective neurophysiological signatures. In the present study we used high-density scalp event-related potentials (ERPs) during an active version of the auditory oddball paradigm to explore the lifetimes of these processes by varying the stimulus onset asynchrony (SOA). We observed that early MMN (90-160 ms) decreased when the SOA increased, confirming the evanescence of this echoic memory system. Subsequent neural events including late MMN (160-220 ms) and P3a/P3b components of the P3 complex (240-500 ms) did not decay with SOA, but showed a systematic delay effect supporting a two-stage model of accumulation of evidence. On the basis of these observations, we propose a distinction within the MMN complex of two distinct events: (1) an early, pre-attentive and fast-decaying MMN associated with generators located within superior temporal gyri (STG) and frontal cortex, and (2) a late MMN more resistant to SOA, corresponding to the activation of a distributed cortical network including fronto-parietal regions. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  3. Reliability-Weighted Integration of Audiovisual Signals Can Be Modulated by Top-down Attention

    Science.gov (United States)

    Noppeney, Uta

    2018-01-01

    Abstract Behaviorally, it is well established that human observers integrate signals near-optimally weighted in proportion to their reliabilities as predicted by maximum likelihood estimation. Yet, despite abundant behavioral evidence, it is unclear how the human brain accomplishes this feat. In a spatial ventriloquist paradigm, participants were presented with auditory, visual, and audiovisual signals and reported the location of the auditory or the visual signal. Combining psychophysics, multivariate functional MRI (fMRI) decoding, and models of maximum likelihood estimation (MLE), we characterized the computational operations underlying audiovisual integration at distinct cortical levels. We estimated observers’ behavioral weights by fitting psychometric functions to participants’ localization responses. Likewise, we estimated the neural weights by fitting neurometric functions to spatial locations decoded from regional fMRI activation patterns. Our results demonstrate that low-level auditory and visual areas encode predominantly the spatial location of the signal component of a region’s preferred auditory (or visual) modality. By contrast, intraparietal sulcus forms spatial representations by integrating auditory and visual signals weighted by their reliabilities. Critically, the neural and behavioral weights and the variance of the spatial representations depended not only on the sensory reliabilities as predicted by the MLE model but also on participants’ modality-specific attention and report (i.e., visual vs. auditory). These results suggest that audiovisual integration is not exclusively determined by bottom-up sensory reliabilities. Instead, modality-specific attention and report can flexibly modulate how intraparietal sulcus integrates sensory signals into spatial representations to guide behavioral responses (e.g., localization and orienting). PMID:29527567

  4. The effects of caffeine and directed attention on acoustic startle habituation.

    Science.gov (United States)

    Schicatano, E J; Blumenthal, T D

    1998-01-01

    The present experiment tested the effects of caffeine on acoustic startle habituation during different attention tasks in which subjects either (a) attended to the acoustic startle stimulus (auditory attention; n = 9) (b) attended to a visual search task during presentation of acoustic startle stimuli (visual attention; n = 10), or (c) were given no specific instructions during acoustic startle testing (no attention; n = 9). Startle eyeblink responses were measured after subjects received either caffeine (1 mg/kg) or placebo. Caffeine significantly delayed response habituation in the no attention group and in the auditory attention group, but had no effect on habituation in the visual attention group. These data show that startle habituation can occur with minimal attention being directed to the acoustic startle stimulus, and that visual attention cancels the effects of caffeine on startle habituation.

  5. Auditory Figure-Ground Segregation is Impaired by High Visual Load

    OpenAIRE

    Lavie, Nilli; Chait, Maria; Molloy, Katharine

    2017-01-01

    Figure-ground segregation is fundamental to listening in complex acoustic environments. An ongoing debate pertains to whether segregation requires attention or is 'automatic' and pre-attentive. In this magnetoencephalography (MEG) study we tested a prediction derived from Load Theory of attention (1) that segregation requires attention, but can benefit from the automatic allocation of any 'leftover' capacity under low load. Complex auditory scenes were modelled with Stochastic Figure Ground s...

  6. Separable sustained and selective attention factors are apparent in 5-year-old children

    DEFF Research Database (Denmark)

    Underbjerg, Mette; George, Melanie S; Thorsen, Poul

    2013-01-01

    In adults and older children, evidence consistent with relative separation between selective and sustained attention, superimposed upon generally positive inter-test correlations, has been reported. Here we examine whether this pattern is detectable in 5-year-old children from the healthy...... and auditory stimuli were good. In a factor analysis, the two TEA-Ch(J) selective attention tasks (one visual, one auditory) loaded onto a common factor and diverged from the two sustained attention tasks (one auditory, one motor), which shared a common loading on the second factor. This pattern, which...... suggests that the tests are indeed sensitive to underlying attentional capacities, was supported by the relationships between the TEA-Ch(J) factors and Test of Everyday Attention for Children subtests in the older children in the sample. It is possible to gain convincing performance-based estimates...

  7. Selective attention to sound location or pitch studied with fMRI.

    Science.gov (United States)

    Degerman, Alexander; Rinne, Teemu; Salmi, Juha; Salonen, Oili; Alho, Kimmo

    2006-03-10

    We used 3-T functional magnetic resonance imaging to compare the brain mechanisms underlying selective attention to sound location and pitch. In different tasks, the subjects (N = 10) attended to a designated sound location or pitch or to pictures presented on the screen. In the Attend Location conditions, the sound location varied randomly (left or right), while the pitch was kept constant (high or low). In the Attend Pitch conditions, sounds of randomly varying pitch (high or low) were presented at a constant location (left or right). Both attention to location and attention to pitch produced enhanced activity (in comparison with activation caused by the same sounds when attention was focused on the pictures) in widespread areas of the superior temporal cortex. Attention to either sound feature also activated prefrontal and inferior parietal cortical regions. These activations were stronger during attention to location than during attention to pitch. Attention to location but not to pitch produced a significant increase of activation in the premotor/supplementary motor cortices of both hemispheres and in the right prefrontal cortex, while no area showed activity specifically related to attention to pitch. The present results suggest some differences in the attentional selection of sounds on the basis of their location and pitch consistent with the suggested auditory "what" and "where" processing streams.

  8. Auditory and visual spatial impression: Recent studies of three auditoria

    Science.gov (United States)

    Nguyen, Andy; Cabrera, Densil

    2004-10-01

    Auditory spatial impression is widely studied for its contribution to auditorium acoustical quality. By contrast, visual spatial impression in auditoria has received relatively little attention in formal studies. This paper reports results from a series of experiments investigating the auditory and visual spatial impression of concert auditoria. For auditory stimuli, a fragment of an anechoic recording of orchestral music was convolved with calibrated binaural impulse responses, which had been made with the dummy head microphone at a wide range of positions in three auditoria and the sound source on the stage. For visual stimuli, greyscale photographs were used, taken at the same positions in the three auditoria, with a visual target on the stage. Subjective experiments were conducted with auditory stimuli alone, visual stimuli alone, and visual and auditory stimuli combined. In these experiments, subjects rated apparent source width, listener envelopment, intimacy and source distance (auditory stimuli), and spaciousness, envelopment, stage dominance, intimacy and target distance (visual stimuli). Results show target distance to be of primary importance in auditory and visual spatial impression-thereby providing a basis for covariance between some attributes of auditory and visual spatial impression. Nevertheless, some attributes of spatial impression diverge between the senses.

  9. The auditory cortex hosts network nodes influential for emotion processing: An fMRI study on music-evoked fear and joy.

    Science.gov (United States)

    Koelsch, Stefan; Skouras, Stavros; Lohmann, Gabriele

    2018-01-01

    Sound is a potent elicitor of emotions. Auditory core, belt and parabelt regions have anatomical connections to a large array of limbic and paralimbic structures which are involved in the generation of affective activity. However, little is known about the functional role of auditory cortical regions in emotion processing. Using functional magnetic resonance imaging and music stimuli that evoke joy or fear, our study reveals that anterior and posterior regions of auditory association cortex have emotion-characteristic functional connectivity with limbic/paralimbic (insula, cingulate cortex, and striatum), somatosensory, visual, motor-related, and attentional structures. We found that these regions have remarkably high emotion-characteristic eigenvector centrality, revealing that they have influential positions within emotion-processing brain networks with "small-world" properties. By contrast, primary auditory fields showed surprisingly strong emotion-characteristic functional connectivity with intra-auditory regions. Our findings demonstrate that the auditory cortex hosts regions that are influential within networks underlying the affective processing of auditory information. We anticipate our results to incite research specifying the role of the auditory cortex-and sensory systems in general-in emotion processing, beyond the traditional view that sensory cortices have merely perceptual functions.

  10. Auditory memory function in expert chess players.

    Science.gov (United States)

    Fattahi, Fariba; Geshani, Ahmad; Jafari, Zahra; Jalaie, Shohreh; Salman Mahini, Mona

    2015-01-01

    Chess is a game that involves many aspects of high level cognition such as memory, attention, focus and problem solving. Long term practice of chess can improve cognition performances and behavioral skills. Auditory memory, as a kind of memory, can be influenced by strengthening processes following long term chess playing like other behavioral skills because of common processing pathways in the brain. The purpose of this study was to evaluate the auditory memory function of expert chess players using the Persian version of dichotic auditory-verbal memory test. The Persian version of dichotic auditory-verbal memory test was performed for 30 expert chess players aged 20-35 years and 30 non chess players who were matched by different conditions; the participants in both groups were randomly selected. The performance of the two groups was compared by independent samples t-test using SPSS version 21. The mean score of dichotic auditory-verbal memory test between the two groups, expert chess players and non-chess players, revealed a significant difference (p≤ 0.001). The difference between the ears scores for expert chess players (p= 0.023) and non-chess players (p= 0.013) was significant. Gender had no effect on the test results. Auditory memory function in expert chess players was significantly better compared to non-chess players. It seems that increased auditory memory function is related to strengthening cognitive performances due to playing chess for a long time.

  11. Auditory prediction during speaking and listening.

    Science.gov (United States)

    Sato, Marc; Shiller, Douglas M

    2018-02-02

    In the present EEG study, the role of auditory prediction in speech was explored through the comparison of auditory cortical responses during active speaking and passive listening to the same acoustic speech signals. Two manipulations of sensory prediction accuracy were used during the speaking task: (1) a real-time change in vowel F1 feedback (reducing prediction accuracy relative to unaltered feedback) and (2) presenting a stable auditory target rather than a visual cue to speak (enhancing auditory prediction accuracy during baseline productions, and potentially enhancing the perturbing effect of altered feedback). While subjects compensated for the F1 manipulation, no difference between the auditory-cue and visual-cue conditions were found. Under visually-cued conditions, reduced N1/P2 amplitude was observed during speaking vs. listening, reflecting a motor-to-sensory prediction. In addition, a significant correlation was observed between the magnitude of behavioral compensatory F1 response and the magnitude of this speaking induced suppression (SIS) for P2 during the altered auditory feedback phase, where a stronger compensatory decrease in F1 was associated with a stronger the SIS effect. Finally, under the auditory-cued condition, an auditory repetition-suppression effect was observed in N1/P2 amplitude during the listening task but not active speaking, suggesting that auditory predictive processes during speaking and passive listening are functionally distinct. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Temporal correlation between auditory neurons and the hippocampal theta rhythm induced by novel stimulations in awake guinea pigs.

    Science.gov (United States)

    Liberman, Tamara; Velluti, Ricardo A; Pedemonte, Marisa

    2009-11-17

    The hippocampal theta rhythm is associated with the processing of sensory systems such as touch, smell, vision and hearing, as well as with motor activity, the modulation of autonomic processes such as cardiac rhythm, and learning and memory processes. The discovery of temporal correlation (phase locking) between the theta rhythm and both visual and auditory neuronal activity has led us to postulate the participation of such rhythm in the temporal processing of sensory information. In addition, changes in attention can modify both the theta rhythm and the auditory and visual sensory activity. The present report tested the hypothesis that the temporal correlation between auditory neuronal discharges in the inferior colliculus central nucleus (ICc) and the hippocampal theta rhythm could be enhanced by changes in sensory stimulation. We presented chronically implanted guinea pigs with auditory stimuli that varied over time, and recorded the auditory response during wakefulness. It was observed that the stimulation shifts were capable of producing the temporal phase correlations between the theta rhythm and the ICc unit firing, and they differed depending on the stimulus change performed. Such correlations disappeared approximately 6 s after the change presentation. Furthermore, the power of the hippocampal theta rhythm increased in half of the cases presented with a stimulation change. Based on these data, we propose that the degree of correlation between the unitary activity and the hippocampal theta rhythm varies with--and therefore may signal--stimulus novelty.

  13. Music lessons improve auditory perceptual and cognitive performance in deaf children.

    Science.gov (United States)

    Rochette, Françoise; Moussard, Aline; Bigand, Emmanuel

    2014-01-01

    Despite advanced technologies in auditory rehabilitation of profound deafness, deaf children often exhibit delayed cognitive and linguistic development and auditory training remains a crucial element of their education. In the present cross-sectional study, we assess whether music would be a relevant tool for deaf children rehabilitation. In normal-hearing children, music lessons have been shown to improve cognitive and linguistic-related abilities, such as phonetic discrimination and reading. We compared auditory perception, auditory cognition, and phonetic discrimination between 14 profoundly deaf children who completed weekly music lessons for a period of 1.5-4 years and 14 deaf children who did not receive musical instruction. Children were assessed on perceptual and cognitive auditory tasks using environmental sounds: discrimination, identification, auditory scene analysis, auditory working memory. Transfer to the linguistic domain was tested with a phonetic discrimination task. Musically trained children showed better performance in auditory scene analysis, auditory working memory and phonetic discrimination tasks, and multiple regressions showed that success on these tasks was at least partly driven by music lessons. We propose that musical education contributes to development of general processes such as auditory attention and perception, which, in turn, facilitate auditory-related cognitive and linguistic processes.

  14. Music lessons improve auditory perceptual and cognitive performance in deaf children

    Directory of Open Access Journals (Sweden)

    Françoise eROCHETTE

    2014-07-01

    Full Text Available Despite advanced technologies in auditory rehabilitation of profound deafness, deaf children often exhibit delayed cognitive and linguistic development and auditory training remains a crucial element of their education. In the present cross-sectional study, we assess whether music would be a relevant tool for deaf children rehabilitation. In normal-hearing children, music lessons have been shown to improve cognitive and linguistic-related abilities, such as phonetic discrimination and reading. We compared auditory perception, auditory cognition, and phonetic discrimination between 14 profoundly deaf children who completed weekly music lessons for a period of 1.5 to 4 years and 14 deaf children who did not receive musical instruction. Children were assessed on perceptual and cognitive auditory tasks using environmental sounds: discrimination, identification, auditory scene analysis, auditory working memory. Transfer to the linguistic domain was tested with a phonetic discrimination task. Musically-trained children showed better performance in auditory scene analysis, auditory working memory and phonetic discrimination tasks, and multiple regressions showed that success on these tasks was at least partly driven by music lessons. We propose that musical education contributes to development of general processes such as auditory attention and perception, which, in turn, facilitate auditory-related cognitive and linguistic processes.

  15. Auditory motion-specific mechanisms in the primate brain.

    Directory of Open Access Journals (Sweden)

    Colline Poirier

    2017-05-01

    Full Text Available This work examined the mechanisms underlying auditory motion processing in the auditory cortex of awake monkeys using functional magnetic resonance imaging (fMRI. We tested to what extent auditory motion analysis can be explained by the linear combination of static spatial mechanisms, spectrotemporal processes, and their interaction. We found that the posterior auditory cortex, including A1 and the surrounding caudal belt and parabelt, is involved in auditory motion analysis. Static spatial and spectrotemporal processes were able to fully explain motion-induced activation in most parts of the auditory cortex, including A1, but not in circumscribed regions of the posterior belt and parabelt cortex. We show that in these regions motion-specific processes contribute to the activation, providing the first demonstration that auditory motion is not simply deduced from changes in static spatial location. These results demonstrate that parallel mechanisms for motion and static spatial analysis coexist within the auditory dorsal stream.

  16. Functionally Specific Oscillatory Activity Correlates between Visual and Auditory Cortex in the Blind

    Science.gov (United States)

    Schepers, Inga M.; Hipp, Joerg F.; Schneider, Till R.; Roder, Brigitte; Engel, Andreas K.

    2012-01-01

    Many studies have shown that the visual cortex of blind humans is activated in non-visual tasks. However, the electrophysiological signals underlying this cross-modal plasticity are largely unknown. Here, we characterize the neuronal population activity in the visual and auditory cortex of congenitally blind humans and sighted controls in a…

  17. Differential relationship of recent self-reported stress and acute anxiety with divided attention performance.

    Science.gov (United States)

    Petrac, D C; Bedwell, J S; Renk, K; Orem, D M; Sims, V

    2009-07-01

    There have been relatively few studies on the relationship between recent perceived environmental stress and cognitive performance, and the existing studies do not control for state anxiety during the cognitive testing. The current study addressed this need by examining recent self-reported environmental stress and divided attention performance, while controlling for state anxiety. Fifty-four university undergraduates who self-reported a wide range of perceived recent stress (10-item perceived stress scale) completed both single and dual (simultaneous auditory and visual stimuli) continuous performance tests. Partial correlation analysis showed a statistically significant positive correlation between perceived stress and the auditory omission errors from the dual condition, after controlling for state anxiety and auditory omission errors from the single condition (r = 0.41). This suggests that increased environmental stress relates to decreased divided attention performance in auditory vigilance. In contrast, an increase in state anxiety (controlling for perceived stress) was related to a decrease in auditory omission errors from the dual condition (r = - 0.37), which suggests that state anxiety may improve divided attention performance. Results suggest that further examination of the neurobiological consequences of environmental stress on divided attention and other executive functioning tasks is needed.

  18. Hand proximity facilitates spatial discrimination of auditory tones

    Directory of Open Access Journals (Sweden)

    Philip eTseng

    2014-06-01

    Full Text Available The effect of hand proximity on vision and visual attention has been well documented. In this study we tested whether such effect(s would also be present in the auditory modality. With hands placed either near or away from the audio sources, participants performed an auditory-spatial discrimination (Exp 1: left or right side, pitch discrimination (Exp 2: high, med, or low tone, and spatial-plus-pitch (Exp 3: left or right; high, med, or low discrimination task. In Exp 1, when hands were away from the audio source, participants consistently responded faster with their right hand regardless of stimulus location. This right hand advantage, however, disappeared in the hands-near condition because of a significant improvement in left hand’s reaction time. No effect of hand proximity was found in Exp 2 or 3, where a choice reaction time task requiring pitch discrimination was used. Together, these results suggest that the effect of hand proximity is not exclusive to vision alone, but is also present in audition, though in a much weaker form. Most important, these findings provide evidence from auditory attention that supports the multimodal account originally raised by Reed et al. in 2006.

  19. The selective processing of emotional visual stimuli while detecting auditory targets: an ERP analysis.

    Science.gov (United States)

    Schupp, Harald T; Stockburger, Jessica; Bublatzky, Florian; Junghöfer, Markus; Weike, Almut I; Hamm, Alfons O

    2008-09-16

    Event-related potential studies revealed an early posterior negativity (EPN) for emotional compared to neutral pictures. Exploring the emotion-attention relationship, a previous study observed that a primary visual discrimination task interfered with the emotional modulation of the EPN component. To specify the locus of interference, the present study assessed the fate of selective visual emotion processing while attention is directed towards the auditory modality. While simply viewing a rapid and continuous stream of pleasant, neutral, and unpleasant pictures in one experimental condition, processing demands of a concurrent auditory target discrimination task were systematically varied in three further experimental conditions. Participants successfully performed the auditory task as revealed by behavioral performance and selected event-related potential components. Replicating previous results, emotional pictures were associated with a larger posterior negativity compared to neutral pictures. Of main interest, increasing demands of the auditory task did not modulate the selective processing of emotional visual stimuli. With regard to the locus of interference, selective emotion processing as indexed by the EPN does not seem to reflect shared processing resources of visual and auditory modality.

  20. The interplay of attention and emotion: top-down attention modulates amygdala activation in psychopathy.

    Science.gov (United States)

    Larson, Christine L; Baskin-Sommers, Arielle R; Stout, Daniel M; Balderston, Nicholas L; Curtin, John J; Schultz, Douglas H; Kiehl, Kent A; Newman, Joseph P

    2013-12-01

    Psychopathic behavior has long been attributed to a fundamental deficit in fear that arises from impaired amygdala function. Growing evidence has demonstrated that fear-potentiated startle (FPS) and other psychopathy-related deficits are moderated by focus of attention, but to date, no work on adult psychopathy has examined attentional modulation of the amygdala or concomitant recruitment of relevant attention-related circuitry. Consistent with previous FPS findings, here we report that psychopathy-related differences in amygdala activation appear and disappear as a function of goal-directed attention. Specifically, decreased amygdala activity was observed in psychopathic offenders only when attention was engaged in an alternative goal-relevant task prior to presenting threat-relevant information. Under this condition, psychopaths also exhibited greater activation in selective-attention regions of the lateral prefrontal cortex (LPFC) than did nonpsychopaths, and this increased LPFC activation mediated psychopathy's association with decreased amygdala activation. In contrast, when explicitly attending to threat, amygdala activation did not differ in psychopaths and nonpsychopaths. This pattern of amygdala activation highlights the potential role of LPFC in mediating the failure of psychopathic individuals to process fear and other important information when it is peripheral to the primary focus of goal-directed attention.

  1. Large-Scale Brain Networks Supporting Divided Attention across Spatial Locations and Sensory Modalities.

    Science.gov (United States)

    Santangelo, Valerio

    2018-01-01

    Higher-order cognitive processes were shown to rely on the interplay between large-scale neural networks. However, brain networks involved with the capability to split attentional resource over multiple spatial locations and multiple stimuli or sensory modalities have been largely unexplored to date. Here I re-analyzed data from Santangelo et al. (2010) to explore the causal interactions between large-scale brain networks during divided attention. During fMRI scanning, participants monitored streams of visual and/or auditory stimuli in one or two spatial locations for detection of occasional targets. This design allowed comparing a condition in which participants monitored one stimulus/modality (either visual or auditory) in two spatial locations vs. a condition in which participants monitored two stimuli/modalities (both visual and auditory) in one spatial location. The analysis of the independent components (ICs) revealed that dividing attentional resources across two spatial locations necessitated a brain network involving the left ventro- and dorso-lateral prefrontal cortex plus the posterior parietal cortex, including the intraparietal sulcus (IPS) and the angular gyrus, bilaterally. The analysis of Granger causality highlighted that the activity of lateral prefrontal regions were predictive of the activity of all of the posteriors parietal nodes. By contrast, dividing attention across two sensory modalities necessitated a brain network including nodes belonging to the dorsal frontoparietal network, i.e., the bilateral frontal eye-fields (FEF) and IPS, plus nodes belonging to the salience network, i.e., the anterior cingulated cortex and the left and right anterior insular cortex (aIC). The analysis of Granger causality highlights a tight interdependence between the dorsal frontoparietal and salience nodes in trials requiring divided attention between different sensory modalities. The current findings therefore highlighted a dissociation among brain networks

  2. Large-Scale Brain Networks Supporting Divided Attention across Spatial Locations and Sensory Modalities

    Directory of Open Access Journals (Sweden)

    Valerio Santangelo

    2018-02-01

    Full Text Available Higher-order cognitive processes were shown to rely on the interplay between large-scale neural networks. However, brain networks involved with the capability to split attentional resource over multiple spatial locations and multiple stimuli or sensory modalities have been largely unexplored to date. Here I re-analyzed data from Santangelo et al. (2010 to explore the causal interactions between large-scale brain networks during divided attention. During fMRI scanning, participants monitored streams of visual and/or auditory stimuli in one or two spatial locations for detection of occasional targets. This design allowed comparing a condition in which participants monitored one stimulus/modality (either visual or auditory in two spatial locations vs. a condition in which participants monitored two stimuli/modalities (both visual and auditory in one spatial location. The analysis of the independent components (ICs revealed that dividing attentional resources across two spatial locations necessitated a brain network involving the left ventro- and dorso-lateral prefrontal cortex plus the posterior parietal cortex, including the intraparietal sulcus (IPS and the angular gyrus, bilaterally. The analysis of Granger causality highlighted that the activity of lateral prefrontal regions were predictive of the activity of all of the posteriors parietal nodes. By contrast, dividing attention across two sensory modalities necessitated a brain network including nodes belonging to the dorsal frontoparietal network, i.e., the bilateral frontal eye-fields (FEF and IPS, plus nodes belonging to the salience network, i.e., the anterior cingulated cortex and the left and right anterior insular cortex (aIC. The analysis of Granger causality highlights a tight interdependence between the dorsal frontoparietal and salience nodes in trials requiring divided attention between different sensory modalities. The current findings therefore highlighted a dissociation among

  3. Effects of alcohol on attention orienting and dual-task performance during simulated driving: an event-related potential study.

    Science.gov (United States)

    Wester, Anne E; Verster, Joris C; Volkerts, Edmund R; Böcker, Koen B E; Kenemans, J Leon

    2010-09-01

    Driving is a complex task and is susceptible to inattention and distraction. Moreover, alcohol has a detrimental effect on driving performance, possibly due to alcohol-induced attention deficits. The aim of the present study was to assess the effects of alcohol on simulated driving performance and attention orienting and allocation, as assessed by event-related potentials (ERPs). Thirty-two participants completed two test runs in the Divided Attention Steering Simulator (DASS) with blood alcohol concentrations (BACs) of 0.00%, 0.02%, 0.05%, 0.08% and 0.10%. Sixteen participants performed the second DASS test run with a passive auditory oddball to assess alcohol effects on involuntary attention shifting. Sixteen other participants performed the second DASS test run with an active auditory oddball to assess alcohol effects on dual-task performance and active attention allocation. Dose-dependent impairments were found for reaction times, the number of misses and steering error, even more so in dual-task conditions, especially in the active oddball group. ERP amplitudes to novel irrelevant events were also attenuated in a dose-dependent manner. The P3b amplitude to deviant target stimuli decreased with blood alcohol concentration only in the dual-task condition. It is concluded that alcohol increases distractibility and interference from secondary task stimuli, as well as reduces attentional capacity and dual-task integrality.

  4. Acquisition, Analyses and Interpretation of fMRI Data: A Study on the Effective Connectivity in Human Primary Auditory Cortices

    International Nuclear Information System (INIS)

    Ahmad Nazlim Yusoff; Mazlyfarina Mohamad; Khairiah Abdul Hamid

    2011-01-01

    A study on the effective connectivity characteristics in auditory cortices was conducted on five healthy Malay male subjects with the age of 20 to 40 years old using functional magnetic resonance imaging (fMRI), statistical parametric mapping (SPM5) and dynamic causal modelling (DCM). A silent imaging paradigm was used to reduce the scanner sound artefacts on functional images. The subjects were instructed to pay attention to the white noise stimulus binaurally given at intensity level of 70 dB higher than the hearing level for normal people. Functional specialisation was studied using Matlab-based SPM5 software by means of fixed effects (FFX), random effects (RFX) and conjunction analyses. Individual analyses on all subjects indicate asymmetrical bilateral activation between the left and right auditory cortices in Brodmann areas (BA)22, 41 and 42 involving the primary and secondary auditory cortices. The three auditory areas in the right and left auditory cortices are selected for the determination of the effective connectivity by constructing 9 network models. The effective connectivity is determined on four out of five subjects with the exception of one subject who has the BA22 coordinates located too far from BA22 coordinates obtained from group analysis. DCM results showed the existence of effective connectivity between the three selected auditory areas in both auditory cortices. In the right auditory cortex, BA42 is identified as input centre with unidirectional parallel effective connectivities of BA42→BA41and BA42→BA22. However, for the left auditory cortex, the input is BA41 with unidirectional parallel effective connectivities of BA41→BA42 and BA41→BA22. The connectivity between the activated auditory areas suggests the existence of signal pathway in the auditory cortices even when the subject is listening to noise. (author)

  5. Comparison on driving fatigue related hemodynamics activated by auditory and visual stimulus

    Science.gov (United States)

    Deng, Zishan; Gao, Yuan; Li, Ting

    2018-02-01

    As one of the main causes of traffic accidents, driving fatigue deserves researchers' attention and its detection and monitoring during long-term driving require a new technique to realize. Since functional near-infrared spectroscopy (fNIRS) can be applied to detect cerebral hemodynamic responses, we can promisingly expect its application in fatigue level detection. Here, we performed three different kinds of experiments on a driver and recorded his cerebral hemodynamic responses when driving for long hours utilizing our device based on fNIRS. Each experiment lasted for 7 hours and one of the three specific experimental tests, detecting the driver's response to sounds, traffic lights and direction signs respectively, was done every hour. The results showed that visual stimulus was easier to cause fatigue compared with auditory stimulus and visual stimulus induced by traffic lights scenes was easier to cause fatigue compared with visual stimulus induced by direction signs in the first few hours. We also found that fatigue related hemodynamics caused by auditory stimulus increased fastest, then traffic lights scenes, and direction signs scenes slowest. Our study successfully compared audio, visual color, and visual character stimulus in sensitivity to cause driving fatigue, which is meaningful for driving safety management.

  6. Neuronal Correlates of Auditory Streaming in Monkey Auditory Cortex for Tone Sequences without Spectral Differences

    Directory of Open Access Journals (Sweden)

    Stanislava Knyazeva

    2018-01-01

    Full Text Available This study finds a neuronal correlate of auditory perceptual streaming in the primary auditory cortex for sequences of tone complexes that have the same amplitude spectrum but a different phase spectrum. Our finding is based on microelectrode recordings of multiunit activity from 270 cortical sites in three awake macaque monkeys. The monkeys were presented with repeated sequences of a tone triplet that consisted of an A tone, a B tone, another A tone and then a pause. The A and B tones were composed of unresolved harmonics formed by adding the harmonics in cosine phase, in alternating phase, or in random phase. A previous psychophysical study on humans revealed that when the A and B tones are similar, humans integrate them into a single auditory stream; when the A and B tones are dissimilar, humans segregate them into separate auditory streams. We found that the similarity of neuronal rate responses to the triplets was highest when all A and B tones had cosine phase. Similarity was intermediate when the A tones had cosine phase and the B tones had alternating phase. Similarity was lowest when the A tones had cosine phase and the B tones had random phase. The present study corroborates and extends previous reports, showing similar correspondences between neuronal activity in the primary auditory cortex and auditory streaming of sound sequences. It also is consistent with Fishman’s population separation model of auditory streaming.

  7. Neuronal Correlates of Auditory Streaming in Monkey Auditory Cortex for Tone Sequences without Spectral Differences.

    Science.gov (United States)

    Knyazeva, Stanislava; Selezneva, Elena; Gorkin, Alexander; Aggelopoulos, Nikolaos C; Brosch, Michael

    2018-01-01

    This study finds a neuronal correlate of auditory perceptual streaming in the primary auditory cortex for sequences of tone complexes that have the same amplitude spectrum but a different phase spectrum. Our finding is based on microelectrode recordings of multiunit activity from 270 cortical sites in three awake macaque monkeys. The monkeys were presented with repeated sequences of a tone triplet that consisted of an A tone, a B tone, another A tone and then a pause. The A and B tones were composed of unresolved harmonics formed by adding the harmonics in cosine phase, in alternating phase, or in random phase. A previous psychophysical study on humans revealed that when the A and B tones are similar, humans integrate them into a single auditory stream; when the A and B tones are dissimilar, humans segregate them into separate auditory streams. We found that the similarity of neuronal rate responses to the triplets was highest when all A and B tones had cosine phase. Similarity was intermediate when the A tones had cosine phase and the B tones had alternating phase. Similarity was lowest when the A tones had cosine phase and the B tones had random phase. The present study corroborates and extends previous reports, showing similar correspondences between neuronal activity in the primary auditory cortex and auditory streaming of sound sequences. It also is consistent with Fishman's population separation model of auditory streaming.

  8. Noise exposure and oxidative balance in auditory and extra-auditory structures in adult and developing animals. Pharmacological approaches aimed to minimize its effects.

    Science.gov (United States)

    Molina, S J; Miceli, M; Guelman, L R

    2016-07-01

    Noise coming from urban traffic, household appliances or discotheques might be as hazardous to the health of exposed people as occupational noise, because may likewise cause hearing loss, changes in hormonal, cardiovascular and immune systems and behavioral alterations. Besides, noise can affect sleep, work performance and productivity as well as communication skills. Moreover, exposure to noise can trigger an oxidative imbalance between reactive oxygen species (ROS) and the activity of antioxidant enzymes in different structures, which can contribute to tissue damage. In this review we systematized the information from reports concerning noise effects on cell oxidative balance in different tissues, focusing on auditory and non-auditory structures. We paid specific attention to in vivo studies, including results obtained in adult and developing subjects. Finally, we discussed the pharmacological strategies tested by different authors aimed to minimize the damaging effects of noise on living beings. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Tinnitus intensity dependent gamma oscillations of the contralateral auditory cortex.

    Directory of Open Access Journals (Sweden)

    Elsa van der Loo

    Full Text Available BACKGROUND: Non-pulsatile tinnitus is considered a subjective auditory phantom phenomenon present in 10 to 15% of the population. Tinnitus as a phantom phenomenon is related to hyperactivity and reorganization of the auditory cortex. Magnetoencephalography studies demonstrate a correlation between gamma band activity in the contralateral auditory cortex and the presence of tinnitus. The present study aims to investigate the relation between objective gamma-band activity in the contralateral auditory cortex and subjective tinnitus loudness scores. METHODS AND FINDINGS: In unilateral tinnitus patients (N = 15; 10 right, 5 left source analysis of resting state electroencephalographic gamma band oscillations shows a strong positive correlation with Visual Analogue Scale loudness scores in the contralateral auditory cortex (max r = 0.73, p<0.05. CONCLUSION: Auditory phantom percepts thus show similar sound level dependent activation of the contralateral auditory cortex as observed in normal audition. In view of recent consciousness models and tinnitus network models these results suggest tinnitus loudness is coded by gamma band activity in the contralateral auditory cortex but might not, by itself, be responsible for tinnitus perception.

  10. Preattentive representation of feature conjunctions for concurrent spatially distributed auditory objects.

    Science.gov (United States)

    Takegata, Rika; Brattico, Elvira; Tervaniemi, Mari; Varyagina, Olga; Näätänen, Risto; Winkler, István

    2005-09-01

    The role of attention in conjoining features of an object has been a topic of much debate. Studies using the mismatch negativity (MMN), an index of detecting acoustic deviance, suggested that the conjunctions of auditory features are preattentively represented in the brain. These studies, however, used sequentially presented sounds and thus are not directly comparable with visual studies of feature integration. Therefore, the current study presented an array of spatially distributed sounds to determine whether the auditory features of concurrent sounds are correctly conjoined without focal attention directed to the sounds. Two types of sounds differing from each other in timbre and pitch were repeatedly presented together while subjects were engaged in a visual n-back working-memory task and ignored the sounds. Occasional reversals of the frequent pitch-timbre combinations elicited MMNs of a very similar amplitude and latency irrespective of the task load. This result suggested preattentive integration of auditory features. However, performance in a subsequent target-search task with the same stimuli indicated the occurrence of illusory conjunctions. The discrepancy between the results obtained with and without focal attention suggests that illusory conjunctions may occur during voluntary access to the preattentively encoded object representations.

  11. Attentional control activation relates to working memory in attention-deficit/hyperactivity disorder.

    Science.gov (United States)

    Burgess, Gregory C; Depue, Brendan E; Ruzic, Luka; Willcutt, Erik G; Du, Yiping P; Banich, Marie T

    2010-04-01

    Attentional control difficulties in individuals with attention-deficit/hyperactivity disorder (ADHD) might reflect poor working memory (WM) ability, especially because WM ability and attentional control rely on similar brain regions. The current study examined whether WM ability might explain group differences in brain activation between adults with ADHD and normal control subjects during attentional demand. Participants were 20 adults with ADHD combined subtype with no comorbid psychiatric or learning disorders and 23 control subjects similar in age, IQ, and gender. The WM measures were obtained from the Wechsler Adult Intelligence Scale-III and Wechsler Memory Scale-Revised. Brain activation was assessed with functional magnetic resonance imaging (fMRI) while performing a Color-Word Stroop task. Group differences in WM ability explained a portion of the activation in left dorsolateral prefrontal cortex (DLPFC), which has been related to the creation and maintenance of an attentional set for task-relevant information. In addition, greater WM ability predicted increased activation of brain regions related to stimulus-driven attention and response selection processes in the ADHD group but not in the control group. The inability to maintain an appropriate task set in young adults with combined type ADHD, associated with decreased activity in left DLPFC, might in part be due to poor WM ability. Furthermore, in individuals with ADHD, higher WM ability might relate to increased recruitment of stimulus-driven attention and response selection processes, perhaps as a compensatory strategy. Copyright 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  12. A Review of Auditory Prediction and Its Potential Role in Tinnitus Perception.

    Science.gov (United States)

    Durai, Mithila; O'Keeffe, Mary G; Searchfield, Grant D

    2018-06-01

    The precise mechanisms underlying tinnitus perception and distress are still not fully understood. A recent proposition is that auditory prediction errors and related memory representations may play a role in driving tinnitus perception. It is of interest to further explore this. To obtain a comprehensive narrative synthesis of current research in relation to auditory prediction and its potential role in tinnitus perception and severity. A narrative review methodological framework was followed. The key words Prediction Auditory, Memory Prediction Auditory, Tinnitus AND Memory, Tinnitus AND Prediction in Article Title, Abstract, and Keywords were extensively searched on four databases: PubMed, Scopus, SpringerLink, and PsychINFO. All study types were selected from 2000-2016 (end of 2016) and had the following exclusion criteria applied: minimum age of participants article not available in English. Reference lists of articles were reviewed to identify any further relevant studies. Articles were short listed based on title relevance. After reading the abstracts and with consensus made between coauthors, a total of 114 studies were selected for charting data. The hierarchical predictive coding model based on the Bayesian brain hypothesis, attentional modulation and top-down feedback serves as the fundamental framework in current literature for how auditory prediction may occur. Predictions are integral to speech and music processing, as well as in sequential processing and identification of auditory objects during auditory streaming. Although deviant responses are observable from middle latency time ranges, the mismatch negativity (MMN) waveform is the most commonly studied electrophysiological index of auditory irregularity detection. However, limitations may apply when interpreting findings because of the debatable origin of the MMN and its restricted ability to model real-life, more complex auditory phenomenon. Cortical oscillatory band activity may act as

  13. Odors bias time perception in visual and auditory modalities

    Directory of Open Access Journals (Sweden)

    Zhenzhu eYue

    2016-04-01

    Full Text Available Previous studies have shown that emotional states alter our perception of time. However, attention, which is modulated by a number of factors, such as emotional events, also influences time perception. To exclude potential attentional effects associated with emotional events, various types of odors (inducing different levels of emotional arousal were used to explore whether olfactory events modulated time perception differently in visual and auditory modalities. Participants were shown either a visual dot or heard a continuous tone for 1000 ms or 4000 ms while they were exposed to odors of jasmine, lavender, or garlic. Participants then reproduced the temporal durations of the preceding visual or auditory stimuli by pressing the spacebar twice. Their reproduced durations were compared to those in the control condition (without odor. The results showed that participants produced significantly longer time intervals in the lavender condition than in the jasmine or garlic conditions. The overall influence of odor on time perception was equivalent for both visual and auditory modalities. The analysis of the interaction effect showed that participants produced longer durations than the actual duration in the short interval condition, but they produced shorter durations in the long interval condition. The effect sizes were larger for the auditory modality than those for the visual modality. Moreover, by comparing performance across the initial and the final blocks of the experiment, we found odor adaptation effects were mainly manifested as longer reproductions for the short time interval later in the adaptation phase, and there was a larger effect size in the auditory modality. In summary, the present results indicate that odors imposed differential impacts on reproduced time durations, and they were constrained by different sensory modalities, valence of the emotional events, and target durations. Biases in time perception could be accounted for by a

  14. Thalamic and parietal brain morphology predicts auditory category learning.

    Science.gov (United States)

    Scharinger, Mathias; Henry, Molly J; Erb, Julia; Meyer, Lars; Obleser, Jonas

    2014-01-01

    Auditory categorization is a vital skill involving the attribution of meaning to acoustic events, engaging domain-specific (i.e., auditory) as well as domain-general (e.g., executive) brain networks. A listener's ability to categorize novel acoustic stimuli should therefore depend on both, with the domain-general network being particularly relevant for adaptively changing listening strategies and directing attention to relevant acoustic cues. Here we assessed adaptive listening behavior, using complex acoustic stimuli with an initially salient (but later degraded) spectral cue and a secondary, duration cue that remained nondegraded. We employed voxel-based morphometry (VBM) to identify cortical and subcortical brain structures whose individual neuroanatomy predicted task performance and the ability to optimally switch to making use of temporal cues after spectral degradation. Behavioral listening strategies were assessed by logistic regression and revealed mainly strategy switches in the expected direction, with considerable individual differences. Gray-matter probability in the left inferior parietal lobule (BA 40) and left precentral gyrus was predictive of "optimal" strategy switch, while gray-matter probability in thalamic areas, comprising the medial geniculate body, co-varied with overall performance. Taken together, our findings suggest that successful auditory categorization relies on domain-specific neural circuits in the ascending auditory pathway, while adaptive listening behavior depends more on brain structure in parietal cortex, enabling the (re)direction of attention to salient stimulus properties. © 2013 Published by Elsevier Ltd.

  15. The role of the auditory brainstem in processing musically-relevant pitch

    Directory of Open Access Journals (Sweden)

    Gavin M. Bidelman

    2013-05-01

    Full Text Available Neuroimaging work has shed light on the cerebral architecture involved in processing the melodic and harmonic aspects of music. Here, recent evidence is reviewed illustrating that subcortical auditory structures contribute to the early formation and processing of musically-relevant pitch. Electrophysiological recordings from the human brainstem and population responses from the auditory nerve reveal that nascent features of tonal music (e.g., consonance/dissonance, pitch salience, harmonic sonority are evident at early, subcortical levels of the auditory pathway. The salience and harmonicity of brainstem activity is strongly correlated with listeners’ perceptual preferences and perceived consonance for the tonal relationships of music. Moreover, the hierarchical ordering of pitch intervals/chords described by the Western music practice and their perceptual consonance is well-predicted by the salience with which pitch combinations are encoded in subcortical auditory structures. While the neural correlates of consonance can be tuned and exaggerated with musical training, they persist even in the absence of musicianship or long-term enculturation. As such, it is posited that the structural foundations of musical pitch might result from innate processing performed by the central auditory system. A neurobiological predisposition for consonant, pleasant sounding pitch relationships may be one reason why these pitch combinations have been favored by composers and listeners for centuries. It is suggested that important perceptual dimensions of music emerge well before the auditory signal reaches cerebral cortex and prior to attentional engagement. While cortical mechanisms are no doubt critical to the perception, production, and enjoyment of music, the contribution of subcortical structures implicates a more integrated, hierarchically organized network underlying music processing within the brain.

  16. Happiness increases distraction by auditory deviant stimuli.

    Science.gov (United States)

    Pacheco-Unguetti, Antonia Pilar; Parmentier, Fabrice B R

    2016-08-01

    Rare and unexpected changes (deviants) in an otherwise repeated stream of task-irrelevant auditory distractors (standards) capture attention and impair behavioural performance in an ongoing visual task. Recent evidence indicates that this effect is increased by sadness in a task involving neutral stimuli. We tested the hypothesis that such effect may not be limited to negative emotions but reflect a general depletion of attentional resources by examining whether a positive emotion (happiness) would increase deviance distraction too. Prior to performing an auditory-visual oddball task, happiness or a neutral mood was induced in participants by means of the exposure to music and the recollection of an autobiographical event. Results from the oddball task showed significantly larger deviance distraction following the induction of happiness. Interestingly, the small amount of distraction typically observed on the standard trial following a deviant trial (post-deviance distraction) was not increased by happiness. We speculate that happiness might interfere with the disengagement of attention from the deviant sound back towards the target stimulus (through the depletion of cognitive resources and/or mind wandering) but help subsequent cognitive control to recover from distraction. © 2015 The British Psychological Society.

  17. The impact of auditory white noise on semantic priming.

    Science.gov (United States)

    Angwin, Anthony J; Wilson, Wayne J; Copland, David A; Barry, Robert J; Myatt, Grace; Arnott, Wendy L

    2018-04-10

    It has been proposed that white noise can improve cognitive performance for some individuals, particularly those with lower attention, and that this effect may be mediated by dopaminergic circuitry. Given existing evidence that semantic priming is modulated by dopamine, this study investigated whether white noise can facilitate semantic priming. Seventy-eight adults completed an auditory semantic priming task with and without white noise, at either a short or long inter-stimulus interval (ISI). Measures of both direct and indirect semantic priming were examined. Analysis of the results revealed significant direct and indirect priming effects at each ISI in noise and silence, however noise significantly reduced the magnitude of indirect priming. Analyses of subgroups with higher versus lower attention revealed a reduction to indirect priming in noise relative to silence for participants with lower executive and orienting attention. These findings suggest that white noise focuses automatic spreading activation, which may be driven by modulation of dopaminergic circuitry. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Musical minds: attentional blink reveals modality-specific restrictions.

    Directory of Open Access Journals (Sweden)

    Sander Martens

    Full Text Available Formal musical training is known to have positive effects on attentional and executive functioning, processing speed, and working memory. Consequently, one may expect to find differences in the dynamics of temporal attention between musicians and non-musicians. Here we address the question whether that is indeed the case, and whether any beneficial effects of musical training on temporal attention are modality specific or generalize across sensory modalities.When two targets are presented in close temporal succession, most people fail to report the second target, a phenomenon known as the attentional blink (AB. We measured and compared AB magnitude for musicians and non-musicians using auditory or visually presented letters and digits. Relative to non-musicians, the auditory AB was both attenuated and delayed in musicians, whereas the visual AB was larger. Non-musicians with a large auditory AB tended to show a large visual AB. However, neither a positive nor negative correlation was found in musicians, suggesting that at least in musicians, attentional restrictions within each modality are completely separate.AB magnitude within one modality can generalize to another modality, but this turns out not to be the case for every individual. Formal musical training seems to have a domain-general, but modality-specific beneficial effect on selective attention. The results fit with the idea that a major source of attentional restriction as reflected in the AB lies in modality-specific, independent sensory systems rather than a central amodal system. The findings demonstrate that individual differences in AB magnitude can provide important information about the modular structure of human cognition.

  19. Musical minds: attentional blink reveals modality-specific restrictions.

    Science.gov (United States)

    Martens, Sander; Wierda, Stefan M; Dun, Mathijs; de Vries, Michal; Smid, Henderikus G O M

    2015-01-01

    Formal musical training is known to have positive effects on attentional and executive functioning, processing speed, and working memory. Consequently, one may expect to find differences in the dynamics of temporal attention between musicians and non-musicians. Here we address the question whether that is indeed the case, and whether any beneficial effects of musical training on temporal attention are modality specific or generalize across sensory modalities. When two targets are presented in close temporal succession, most people fail to report the second target, a phenomenon known as the attentional blink (AB). We measured and compared AB magnitude for musicians and non-musicians using auditory or visually presented letters and digits. Relative to non-musicians, the auditory AB was both attenuated and delayed in musicians, whereas the visual AB was larger. Non-musicians with a large auditory AB tended to show a large visual AB. However, neither a positive nor negative correlation was found in musicians, suggesting that at least in musicians, attentional restrictions within each modality are completely separate. AB magnitude within one modality can generalize to another modality, but this turns out not to be the case for every individual. Formal musical training seems to have a domain-general, but modality-specific beneficial effect on selective attention. The results fit with the idea that a major source of attentional restriction as reflected in the AB lies in modality-specific, independent sensory systems rather than a central amodal system. The findings demonstrate that individual differences in AB magnitude can provide important information about the modular structure of human cognition.

  20. Nonspatial intermodal selective attention is mediated by sensory brain areas: Evidence from event-related potentials

    NARCIS (Netherlands)

    Talsma, D.; Kok, Albert

    2001-01-01

    The present study focuses on the question of whether inter- and intramodal forms of attention are reflected in activation of the same or different brain areas. ERPs were recorded while subjects were presented a random sequence of visual and auditory stimuli. They were instructed to attend to

  1. Interaction of language, auditory and memory brain networks in auditory verbal hallucinations.

    Science.gov (United States)

    Ćurčić-Blake, Branislava; Ford, Judith M; Hubl, Daniela; Orlov, Natasza D; Sommer, Iris E; Waters, Flavie; Allen, Paul; Jardri, Renaud; Woodruff, Peter W; David, Olivier; Mulert, Christoph; Woodward, Todd S; Aleman, André

    2017-01-01

    Auditory verbal hallucinations (AVH) occur in psychotic disorders, but also as a symptom of other conditions and even in healthy people. Several current theories on the origin of AVH converge, with neuroimaging studies suggesting that the language, auditory and memory/limbic networks are of particular relevance. However, reconciliation of these theories with experimental evidence is missing. We review 50 studies investigating functional (EEG and fMRI) and anatomic (diffusion tensor imaging) connectivity in these networks, and explore the evidence supporting abnormal connectivity in these networks associated with AVH. We distinguish between functional connectivity during an actual hallucination experience (symptom capture) and functional connectivity during either the resting state or a task comparing individuals who hallucinate with those who do not (symptom association studies). Symptom capture studies clearly reveal a pattern of increased coupling among the auditory, language and striatal regions. Anatomical and symptom association functional studies suggest that the interhemispheric connectivity between posterior auditory regions may depend on the phase of illness, with increases in non-psychotic individuals and first episode patients and decreases in chronic patients. Leading hypotheses involving concepts as unstable memories, source monitoring, top-down attention, and hybrid models of hallucinations are supported in part by the published connectivity data, although several caveats and inconsistencies remain. Specifically, possible changes in fronto-temporal connectivity are still under debate. Precise hypotheses concerning the directionality of connections deduced from current theoretical approaches should be tested using experimental approaches that allow for discrimination of competing hypotheses. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Kölliker’s Organ and the Development of Spontaneous Activity in the Auditory System: Implications for Hearing Dysfunction

    Directory of Open Access Journals (Sweden)

    M. W. Nishani Dayaratne

    2014-01-01

    Full Text Available Prior to the “onset of hearing,” developing cochlear inner hair cells (IHCs and primary auditory neurons undergo experience-independent activity, which is thought to be important in retaining and refining neural connections in the absence of sound. One of the major hypotheses regarding the origin of such activity involves a group of columnar epithelial supporting cells forming Kölliker’s organ, which is only present during this critical period of auditory development. There is strong evidence for a purinergic signalling mechanism underlying such activity. ATP released through connexin hemichannels may activate P2 purinergic receptors in both Kölliker’s organ and the adjacent IHCs, leading to generation of electrical activity throughout the auditory system. However, recent work has suggested an alternative origin, by demonstrating the ability of IHCs to generate this spontaneous activity without activation by ATP. Regardless, developmental abnormalities of Kölliker’s organ may lead to congenital hearing loss, considering that mutations in ion channels (hemichannels, gap junctions, and calcium channels involved in Kölliker’s organ activity share strong links with such types of deafness.

  3. Attention and Motivated Response to Simulated Male Advertisement Call Activates Forebrain Dopaminergic and Social Decision-Making Network Nuclei in Female Midshipman Fish.

    Science.gov (United States)

    Forlano, Paul M; Licorish, Roshney R; Ghahramani, Zachary N; Timothy, Miky; Ferrari, Melissa; Palmer, William C; Sisneros, Joseph A

    2017-10-01

    Little is known regarding the coordination of audition with decision-making and subsequent motor responses that initiate social behavior including mate localization during courtship. Using the midshipman fish model, we tested the hypothesis that the time spent by females attending and responding to the advertisement call is correlated with the activation of a specific subset of catecholaminergic (CA) and social decision-making network (SDM) nuclei underlying auditory- driven sexual motivation. In addition, we quantified the relationship of neural activation between CA and SDM nuclei in all responders with the goal of providing a map of functional connectivity of the circuitry underlying a motivated state responsive to acoustic cues during mate localization. In order to make a baseline qualitative comparison of this functional brain map to unmotivated females, we made a similar correlative comparison of brain activation in females who were unresponsive to the advertisement call playback. Our results support an important role for dopaminergic neurons in the periventricular posterior tuberculum and ventral thalamus, putative A11 and A13 tetrapod homologues, respectively, as well as the posterior parvocellular preoptic area and dorsomedial telencephalon, (laterobasal amygdala homologue) in auditory attention and appetitive sexual behavior in fishes. These findings may also offer insights into the function of these highly conserved nuclei in the context of auditory-driven reproductive social behavior across vertebrates. © The Author 2017. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  4. Ebselen attenuates cisplatin-induced ROS generation through Nrf2 activation in auditory cells.

    Science.gov (United States)

    Kim, Se-Jin; Park, Channy; Han, A Lum; Youn, Myung-Ja; Lee, Jeong-Han; Kim, Yunha; Kim, Eun-Sook; Kim, Hyung-Jin; Kim, Jin-Kyung; Lee, Ho-Kyun; Chung, Sang-Young; So, Hongseob; Park, Raekil

    2009-05-01

    Ebselen, an organoselenium compound that acts as a glutathione peroxidase mimetic, has been demonstrated to possess antioxidant and anti-inflammatory activities. However, the molecular mechanism underlying this effect is not fully understood in auditory cells. The purpose of the present study is to investigate the protective effect of ebselen against cisplatin-induced toxicity in HEI-OC1 auditory cells, organotypic cultures of cochlear explants from two-day postnatal rats (P(2)) and adult Balb/C mice. Pretreatment with ebselen ameliorated apoptotic death induced by cisplatin in HEI-OC1 cells and organotypic cultures of Corti's organ. Ebselen pretreatment also significantly suppressed cisplatin-induced increases in intracellular reactive oxygen species (ROS), intracellular reactive nitrogen species (RNS) and lipid peroxidation levels. Ebselen dose-dependently increased the expression level of an antioxidant response element (ARE)-luciferase reporter in HEI-OC1 cells through the translocation of Nrf2 into the nucleus. Furthermore, we found that pretreatment with ebselen significantly restored Nrf2 function, whereas it ameliorated the cytotoxicity of cisplatin in cells transfectants with either a pcDNA3.1 (control) or a DN-Nrf2 (dominant-negative) plasmid. We also observed that Nrf2 activation by ebselen increased the expression of phase II antioxidant genes, including heme oxygenase (HO-1), NAD(P)H:quinine oxidoreductase, and gamma-glutamylcysteine synthetase (gamma-GCS). Treatment with ebselen resulted in an increased expression of HO-1 and intranuclear Nrf2 in hair cells of organotypic cultured cochlea. After intraperitoneal injection with cisplatin, auditory brainstem responses (ABRs) threshold was measured on 8th day in Balb/C mice. ABR threshold shift was marked occurred in mice injected with cisplatin (16 mg/kg, n=5; Click and 8-kHz stimuli, pebselen was not significantly changed. These results suggest that ebselen activates the Nrf2-ARE signaling pathway

  5. Sonic morphology: Aesthetic dimensional auditory spatial awareness

    Science.gov (United States)

    Whitehouse, Martha M.

    The sound and ceramic sculpture installation, " Skirting the Edge: Experiences in Sound & Form," is an integration of art and science demonstrating the concept of sonic morphology. "Sonic morphology" is herein defined as aesthetic three-dimensional auditory spatial awareness. The exhibition explicates my empirical phenomenal observations that sound has a three-dimensional form. Composed of ceramic sculptures that allude to different social and physical situations, coupled with sound compositions that enhance and create a three-dimensional auditory and visual aesthetic experience (see accompanying DVD), the exhibition supports the research question, "What is the relationship between sound and form?" Precisely how people aurally experience three-dimensional space involves an integration of spatial properties, auditory perception, individual history, and cultural mores. People also utilize environmental sound events as a guide in social situations and in remembering their personal history, as well as a guide in moving through space. Aesthetically, sound affects the fascination, meaning, and attention one has within a particular space. Sonic morphology brings art forms such as a movie, video, sound composition, and musical performance into the cognitive scope by generating meaning from the link between the visual and auditory senses. This research examined sonic morphology as an extension of musique concrete, sound as object, originating in Pierre Schaeffer's work in the 1940s. Pointing, as John Cage did, to the corporeal three-dimensional experience of "all sound," I composed works that took their total form only through the perceiver-participant's participation in the exhibition. While contemporary artist Alvin Lucier creates artworks that draw attention to making sound visible, "Skirting the Edge" engages the perceiver-participant visually and aurally, leading to recognition of sonic morphology.

  6. Attentional reorienting triggers spatial asymmetries in a search task with cross-modal spatial cueing.

    Directory of Open Access Journals (Sweden)

    Rebecca E Paladini

    Full Text Available Cross-modal spatial cueing can affect performance in a visual search task. For example, search performance improves if a visual target and an auditory cue originate from the same spatial location, and it deteriorates if they originate from different locations. Moreover, it has recently been postulated that multisensory settings, i.e., experimental settings, in which critical stimuli are concurrently presented in different sensory modalities (e.g., visual and auditory, may trigger asymmetries in visuospatial attention. Thereby, a facilitation has been observed for visual stimuli presented in the right compared to the left visual space. However, it remains unclear whether auditory cueing of attention differentially affects search performance in the left and the right hemifields in audio-visual search tasks. The present study investigated whether spatial asymmetries would occur in a search task with cross-modal spatial cueing. Participants completed a visual search task that contained no auditory cues (i.e., unimodal visual condition, spatially congruent, spatially incongruent, and spatially non-informative auditory cues. To further assess participants' accuracy in localising the auditory cues, a unimodal auditory spatial localisation task was also administered. The results demonstrated no left/right asymmetries in the unimodal visual search condition. Both an additional incongruent, as well as a spatially non-informative, auditory cue resulted in lateral asymmetries. Thereby, search times were increased for targets presented in the left compared to the right hemifield. No such spatial asymmetry was observed in the congruent condition. However, participants' performance in the congruent condition was modulated by their tone localisation accuracy. The findings of the present study demonstrate that spatial asymmetries in multisensory processing depend on the validity of the cross-modal cues, and occur under specific attentional conditions, i.e., when

  7. Short-Term Memory for Space and Time Flexibly Recruit Complementary Sensory-Biased Frontal Lobe Attention Networks.

    Science.gov (United States)

    Michalka, Samantha W; Kong, Lingqiang; Rosen, Maya L; Shinn-Cunningham, Barbara G; Somers, David C

    2015-08-19

    The frontal lobes control wide-ranging cognitive functions; however, functional subdivisions of human frontal cortex are only coarsely mapped. Here, functional magnetic resonance imaging reveals two distinct visual-biased attention regions in lateral frontal cortex, superior precentral sulcus (sPCS) and inferior precentral sulcus (iPCS), anatomically interdigitated with two auditory-biased attention regions, transverse gyrus intersecting precentral sulcus (tgPCS) and caudal inferior frontal sulcus (cIFS). Intrinsic functional connectivity analysis demonstrates that sPCS and iPCS fall within a broad visual-attention network, while tgPCS and cIFS fall within a broad auditory-attention network. Interestingly, we observe that spatial and temporal short-term memory (STM), respectively, recruit visual and auditory attention networks in the frontal lobe, independent of sensory modality. These findings not only demonstrate that both sensory modality and information domain influence frontal lobe functional organization, they also demonstrate that spatial processing co-localizes with visual processing and that temporal processing co-localizes with auditory processing in lateral frontal cortex. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Short-term memory for space and time flexibly recruit complementary sensory-biased frontal lobe attention networks

    Science.gov (United States)

    Michalka, Samantha W.; Kong, Lingqiang; Rosen, Maya L.; Shinn-Cunningham, Barbara G.; Somers, David C.

    2015-01-01

    Summary The frontal lobes control wide-ranging cognitive functions; however, functional subdivisions of human frontal cortex are only coarsely mapped. Here, functional magnetic resonance imaging reveals two distinct visual-biased attention regions in lateral frontal cortex, superior precentral sulcus (sPCS) and inferior precentral sulcus (iPCS), anatomically interdigitated with two auditory-biased attention regions, transverse gyrus intersecting precentral sulcus (tgPCS) and caudal inferior frontal sulcus (cIFS). Intrinsic functional connectivity analysis demonstrates that sPCS and iPCS fall within a broad visual-attention network, while tgPCS and cIFS fall within a broad auditory-attention network. Interestingly, we observe that spatial and temporal short-term memory (STM), respectively, recruit visual and auditory attention networks in the frontal lobe, independent of sensory modality. These findings not only demonstrate that both sensory modality and information domain influence frontal lobe functional organization, they also demonstrate that spatial processing co-localizes with visual processing and that temporal processing co-localizes with auditory processing in lateral frontal cortex. PMID:26291168

  9. Increased BOLD Signals Elicited by High Gamma Auditory Stimulation of the Left Auditory Cortex in Acute State Schizophrenia

    Directory of Open Access Journals (Sweden)

    Hironori Kuga, M.D.

    2016-10-01

    We acquired BOLD responses elicited by click trains of 20, 30, 40 and 80-Hz frequencies from 15 patients with acute episode schizophrenia (AESZ, 14 symptom-severity-matched patients with non-acute episode schizophrenia (NASZ, and 24 healthy controls (HC, assessed via a standard general linear-model-based analysis. The AESZ group showed significantly increased ASSR-BOLD signals to 80-Hz stimuli in the left auditory cortex compared with the HC and NASZ groups. In addition, enhanced 80-Hz ASSR-BOLD signals were associated with more severe auditory hallucination experiences in AESZ participants. The present results indicate that neural over activation occurs during 80-Hz auditory stimulation of the left auditory cortex in individuals with acute state schizophrenia. Given the possible association between abnormal gamma activity and increased glutamate levels, our data may reflect glutamate toxicity in the auditory cortex in the acute state of schizophrenia, which might lead to progressive changes in the left transverse temporal gyrus.

  10. Nonspatial intermodal selective attention is mediated by sensory brain areas: Evidence from event-related potential.

    NARCIS (Netherlands)

    Talsma, D.; Kok, A.

    2001-01-01

    Focuses on the question of whether inter-and intramodal forms of attention are reflected in activation of the same or different brain areas. ERPs were recorded while Ss (aged 18-41 yrs) were presented a random sequence of visual and auditory stimuli. They were instructed to attend to nonspatial

  11. Segregation and integration of auditory streams when listening to multi-part music.

    Directory of Open Access Journals (Sweden)

    Marie Ragert

    Full Text Available In our daily lives, auditory stream segregation allows us to differentiate concurrent sound sources and to make sense of the scene we are experiencing. However, a combination of segregation and the concurrent integration of auditory streams is necessary in order to analyze the relationship between streams and thus perceive a coherent auditory scene. The present functional magnetic resonance imaging study investigates the relative role and neural underpinnings of these listening strategies in multi-part musical stimuli. We compare a real human performance of a piano duet and a synthetic stimulus of the same duet in a prioritized integrative attention paradigm that required the simultaneous segregation and integration of auditory streams. In so doing, we manipulate the degree to which the attended part of the duet led either structurally (attend melody vs. attend accompaniment or temporally (asynchronies vs. no asynchronies between parts, and thus the relative contributions of integration and segregation used to make an assessment of the leader-follower relationship. We show that perceptually the relationship between parts is biased towards the conventional structural hierarchy in western music in which the melody generally dominates (leads the accompaniment. Moreover, the assessment varies as a function of both cognitive load, as shown through difficulty ratings and the interaction of the temporal and the structural relationship factors. Neurally, we see that the temporal relationship between parts, as one important cue for stream segregation, revealed distinct neural activity in the planum temporale. By contrast, integration used when listening to both the temporally separated performance stimulus and the temporally fused synthetic stimulus resulted in activation of the intraparietal sulcus. These results support the hypothesis that the planum temporale and IPS are key structures underlying the mechanisms of segregation and integration of

  12. Segregation and integration of auditory streams when listening to multi-part music.

    Science.gov (United States)

    Ragert, Marie; Fairhurst, Merle T; Keller, Peter E

    2014-01-01

    In our daily lives, auditory stream segregation allows us to differentiate concurrent sound sources and to make sense of the scene we are experiencing. However, a combination of segregation and the concurrent integration of auditory streams is necessary in order to analyze the relationship between streams and thus perceive a coherent auditory scene. The present functional magnetic resonance imaging study investigates the relative role and neural underpinnings of these listening strategies in multi-part musical stimuli. We compare a real human performance of a piano duet and a synthetic stimulus of the same duet in a prioritized integrative attention paradigm that required the simultaneous segregation and integration of auditory streams. In so doing, we manipulate the degree to which the attended part of the duet led either structurally (attend melody vs. attend accompaniment) or temporally (asynchronies vs. no asynchronies between parts), and thus the relative contributions of integration and segregation used to make an assessment of the leader-follower relationship. We show that perceptually the relationship between parts is biased towards the conventional structural hierarchy in western music in which the melody generally dominates (leads) the accompaniment. Moreover, the assessment varies as a function of both cognitive load, as shown through difficulty ratings and the interaction of the temporal and the structural relationship factors. Neurally, we see that the temporal relationship between parts, as one important cue for stream segregation, revealed distinct neural activity in the planum temporale. By contrast, integration used when listening to both the temporally separated performance stimulus and the temporally fused synthetic stimulus resulted in activation of the intraparietal sulcus. These results support the hypothesis that the planum temporale and IPS are key structures underlying the mechanisms of segregation and integration of auditory streams

  13. Changes in regional cerebral blood flow during auditory cognitive tasks

    International Nuclear Information System (INIS)

    Ohyama, Masashi; Kitamura, Shin; Terashi, Akiro; Senda, Michio.

    1993-01-01

    In order to investigate the relation between auditory cognitive function and regional brain activation, we measured the changes in the regional cerebral blood flow (CBF) using positron emission tomography (PET) during the 'odd-ball' paradigm in ten normal healthy volunteers. The subjects underwent 3 tasks, twice for each, while the evoked potential was recorded. In these tasks, the auditory stimulus was a series of pure tones delivered every 1.5 sec binaurally at 75 dB from the earphones. Task A: the stimulus was a series of tones with 1000 Hz only, and the subject was instructed to only hear. Task B: the stimulus was a series of tones with 1000 Hz only, and the subject was instructed to push the button on detecting a tone. Task C: the stimulus was a series of pure tones delivered every 1.5 sec binaurally at 75 dB with a frequency of 1000 Hz (non-target) in 80% and 2000 Hz (target) in 20% at random, and the subject was instructed to push the button on detecting a target tone. The event related potential (P300) was observed in task C (Pz: 334.3±19.6 msec). At each task, the CBF was measured using PET with i.v. injection of 1.5 GBq of O-15 water. The changes in CBF associated with auditory cognition was evaluated by the difference between the CBF images in task C and B. Localized increase was observed in the anterior cingulate cortex (in all subjects), the bilateral associate auditory cortex, the prefrontal cortex and the parietal cortex. The latter three areas had a large individual variation in the location of foci. These results suggested the role of those cortical areas in auditory cognition. The anterior cingulate was most activated (15.0±2.24% of global CBF). This region was not activated in the condition of task B minus task A. The anterior cingulate is a part of Papez's circuit that is related to memory and other higher cortical function. These results suggested that this area may play an important role in cognition as well as in attention. (author)

  14. Behavioral relevance of gamma-band activity for short-term memory-based auditory decision-making.

    Science.gov (United States)

    Kaiser, Jochen; Heidegger, Tonio; Lutzenberger, Werner

    2008-06-01

    Oscillatory activity in the gamma-band range has been established as a correlate of cognitive processes, including perception, attention and memory. Only a few studies, however, have provided evidence for an association between gamma-band activity (GBA) and measures of behavioral performance. Here we focused on the comparison between sample and test stimuli S1 and S2 during an auditory spatial short-term memory task. Applying statistical probability mapping to magnetoencephalographic recordings from 28 human subjects, we identified GBA components distinguishing nonidentical from identical S1-S2 pairs. This activity was found at frequencies between 65 and 90 Hz and was localized over posterior cortical regions contralateral to the hemifield in which the stimuli were presented. The 10 best task performers showed higher amplitudes of this GBA component than the 10 worst performers. This group difference was most pronounced between about 150 and 300 ms after stimulus onset. Apparently the decision about whether test stimuli matched the stored representation of previously presented sample sounds relied partly on the oscillatory activation of networks representing differences between both stimuli. This result could be replicated by reanalyzing the combined data from two previous studies assessing short-term memory for sound duration and sound lateralization, respectively. Similarly to our main study, GBA amplitudes to nonmatching vs. matching S1-S2 pairs were higher in good performers than poor performers. The present findings demonstrate the behavioral relevance of GBA.

  15. Areas activated during naturalistic reading comprehension overlap topological visual, auditory, and somatotomotor maps.

    Science.gov (United States)

    Sood, Mariam R; Sereno, Martin I

    2016-08-01

    Cortical mapping techniques using fMRI have been instrumental in identifying the boundaries of topological (neighbor-preserving) maps in early sensory areas. The presence of topological maps beyond early sensory areas raises the possibility that they might play a significant role in other cognitive systems, and that topological mapping might help to delineate areas involved in higher cognitive processes. In this study, we combine surface-based visual, auditory, and somatomotor mapping methods with a naturalistic reading comprehension task in the same group of subjects to provide a qualitative and quantitative assessment of the cortical overlap between sensory-motor maps in all major sensory modalities, and reading processing regions. Our results suggest that cortical activation during naturalistic reading comprehension overlaps more extensively with topological sensory-motor maps than has been heretofore appreciated. Reading activation in regions adjacent to occipital lobe and inferior parietal lobe almost completely overlaps visual maps, whereas a significant portion of frontal activation for reading in dorsolateral and ventral prefrontal cortex overlaps both visual and auditory maps. Even classical language regions in superior temporal cortex are partially overlapped by topological visual and auditory maps. By contrast, the main overlap with somatomotor maps is restricted to a small region on the anterior bank of the central sulcus near the border between the face and hand representations of M-I. Hum Brain Mapp 37:2784-2810, 2016. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  16. Attention and Working Memory in Adolescents with Autism Spectrum Disorder: A Functional MRI Study.

    Science.gov (United States)

    Rahko, Jukka S; Vuontela, Virve A; Carlson, Synnöve; Nikkinen, Juha; Hurtig, Tuula M; Kuusikko-Gauffin, Sanna; Mattila, Marja-Leena; Jussila, Katja K; Remes, Jukka J; Jansson-Verkasalo, Eira M; Aronen, Eeva T; Pauls, David L; Ebeling, Hanna E; Tervonen, Osmo; Moilanen, Irma K; Kiviniemi, Vesa J

    2016-06-01

    The present study examined attention and memory load-dependent differences in the brain activation and deactivation patterns between adolescents with autism spectrum disorders (ASDs) and typically developing (TD) controls using functional magnetic resonance imaging. Attentional (0-back) and working memory (WM; 2-back) processing and load differences (0 vs. 2-back) were analysed. WM-related areas activated and default mode network deactivated normally in ASDs as a function of task load. ASDs performed the attentional 0-back task similarly to TD controls but showed increased deactivation in cerebellum and right temporal cortical areas and weaker activation in other cerebellar areas. Increasing task load resulted in multiple responses in ASDs compared to TD and in inadequate modulation of brain activity in right insula, primary somatosensory, motor and auditory cortices. The changes during attentional task may reflect compensatory mechanisms enabling normal behavioral performance. The inadequate memory load-dependent modulation of activity suggests diminished compensatory potential in ASD.

  17. Readout from iconic memory and selective spatial attention involve similar neural processes.

    Science.gov (United States)

    Ruff, Christian C; Kristjánsson, Arni; Driver, Jon

    2007-10-01

    Iconic memory and spatial attention are often considered separately, but they may have functional similarities. Here we provide functional magnetic resonance imaging evidence for some common underlying neural effects. Subjects judged three visual stimuli in one hemifield of a bilateral array comprising six stimuli. The relevant hemifield for partial report was indicated by an auditory cue, administered either before the visual array (precue, spatial attention) or shortly after the array (postcue, iconic memory). Pre- and postcues led to similar activity modulations in lateral occipital cortex contralateral to the cued side. This finding indicates that readout from iconic memory can have some neural effects similar to those of spatial attention. We also found common bilateral activation of a fronto-parietal network for postcue and precue trials. These neuroimaging data suggest that some common neural mechanisms underlie selective spatial attention and readout from iconic memory. Some differences were also found; compared with precues, postcues led to higher activity in the right middle frontal gyrus.

  18. Inattentional Deafness: Visual Load Leads to Time-Specific Suppression of Auditory Evoked Responses.

    Science.gov (United States)

    Molloy, Katharine; Griffiths, Timothy D; Chait, Maria; Lavie, Nilli

    2015-12-09

    Due to capacity limits on perception, conditions of high perceptual load lead to reduced processing of unattended stimuli (Lavie et al., 2014). Accumulating work demonstrates the effects of visual perceptual load on visual cortex responses, but the effects on auditory processing remain poorly understood. Here we establish the neural mechanisms underlying "inattentional deafness"--the failure to perceive auditory stimuli under high visual perceptual load. Participants performed a visual search task of low (target dissimilar to nontarget items) or high (target similar to nontarget items) load. On a random subset (50%) of trials, irrelevant tones were presented concurrently with the visual stimuli. Brain activity was recorded with magnetoencephalography, and time-locked responses to the visual search array and to the incidental presence of unattended tones were assessed. High, compared to low, perceptual load led to increased early visual evoked responses (within 100 ms from onset). This was accompanied by reduced early (∼ 100 ms from tone onset) auditory evoked activity in superior temporal sulcus and posterior middle temporal gyrus. A later suppression of the P3 "awareness" response to the tones was also observed under high load. A behavioral experiment revealed reduced tone detection sensitivity under high visual load, indicating that the reduction in neural responses was indeed associated with reduced awareness of the sounds. These findings support a neural account of shared audiovisual resources, which, when depleted under load, leads to failures of sensory perception and awareness. The present work clarifies the neural underpinning of inattentional deafness under high visual load. The findings of near-simultaneous load effects on both visual and auditory evoked responses suggest shared audiovisual processing capacity. Temporary depletion of shared capacity in perceptually demanding visual tasks leads to a momentary reduction in sensory processing of auditory

  19. Competition and convergence between auditory and cross-modal visual inputs to primary auditory cortical areas

    Science.gov (United States)

    Mao, Yu-Ting; Hua, Tian-Miao

    2011-01-01

    Sensory neocortex is capable of considerable plasticity after sensory deprivation or damage to input pathways, especially early in development. Although plasticity can often be restorative, sometimes novel, ectopic inputs invade the affected cortical area. Invading inputs from other sensory modalities may compromise the original function or even take over, imposing a new function and preventing recovery. Using ferrets whose retinal axons were rerouted into auditory thalamus at birth, we were able to examine the effect of varying the degree of ectopic, cross-modal input on reorganization of developing auditory cortex. In particular, we assayed whether the invading visual inputs and the existing auditory inputs competed for or shared postsynaptic targets and whether the convergence of input modalities would induce multisensory processing. We demonstrate that although the cross-modal inputs create new visual neurons in auditory cortex, some auditory processing remains. The degree of damage to auditory input to the medial geniculate nucleus was directly related to the proportion of visual neurons in auditory cortex, suggesting that the visual and residual auditory inputs compete for cortical territory. Visual neurons were not segregated from auditory neurons but shared target space even on individual target cells, substantially increasing the proportion of multisensory neurons. Thus spatial convergence of visual and auditory input modalities may be sufficient to expand multisensory representations. Together these findings argue that early, patterned visual activity does not drive segregation of visual and auditory afferents and suggest that auditory function might be compromised by converging visual inputs. These results indicate possible ways in which multisensory cortical areas may form during development and evolution. They also suggest that rehabilitative strategies designed to promote recovery of function after sensory deprivation or damage need to take into

  20. Contralateral white noise selectively changes left human auditory cortex activity in a lexical decision task.

    Science.gov (United States)

    Behne, Nicole; Wendt, Beate; Scheich, Henning; Brechmann, André

    2006-04-01

    In a previous study, we hypothesized that the approach of presenting information-bearing stimuli to one ear and noise to the other ear may be a general strategy to determine hemispheric specialization in auditory cortex (AC). In that study, we confirmed the dominant role of the right AC in directional categorization of frequency modulations by showing that fMRI activation of right but not left AC was sharply emphasized when masking noise was presented to the contralateral ear. Here, we tested this hypothesis using a lexical decision task supposed to be mainly processed in the left hemisphere. Subjects had to distinguish between pseudowords and natural words presented monaurally to the left or right ear either with or without white noise to the other ear. According to our hypothesis, we expected a strong effect of contralateral noise on fMRI activity in left AC. For the control conditions without noise, we found that activation in both auditory cortices was stronger on contralateral than on ipsilateral word stimulation consistent with a more influential contralateral than ipsilateral auditory pathway. Additional presentation of contralateral noise did not significantly change activation in right AC, whereas it led to a significant increase of activation in left AC compared with the condition without noise. This is consistent with a left hemispheric specialization for lexical decisions. Thus our results support the hypothesis that activation by ipsilateral information-bearing stimuli is upregulated mainly in the hemisphere specialized for a given task when noise is presented to the more influential contralateral ear.

  1. Perceptual Plasticity for Auditory Object Recognition

    Science.gov (United States)

    Heald, Shannon L. M.; Van Hedger, Stephen C.; Nusbaum, Howard C.

    2017-01-01

    In our auditory environment, we rarely experience the exact acoustic waveform twice. This is especially true for communicative signals that have meaning for listeners. In speech and music, the acoustic signal changes as a function of the talker (or instrument), speaking (or playing) rate, and room acoustics, to name a few factors. Yet, despite this acoustic variability, we are able to recognize a sentence or melody as the same across various kinds of acoustic inputs and determine meaning based on listening goals, expectations, context, and experience. The recognition process relates acoustic signals to prior experience despite variability in signal-relevant and signal-irrelevant acoustic properties, some of which could be considered as “noise” in service of a recognition goal. However, some acoustic variability, if systematic, is lawful and can be exploited by listeners to aid in recognition. Perceivable changes in systematic variability can herald a need for listeners to reorganize perception and reorient their attention to more immediately signal-relevant cues. This view is not incorporated currently in many extant theories of auditory perception, which traditionally reduce psychological or neural representations of perceptual objects and the processes that act on them to static entities. While this reduction is likely done for the sake of empirical tractability, such a reduction may seriously distort the perceptual process to be modeled. We argue that perceptual representations, as well as the processes underlying perception, are dynamically determined by an interaction between the uncertainty of the auditory signal and constraints of context. This suggests that the process of auditory recognition is highly context-dependent in that the identity of a given auditory object may be intrinsically tied to its preceding context. To argue for the flexible neural and psychological updating of sound-to-meaning mappings across speech and music, we draw upon examples

  2. Missing a trick: Auditory load modulates conscious awareness in audition.

    Science.gov (United States)

    Fairnie, Jake; Moore, Brian C J; Remington, Anna

    2016-07-01

    In the visual domain there is considerable evidence supporting the Load Theory of Attention and Cognitive Control, which holds that conscious perception of background stimuli depends on the level of perceptual load involved in a primary task. However, literature on the applicability of this theory to the auditory domain is limited and, in many cases, inconsistent. Here we present a novel "auditory search task" that allows systematic investigation of the impact of auditory load on auditory conscious perception. An array of simultaneous, spatially separated sounds was presented to participants. On half the trials, a critical stimulus was presented concurrently with the array. Participants were asked to detect which of 2 possible targets was present in the array (primary task), and whether the critical stimulus was present or absent (secondary task). Increasing the auditory load of the primary task (raising the number of sounds in the array) consistently reduced the ability to detect the critical stimulus. This indicates that, at least in certain situations, load theory applies in the auditory domain. The implications of this finding are discussed both with respect to our understanding of typical audition and for populations with altered auditory processing. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  3. Integration and segregation in auditory scene analysis

    Science.gov (United States)

    Sussman, Elyse S.

    2005-03-01

    Assessment of the neural correlates of auditory scene analysis, using an index of sound change detection that does not require the listener to attend to the sounds [a component of event-related brain potentials called the mismatch negativity (MMN)], has previously demonstrated that segregation processes can occur without attention focused on the sounds and that within-stream contextual factors influence how sound elements are integrated and represented in auditory memory. The current study investigated the relationship between the segregation and integration processes when they were called upon to function together. The pattern of MMN results showed that the integration of sound elements within a sound stream occurred after the segregation of sounds into independent streams and, further, that the individual streams were subject to contextual effects. These results are consistent with a view of auditory processing that suggests that the auditory scene is rapidly organized into distinct streams and the integration of sequential elements to perceptual units takes place on the already formed streams. This would allow for the flexibility required to identify changing within-stream sound patterns, needed to appreciate music or comprehend speech..

  4. Absence of both auditory evoked potentials and auditory percepts dependent on timing cues.

    Science.gov (United States)

    Starr, A; McPherson, D; Patterson, J; Don, M; Luxford, W; Shannon, R; Sininger, Y; Tonakawa, L; Waring, M

    1991-06-01

    An 11-yr-old girl had an absence of sensory components of auditory evoked potentials (brainstem, middle and long-latency) to click and tone burst stimuli that she could clearly hear. Psychoacoustic tests revealed a marked impairment of those auditory perceptions dependent on temporal cues, that is, lateralization of binaural clicks, change of binaural masked threshold with changes in signal phase, binaural beats, detection of paired monaural clicks, monaural detection of a silent gap in a sound, and monaural threshold elevation for short duration tones. In contrast, auditory functions reflecting intensity or frequency discriminations (difference limens) were only minimally impaired. Pure tone audiometry showed a moderate (50 dB) bilateral hearing loss with a disproportionate severe loss of word intelligibility. Those auditory evoked potentials that were preserved included (1) cochlear microphonics reflecting hair cell activity; (2) cortical sustained potentials reflecting processing of slowly changing signals; and (3) long-latency cognitive components (P300, processing negativity) reflecting endogenous auditory cognitive processes. Both the evoked potential and perceptual deficits are attributed to changes in temporal encoding of acoustic signals perhaps occurring at the synapse between hair cell and eighth nerve dendrites. The results from this patient are discussed in relation to previously published cases with absent auditory evoked potentials and preserved hearing.

  5. Rhythmic Haptic Stimuli Improve Short-Term Attention.

    Science.gov (United States)

    Zhang, Shusheng; Wang, Dangxiao; Afzal, Naqash; Zhang, Yuru; Wu, Ruilin

    2016-01-01

    Brainwave entrainment using rhythmic visual and/or auditory stimulation has shown its efficacy in modulating neural activities and cognitive ability. In the presented study, we aim to investigate whether rhythmic haptic stimulation could enhance short-term attention. An experiment with sensorimotor rhythm (SMR) increasing protocol was performed in which participants were presented sinusoidal vibrotactile stimulus of 15 Hz on their palm. Test of Variables of Attention (T.O.V.A.) was performed before and after the stimulating session. Electroencephalograph (EEG) was recorded across the stimulating session and the two attention test sessions. SMR band power manifested a significant increase after stimulation. Results of T.O.V.A. tests indicated an improvement in the attention of participants who had received the stimulation compared to the control group who had not received the stimulation. The D prime score of T.O.V.A. reveals that participants performed better in perceptual sensitivity and sustaining attention level compared to their baseline performance before the stimulating session. These findings highlight the potential value of using haptics-based brainwave entrainment for cognitive training.

  6. Music-induced cortical plasticity and lateral inhibition in the human auditory cortex as foundations for tonal tinnitus treatment.

    Science.gov (United States)

    Pantev, Christo; Okamoto, Hidehiko; Teismann, Henning

    2012-01-01

    Over the past 15 years, we have studied plasticity in the human auditory cortex by means of magnetoencephalography (MEG). Two main topics nurtured our curiosity: the effects of musical training on plasticity in the auditory system, and the effects of lateral inhibition. One of our plasticity studies found that listening to notched music for 3 h inhibited the neuronal activity in the auditory cortex that corresponded to the center-frequency of the notch, suggesting suppression of neural activity by lateral inhibition. Subsequent research on this topic found that suppression was notably dependent upon the notch width employed, that the lower notch-edge induced stronger attenuation of neural activity than the higher notch-edge, and that auditory focused attention strengthened the inhibitory networks. Crucially, the overall effects of lateral inhibition on human auditory cortical activity were stronger than the habituation effects. Based on these results we developed a novel treatment strategy for tonal tinnitus-tailor-made notched music training (TMNMT). By notching the music energy spectrum around the individual tinnitus frequency, we intended to attract lateral inhibition to auditory neurons involved in tinnitus perception. So far, the training strategy has been evaluated in two studies. The results of the initial long-term controlled study (12 months) supported the validity of the treatment concept: subjective tinnitus loudness and annoyance were significantly reduced after TMNMT but not when notching spared the tinnitus frequencies. Correspondingly, tinnitus-related auditory evoked fields (AEFs) were significantly reduced after training. The subsequent short-term (5 days) training study indicated that training was more effective in the case of tinnitus frequencies ≤ 8 kHz compared to tinnitus frequencies >8 kHz, and that training should be employed over a long-term in order to induce more persistent effects. Further development and evaluation of TMNMT therapy

  7. Spiking in auditory cortex following thalamic stimulation is dominated by cortical network activity

    Science.gov (United States)

    Krause, Bryan M.; Raz, Aeyal; Uhlrich, Daniel J.; Smith, Philip H.; Banks, Matthew I.

    2014-01-01

    The state of the sensory cortical network can have a profound impact on neural responses and perception. In rodent auditory cortex, sensory responses are reported to occur in the context of network events, similar to brief UP states, that produce “packets” of spikes and are associated with synchronized synaptic input (Bathellier et al., 2012; Hromadka et al., 2013; Luczak et al., 2013). However, traditional models based on data from visual and somatosensory cortex predict that ascending sensory thalamocortical (TC) pathways sequentially activate cells in layers 4 (L4), L2/3, and L5. The relationship between these two spatio-temporal activity patterns is unclear. Here, we used calcium imaging and electrophysiological recordings in murine auditory TC brain slices to investigate the laminar response pattern to stimulation of TC afferents. We show that although monosynaptically driven spiking in response to TC afferents occurs, the vast majority of spikes fired following TC stimulation occurs during brief UP states and outside the context of the L4>L2/3>L5 activation sequence. Specifically, monosynaptic subthreshold TC responses with similar latencies were observed throughout layers 2–6, presumably via synapses onto dendritic processes located in L3 and L4. However, monosynaptic spiking was rare, and occurred primarily in L4 and L5 non-pyramidal cells. By contrast, during brief, TC-induced UP states, spiking was dense and occurred primarily in pyramidal cells. These network events always involved infragranular layers, whereas involvement of supragranular layers was variable. During UP states, spike latencies were comparable between infragranular and supragranular cells. These data are consistent with a model in which activation of auditory cortex, especially supragranular layers, depends on internally generated network events that represent a non-linear amplification process, are initiated by infragranular cells and tightly regulated by feed-forward inhibitory

  8. Cross-modal processing in auditory and visual working memory.

    Science.gov (United States)

    Suchan, Boris; Linnewerth, Britta; Köster, Odo; Daum, Irene; Schmid, Gebhard

    2006-02-01

    This study aimed to further explore processing of auditory and visual stimuli in working memory. Smith and Jonides (1997) [Smith, E.E., Jonides, J., 1997. Working memory: A view from neuroimaging. Cogn. Psychol. 33, 5-42] described a modified working memory model in which visual input is automatically transformed into a phonological code. To study this process, auditory and the corresponding visual stimuli were presented in a variant of the 2-back task which involved changes from the auditory to the visual modality and vice versa. Brain activation patterns underlying visual and auditory processing as well as transformation mechanisms were analyzed. Results yielded a significant activation in the left primary auditory cortex associated with transformation of visual into auditory information which reflects the matching and recoding of a stored item and its modality. This finding yields empirical evidence for a transformation of visual input into a phonological code, with the auditory cortex as the neural correlate of the recoding process in working memory.

  9. Comparing Auditory Noise Treatment with Stimulant Medication on Cognitive Task Performance in Children with Attention Deficit Hyperactivity Disorder: Results from a Pilot Study.

    Science.gov (United States)

    Söderlund, Göran B W; Björk, Christer; Gustafsson, Peik

    2016-01-01

    Recent research has shown that acoustic white noise (80 dB) can improve task performance in people with attention deficits and/or Attention Deficit Hyperactivity Disorder (ADHD). This is attributed to the phenomenon of stochastic resonance in which a certain amount of noise can improve performance in a brain that is not working at its optimum. We compare here the effect of noise exposure with the effect of stimulant medication on cognitive task performance in ADHD. The aim of the present study was to compare the effects of auditory noise exposure with stimulant medication for ADHD children on a cognitive test battery. A group of typically developed children (TDC) took the same tests as a comparison. Twenty children with ADHD of combined or inattentive subtypes and twenty TDC matched for age and gender performed three different tests (word recall, spanboard and n-back task) during exposure to white noise (80 dB) and in a silent condition. The ADHD children were tested with and without central stimulant medication. In the spanboard- and the word recall tasks, but not in the 2-back task, white noise exposure led to significant improvements for both non-medicated and medicated ADHD children. No significant effects of medication were found on any of the three tasks. This pilot study shows that exposure to white noise resulted in a task improvement that was larger than the one with stimulant medication thus opening up the possibility of using auditory noise as an alternative, non-pharmacological treatment of cognitive ADHD symptoms.

  10. Comparing Auditory Noise Treatment with Stimulant Medication on Cognitive Task Performance in Children with Attention Deficit Hyperactivity Disorder: Results from a Pilot Study

    Directory of Open Access Journals (Sweden)

    Göran B W Söderlund

    2016-09-01

    Full Text Available Background: Recent research has shown that acoustic white noise (80 dB can improve task performance in people with attention deficits and/or Attention Deficit Hyperactivity Disorder (ADHD. This is attributed to the phenomenon of stochastic resonance in which a certain amount of noise can improve performance in a brain that is not working at its optimum. We compare here the effect of noise exposure with the effect of stimulant medication on cognitive task performance in ADHD. The aim of the present study was to compare the effects of auditory noise exposure with stimulant medication for ADHD children on a cognitive test battery. A group of typically developed children (TDC took the same tests as a comparison.Methods: Twenty children with ADHD of combined or inattentive subtypes and twenty typically developed children matched for age and gender performed three different tests (word recall, spanboard and n-back task during exposure to white noise (80 dB and in a silent condition. The ADHD children were tested with and without central stimulant medication.Results: In the spanboard- and the word recall tasks, but not in the 2-back task, white noise exposure led to significant improvements for both non-medicated and medicated ADHD children. No significant effects of medication were found on any of the three tasks.Conclusion: This pilot study shows that exposure to white noise resulted in a task improvement that was larger than the one with stimulant medication thus opening up the possibility of using auditory noise as an alternative, non-pharmacological treatment of cognitive ADHD symptoms.

  11. Nonspatial intermodal selective attention is mediated by sensory brain brain areas: Evidence from event-related potential.

    NARCIS (Netherlands)

    Talsma, D.; Kok, A.

    2001-01-01

    Focuses on the question of whether inter-and intramodal forms of attention are reflected in activation of the same or different brain areas. ERPs were recorded while Ss (aged 18-41 yrs) were presented a random sequence of visual and auditory stimuli. They were instructed to attend to nonspatial

  12. Bilateral Changes of Spontaneous Activity Within the Central Auditory Pathway Upon Chronic Unilateral Intracochlear Electrical Stimulation.

    Science.gov (United States)

    Basta, Dietmar; Götze, Romy; Gröschel, Moritz; Jansen, Sebastian; Janke, Oliver; Tzschentke, Barbara; Boyle, Patrick; Ernst, Arne

    2015-12-01

    In recent years, cochlear implants have been applied successfully for the treatment of unilateral hearing loss with quite surprising benefit. One reason for this successful treatment, including the relief from tinnitus, could be the normalization of spontaneous activity in the central auditory pathway because of the electrical stimulation. The present study, therefore, investigated at a cellular level, the effect of a unilateral chronic intracochlear stimulation on key structures of the central auditory pathway. Normal-hearing guinea pigs were mechanically single-sided deafened through a standard HiFocus1j electrode array (on a HiRes 90k cochlear implant) being inserted into the first turn of the cochlea. Four to five electrode contacts could be used for the stimulation. Six weeks after surgery, the speech processor (Auria) was fitted, based on tNRI values and mounted on the animal's back. The two experimental groups were stimulated 16 hours per day for 90 days, using a HiRes strategy based on different stimulation rates (low rate (275 pps/ch), high rate (5000 pps/ch)). The results were compared with those of unilateral deafened controls (implanted but not stimulated), as well as between the treatment groups. All animals experienced a standardized free field auditory environment. The low-rate group showed a significantly lower average spontaneous activity bilaterally in the dorsal cochlear nucleus and the medial geniculate body than the controls. However, there was no difference in the inferior colliculus and the primary auditory cortex. Spontaneous activity of the high-rate group was also reduced bilaterally in the dorsal cochlear nucleus and in the primary auditory cortex. No differences could be observed between the high-rate group and the controls in the contra-lateral inferior colliculus and medial geniculate body. The high-rate group showed bilaterally a higher activity in the CN and the MGB compared with the low-rate group, whereas in the IC and in the

  13. Visual selective attention in amnestic mild cognitive impairment.

    Science.gov (United States)

    McLaughlin, Paula M; Anderson, Nicole D; Rich, Jill B; Chertkow, Howard; Murtha, Susan J E

    2014-11-01

    Subtle deficits in visual selective attention have been found in amnestic mild cognitive impairment (aMCI). However, few studies have explored performance on visual search paradigms or the Simon task, which are known to be sensitive to disease severity in Alzheimer's patients. Furthermore, there is limited research investigating how deficiencies can be ameliorated with exogenous support (auditory cues). Sixteen individuals with aMCI and 14 control participants completed 3 experimental tasks that varied in demand and cue availability: visual search-alerting, visual search-orienting, and Simon task. Visual selective attention was influenced by aMCI, auditory cues, and task characteristics. Visual search abilities were relatively consistent across groups. The aMCI participants were impaired on the Simon task when working memory was required, but conflict resolution was similar to controls. Spatially informative orienting cues improved response times, whereas spatially neutral alerting cues did not influence performance. Finally, spatially informative auditory cues benefited the aMCI group more than controls in the visual search task, specifically at the largest array size where orienting demands were greatest. These findings suggest that individuals with aMCI have working memory deficits and subtle deficiencies in orienting attention and rely on exogenous information to guide attention. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Validation of the Emotiv EPOC® EEG gaming system for measuring research quality auditory ERPs

    Science.gov (United States)

    Mousikou, Petroula; Mahajan, Yatin; de Lissa, Peter; Thie, Johnson; McArthur, Genevieve

    2013-01-01

    Background. Auditory event-related potentials (ERPs) have proved useful in investigating the role of auditory processing in cognitive disorders such as developmental dyslexia, specific language impairment (SLI), attention deficit hyperactivity disorder (ADHD), schizophrenia, and autism. However, laboratory recordings of auditory ERPs can be lengthy, uncomfortable, or threatening for some participants – particularly children. Recently, a commercial gaming electroencephalography (EEG) system has been developed that is portable, inexpensive, and easy to set up. In this study we tested if auditory ERPs measured using a gaming EEG system (Emotiv EPOC®, www.emotiv.com) were equivalent to those measured by a widely-used, laboratory-based, research EEG system (Neuroscan). Methods. We simultaneously recorded EEGs with the research and gaming EEG systems, whilst presenting 21 adults with 566 standard (1000 Hz) and 100 deviant (1200 Hz) tones under passive (non-attended) and active (attended) conditions. The onset of each tone was marked in the EEGs using a parallel port pulse (Neuroscan) or a stimulus-generated electrical pulse injected into the O1 and O2 channels (Emotiv EPOC®). These markers were used to calculate research and gaming EEG system late auditory ERPs (P1, N1, P2, N2, and P3 peaks) and the mismatch negativity (MMN) in active and passive listening conditions for each participant. Results. Analyses were restricted to frontal sites as these are most commonly reported in auditory ERP research. Intra-class correlations (ICCs) indicated that the morphology of the research and gaming EEG system late auditory ERP waveforms were similar across all participants, but that the research and gaming EEG system MMN waveforms were only similar for participants with non-noisy MMN waveforms (N = 11 out of 21). Peak amplitude and latency measures revealed no significant differences between the size or the timing of the auditory P1, N1, P2, N2, P3, and MMN peaks. Conclusions

  15. Validation of the Emotiv EPOC® EEG gaming system for measuring research quality auditory ERPs

    Directory of Open Access Journals (Sweden)

    Nicholas A. Badcock

    2013-02-01

    Full Text Available Background. Auditory event-related potentials (ERPs have proved useful in investigating the role of auditory processing in cognitive disorders such as developmental dyslexia, specific language impairment (SLI, attention deficit hyperactivity disorder (ADHD, schizophrenia, and autism. However, laboratory recordings of auditory ERPs can be lengthy, uncomfortable, or threatening for some participants – particularly children. Recently, a commercial gaming electroencephalography (EEG system has been developed that is portable, inexpensive, and easy to set up. In this study we tested if auditory ERPs measured using a gaming EEG system (Emotiv EPOC®, www.emotiv.com were equivalent to those measured by a widely-used, laboratory-based, research EEG system (Neuroscan.Methods. We simultaneously recorded EEGs with the research and gaming EEG systems, whilst presenting 21 adults with 566 standard (1000 Hz and 100 deviant (1200 Hz tones under passive (non-attended and active (attended conditions. The onset of each tone was marked in the EEGs using a parallel port pulse (Neuroscan or a stimulus-generated electrical pulse injected into the O1 and O2 channels (Emotiv EPOC®. These markers were used to calculate research and gaming EEG system late auditory ERPs (P1, N1, P2, N2, and P3 peaks and the mismatch negativity (MMN in active and passive listening conditions for each participant.Results. Analyses were restricted to frontal sites as these are most commonly reported in auditory ERP research. Intra-class correlations (ICCs indicated that the morphology of the research and gaming EEG system late auditory ERP waveforms were similar across all participants, but that the research and gaming EEG system MMN waveforms were only similar for participants with non-noisy MMN waveforms (N = 11 out of 21. Peak amplitude and latency measures revealed no significant differences between the size or the timing of the auditory P1, N1, P2, N2, P3, and MMN peaks

  16. Attention to affective audio-visual information: Comparison between musicians and non-musicians

    NARCIS (Netherlands)

    Weijkamp, J.; Sadakata, M.

    2017-01-01

    Individuals with more musical training repeatedly demonstrate enhanced auditory perception abilities. The current study examined how these enhanced auditory skills interact with attention to affective audio-visual stimuli. A total of 16 participants with more than 5 years of musical training

  17. The neural correlates of coloured music: a functional MRI investigation of auditory-visual synaesthesia.

    Science.gov (United States)

    Neufeld, J; Sinke, C; Dillo, W; Emrich, H M; Szycik, G R; Dima, D; Bleich, S; Zedler, M

    2012-01-01

    In auditory-visual synaesthesia, all kinds of sound can induce additional visual experiences. To identify the brain regions mainly involved in this form of synaesthesia, functional magnetic resonance imaging (fMRI) has been used during non-linguistic sound perception (chords and pure tones) in synaesthetes and non-synaesthetes. Synaesthetes showed increased activation in the left inferior parietal cortex (IPC), an area involved in multimodal integration, feature binding and attention guidance. No significant group-differences could be detected in area V4, which is known to be related to colour vision and form processing. The results support the idea of the parietal cortex acting as sensory nexus area in auditory-visual synaesthesia, and as a common neural correlate for different types of synaesthesia. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Data on the effect of conductive hearing loss on auditory and visual cortex activity revealed by intrinsic signal imaging.

    Science.gov (United States)

    Teichert, Manuel; Bolz, Jürgen

    2017-10-01

    This data article provides additional data related to the research article entitled "Simultaneous intrinsic signal imaging of auditory and visual cortex reveals profound effects of acute hearing loss on visual processing" (Teichert and Bolz, 2017) [1]. The primary auditory and visual cortex (A1 and V1) of adult male C57BL/6J mice (P120-P240) were mapped simultaneously using intrinsic signal imaging (Kalatsky and Stryker, 2003) [2]. A1 and V1 activity evoked by combined auditory and visual stimulation were measured before and after conductive hearing loss (CHL) induced by bilateral malleus removal. We provide data showing that A1 responsiveness evoked by sounds of different sound pressure levels (SPL) decreased after CHL whereas visually evoked V1 activity increased after this intervention. In addition, we also provide imaging data on percentage of V1 activity increases after CHL compared to pre-CHL.

  19. Activity as a Mediator Between Users and Their Auditory Environment in an Urban Pocket Park: A Case Study of Parc du Portugal (Montreal, Canada)

    NARCIS (Netherlands)

    Bild, E.; Steele, D.; Pfeffer, K.; Bertolini, L.; Guastavino, C.; Aletta, F.; Xao, J.

    2018-01-01

    Sound is receiving increasing attention in urban planning and design due to its effects on human health and quality of life. Soundscape researchers have sought ecologically valid measures to describe and explain the complex relationship between people and their auditory environments, largely

  20. Blast-Induced Tinnitus and Elevated Central Auditory and Limbic Activity in Rats: A Manganese-Enhanced MRI and Behavioral Study.

    Science.gov (United States)

    Ouyang, Jessica; Pace, Edward; Lepczyk, Laura; Kaufman, Michael; Zhang, Jessica; Perrine, Shane A; Zhang, Jinsheng

    2017-07-07

    Blast-induced tinitus is the number one service-connected disability that currently affects military personnel and veterans. To elucidate its underlying mechanisms, we subjected 13 Sprague Dawley adult rats to unilateral 14 psi blast exposure to induce tinnitus and measured auditory and limbic brain activity using manganese-enhanced MRI (MEMRI). Tinnitus was evaluated with a gap detection acoustic startle reflex paradigm, while hearing status was assessed with prepulse inhibition (PPI) and auditory brainstem responses (ABRs). Both anxiety and cognitive functioning were assessed using elevated plus maze and Morris water maze, respectively. Five weeks after blast exposure, 8 of the 13 blasted rats exhibited chronic tinnitus. While acoustic PPI remained intact and ABR thresholds recovered, the ABR wave P1-N1 amplitude reduction persisted in all blast-exposed rats. No differences in spatial cognition were observed, but blasted rats as a whole exhibited increased anxiety. MEMRI data revealed a bilateral increase in activity along the auditory pathway and in certain limbic regions of rats with tinnitus compared to age-matched controls. Taken together, our data suggest that while blast-induced tinnitus may play a role in auditory and limbic hyperactivity, the non-auditory effects of blast and potential traumatic brain injury may also exert an effect.

  1. Time-resolved neuroimaging of visual short term memory consolidation by post-perceptual attention shifts.

    Science.gov (United States)

    Hecht, Marcus; Thiemann, Ulf; Freitag, Christine M; Bender, Stephan

    2016-01-15

    Post-perceptual cues can enhance visual short term memory encoding even after the offset of the visual stimulus. However, both the mechanisms by which the sensory stimulus characteristics are buffered as well as the mechanisms by which post-perceptual selective attention enhances short term memory encoding remain unclear. We analyzed late post-perceptual event-related potentials (ERPs) in visual change detection tasks (100ms stimulus duration) by high-resolution ERP analysis to elucidate these mechanisms. The effects of early and late auditory post-cues (300ms or 850ms after visual stimulus onset) as well as the effects of a visual interference stimulus were examined in 27 healthy right-handed adults. Focusing attention with post-perceptual cues at both latencies significantly improved memory performance, i.e. sensory stimulus characteristics were available for up to 850ms after stimulus presentation. Passive watching of the visual stimuli without auditory cue presentation evoked a slow negative wave (N700) over occipito-temporal visual areas. N700 was strongly reduced by a visual interference stimulus which impeded memory maintenance. In contrast, contralateral delay activity (CDA) still developed in this condition after the application of auditory post-cues and was thereby dissociated from N700. CDA and N700 seem to represent two different processes involved in short term memory encoding. While N700 could reflect visual post processing by automatic attention attraction, CDA may reflect the top-down process of searching selectively for the required information through post-perceptual attention. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Auditory agnosia.

    Science.gov (United States)

    Slevc, L Robert; Shell, Alison R

    2015-01-01

    Auditory agnosia refers to impairments in sound perception and identification despite intact hearing, cognitive functioning, and language abilities (reading, writing, and speaking). Auditory agnosia can be general, affecting all types of sound perception, or can be (relatively) specific to a particular domain. Verbal auditory agnosia (also known as (pure) word deafness) refers to deficits specific to speech processing, environmental sound agnosia refers to difficulties confined to non-speech environmental sounds, and amusia refers to deficits confined to music. These deficits can be apperceptive, affecting basic perceptual processes, or associative, affecting the relation of a perceived auditory object to its meaning. This chapter discusses what is known about the behavioral symptoms and lesion correlates of these different types of auditory agnosia (focusing especially on verbal auditory agnosia), evidence for the role of a rapid temporal processing deficit in some aspects of auditory agnosia, and the few attempts to treat the perceptual deficits associated with auditory agnosia. A clear picture of auditory agnosia has been slow to emerge, hampered by the considerable heterogeneity in behavioral deficits, associated brain damage, and variable assessments across cases. Despite this lack of clarity, these striking deficits in complex sound processing continue to inform our understanding of auditory perception and cognition. © 2015 Elsevier B.V. All rights reserved.

  3. Spatial Attention and Audiovisual Interactions in Apparent Motion

    Science.gov (United States)

    Sanabria, Daniel; Soto-Faraco, Salvador; Spence, Charles

    2007-01-01

    In this study, the authors combined the cross-modal dynamic capture task (involving the horizontal apparent movement of visual and auditory stimuli) with spatial cuing in the vertical dimension to investigate the role of spatial attention in cross-modal interactions during motion perception. Spatial attention was manipulated endogenously, either…

  4. Influences of multiple memory systems on auditory mental image acuity.

    Science.gov (United States)

    Navarro Cebrian, Ana; Janata, Petr

    2010-05-01

    The influence of different memory systems and associated attentional processes on the acuity of auditory images, formed for the purpose of making intonation judgments, was examined across three experiments using three different task types (cued-attention, imagery, and two-tone discrimination). In experiment 1 the influence of implicit long-term memory for musical scale structure was manipulated by varying the scale degree (leading tone versus tonic) of the probe note about which a judgment had to be made. In experiments 2 and 3 the ability of short-term absolute pitch knowledge to develop was manipulated by presenting blocks of trials in the same key or in seven different keys. The acuity of auditory images depended on all of these manipulations. Within individual listeners, thresholds in the two-tone discrimination and cued-attention conditions were closely related. In many listeners, cued-attention thresholds were similar to thresholds in the imagery condition, and depended on the amount of training individual listeners had in playing a musical instrument. The results indicate that mental images formed at a sensory/cognitive interface for the purpose of making perceptual decisions are highly malleable.

  5. Brain networks underlying mental imagery of auditory and visual information.

    Science.gov (United States)

    Zvyagintsev, Mikhail; Clemens, Benjamin; Chechko, Natalya; Mathiak, Krystyna A; Sack, Alexander T; Mathiak, Klaus

    2013-05-01

    Mental imagery is a complex cognitive process that resembles the experience of perceiving an object when this object is not physically present to the senses. It has been shown that, depending on the sensory nature of the object, mental imagery also involves correspondent sensory neural mechanisms. However, it remains unclear which areas of the brain subserve supramodal imagery processes that are independent of the object modality, and which brain areas are involved in modality-specific imagery processes. Here, we conducted a functional magnetic resonance imaging study to reveal supramodal and modality-specific networks of mental imagery for auditory and visual information. A common supramodal brain network independent of imagery modality, two separate modality-specific networks for imagery of auditory and visual information, and a common deactivation network were identified. The supramodal network included brain areas related to attention, memory retrieval, motor preparation and semantic processing, as well as areas considered to be part of the default-mode network and multisensory integration areas. The modality-specific networks comprised brain areas involved in processing of respective modality-specific sensory information. Interestingly, we found that imagery of auditory information led to a relative deactivation within the modality-specific areas for visual imagery, and vice versa. In addition, mental imagery of both auditory and visual information widely suppressed the activity of primary sensory and motor areas, for example deactivation network. These findings have important implications for understanding the mechanisms that are involved in generation of mental imagery. © 2013 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  6. Interference by Process, Not Content, Determines Semantic Auditory Distraction

    Science.gov (United States)

    Marsh, John E.; Hughes, Robert W.; Jones, Dylan M.

    2009-01-01

    Distraction by irrelevant background sound of visually-based cognitive tasks illustrates the vulnerability of attentional selectivity across modalities. Four experiments centred on auditory distraction during tests of memory for visually-presented semantic information. Meaningful irrelevant speech disrupted the free recall of semantic…

  7. Nonverbal spatially selective attention in 4- and 5-year-old children.

    Science.gov (United States)

    Sanders, Lisa D; Zobel, Benjamin H

    2012-07-01

    Under some conditions 4- and 5-year-old children can differentially process sounds from attended and unattended locations. In fact, the latency of spatially selective attention effects on auditory processing as measured with event-related potentials (ERPs) is quite similar in young children and adults. However, it is not clear if developmental differences in the polarity, distribution, and duration of attention effects are best attributed to acoustic characteristics, availability of non-spatial attention cues, task demands, or domain. In the current study adults and children were instructed to attend to one of two simultaneously presented soundscapes (e.g., city sounds or night sounds) to detect targets (e.g., car horn or owl hoot) in the attended channel only. Probes presented from the same location as the attended soundscape elicited a larger negativity by 80 ms after onset in both adults and children. This initial negative difference (Nd) was followed by a larger positivity for attended probes in adults and another negativity for attended probes in children. The results indicate that the neural systems by which attention modulates early auditory processing are available for young children even when presented with nonverbal sounds. They also suggest important interactions between attention, acoustic characteristics, and maturity on auditory evoked potentials. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Dose-dependent suppression by ethanol of transient auditory 40-Hz response.

    Science.gov (United States)

    Jääskeläinen, I P; Hirvonen, J; Saher, M; Pekkonen, E; Sillanaukee, P; Näätänen, R; Tiitinen, H

    2000-02-01

    Acute alcohol (ethanol) challenge is known to induce various cognitive disturbances, yet the neural basis of the effect is poorly known. The auditory transient evoked gamma-band (40-Hz) oscillatory responses have been suggested to be associated with various perceptual and cognitive functions in humans; however, alcohol effects on auditory 40-Hz responses have not been investigated to date. The objective of the study was to test the dose-related impact of alcohol on auditory transient evoked 40-Hz responses during a selective-attention task. Ten healthy social drinkers ingested, in four separate sessions, 0.00, 0. 25, 0.50, or 0.75 g/kg of 10% (v/v) alcohol solution. The order of the sessions was randomized and a double-blind procedure was employed. During a selective attention task, 300-Hz standard and 330-Hz deviant tones were presented to the left ear, and 1000-Hz standards and 1100-Hz deviants to the right ear of the subjects (P=0. 425 for each standard, P=0.075 for each deviant). The subjects attended to a designated ear, and were to detect the deviants therein while ignoring tones to the other ear. The auditory transient evoked 40-Hz responses elicited by both the attended and unattended standard tones were significantly suppressed by the 0.50 and 0.75 g/kg alcohol doses. Alcohol suppresses auditory transient evoked 40-Hz oscillations already with moderate blood alcohol concentrations. Given the putative role of gamma-band oscillations in cognition, this finding could be associated with certain alcohol-induced cognitive deficits.

  9. Active Listening Delays Attentional Disengagement and Saccadic Eye Movements.

    Science.gov (United States)

    Lester, Benjamin D; Vecera, Shaun P

    2017-05-23

    Successful goal-directed visual behavior depends on efficient disengagement of attention. Attention must be withdrawn from its current focus before being redeployed to a new object or internal process. Previous research has demonstrated that occupying cognitive processes with a secondary cellular phone conversation impairs attentional functioning and driving behavior. For example, attentional processing is significantly impacted by concurrent cell phone use, resulting in decreased explicit memory for on-road information. Here, we examined the impact of a critical component of cell-phone use-active listening-on the effectiveness of attentional disengagement. In the gap task-a saccadic manipulation of attentional disengagement-we measured saccade latencies while participants performed a secondary active listening task. Saccadic latencies significantly increased under an active listening load only when attention needed to be disengaged, indicating that active listening delays a disengagement operation. Simple dual-task interference did not account for the observed results. Rather, active cognitive engagement is required for measurable disengagement slowing to be observed. These results have implications for investigations of attention, gaze behavior, and distracted driving. Secondary tasks such as active listening or cell-phone conversations can have wide-ranging impacts on cognitive functioning, potentially impairing relatively elementary operations of attentional function, including disengagement.

  10. The cerebral functional location in normal subjects with Chinese classical national music auditory stimulus

    International Nuclear Information System (INIS)

    Sun Da; Xu Wei; Zhan Hongwei; Liu Hongbiao

    2004-01-01

    Purpose: To detect the cerebral functional location in normal subjects with Chinese classical national music auditory stimulus. Methods: 10 normal young students of the medical collage of Zhejiang University,22-24 years old,5 male and 5 female. The first they underwent a 99mTc-ECD brain imaging during a rest state using a dual detectors gamma camera with fan beam collimators. After 2-4 days they were asked to listen a Chinese classical national music that was played by Erhu and Guzheng for 20 minters. They were also asked to pay special attention to the name of the music, what musical instruments they played and what imagination was opened out in the music. 99mTc-ECD was administered in the first 3 minutes during thy listened the music. The brain imaging was performed in 30-60 minutes after the tracer was administered. Results: To compare the rest state, during listening the Chinese classical national music and paying special attention to the imagination of music the right midtemporal in 6 cases, left midtemporal in 2 cases, right superior temporal in 2 cases, left superior temporal in 6 cases, and right inferior temporal in 2 cases were activated. Among them, dual temporal were activated in 6 cases, right temporal in 3 cases and left temporal in 1 case. It is very interesting that the inferior frontal and/or medial frontal lobes were activated in all 10 subjects, and the activity was markedly higher in frontal than in temporal. Among them dual frontal lobes were activated in 9 subjects, and only right frontal in 1 case. The right superior frontal lobes were activated in 2 cases. The occipital lobes were activated in 4 subjects, and dual occipital in 3 cases, right occipital in 1 case. These 4 subjects stated after listening that they imagined the natural landscape and imagination that was opened out in the music follow the music. Other regions that were activated included parietal lobes (right and left in 1 respectively), pre-cingulated gyms (in 2 cases), and left

  11. Pre-attentive modulation of brain responses to tones in coloured-hearing synesthetes

    Directory of Open Access Journals (Sweden)

    Jäncke Lutz

    2012-12-01

    Full Text Available Abstract Background Coloured-hearing (CH synesthesia is a perceptual phenomenon in which an acoustic stimulus (the inducer initiates a concurrent colour perception (the concurrent. Individuals with CH synesthesia "see" colours when hearing tones, words, or music; this specific phenomenon suggesting a close relationship between auditory and visual representations. To date, it is still unknown whether the perception of colours is associated with a modulation of brain functions in the inducing brain area, namely in the auditory-related cortex and associated brain areas. In addition, there is an on-going debate as to whether attention to the inducer is necessarily required for eliciting a visual concurrent, or whether the latter can emerge in a pre-attentive fashion. Results By using the EEG technique in the context of a pre-attentive mismatch negativity (MMN paradigm, we show that the binding of tones and colours in CH synesthetes is associated with increased MMN amplitudes in response to deviant tones supposed to induce novel concurrent colour perceptions. Most notably, the increased MMN amplitudes we revealed in the CH synesthetes were associated with stronger intracerebral current densities originating from the auditory cortex, parietal cortex, and ventral visual areas. Conclusions The automatic binding of tones and colours in CH synesthetes is accompanied by an early pre-attentive process recruiting the auditory cortex, inferior and superior parietal lobules, as well as ventral occipital areas.

  12. What's that sound? Matches with auditory long-term memory induce gamma activity in human EEG.

    Science.gov (United States)

    Lenz, Daniel; Schadow, Jeanette; Thaerig, Stefanie; Busch, Niko A; Herrmann, Christoph S

    2007-04-01

    In recent years the cognitive functions of human gamma-band activity (30-100 Hz) advanced continuously into scientific focus. Not only bottom-up driven influences on 40 Hz activity have been observed, but also top-down processes seem to modulate responses in this frequency band. Among the various functions that have been related to gamma activity a pivotal role has been assigned to memory processes. Visual experiments suggested that gamma activity is involved in matching visual input to memory representations. Based on these findings we hypothesized that such memory related modulations of gamma activity exist in the auditory modality, as well. Thus, we chose environmental sounds for which subjects already had a long-term memory (LTM) representation and compared them to unknown, but physically similar sounds. 21 subjects had to classify sounds as 'recognized' or 'unrecognized', while EEG was recorded. Our data show significantly stronger activity in the induced gamma-band for recognized sounds in the time window between 300 and 500 ms after stimulus onset with a central topography. The results suggest that induced gamma-band activity reflects the matches between sounds and their representations in auditory LTM.

  13. Spatiotemporal Relationships among Audiovisual Stimuli Modulate Auditory Facilitation of Visual Target Discrimination.

    Science.gov (United States)

    Li, Qi; Yang, Huamin; Sun, Fang; Wu, Jinglong

    2015-03-01

    Sensory information is multimodal; through audiovisual interaction, task-irrelevant auditory stimuli tend to speed response times and increase visual perception accuracy. However, mechanisms underlying these performance enhancements have remained unclear. We hypothesize that task-irrelevant auditory stimuli might provide reliable temporal and spatial cues for visual target discrimination and behavioral response enhancement. Using signal detection theory, the present study investigated the effects of spatiotemporal relationships on auditory facilitation of visual target discrimination. Three experiments were conducted where an auditory stimulus maintained reliable temporal and/or spatial relationships with visual target stimuli. Results showed that perception sensitivity (d') to visual target stimuli was enhanced only when a task-irrelevant auditory stimulus maintained reliable spatiotemporal relationships with a visual target stimulus. When only reliable spatial or temporal information was contained, perception sensitivity was not enhanced. These results suggest that reliable spatiotemporal relationships between visual and auditory signals are required for audiovisual integration during a visual discrimination task, most likely due to a spread of attention. These results also indicate that auditory facilitation of visual target discrimination follows from late-stage cognitive processes rather than early stage sensory processes. © 2015 SAGE Publications.

  14. Attention Modulates TMS-Locked Alpha Oscillations in the Visual Cortex.

    Science.gov (United States)

    Herring, Jim D; Thut, Gregor; Jensen, Ole; Bergmann, Til O

    2015-10-28

    Cortical oscillations, such as 8-12 Hz alpha-band activity, are thought to subserve gating of information processing in the human brain. While most of the supporting evidence is correlational, causal evidence comes from attempts to externally drive ("entrain") these oscillations by transcranial magnetic stimulation (TMS). Indeed, the frequency profile of TMS-evoked potentials (TEPs) closely resembles that of oscillations spontaneously emerging in the same brain region. However, it is unclear whether TMS-locked and spontaneous oscillations are produced by the same neuronal mechanisms. If so, they should react in a similar manner to top-down modulation by endogenous attention. To test this prediction, we assessed the alpha-like EEG response to TMS of the visual cortex during periods of high and low visual attention while participants attended to either the visual or auditory modality in a cross-modal attention task. We observed a TMS-locked local oscillatory alpha response lasting several cycles after TMS (but not after sham stimulation). Importantly, TMS-locked alpha power was suppressed during deployment of visual relative to auditory attention, mirroring spontaneous alpha amplitudes. In addition, the early N40 TEP component, located at the stimulation site, was amplified by visual attention. The extent of attentional modulation for both TMS-locked alpha power and N40 amplitude did depend, with opposite sign, on the individual ability to modulate spontaneous alpha power at the stimulation site. We therefore argue that TMS-locked and spontaneous oscillations are of common neurophysiological origin, whereas the N40 TEP component may serve as an index of current cortical excitability at the time of stimulation. Copyright © 2015 Herring et al.

  15. Auditory event-related potentials in children with benign epilepsy with centro-temporal spikes.

    Science.gov (United States)

    Tomé, David; Sampaio, Mafalda; Mendes-Ribeiro, José; Barbosa, Fernando; Marques-Teixeira, João

    2014-12-01

    Benign focal epilepsy in childhood with centro-temporal spikes (BECTS) is one of the most common forms of idiopathic epilepsy, with onset from age 3 to 14 years. Although the prognosis for children with BECTS is excellent, some studies have revealed neuropsychological deficits in many domains, including language. Auditory event-related potentials (AERPs) reflect activation of different neuronal populations and are suggested to contribute to the evaluation of auditory discrimination (N1), attention allocation and phonological categorization (N2), and echoic memory (mismatch negativity--MMN). The scarce existing literature about this theme motivated the present study, which aims to investigate and document the existing AERP changes in a group of children with BECTS. AERPs were recorded, during the day, to pure and vocal tones and in a conventional auditory oddball paradigm in five children with BECTS (aged 8-12; mean=10 years; male=5) and in six gender and age-matched controls. Results revealed high amplitude of AERPs for the group of children with BECTS with a slight latency delay more pronounced in fronto-central electrodes. Children with BECTS may have abnormal central auditory processing, reflected by electrophysiological measures such as AERPs. In advance, AERPs seem a good tool to detect and reliably reveal cortical excitability in children with typical BECTS. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Validation of the Emotiv EPOC(®) EEG gaming system for measuring research quality auditory ERPs.

    Science.gov (United States)

    Badcock, Nicholas A; Mousikou, Petroula; Mahajan, Yatin; de Lissa, Peter; Thie, Johnson; McArthur, Genevieve

    2013-01-01

    Background. Auditory event-related potentials (ERPs) have proved useful in investigating the role of auditory processing in cognitive disorders such as developmental dyslexia, specific language impairment (SLI), attention deficit hyperactivity disorder (ADHD), schizophrenia, and autism. However, laboratory recordings of auditory ERPs can be lengthy, uncomfortable, or threatening for some participants - particularly children. Recently, a commercial gaming electroencephalography (EEG) system has been developed that is portable, inexpensive, and easy to set up. In this study we tested if auditory ERPs measured using a gaming EEG system (Emotiv EPOC(®), www.emotiv.com) were equivalent to those measured by a widely-used, laboratory-based, research EEG system (Neuroscan). Methods. We simultaneously recorded EEGs with the research and gaming EEG systems, whilst presenting 21 adults with 566 standard (1000 Hz) and 100 deviant (1200 Hz) tones under passive (non-attended) and active (attended) conditions. The onset of each tone was marked in the EEGs using a parallel port pulse (Neuroscan) or a stimulus-generated electrical pulse injected into the O1 and O2 channels (Emotiv EPOC(®)). These markers were used to calculate research and gaming EEG system late auditory ERPs (P1, N1, P2, N2, and P3 peaks) and the mismatch negativity (MMN) in active and passive listening conditions for each participant. Results. Analyses were restricted to frontal sites as these are most commonly reported in auditory ERP research. Intra-class correlations (ICCs) indicated that the morphology of the research and gaming EEG system late auditory ERP waveforms were similar across all participants, but that the research and gaming EEG system MMN waveforms were only similar for participants with non-noisy MMN waveforms (N = 11 out of 21). Peak amplitude and latency measures revealed no significant differences between the size or the timing of the auditory P1, N1, P2, N2, P3, and MMN peaks

  17. Synchronization and phonological skills: precise auditory timing hypothesis (PATH

    Directory of Open Access Journals (Sweden)

    Adam eTierney

    2014-11-01

    Full Text Available Phonological skills are enhanced by music training, but the mechanisms enabling this cross-domain enhancement remain unknown. To explain this cross-domain transfer, we propose a precise auditory timing hypothesis (PATH whereby entrainment practice is the core mechanism underlying enhanced phonological abilities in musicians. Both rhythmic synchronization and language skills such as consonant discrimination, detection of word and phrase boundaries, and conversational turn-taking rely on the perception of extremely fine-grained timing details in sound. Auditory-motor timing is an acoustic feature which meets all five of the pre-conditions necessary for cross-domain enhancement to occur (Patel 2011, 2012, 2014. There is overlap between the neural networks that process timing in the context of both music and language. Entrainment to music demands more precise timing sensitivity than does language processing. Moreover, auditory-motor timing integration captures the emotion of the trainee, is repeatedly practiced, and demands focused attention. The precise auditory timing hypothesis predicts that musical training emphasizing entrainment will be particularly effective in enhancing phonological skills.

  18. A Neural Circuit for Auditory Dominance over Visual Perception.

    Science.gov (United States)

    Song, You-Hyang; Kim, Jae-Hyun; Jeong, Hye-Won; Choi, Ilsong; Jeong, Daun; Kim, Kwansoo; Lee, Seung-Hee

    2017-02-22

    When conflicts occur during integration of visual and auditory information, one modality often dominates the other, but the underlying neural circuit mechanism remains unclear. Using auditory-visual discrimination tasks for head-fixed mice, we found that audition dominates vision in a process mediated by interaction between inputs from the primary visual (VC) and auditory (AC) cortices in the posterior parietal cortex (PTLp). Co-activation of the VC and AC suppresses VC-induced PTLp responses, leaving AC-induced responses. Furthermore, parvalbumin-positive (PV+) interneurons in the PTLp mainly receive AC inputs, and muscimol inactivation of the PTLp or optogenetic inhibition of its PV+ neurons abolishes auditory dominance in the resolution of cross-modal sensory conflicts without affecting either sensory perception. Conversely, optogenetic activation of PV+ neurons in the PTLp enhances the auditory dominance. Thus, our results demonstrate that AC input-specific feedforward inhibition of VC inputs in the PTLp is responsible for the auditory dominance during cross-modal integration. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. The role of auditory cortices in the retrieval of single-trial auditory-visual object memories.

    Science.gov (United States)

    Matusz, Pawel J; Thelen, Antonia; Amrein, Sarah; Geiser, Eveline; Anken, Jacques; Murray, Micah M

    2015-03-01

    Single-trial encounters with multisensory stimuli affect both memory performance and early-latency brain responses to visual stimuli. Whether and how auditory cortices support memory processes based on single-trial multisensory learning is unknown and may differ qualitatively and quantitatively from comparable processes within visual cortices due to purported differences in memory capacities across the senses. We recorded event-related potentials (ERPs) as healthy adults (n = 18) performed a continuous recognition task in the auditory modality, discriminating initial (new) from repeated (old) sounds of environmental objects. Initial presentations were either unisensory or multisensory; the latter entailed synchronous presentation of a semantically congruent or a meaningless image. Repeated presentations were exclusively auditory, thus differing only according to the context in which the sound was initially encountered. Discrimination abilities (indexed by d') were increased for repeated sounds that were initially encountered with a semantically congruent image versus sounds initially encountered with either a meaningless or no image. Analyses of ERPs within an electrical neuroimaging framework revealed that early stages of auditory processing of repeated sounds were affected by prior single-trial multisensory contexts. These effects followed from significantly reduced activity within a distributed network, including the right superior temporal cortex, suggesting an inverse relationship between brain activity and behavioural outcome on this task. The present findings demonstrate how auditory cortices contribute to long-term effects of multisensory experiences on auditory object discrimination. We propose a new framework for the efficacy of multisensory processes to impact both current multisensory stimulus processing and unisensory discrimination abilities later in time. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  20. Congenital Deafness Reduces, But Does Not Eliminate Auditory Responsiveness in Cat Extrastriate Visual Cortex.

    Science.gov (United States)

    Land, Rüdiger; Radecke, Jan-Ole; Kral, Andrej

    2018-04-01

    Congenital deafness not only affects the development of the auditory cortex, but also the interrelation between the visual and auditory system. For example, congenital deafness leads to visual modulation of the deaf auditory cortex in the form of cross-modal plasticity. Here we asked, whether congenital deafness additionally affects auditory modulation in the visual cortex. We demonstrate that auditory activity, which is normally present in the lateral suprasylvian visual areas in normal hearing cats, can also be elicited by electrical activation of the auditory system with cochlear implants. We then show that in adult congenitally deaf cats auditory activity in this region was reduced when tested with cochlear implant stimulation. However, the change in this area was small and auditory activity was not completely abolished despite years of congenital deafness. The results document that congenital deafness leads not only to changes in the auditory cortex but also affects auditory modulation of visual areas. However, the results further show a persistence of fundamental cortical sensory functional organization despite congenital deafness. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. A novel 9-class auditory ERP paradigm driving a predictive text entry system

    Directory of Open Access Journals (Sweden)

    Johannes eHöhne

    2011-08-01

    Full Text Available Brain-Computer Interfaces (BCIs based on Event Related Potentials (ERPs strive for offering communication pathways which are independent of muscle activity. While most visual ERP-based BCI paradigms require good control of the user's gaze direction, auditory BCI paradigms overcome this restriction. The present work proposes a novel approach using Auditory Evoked Potentials (AEP for the example of a multiclass text spelling application. To control the ERP speller, BCI users focus their attention to two-dimensional auditory stimuli that vary in both, pitch (high/medium/low and direction (left/middle/right and that are presented via headphones. The resulting nine different control signals are exploited to drive a predictive text entry system. It enables the user to spell a letter by a single 9-class decision plus two additional decisions to confirm a spelled word.This paradigm - called PASS2D - was investigated in an online study with twelve healthy participants. Users spelled with more than 0.8 characters per minute on average (3.4 bits per minute which makes PASS2D a competitive method. It could enrich the toolbox of existing ERP paradigms for BCI end users like late-stage ALS patients.

  2. Translation and adaptation of functional auditory performance indicators (FAPI

    Directory of Open Access Journals (Sweden)

    Karina Ferreira

    2011-12-01

    Full Text Available Work with deaf children has gained new attention since the expectation and goal of therapy has expanded to language development and subsequent language learning. Many clinical tests were developed for evaluation of speech sound perception in young children in response to the need for accurate assessment of hearing skills that developed from the use of individual hearing aids or cochlear implants. These tests also allow the evaluation of the rehabilitation program. However, few of these tests are available in Portuguese. Evaluation with the Functional Auditory Performance Indicators (FAPI generates a child's functional auditory skills profile, which lists auditory skills in an integrated and hierarchical order. It has seven hierarchical categories, including sound awareness, meaningful sound, auditory feedback, sound source localizing, auditory discrimination, short-term auditory memory, and linguistic auditory processing. FAPI evaluation allows the therapist to map the child's hearing profile performance, determine the target for increasing the hearing abilities, and develop an effective therapeutic plan. Objective: Since the FAPI is an American test, the inventory was adapted for application in the Brazilian population. Material and Methods: The translation was done following the steps of translation and back translation, and reproducibility was evaluated. Four translated versions (two originals and two back-translated were compared, and revisions were done to ensure language adaptation and grammatical and idiomatic equivalence. Results: The inventory was duly translated and adapted. Conclusion: Further studies about the application of the translated FAPI are necessary to make the test practicable in Brazilian clinical use.

  3. Effect of background music on auditory-verbal memory performance

    Directory of Open Access Journals (Sweden)

    Sona Matloubi

    2014-12-01

    Full Text Available Background and Aim: Music exists in all cultures; many scientists are seeking to understand how music effects cognitive development such as comprehension, memory, and reading skills. More recently, a considerable number of neuroscience studies on music have been developed. This study aimed to investigate the effects of null and positive background music in comparison with silence on auditory-verbal memory performance.Methods: Forty young adults (male and female with normal hearing, aged between 18 and 26, participated in this comparative-analysis study. An auditory and speech evaluation was conducted in order to investigate the effects of background music on working memory. Subsequently, the Rey auditory-verbal learning test was performed for three conditions: silence, positive, and null music.Results: The mean score of the Rey auditory-verbal learning test in silence condition was higher than the positive music condition (p=0.003 and the null music condition (p=0.01. The tests results did not reveal any gender differences.Conclusion: It seems that the presence of competitive music (positive and null music and the orientation of auditory attention have negative effects on the performance of verbal working memory. It is possibly owing to the intervention of music with verbal information processing in the brain.

  4. Auditory Reserve and the Legacy of Auditory Experience

    Directory of Open Access Journals (Sweden)

    Erika Skoe

    2014-11-01

    Full Text Available Musical training during childhood has been linked to more robust encoding of sound later in life. We take this as evidence for an auditory reserve: a mechanism by which individuals capitalize on earlier life experiences to promote auditory processing. We assert that early auditory experiences guide how the reserve develops and is maintained over the lifetime. Experiences that occur after childhood, or which are limited in nature, are theorized to affect the reserve, although their influence on sensory processing may be less long-lasting and may potentially fade over time if not repeated. This auditory reserve may help to explain individual differences in how individuals cope with auditory impoverishment or loss of sensorineural function.

  5. The effect of phasic auditory alerting on visual perception

    DEFF Research Database (Denmark)

    Petersen, Anders; Petersen, Annemarie Hilkjær; Bundesen, Claus

    2017-01-01

    /no-alerting design with a pure accuracy-based single-letter recognition task. Computational modeling based on Bundesen’s Theory of Visual Attention was used to examine the effect of phasic alertness on visual processing speed and threshold of conscious perception. Results show that phasic auditory alertness affects...

  6. Investigating attentional processes in depressive-like domestic horses (Equus caballus).

    Science.gov (United States)

    Rochais, C; Henry, S; Fureix, C; Hausberger, M

    2016-03-01

    Some captive/domestic animals respond to confinement by becoming inactive and unresponsive to external stimuli. Human inactivity is one of the behavioural markers of clinical depression, a mental disorder diagnosed by the co-occurrence of symptoms including deficit in selective attention. Some riding horses display 'withdrawn' states of inactivity and low responsiveness to stimuli that resemble the reduced engagement with their environment of some depressed patients. We hypothesized that 'withdrawn' horses experience a depressive-like state and evaluated their level of attention by confronting them with auditory stimuli. Five novel auditory stimuli were broadcasted to 27 horses, including 12 'withdrawn' horses, for 5 days. The horses' reactions and durations of attention were recorded. Non-withdrawn horses reacted more and their attention lasted longer than that of withdrawn horses on the first day, but their durations of attention decreased over days, but those of withdrawn horses remained stable. These results suggest that the withdrawn horses' selective attention is altered, adding to already evidenced common features between this horses' state and human depression. Copyright © 2016. Published by Elsevier B.V.

  7. Preferential processing of tactile events under conditions of divided attention: Effects of divided attention on reaction time

    OpenAIRE

    Hanson, James V. M.; Whitaker, David; Heron, James

    2009-01-01

    Differences in transduction and transmission latencies of visual, auditory and tactile events cause corresponding differences in simple reaction time. As reaction time is usually measured in unimodal blocks, it is unclear whether such latency differences also apply when observers monitor multiple sensory channels. We investigate this by comparing reaction time when attention is focussed on a single modality, and when attention is divided between multiple modalities. Results show that tactile ...

  8. Functional mapping of the primate auditory system.

    Science.gov (United States)

    Poremba, Amy; Saunders, Richard C; Crane, Alison M; Cook, Michelle; Sokoloff, Louis; Mishkin, Mortimer

    2003-01-24

    Cerebral auditory areas were delineated in the awake, passively listening, rhesus monkey by comparing the rates of glucose utilization in an intact hemisphere and in an acoustically isolated contralateral hemisphere of the same animal. The auditory system defined in this way occupied large portions of cerebral tissue, an extent probably second only to that of the visual system. Cortically, the activated areas included the entire superior temporal gyrus and large portions of the parietal, prefrontal, and limbic lobes. Several auditory areas overlapped with previously identified visual areas, suggesting that the auditory system, like the visual system, contains separate pathways for processing stimulus quality, location, and motion.

  9. Tinnitus alters resting state functional connectivity (RSFC) in human auditory and non-auditory brain regions as measured by functional near-infrared spectroscopy (fNIRS).

    Science.gov (United States)

    San Juan, Juan; Hu, Xiao-Su; Issa, Mohamad; Bisconti, Silvia; Kovelman, Ioulia; Kileny, Paul; Basura, Gregory

    2017-01-01

    Tinnitus, or phantom sound perception, leads to increased spontaneous neural firing rates and enhanced synchrony in central auditory circuits in animal models. These putative physiologic correlates of tinnitus to date have not been well translated in the brain of the human tinnitus sufferer. Using functional near-infrared spectroscopy (fNIRS) we recently showed that tinnitus in humans leads to maintained hemodynamic activity in auditory and adjacent, non-auditory cortices. Here we used fNIRS technology to investigate changes in resting state functional connectivity between human auditory and non-auditory brain regions in normal-hearing, bilateral subjective tinnitus and controls before and after auditory stimulation. Hemodynamic activity was monitored over the region of interest (primary auditory cortex) and non-region of interest (adjacent non-auditory cortices) and functional brain connectivity was measured during a 60-second baseline/period of silence before and after a passive auditory challenge consisting of alternating pure tones (750 and 8000Hz), broadband noise and silence. Functional connectivity was measured between all channel-pairs. Prior to stimulation, connectivity of the region of interest to the temporal and fronto-temporal region was decreased in tinnitus participants compared to controls. Overall, connectivity in tinnitus was differentially altered as compared to controls following sound stimulation. Enhanced connectivity was seen in both auditory and non-auditory regions in the tinnitus brain, while controls showed a decrease in connectivity following sound stimulation. In tinnitus, the strength of connectivity was increased between auditory cortex and fronto-temporal, fronto-parietal, temporal, occipito-temporal and occipital cortices. Together these data suggest that central auditory and non-auditory brain regions are modified in tinnitus and that resting functional connectivity measured by fNIRS technology may contribute to conscious phantom

  10. Tinnitus alters resting state functional connectivity (RSFC in human auditory and non-auditory brain regions as measured by functional near-infrared spectroscopy (fNIRS.

    Directory of Open Access Journals (Sweden)

    Juan San Juan

    Full Text Available Tinnitus, or phantom sound perception, leads to increased spontaneous neural firing rates and enhanced synchrony in central auditory circuits in animal models. These putative physiologic correlates of tinnitus to date have not been well translated in the brain of the human tinnitus sufferer. Using functional near-infrared spectroscopy (fNIRS we recently showed that tinnitus in humans leads to maintained hemodynamic activity in auditory and adjacent, non-auditory cortices. Here we used fNIRS technology to investigate changes in resting state functional connectivity between human auditory and non-auditory brain regions in normal-hearing, bilateral subjective tinnitus and controls before and after auditory stimulation. Hemodynamic activity was monitored over the region of interest (primary auditory cortex and non-region of interest (adjacent non-auditory cortices and functional brain connectivity was measured during a 60-second baseline/period of silence before and after a passive auditory challenge consisting of alternating pure tones (750 and 8000Hz, broadband noise and silence. Functional connectivity was measured between all channel-pairs. Prior to stimulation, connectivity of the region of interest to the temporal and fronto-temporal region was decreased in tinnitus participants compared to controls. Overall, connectivity in tinnitus was differentially altered as compared to controls following sound stimulation. Enhanced connectivity was seen in both auditory and non-auditory regions in the tinnitus brain, while controls showed a decrease in connectivity following sound stimulation. In tinnitus, the strength of connectivity was increased between auditory cortex and fronto-temporal, fronto-parietal, temporal, occipito-temporal and occipital cortices. Together these data suggest that central auditory and non-auditory brain regions are modified in tinnitus and that resting functional connectivity measured by fNIRS technology may contribute to

  11. Age-related differences in auditory evoked potentials as a function of task modulation during speech-nonspeech processing.

    Science.gov (United States)

    Rufener, Katharina Simone; Liem, Franziskus; Meyer, Martin

    2014-01-01

    Healthy aging is typically associated with impairment in various cognitive abilities such as memory, selective attention or executive functions. Less well observed is the fact that also language functions in general and speech processing in particular seems to be affected by age. This impairment is partly caused by pathologies of the peripheral auditory nervous system and central auditory decline and in some part also by a cognitive decay. This cross-sectional electroencephalography (EEG) study investigates temporally early electrophysiological correlates of auditory related selective attention in young (20-32 years) and older (60-74 years) healthy adults. In two independent tasks, we systematically modulate the subjects' focus of attention by presenting words and pseudowords as targets and white noise stimuli as distractors. Behavioral data showed no difference in task accuracy between the two age samples irrespective of the modulation of attention. However, our work is the first to show that the N1-and the P2 component evoked by speech and nonspeech stimuli are specifically modulated in older adults and young adults depending on the subjects' focus of attention. This finding is particularly interesting in that the age-related differences in AEPs may be reflecting levels of processing that are not mirrored by the behavioral measurements.

  12. Influence of attention alternation on movement-related cortical potentials in healthy individuals and stroke patients

    DEFF Research Database (Denmark)

    Aliakbaryhosseinabadi, Susan; Kostic, Vladimir; Pavlovic, Aleksandra

    2017-01-01

    Objective In this study, we analyzed the influence of artificially imposed attention variations using the auditory oddball paradigm on the cortical activity associated to motor preparation/execution. Methods EEG signals from Cz and its surrounding channels were recorded during three sets of ankle...... and accuracy deteriorated with attention diversion. Conclusion Attention diversion has a significant influence on MRCP features and detection parameters, although these changes were counteracted by the application of the laplacian method. Significance Brain–computer interfaces for neuromodulation that use...... the MRCP as the control signal are robust to changes in attention. However, attention must be monitored since it plays a key role in plasticity induction. Here we demonstrate that this can be achieved using the single channel Cz....

  13. Prestimulus subsequent memory effects for auditory and visual events.

    Science.gov (United States)

    Otten, Leun J; Quayle, Angela H; Puvaneswaran, Bhamini

    2010-06-01

    It has been assumed that the effective encoding of information into memory primarily depends on neural activity elicited when an event is initially encountered. Recently, it has been shown that memory formation also relies on neural activity just before an event. The precise role of such activity in memory is currently unknown. Here, we address whether prestimulus activity affects the encoding of auditory and visual events, is set up on a trial-by-trial basis, and varies as a function of the type of recognition judgment an item later receives. Electrical brain activity was recorded from the scalps of 24 healthy young adults while they made semantic judgments on randomly intermixed series of visual and auditory words. Each word was preceded by a cue signaling the modality of the upcoming word. Auditory words were preceded by auditory cues and visual words by visual cues. A recognition memory test with remember/know judgments followed after a delay of about 45 min. As observed previously, a negative-going, frontally distributed modulation just before visual word onset predicted later recollection of the word. Crucially, the same effect was found for auditory words and observed on stay as well as switch trials. These findings emphasize the flexibility and general role of prestimulus activity in memory formation, and support a functional interpretation of the activity in terms of semantic preparation. At least with an unpredictable trial sequence, the activity is set up anew on each trial.

  14. The angular gyrus and visuospatial attention in decision-making under risk.

    Science.gov (United States)

    Studer, Bettina; Cen, Danlu; Walsh, Vincent

    2014-12-01

    Recent neuroimaging studies on decision-making under risk indicate that the angular gyrus (AG) is sensitive to the probability and variance of outcomes during choice. A separate body of research has established the AG as a key area in visual attention. The current study used repetitive transcranial magnetic stimulation (rTMS) in healthy volunteers to test whether the causal contribution of the AG to decision-making is independent of or linked to the guidance of visuospatial attention. A within-subject design compared decision making on a laboratory gambling task under three conditions: following rTMS to the AG, following rTMS to the premotor cortex (PMC, as an active control condition) and without TMS. The task presented two different trial types, 'visual' and 'auditory' trials, which entailed a high versus minimal demand for visuospatial attention, respectively. Our results showed a systematic effect of rTMS to the AG upon decision-making behavior in visual trials. Without TMS and following rTMS to the control region, decision latencies reflected the odds of winning; this relationship was disrupted by rTMS to the AG. In contrast, no significant effects of rTMS to the AG (or to the PMC) upon choice behavior in auditory trials were found. Thus, rTMS to the AG affected decision-making only in the task condition requiring visuospatial attention. The current findings suggest that the AG contributes to decision-making by guiding attention to relevant information about reward and punishment in the visual environment. Copyright © 2014. Published by Elsevier Inc.

  15. Children with speech sound disorder: Comparing a non-linguistic auditory approach with a phonological intervention approach to improve phonological skills

    Directory of Open Access Journals (Sweden)

    Cristina eMurphy

    2015-02-01

    Full Text Available This study aimed to compare the effects of a non-linguistic auditory intervention approach with a phonological intervention approach on the phonological skills of children with speech sound disorder. A total of 17 children, aged 7-12 years, with speech sound disorder were randomly allocated to either the non-linguistic auditory temporal intervention group (n = 10, average age 7.7 ± 1.2 or phonological intervention group (n = 7, average age 8.6 ± 1.2. The intervention outcomes included auditory-sensory measures (auditory temporal processing skills and cognitive measures (attention, short-term memory, speech production and phonological awareness skills. The auditory approach focused on non-linguistic auditory training (eg. backward masking and frequency discrimination, whereas the phonological approach focused on speech sound training (eg. phonological organisation and awareness. Both interventions consisted of twelve 45-minute sessions delivered twice per week, for a total of nine hours. Intra-group analysis demonstrated that the auditory intervention group showed significant gains in both auditory and cognitive measures, whereas no significant gain was observed in the phonological intervention group. No significant improvement on phonological skills was observed in any of the groups. Inter-group analysis demonstrated significant differences between the improvement following training for both groups, with a more pronounced gain for the non-linguistic auditory temporal intervention in one of the visual attention measures and both auditory measures. Therefore, both analyses suggest that although the non-linguistic auditory intervention approach appeared to be the most effective intervention approach, it was not sufficient to promote the enhancement of phonological skills.

  16. Effects of visual working memory on brain information processing of irrelevant auditory stimuli.

    Directory of Open Access Journals (Sweden)

    Jiagui Qu

    Full Text Available Selective attention has traditionally been viewed as a sensory processing modulator that promotes cognitive processing efficiency by favoring relevant stimuli while inhibiting irrelevant stimuli. However, the cross-modal processing of irrelevant information during working memory (WM has been rarely investigated. In this study, the modulation of irrelevant auditory information by the brain during a visual WM task was investigated. The N100 auditory evoked potential (N100-AEP following an auditory click was used to evaluate the selective attention to auditory stimulus during WM processing and at rest. N100-AEP amplitudes were found to be significantly affected in the left-prefrontal, mid-prefrontal, right-prefrontal, left-frontal, and mid-frontal regions while performing a high WM load task. In contrast, no significant differences were found between N100-AEP amplitudes in WM states and rest states under a low WM load task in all recorded brain regions. Furthermore, no differences were found between the time latencies of N100-AEP troughs in WM states and rest states while performing either the high or low WM load task. These findings suggested that the prefrontal cortex (PFC may integrate information from different sensory channels to protect perceptual integrity during cognitive processing.

  17. Effects of visual working memory on brain information processing of irrelevant auditory stimuli.

    Science.gov (United States)

    Qu, Jiagui; Rizak, Joshua D; Zhao, Lun; Li, Minghong; Ma, Yuanye

    2014-01-01

    Selective attention has traditionally been viewed as a sensory processing modulator that promotes cognitive processing efficiency by favoring relevant stimuli while inhibiting irrelevant stimuli. However, the cross-modal processing of irrelevant information during working memory (WM) has been rarely investigated. In this study, the modulation of irrelevant auditory information by the brain during a visual WM task was investigated. The N100 auditory evoked potential (N100-AEP) following an auditory click was used to evaluate the selective attention to auditory stimulus during WM processing and at rest. N100-AEP amplitudes were found to be significantly affected in the left-prefrontal, mid-prefrontal, right-prefrontal, left-frontal, and mid-frontal regions while performing a high WM load task. In contrast, no significant differences were found between N100-AEP amplitudes in WM states and rest states under a low WM load task in all recorded brain regions. Furthermore, no differences were found between the time latencies of N100-AEP troughs in WM states and rest states while performing either the high or low WM load task. These findings suggested that the prefrontal cortex (PFC) may integrate information from different sensory channels to protect perceptual integrity during cognitive processing.

  18. Distractor Effect of Auditory Rhythms on Self-Paced Tapping in Chimpanzees and Humans.

    Science.gov (United States)

    Hattori, Yuko; Tomonaga, Masaki; Matsuzawa, Tetsuro

    2015-01-01

    Humans tend to spontaneously align their movements in response to visual (e.g., swinging pendulum) and auditory rhythms (e.g., hearing music while walking). Particularly in the case of the response to auditory rhythms, neuroscientific research has indicated that motor resources are also recruited while perceiving an auditory rhythm (or regular pulse), suggesting a tight link between the auditory and motor systems in the human brain. However, the evolutionary origin of spontaneous responses to auditory rhythms is unclear. Here, we report that chimpanzees and humans show a similar distractor effect in perceiving isochronous rhythms during rhythmic movement. We used isochronous auditory rhythms as distractor stimuli during self-paced alternate tapping of two keys of an electronic keyboard by humans and chimpanzees. When the tempo was similar to their spontaneous motor tempo, tapping onset was influenced by intermittent entrainment to auditory rhythms. Although this effect itself is not an advanced rhythmic ability such as dancing or singing, our results suggest that, to some extent, the biological foundation for spontaneous responses to auditory rhythms was already deeply rooted in the common ancestor of chimpanzees and humans, 6 million years ago. This also suggests the possibility of a common attentional mechanism, as proposed by the dynamic attending theory, underlying the effect of perceiving external rhythms on motor movement.

  19. Auditory Tones and Foot-Shock Recapitulate Spontaneous Sub-Threshold Activity in Basolateral Amygdala Principal Neurons and Interneurons.

    Directory of Open Access Journals (Sweden)

    François Windels

    Full Text Available In quiescent states such as anesthesia and slow wave sleep, cortical networks show slow rhythmic synchronized activity. In sensory cortices this rhythmic activity shows a stereotypical pattern that is recapitulated by stimulation of the appropriate sensory modality. The amygdala receives sensory input from a variety of sources, and in anesthetized animals, neurons in the basolateral amygdala (BLA show slow rhythmic synchronized activity. Extracellular field potential recordings show that these oscillations are synchronized with sensory cortex and the thalamus, with both the thalamus and cortex leading the BLA. Using whole-cell recording in vivo we show that the membrane potential of principal neurons spontaneously oscillates between up- and down-states. Footshock and auditory stimulation delivered during down-states evokes an up-state that fully recapitulates those occurring spontaneously. These results suggest that neurons in the BLA receive convergent input from networks of cortical neurons with slow oscillatory activity and that somatosensory and auditory stimulation can trigger activity in these same networks.

  20. Precise auditory-vocal mirroring in neurons for learned vocal communication.

    Science.gov (United States)

    Prather, J F; Peters, S; Nowicki, S; Mooney, R

    2008-01-17

    Brain mechanisms for communication must establish a correspondence between sensory and motor codes used to represent the signal. One idea is that this correspondence is established at the level of single neurons that are active when the individual performs a particular gesture or observes a similar gesture performed by another individual. Although neurons that display a precise auditory-vocal correspondence could facilitate vocal communication, they have yet to be identified. Here we report that a certain class of neurons in the swamp sparrow forebrain displays a precise auditory-vocal correspondence. We show that these neurons respond in a temporally precise fashion to auditory presentation of certain note sequences in this songbird's repertoire and to similar note sequences in other birds' songs. These neurons display nearly identical patterns of activity when the bird sings the same sequence, and disrupting auditory feedback does not alter this singing-related activity, indicating it is motor in nature. Furthermore, these neurons innervate striatal structures important for song learning, raising the possibility that singing-related activity in these cells is compared to auditory feedback to guide vocal learning.