WorldWideScience

Sample records for auditory attention cues

  1. Negative emotion provides cues for orienting auditory spatial attention

    Directory of Open Access Journals (Sweden)

    Erkin eAsutay

    2015-05-01

    Full Text Available The auditory stimuli provide information about the objects and events around us. They can also carry biologically significant emotional information (such as unseen dangers and conspecific vocalizations, which provides cues for allocation of attention and mental resources. Here, we investigated whether task-irrelevant auditory emotional information can provide cues for orientation of auditory spatial attention. We employed a covert spatial orienting task: the dot-probe task. In each trial, two task irrelevant auditory cues were simultaneously presented at two separate locations (left-right or front-back. Environmental sounds were selected to form emotional vs. neutral, emotional vs. emotional, and neutral vs. neutral cue pairs. The participants’ task was to detect the location of an acoustic target that was presented immediately after the task-irrelevant auditory cues. The target was presented at the same location as one of the auditory cues. The results indicated that participants were significantly faster to locate the target when it replaced the negative cue compared to when it replaced the neutral cue. The positive cues did not produce a clear attentional bias. Further, same valence pairs (emotional-emotional or neutral-neutral did not modulate reaction times due to a lack of spatial attention capture by one cue in the pair. Taken together, the results indicate that negative affect can provide cues for the orientation of spatial attention in the auditory domain.

  2. Modulation of auditory spatial attention by visual emotional cues: differential effects of attentional engagement and disengagement for pleasant and unpleasant cues.

    Science.gov (United States)

    Harrison, Neil R; Woodhouse, Rob

    2016-05-01

    Previous research has demonstrated that threatening, compared to neutral pictures, can bias attention towards non-emotional auditory targets. Here we investigated which subcomponents of attention contributed to the influence of emotional visual stimuli on auditory spatial attention. Participants indicated the location of an auditory target, after brief (250 ms) presentation of a spatially non-predictive peripheral visual cue. Responses to targets were faster at the location of the preceding visual cue, compared to at the opposite location (cue validity effect). The cue validity effect was larger for targets following pleasant and unpleasant cues compared to neutral cues, for right-sided targets. For unpleasant cues, the crossmodal cue validity effect was driven by delayed attentional disengagement, and for pleasant cues, it was driven by enhanced engagement. We conclude that both pleasant and unpleasant visual cues influence the distribution of attention across modalities and that the associated attentional mechanisms depend on the valence of the visual cue.

  3. Intentional preparation of auditory attention-switches: Explicit cueing and sequential switch-predictability.

    Science.gov (United States)

    Seibold, Julia C; Nolden, Sophie; Oberem, Josefa; Fels, Janina; Koch, Iring

    2018-06-01

    In an auditory attention-switching paradigm, participants heard two simultaneously spoken number-words, each presented to one ear, and decided whether the target number was smaller or larger than 5 by pressing a left or right key. An instructional cue in each trial indicated which feature had to be used to identify the target number (e.g., female voice). Auditory attention-switch costs were found when this feature changed compared to when it repeated in two consecutive trials. Earlier studies employing this paradigm showed mixed results when they examined whether such cued auditory attention-switches can be prepared actively during the cue-stimulus interval. This study systematically assessed which preconditions are necessary for the advance preparation of auditory attention-switches. Three experiments were conducted that controlled for cue-repetition benefits, modality switches between cue and stimuli, as well as for predictability of the switch-sequence. Only in the third experiment, in which predictability for an attention-switch was maximal due to a pre-instructed switch-sequence and predictable stimulus onsets, active switch-specific preparation was found. These results suggest that the cognitive system can prepare auditory attention-switches, and this preparation seems to be triggered primarily by the memorised switching-sequence and valid expectations about the time of target onset.

  4. Preconditioning of Spatial and Auditory Cues: Roles of the Hippocampus, Frontal Cortex, and Cue-Directed Attention

    Directory of Open Access Journals (Sweden)

    Andrew C. Talk

    2016-12-01

    Full Text Available Loss of function of the hippocampus or frontal cortex is associated with reduced performance on memory tasks, in which subjects are incidentally exposed to cues at specific places in the environment and are subsequently asked to recollect the location at which the cue was experienced. Here, we examined the roles of the rodent hippocampus and frontal cortex in cue-directed attention during encoding of memory for the location of a single incidentally experienced cue. During a spatial sensory preconditioning task, rats explored an elevated platform while an auditory cue was incidentally presented at one corner. The opposite corner acted as an unpaired control location. The rats demonstrated recollection of location by avoiding the paired corner after the auditory cue was in turn paired with shock. Damage to either the dorsal hippocampus or the frontal cortex impaired this memory ability. However, we also found that hippocampal lesions enhanced attention directed towards the cue during the encoding phase, while frontal cortical lesions reduced cue-directed attention. These results suggest that the deficit in spatial sensory preconditioning caused by frontal cortical damage may be mediated by inattention to the location of cues during the latent encoding phase, while deficits following hippocampal damage must be related to other mechanisms such as generation of neural plasticity.

  5. Preconditioning of Spatial and Auditory Cues: Roles of the Hippocampus, Frontal Cortex, and Cue-Directed Attention

    Science.gov (United States)

    Talk, Andrew C.; Grasby, Katrina L.; Rawson, Tim; Ebejer, Jane L.

    2016-01-01

    Loss of function of the hippocampus or frontal cortex is associated with reduced performance on memory tasks, in which subjects are incidentally exposed to cues at specific places in the environment and are subsequently asked to recollect the location at which the cue was experienced. Here, we examined the roles of the rodent hippocampus and frontal cortex in cue-directed attention during encoding of memory for the location of a single incidentally experienced cue. During a spatial sensory preconditioning task, rats explored an elevated platform while an auditory cue was incidentally presented at one corner. The opposite corner acted as an unpaired control location. The rats demonstrated recollection of location by avoiding the paired corner after the auditory cue was in turn paired with shock. Damage to either the dorsal hippocampus or the frontal cortex impaired this memory ability. However, we also found that hippocampal lesions enhanced attention directed towards the cue during the encoding phase, while frontal cortical lesions reduced cue-directed attention. These results suggest that the deficit in spatial sensory preconditioning caused by frontal cortical damage may be mediated by inattention to the location of cues during the latent encoding phase, while deficits following hippocampal damage must be related to other mechanisms such as generation of neural plasticity. PMID:27999366

  6. Effectiveness of auditory and tactile crossmodal cues in a dual-task visual and auditory scenario.

    Science.gov (United States)

    Hopkins, Kevin; Kass, Steven J; Blalock, Lisa Durrance; Brill, J Christopher

    2017-05-01

    In this study, we examined how spatially informative auditory and tactile cues affected participants' performance on a visual search task while they simultaneously performed a secondary auditory task. Visual search task performance was assessed via reaction time and accuracy. Tactile and auditory cues provided the approximate location of the visual target within the search display. The inclusion of tactile and auditory cues improved performance in comparison to the no-cue baseline conditions. In comparison to the no-cue conditions, both tactile and auditory cues resulted in faster response times in the visual search only (single task) and visual-auditory (dual-task) conditions. However, the effectiveness of auditory and tactile cueing for visual task accuracy was shown to be dependent on task-type condition. Crossmodal cueing remains a viable strategy for improving task performance without increasing attentional load within a singular sensory modality. Practitioner Summary: Crossmodal cueing with dual-task performance has not been widely explored, yet has practical applications. We examined the effects of auditory and tactile crossmodal cues on visual search performance, with and without a secondary auditory task. Tactile cues aided visual search accuracy when also engaged in a secondary auditory task, whereas auditory cues did not.

  7. Selective attention modulates human auditory brainstem responses: relative contributions of frequency and spatial cues.

    Directory of Open Access Journals (Sweden)

    Alexandre Lehmann

    Full Text Available Selective attention is the mechanism that allows focusing one's attention on a particular stimulus while filtering out a range of other stimuli, for instance, on a single conversation in a noisy room. Attending to one sound source rather than another changes activity in the human auditory cortex, but it is unclear whether attention to different acoustic features, such as voice pitch and speaker location, modulates subcortical activity. Studies using a dichotic listening paradigm indicated that auditory brainstem processing may be modulated by the direction of attention. We investigated whether endogenous selective attention to one of two speech signals affects amplitude and phase locking in auditory brainstem responses when the signals were either discriminable by frequency content alone, or by frequency content and spatial location. Frequency-following responses to the speech sounds were significantly modulated in both conditions. The modulation was specific to the task-relevant frequency band. The effect was stronger when both frequency and spatial information were available. Patterns of response were variable between participants, and were correlated with psychophysical discriminability of the stimuli, suggesting that the modulation was biologically relevant. Our results demonstrate that auditory brainstem responses are susceptible to efferent modulation related to behavioral goals. Furthermore they suggest that mechanisms of selective attention actively shape activity at early subcortical processing stages according to task relevance and based on frequency and spatial cues.

  8. Attentional reorienting triggers spatial asymmetries in a search task with cross-modal spatial cueing.

    Directory of Open Access Journals (Sweden)

    Rebecca E Paladini

    Full Text Available Cross-modal spatial cueing can affect performance in a visual search task. For example, search performance improves if a visual target and an auditory cue originate from the same spatial location, and it deteriorates if they originate from different locations. Moreover, it has recently been postulated that multisensory settings, i.e., experimental settings, in which critical stimuli are concurrently presented in different sensory modalities (e.g., visual and auditory, may trigger asymmetries in visuospatial attention. Thereby, a facilitation has been observed for visual stimuli presented in the right compared to the left visual space. However, it remains unclear whether auditory cueing of attention differentially affects search performance in the left and the right hemifields in audio-visual search tasks. The present study investigated whether spatial asymmetries would occur in a search task with cross-modal spatial cueing. Participants completed a visual search task that contained no auditory cues (i.e., unimodal visual condition, spatially congruent, spatially incongruent, and spatially non-informative auditory cues. To further assess participants' accuracy in localising the auditory cues, a unimodal auditory spatial localisation task was also administered. The results demonstrated no left/right asymmetries in the unimodal visual search condition. Both an additional incongruent, as well as a spatially non-informative, auditory cue resulted in lateral asymmetries. Thereby, search times were increased for targets presented in the left compared to the right hemifield. No such spatial asymmetry was observed in the congruent condition. However, participants' performance in the congruent condition was modulated by their tone localisation accuracy. The findings of the present study demonstrate that spatial asymmetries in multisensory processing depend on the validity of the cross-modal cues, and occur under specific attentional conditions, i.e., when

  9. Visual unimodal grouping mediates auditory attentional bias in visuo-spatial working memory.

    Science.gov (United States)

    Botta, Fabiano; Lupiáñez, Juan; Sanabria, Daniel

    2013-09-01

    Audiovisual links in spatial attention have been reported in many previous studies. However, the effectiveness of auditory spatial cues in biasing the information encoding into visuo-spatial working memory (VSWM) is still relatively unknown. In this study, we addressed this issue by combining a cuing paradigm with a change detection task in VSWM. Moreover, we manipulated the perceptual organization of the to-be-remembered visual stimuli. We hypothesized that the auditory effect on VSWM would depend on the perceptual association between the auditory cue and the visual probe. Results showed, for the first time, a significant auditory attentional bias in VSWM. However, the effect was observed only when the to-be-remembered visual stimuli were organized in two distinctive visual objects. We propose that these results shed new light on audio-visual crossmodal links in spatial attention suggesting that, apart from the spatio-temporal contingency, the likelihood of perceptual association between the auditory cue and the visual target can have a large impact on crossmodal attentional biases. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Multisensory Cues Capture Spatial Attention Regardless of Perceptual Load

    Science.gov (United States)

    Santangelo, Valerio; Spence, Charles

    2007-01-01

    We compared the ability of auditory, visual, and audiovisual (bimodal) exogenous cues to capture visuo-spatial attention under conditions of no load versus high perceptual load. Participants had to discriminate the elevation (up vs. down) of visual targets preceded by either unimodal or bimodal cues under conditions of high perceptual load (in…

  11. Sound arithmetic: auditory cues in the rehabilitation of impaired fact retrieval.

    Science.gov (United States)

    Domahs, Frank; Zamarian, Laura; Delazer, Margarete

    2008-04-01

    The present single case study describes the rehabilitation of an acquired impairment of multiplication fact retrieval. In addition to a conventional drill approach, one set of problems was preceded by auditory cues while the other half was not. After extensive repetition, non-specific improvements could be observed for all trained problems (e.g., 3 * 7) as well as for their non-trained complementary problems (e.g., 7 * 3). Beyond this general improvement, specific therapy effects were found for problems trained with auditory cues. These specific effects were attributed to an involvement of implicit memory systems and/or attentional processes during training. Thus, the present results demonstrate that cues in the training of arithmetic facts do not have to be visual to be effective.

  12. Examining age-related differences in auditory attention control using a task-switching procedure.

    Science.gov (United States)

    Lawo, Vera; Koch, Iring

    2014-03-01

    Using a novel task-switching variant of dichotic selective listening, we examined age-related differences in the ability to intentionally switch auditory attention between 2 speakers defined by their sex. In our task, young (M age = 23.2 years) and older adults (M age = 66.6 years) performed a numerical size categorization on spoken number words. The task-relevant speaker was indicated by a cue prior to auditory stimulus onset. The cuing interval was either short or long and varied randomly trial by trial. We found clear performance costs with instructed attention switches. These auditory attention switch costs decreased with prolonged cue-stimulus interval. Older adults were generally much slower (but not more error prone) than young adults, but switching-related effects did not differ across age groups. These data suggest that the ability to intentionally switch auditory attention in a selective listening task is not compromised in healthy aging. We discuss the role of modality-specific factors in age-related differences.

  13. Cross-modal cueing in audiovisual spatial attention

    DEFF Research Database (Denmark)

    Blurton, Steven Paul; Greenlee, Mark W.; Gondan, Matthias

    2015-01-01

    effects have been reported for endogenous visual cues while exogenous cues seem to be mostly ineffective. In three experiments, we investigated cueing effects on the processing of audiovisual signals. In Experiment 1 we used endogenous cues to investigate their effect on the detection of auditory, visual......, and audiovisual targets presented with onset asynchrony. Consistent cueing effects were found in all target conditions. In Experiment 2 we used exogenous cues and found cueing effects only for visual target detection, but not auditory target detection. In Experiment 3 we used predictive exogenous cues to examine...

  14. Age-dependent impairment of auditory processing under spatially focused and divided attention: an electrophysiological study.

    Science.gov (United States)

    Wild-Wall, Nele; Falkenstein, Michael

    2010-01-01

    By using event-related potentials (ERPs) the present study examines if age-related differences in preparation and processing especially emerge during divided attention. Binaurally presented auditory cues called for focused (valid and invalid) or divided attention to one or both ears. Responses were required to subsequent monaurally presented valid targets (vowels), but had to be suppressed to non-target vowels or invalidly cued vowels. Middle-aged participants were more impaired under divided attention than young ones, likely due to an age-related decline in preparatory attention following cues as was reflected in a decreased CNV. Under divided attention, target processing was increased in the middle-aged, likely reflecting compensatory effort to fulfill task requirements in the difficult condition. Additionally, middle-aged participants processed invalidly cued stimuli more intensely as was reflected by stimulus ERPs. The results suggest an age-related impairment in attentional preparation after auditory cues especially under divided attention and latent difficulties to suppress irrelevant information.

  15. The interaction of cognitive load and attention-directing cues in driving.

    Science.gov (United States)

    Lee, Yi-Ching; Lee, John D; Boyle, Linda Ng

    2009-06-01

    This study investigated the effect of a nondriving cognitively loading task on the relationship between drivers' endogenous and exogenous control of attention. Previous studies have shown that cognitive load leads to a withdrawal of attention from the forward scene and a narrowed field of view, which impairs hazard detection. Posner's cue-target paradigm was modified to study how endogenous and exogenous cues interact with cognitive load to influence drivers' attention in a complex dynamic situation. In a driving simulator, pedestrian crossing signs that predicted the spatial location of pedestrians acted as endogenous cues. To impose cognitive load on drivers, we had them perform an auditory task that simulated the demands of emerging in-vehicle technology. Irrelevant exogenous cues were added to half of the experimental drives by including scene clutter. The validity of endogenous cues influenced how drivers scanned for pedestrian targets. Cognitive load delayed drivers' responses, and scene clutter reduced drivers' fixation durations to pedestrians. Cognitive load diminished the influence of exogenous cues to attract attention to irrelevant areas, and drivers were more affected by scene clutter when the endogenous cues were invalid. Cognitive load suppresses interference from irrelevant exogenous cues and delays endogenous orienting of attention in driving. The complexity of everyday tasks, such as driving, is better captured experimentally in paradigms that represent the interactive nature of attention and processing load.

  16. Verbal cues effectively orient children's auditory attention in a CV-syllable dichotic listening paradigm.

    Science.gov (United States)

    Phélip, Marion; Donnot, Julien; Vauclair, Jacques

    2015-12-18

    In their groundbreaking work featuring verbal dichotic listening tasks, Mondor and Bryden showed that tone cues do not enhance children's attentional orienting, in contrast to adults. The magnitude of the children's right-ear advantage was not attenuated when their attention was directed to the left ear. Verbal cues did, however, appear to favour the orientation of attention at around 10 years, although stimulus-onset asynchronies (SOAs), which ranged between 450 and 750 ms, were not rigorously controlled. The aim of our study was therefore to investigate the role of both types of cues in a typical CV-syllable dichotic listening task administered to 8- to 10-year-olds, applying a protocol as similar as possible to that used by Mondor and Bryden, but controlling for SOA as well as for cued ear. Results confirmed that verbal cues are more effective than tone cues in orienting children's attention. However, in contrast to adults, no effect of SOA was observed. We discuss the relative difficulty young children have processing CV syllables, as well as the role of top-down processes in attentional orienting abilities.

  17. Motor Training: Comparison of Visual and Auditory Coded Proprioceptive Cues

    Directory of Open Access Journals (Sweden)

    Philip Jepson

    2012-05-01

    Full Text Available Self-perception of body posture and movement is achieved through multi-sensory integration, particularly the utilisation of vision, and proprioceptive information derived from muscles and joints. Disruption to these processes can occur following a neurological accident, such as stroke, leading to sensory and physical impairment. Rehabilitation can be helped through use of augmented visual and auditory biofeedback to stimulate neuro-plasticity, but the effective design and application of feedback, particularly in the auditory domain, is non-trivial. Simple auditory feedback was tested by comparing the stepping accuracy of normal subjects when given a visual spatial target (step length and an auditory temporal target (step duration. A baseline measurement of step length and duration was taken using optical motion capture. Subjects (n=20 took 20 ‘training’ steps (baseline ±25% using either an auditory target (950 Hz tone, bell-shaped gain envelope or visual target (spot marked on the floor and were then asked to replicate the target step (length or duration corresponding to training with all feedback removed. Visual cues (mean percentage error=11.5%; SD ± 7.0%; auditory cues (mean percentage error = 12.9%; SD ± 11.8%. Visual cues elicit a high degree of accuracy both in training and follow-up un-cued tasks; despite the novelty of the auditory cues present for subjects, the mean accuracy of subjects approached that for visual cues, and initial results suggest a limited amount of practice using auditory cues can improve performance.

  18. The Role of Auditory Cues in the Spatial Knowledge of Blind Individuals

    Science.gov (United States)

    Papadopoulos, Konstantinos; Papadimitriou, Kimon; Koutsoklenis, Athanasios

    2012-01-01

    The study presented here sought to explore the role of auditory cues in the spatial knowledge of blind individuals by examining the relation between the perceived auditory cues and the landscape of a given area and by investigating how blind individuals use auditory cues to create cognitive maps. The findings reveal that several auditory cues…

  19. Auditory Emotional Cues Enhance Visual Perception

    Science.gov (United States)

    Zeelenberg, Rene; Bocanegra, Bruno R.

    2010-01-01

    Recent studies show that emotional stimuli impair performance to subsequently presented neutral stimuli. Here we show a cross-modal perceptual enhancement caused by emotional cues. Auditory cue words were followed by a visually presented neutral target word. Two-alternative forced-choice identification of the visual target was improved by…

  20. Volume Attenuation and High Frequency Loss as Auditory Depth Cues in Stereoscopic 3D Cinema

    Science.gov (United States)

    Manolas, Christos; Pauletto, Sandra

    2014-09-01

    Assisted by the technological advances of the past decades, stereoscopic 3D (S3D) cinema is currently in the process of being established as a mainstream form of entertainment. The main focus of this collaborative effort is placed on the creation of immersive S3D visuals. However, with few exceptions, little attention has been given so far to the potential effect of the soundtrack on such environments. The potential of sound both as a means to enhance the impact of the S3D visual information and to expand the S3D cinematic world beyond the boundaries of the visuals is large. This article reports on our research into the possibilities of using auditory depth cues within the soundtrack as a means of affecting the perception of depth within cinematic S3D scenes. We study two main distance-related auditory cues: high-end frequency loss and overall volume attenuation. A series of experiments explored the effectiveness of these auditory cues. Results, although not conclusive, indicate that the studied auditory cues can influence the audience judgement of depth in cinematic 3D scenes, sometimes in unexpected ways. We conclude that 3D filmmaking can benefit from further studies on the effectiveness of specific sound design techniques to enhance S3D cinema.

  1. Attention Cueing and Activity Equally Reduce False Alarm Rate in Visual-Auditory Associative Learning through Improving Memory.

    Science.gov (United States)

    Nikouei Mahani, Mohammad-Ali; Haghgoo, Hojjat Allah; Azizi, Solmaz; Nili Ahmadabadi, Majid

    2016-01-01

    In our daily life, we continually exploit already learned multisensory associations and form new ones when facing novel situations. Improving our associative learning results in higher cognitive capabilities. We experimentally and computationally studied the learning performance of healthy subjects in a visual-auditory sensory associative learning task across active learning, attention cueing learning, and passive learning modes. According to our results, the learning mode had no significant effect on learning association of congruent pairs. In addition, subjects' performance in learning congruent samples was not correlated with their vigilance score. Nevertheless, vigilance score was significantly correlated with the learning performance of the non-congruent pairs. Moreover, in the last block of the passive learning mode, subjects significantly made more mistakes in taking non-congruent pairs as associated and consciously reported lower confidence. These results indicate that attention and activity equally enhanced visual-auditory associative learning for non-congruent pairs, while false alarm rate in the passive learning mode did not decrease after the second block. We investigated the cause of higher false alarm rate in the passive learning mode by using a computational model, composed of a reinforcement learning module and a memory-decay module. The results suggest that the higher rate of memory decay is the source of making more mistakes and reporting lower confidence in non-congruent pairs in the passive learning mode.

  2. Individualization of music-based rhythmic auditory cueing in Parkinson's disease.

    Science.gov (United States)

    Bella, Simone Dalla; Dotov, Dobromir; Bardy, Benoît; de Cock, Valérie Cochen

    2018-06-04

    Gait dysfunctions in Parkinson's disease can be partly relieved by rhythmic auditory cueing. This consists in asking patients to walk with a rhythmic auditory stimulus such as a metronome or music. The effect on gait is visible immediately in terms of increased speed and stride length. Moreover, training programs based on rhythmic cueing can have long-term benefits. The effect of rhythmic cueing, however, varies from one patient to the other. Patients' response to the stimulation may depend on rhythmic abilities, often deteriorating with the disease. Relatively spared abilities to track the beat favor a positive response to rhythmic cueing. On the other hand, most patients with poor rhythmic abilities either do not respond to the cues or experience gait worsening when walking with cues. An individualized approach to rhythmic auditory cueing with music is proposed to cope with this variability in patients' response. This approach calls for using assistive mobile technologies capable of delivering cues that adapt in real time to patients' gait kinematics, thus affording step synchronization to the beat. Individualized rhythmic cueing can provide a safe and cost-effective alternative to standard cueing that patients may want to use in their everyday lives. © 2018 New York Academy of Sciences.

  3. Selective Attention to Auditory Memory Neurally Enhances Perceptual Precision.

    Science.gov (United States)

    Lim, Sung-Joo; Wöstmann, Malte; Obleser, Jonas

    2015-12-09

    Selective attention to a task-relevant stimulus facilitates encoding of that stimulus into a working memory representation. It is less clear whether selective attention also improves the precision of a stimulus already represented in memory. Here, we investigate the behavioral and neural dynamics of selective attention to representations in auditory working memory (i.e., auditory objects) using psychophysical modeling and model-based analysis of electroencephalographic signals. Human listeners performed a syllable pitch discrimination task where two syllables served as to-be-encoded auditory objects. Valid (vs neutral) retroactive cues were presented during retention to allow listeners to selectively attend to the to-be-probed auditory object in memory. Behaviorally, listeners represented auditory objects in memory more precisely (expressed by steeper slopes of a psychometric curve) and made faster perceptual decisions when valid compared to neutral retrocues were presented. Neurally, valid compared to neutral retrocues elicited a larger frontocentral sustained negativity in the evoked potential as well as enhanced parietal alpha/low-beta oscillatory power (9-18 Hz) during memory retention. Critically, individual magnitudes of alpha oscillatory power (7-11 Hz) modulation predicted the degree to which valid retrocues benefitted individuals' behavior. Our results indicate that selective attention to a specific object in auditory memory does benefit human performance not by simply reducing memory load, but by actively engaging complementary neural resources to sharpen the precision of the task-relevant object in memory. Can selective attention improve the representational precision with which objects are held in memory? And if so, what are the neural mechanisms that support such improvement? These issues have been rarely examined within the auditory modality, in which acoustic signals change and vanish on a milliseconds time scale. Introducing a new auditory memory

  4. Visual form Cues, Biological Motions, Auditory Cues, and Even Olfactory Cues Interact to Affect Visual Sex Discriminations

    OpenAIRE

    Rick Van Der Zwan; Anna Brooks; Duncan Blair; Coralia Machatch; Graeme Hacker

    2011-01-01

    Johnson and Tassinary (2005) proposed that visually perceived sex is signalled by structural or form cues. They suggested also that biological motion cues signal sex, but do so indirectly. We previously have shown that auditory cues can mediate visual sex perceptions (van der Zwan et al., 2009). Here we demonstrate that structural cues to body shape are alone sufficient for visual sex discriminations but that biological motion cues alone are not. Interestingly, biological motions can resolve ...

  5. Spatial auditory attention is modulated by tactile priming.

    Science.gov (United States)

    Menning, Hans; Ackermann, Hermann; Hertrich, Ingo; Mathiak, Klaus

    2005-07-01

    Previous studies have shown that cross-modal processing affects perception at a variety of neuronal levels. In this study, event-related brain responses were recorded via whole-head magnetoencephalography (MEG). Spatial auditory attention was directed via tactile pre-cues (primes) to one of four locations in the peripersonal space (left and right hand versus face). Auditory stimuli were white noise bursts, convoluted with head-related transfer functions, which ensured spatial perception of the four locations. Tactile primes (200-300 ms prior to acoustic onset) were applied randomly to one of these locations. Attentional load was controlled by three different visual distraction tasks. The auditory P50m (about 50 ms after stimulus onset) showed a significant "proximity" effect (larger responses to face stimulation as well as a "contralaterality" effect between side of stimulation and hemisphere). The tactile primes essentially reduced both the P50m and N100m components. However, facial tactile pre-stimulation yielded an enhanced ipsilateral N100m. These results show that earlier responses are mainly governed by exogenous stimulus properties whereas cross-sensory interaction is spatially selective at a later (endogenous) processing stage.

  6. Spatial Hearing with Incongruent Visual or Auditory Room Cues

    Science.gov (United States)

    Gil-Carvajal, Juan C.; Cubick, Jens; Santurette, Sébastien; Dau, Torsten

    2016-11-01

    In day-to-day life, humans usually perceive the location of sound sources as outside their heads. This externalized auditory spatial perception can be reproduced through headphones by recreating the sound pressure generated by the source at the listener’s eardrums. This requires the acoustical features of the recording environment and listener’s anatomy to be recorded at the listener’s ear canals. Although the resulting auditory images can be indistinguishable from real-world sources, their externalization may be less robust when the playback and recording environments differ. Here we tested whether a mismatch between playback and recording room reduces perceived distance, azimuthal direction, and compactness of the auditory image, and whether this is mostly due to incongruent auditory cues or to expectations generated from the visual impression of the room. Perceived distance ratings decreased significantly when collected in a more reverberant environment than the recording room, whereas azimuthal direction and compactness remained room independent. Moreover, modifying visual room-related cues had no effect on these three attributes, while incongruent auditory room-related cues between the recording and playback room did affect distance perception. Consequently, the external perception of virtual sounds depends on the degree of congruency between the acoustical features of the environment and the stimuli.

  7. Auditory feedback blocks memory benefits of cueing during sleep.

    Science.gov (United States)

    Schreiner, Thomas; Lehmann, Mick; Rasch, Björn

    2015-10-28

    It is now widely accepted that re-exposure to memory cues during sleep reactivates memories and can improve later recall. However, the underlying mechanisms are still unknown. As reactivation during wakefulness renders memories sensitive to updating, it remains an intriguing question whether reactivated memories during sleep also become susceptible to incorporating further information after the cue. Here we show that the memory benefits of cueing Dutch vocabulary during sleep are in fact completely blocked when memory cues are directly followed by either correct or conflicting auditory feedback, or a pure tone. In addition, immediate (but not delayed) auditory stimulation abolishes the characteristic increases in oscillatory theta and spindle activity typically associated with successful reactivation during sleep as revealed by high-density electroencephalography. We conclude that plastic processes associated with theta and spindle oscillations occurring during a sensitive period immediately after the cue are necessary for stabilizing reactivated memory traces during sleep.

  8. Evidence for cue-independent spatial representation in the human auditory cortex during active listening.

    Science.gov (United States)

    Higgins, Nathan C; McLaughlin, Susan A; Rinne, Teemu; Stecker, G Christopher

    2017-09-05

    Few auditory functions are as important or as universal as the capacity for auditory spatial awareness (e.g., sound localization). That ability relies on sensitivity to acoustical cues-particularly interaural time and level differences (ITD and ILD)-that correlate with sound-source locations. Under nonspatial listening conditions, cortical sensitivity to ITD and ILD takes the form of broad contralaterally dominated response functions. It is unknown, however, whether that sensitivity reflects representations of the specific physical cues or a higher-order representation of auditory space (i.e., integrated cue processing), nor is it known whether responses to spatial cues are modulated by active spatial listening. To investigate, sensitivity to parametrically varied ITD or ILD cues was measured using fMRI during spatial and nonspatial listening tasks. Task type varied across blocks where targets were presented in one of three dimensions: auditory location, pitch, or visual brightness. Task effects were localized primarily to lateral posterior superior temporal gyrus (pSTG) and modulated binaural-cue response functions differently in the two hemispheres. Active spatial listening (location tasks) enhanced both contralateral and ipsilateral responses in the right hemisphere but maintained or enhanced contralateral dominance in the left hemisphere. Two observations suggest integrated processing of ITD and ILD. First, overlapping regions in medial pSTG exhibited significant sensitivity to both cues. Second, successful classification of multivoxel patterns was observed for both cue types and-critically-for cross-cue classification. Together, these results suggest a higher-order representation of auditory space in the human auditory cortex that at least partly integrates the specific underlying cues.

  9. Grasp cueing and joint attention.

    Science.gov (United States)

    Tschentscher, Nadja; Fischer, Martin H

    2008-10-01

    We studied how two different hand posture cues affect joint attention in normal observers. Visual targets appeared over lateralized objects, with different delays after centrally presented hand postures. Attention was cued by either hand direction or the congruency between hand aperture and object size. Participants pressed a button when they detected a target. Direction cues alone facilitated target detection following short delays but aperture cues alone were ineffective. In contrast, when hand postures combined direction and aperture cues, aperture congruency effects without directional congruency effects emerged and persisted, but only for power grips. These results suggest that parallel parameter specification makes joint attention mechanisms exquisitely sensitive to the timing and content of contextual cues.

  10. Modeling Auditory-Haptic Interface Cues from an Analog Multi-line Telephone

    Science.gov (United States)

    Begault, Durand R.; Anderson, Mark R.; Bittner, Rachael M.

    2012-01-01

    The Western Electric Company produced a multi-line telephone during the 1940s-1970s using a six-button interface design that provided robust tactile, haptic and auditory cues regarding the "state" of the communication system. This multi-line telephone was used as a model for a trade study comparison of two interfaces: a touchscreen interface (iPad)) versus a pressure-sensitive strain gauge button interface (Phidget USB interface controllers). The experiment and its results are detailed in the authors' AES 133rd convention paper " Multimodal Information Management: Evaluation of Auditory and Haptic Cues for NextGen Communication Dispays". This Engineering Brief describes how the interface logic, visual indications, and auditory cues of the original telephone were synthesized using MAX/MSP, including the logic for line selection, line hold, and priority line activation.

  11. Rhythmic Auditory Cueing in Motor Rehabilitation for Stroke Patients: Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Yoo, Ga Eul; Kim, Soo Ji

    2016-01-01

    Given the increasing evidence demonstrating the effects of rhythmic auditory cueing for motor rehabilitation of stroke patients, this synthesized analysis is needed in order to improve rehabilitative practice and maximize clinical effectiveness. This study aimed to systematically analyze the literature on rhythmic auditory cueing for motor rehabilitation of stroke patients by highlighting the outcome variables, type of cueing, and stage of stroke. A systematic review with meta-analysis of randomized controlled or clinically controlled trials was conducted. Electronic databases and music therapy journals were searched for studies including stroke, the use of rhythmic auditory cueing, and motor outcomes, such as gait and upper-extremity function. A total of 10 studies (RCT or CCT) with 356 individuals were included for meta-analysis. There were large effect sizes (Hedges's g = 0.984 for walking velocity; Hedges's g = 0.840 for cadence; Hedges's g = 0.760 for stride length; and Hedges's g = 0.456 for Fugl-Meyer test scores) in the use of rhythmic auditory cueing. Additional subgroup analysis demonstrated that although the type of rhythmic cueing and stage of stroke did not lead to statistically substantial group differences, the effect sizes and heterogeneity values in each subgroup implied possible differences in treatment effect. This study corroborates the beneficial effects of rhythmic auditory cueing, supporting its expanded application to broadened areas of rehabilitation for stroke patients. Also, it suggests the future investigation of the differential outcomes depending on how rhythmic auditory cueing is provided in terms of type and intensity implemented. © the American Music Therapy Association 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Auditory Verbal Cues Alter the Perceived Flavor of Beverages and Ease of Swallowing: A Psychometric and Electrophysiological Analysis

    Directory of Open Access Journals (Sweden)

    Aya Nakamura

    2013-01-01

    Full Text Available We investigated the possible effects of auditory verbal cues on flavor perception and swallow physiology for younger and elder participants. Apple juice, aojiru (grass juice, and water were ingested with or without auditory verbal cues. Flavor perception and ease of swallowing were measured using a visual analog scale and swallow physiology by surface electromyography and cervical auscultation. The auditory verbal cues had significant positive effects on flavor and ease of swallowing as well as on swallow physiology. The taste score and the ease of swallowing score significantly increased when the participant’s anticipation was primed by accurate auditory verbal cues. There was no significant effect of auditory verbal cues on distaste score. Regardless of age, the maximum suprahyoid muscle activity significantly decreased when a beverage was ingested without auditory verbal cues. The interval between the onset of swallowing sounds and the peak timing point of the infrahyoid muscle activity significantly shortened when the anticipation induced by the cue was contradicted in the elderly participant group. These results suggest that auditory verbal cues can improve the perceived flavor of beverages and swallow physiology.

  13. Auditory distance perception in humans: a review of cues, development, neuronal bases, and effects of sensory loss.

    Science.gov (United States)

    Kolarik, Andrew J; Moore, Brian C J; Zahorik, Pavel; Cirstea, Silvia; Pardhan, Shahina

    2016-02-01

    Auditory distance perception plays a major role in spatial awareness, enabling location of objects and avoidance of obstacles in the environment. However, it remains under-researched relative to studies of the directional aspect of sound localization. This review focuses on the following four aspects of auditory distance perception: cue processing, development, consequences of visual and auditory loss, and neurological bases. The several auditory distance cues vary in their effective ranges in peripersonal and extrapersonal space. The primary cues are sound level, reverberation, and frequency. Nonperceptual factors, including the importance of the auditory event to the listener, also can affect perceived distance. Basic internal representations of auditory distance emerge at approximately 6 months of age in humans. Although visual information plays an important role in calibrating auditory space, sensorimotor contingencies can be used for calibration when vision is unavailable. Blind individuals often manifest supranormal abilities to judge relative distance but show a deficit in absolute distance judgments. Following hearing loss, the use of auditory level as a distance cue remains robust, while the reverberation cue becomes less effective. Previous studies have not found evidence that hearing-aid processing affects perceived auditory distance. Studies investigating the brain areas involved in processing different acoustic distance cues are described. Finally, suggestions are given for further research on auditory distance perception, including broader investigation of how background noise and multiple sound sources affect perceived auditory distance for those with sensory loss.

  14. Auditory attention activates peripheral visual cortex.

    Directory of Open Access Journals (Sweden)

    Anthony D Cate

    Full Text Available BACKGROUND: Recent neuroimaging studies have revealed that putatively unimodal regions of visual cortex can be activated during auditory tasks in sighted as well as in blind subjects. However, the task determinants and functional significance of auditory occipital activations (AOAs remains unclear. METHODOLOGY/PRINCIPAL FINDINGS: We examined AOAs in an intermodal selective attention task to distinguish whether they were stimulus-bound or recruited by higher-level cognitive operations associated with auditory attention. Cortical surface mapping showed that auditory occipital activations were localized to retinotopic visual cortex subserving the far peripheral visual field. AOAs depended strictly on the sustained engagement of auditory attention and were enhanced in more difficult listening conditions. In contrast, unattended sounds produced no AOAs regardless of their intensity, spatial location, or frequency. CONCLUSIONS/SIGNIFICANCE: Auditory attention, but not passive exposure to sounds, routinely activated peripheral regions of visual cortex when subjects attended to sound sources outside the visual field. Functional connections between auditory cortex and visual cortex subserving the peripheral visual field appear to underlie the generation of AOAs, which may reflect the priming of visual regions to process soon-to-appear objects associated with unseen sound sources.

  15. Cueing listeners to attend to a target talker progressively improves word report as the duration of the cue-target interval lengthens to 2,000 ms.

    Science.gov (United States)

    Holmes, Emma; Kitterick, Padraig T; Summerfield, A Quentin

    2018-04-25

    Endogenous attention is typically studied by presenting instructive cues in advance of a target stimulus array. For endogenous visual attention, task performance improves as the duration of the cue-target interval increases up to 800 ms. Less is known about how endogenous auditory attention unfolds over time or the mechanisms by which an instructive cue presented in advance of an auditory array improves performance. The current experiment used five cue-target intervals (0, 250, 500, 1,000, and 2,000 ms) to compare four hypotheses for how preparatory attention develops over time in a multi-talker listening task. Young adults were cued to attend to a target talker who spoke in a mixture of three talkers. Visual cues indicated the target talker's spatial location or their gender. Participants directed attention to location and gender simultaneously ("objects") at all cue-target intervals. Participants were consistently faster and more accurate at reporting words spoken by the target talker when the cue-target interval was 2,000 ms than 0 ms. In addition, the latency of correct responses progressively shortened as the duration of the cue-target interval increased from 0 to 2,000 ms. These findings suggest that the mechanisms involved in preparatory auditory attention develop gradually over time, taking at least 2,000 ms to reach optimal configuration, yet providing cumulative improvements in speech intelligibility as the duration of the cue-target interval increases from 0 to 2,000 ms. These results demonstrate an improvement in performance for cue-target intervals longer than those that have been reported previously in the visual or auditory modalities.

  16. Using Auditory Cues to Perceptually Extract Visual Data in Collaborative, Immersive Big-Data Display Systems

    Science.gov (United States)

    Lee, Wendy

    The advent of multisensory display systems, such as virtual and augmented reality, has fostered a new relationship between humans and space. Not only can these systems mimic real-world environments, they have the ability to create a new space typology made solely of data. In these spaces, two-dimensional information is displayed in three dimensions, requiring human senses to be used to understand virtual, attention-based elements. Studies in the field of big data have predominately focused on visual representations and extractions of information with little focus on sounds. The goal of this research is to evaluate the most efficient methods of perceptually extracting visual data using auditory stimuli in immersive environments. Using Rensselaer's CRAIVE-Lab, a virtual reality space with 360-degree panorama visuals and an array of 128 loudspeakers, participants were asked questions based on complex visual displays using a variety of auditory cues ranging from sine tones to camera shutter sounds. Analysis of the speed and accuracy of participant responses revealed that auditory cues that were more favorable for localization and were positively perceived were best for data extraction and could help create more user-friendly systems in the future.

  17. A dominance hierarchy of auditory spatial cues in barn owls.

    Directory of Open Access Journals (Sweden)

    Ilana B Witten

    2010-04-01

    Full Text Available Barn owls integrate spatial information across frequency channels to localize sounds in space.We presented barn owls with synchronous sounds that contained different bands of frequencies (3-5 kHz and 7-9 kHz from different locations in space. When the owls were confronted with the conflicting localization cues from two synchronous sounds of equal level, their orienting responses were dominated by one of the sounds: they oriented toward the location of the low frequency sound when the sources were separated in azimuth; in contrast, they oriented toward the location of the high frequency sound when the sources were separated in elevation. We identified neural correlates of this behavioral effect in the optic tectum (OT, superior colliculus in mammals, which contains a map of auditory space and is involved in generating orienting movements to sounds. We found that low frequency cues dominate the representation of sound azimuth in the OT space map, whereas high frequency cues dominate the representation of sound elevation.We argue that the dominance hierarchy of localization cues reflects several factors: 1 the relative amplitude of the sound providing the cue, 2 the resolution with which the auditory system measures the value of a cue, and 3 the spatial ambiguity in interpreting the cue. These same factors may contribute to the relative weighting of sound localization cues in other species, including humans.

  18. Spatial selective auditory attention in the presence of reverberant energy: individual differences in normal-hearing listeners.

    Science.gov (United States)

    Ruggles, Dorea; Shinn-Cunningham, Barbara

    2011-06-01

    Listeners can selectively attend to a desired target by directing attention to known target source features, such as location or pitch. Reverberation, however, reduces the reliability of the cues that allow a target source to be segregated and selected from a sound mixture. Given this, it is likely that reverberant energy interferes with selective auditory attention. Anecdotal reports suggest that the ability to focus spatial auditory attention degrades even with early aging, yet there is little evidence that middle-aged listeners have behavioral deficits on tasks requiring selective auditory attention. The current study was designed to look for individual differences in selective attention ability and to see if any such differences correlate with age. Normal-hearing adults, ranging in age from 18 to 55 years, were asked to report a stream of digits located directly ahead in a simulated rectangular room. Simultaneous, competing masker digit streams were simulated at locations 15° left and right of center. The level of reverberation was varied to alter task difficulty by interfering with localization cues (increasing localization blur). Overall, performance was best in the anechoic condition and worst in the high-reverberation condition. Listeners nearly always reported a digit from one of the three competing streams, showing that reverberation did not render the digits unintelligible. Importantly, inter-subject differences were extremely large. These differences, however, were not significantly correlated with age, memory span, or hearing status. These results show that listeners with audiometrically normal pure tone thresholds differ in their ability to selectively attend to a desired source, a task important in everyday communication. Further work is necessary to determine if these differences arise from differences in peripheral auditory function or in more central function.

  19. Encoding of Sucrose's Palatability in the Nucleus Accumbens Shell and Its Modulation by Exteroceptive Auditory Cues

    Directory of Open Access Journals (Sweden)

    Miguel Villavicencio

    2018-05-01

    Full Text Available Although the palatability of sucrose is the primary reason for why it is over consumed, it is not well understood how it is encoded in the nucleus accumbens shell (NAcSh, a brain region involved in reward, feeding, and sensory/motor transformations. Similarly, untouched are issues regarding how an external auditory stimulus affects sucrose palatability and, in the NAcSh, the neuronal correlates of this behavior. To address these questions in behaving rats, we investigated how food-related auditory cues modulate sucrose's palatability. The goals are to determine whether NAcSh neuronal responses would track sucrose's palatability (as measured by the increase in hedonically positive oromotor responses lick rate, sucrose concentration, and how it processes auditory information. Using brief-access tests, we found that sucrose's palatability was enhanced by exteroceptive auditory cues that signal the start and the end of a reward epoch. With only the start cue the rejection of water was accelerated, and the sucrose/water ratio was enhanced, indicating greater palatability. However, the start cue also fragmented licking patterns and decreased caloric intake. In the presence of both start and stop cues, the animals fed continuously and increased their caloric intake. Analysis of the licking microstructure confirmed that auditory cues (either signaling the start alone or start/stop enhanced sucrose's oromotor-palatability responses. Recordings of extracellular single-unit activity identified several distinct populations of NAcSh responses that tracked either the sucrose palatability responses or the sucrose concentrations by increasing or decreasing their activity. Another neural population fired synchronously with licking and exhibited an enhancement in their coherence with increasing sucrose concentrations. The population of NAcSh's Palatability-related and Lick-Inactive neurons were the most important for decoding sucrose's palatability. Only the Lick

  20. The effects of divided attention on auditory priming.

    Science.gov (United States)

    Mulligan, Neil W; Duke, Marquinn; Cooper, Angela W

    2007-09-01

    Traditional theorizing stresses the importance of attentional state during encoding for later memory, based primarily on research with explicit memory. Recent research has begun to investigate the role of attention in implicit memory but has focused almost exclusively on priming in the visual modality. The present experiments examined the effect of divided attention on auditory implicit memory, using auditory perceptual identification, word-stem completion and word-fragment completion. Participants heard study words under full attention conditions or while simultaneously carrying out a distractor task (the divided attention condition). In Experiment 1, a distractor task with low response frequency failed to disrupt later auditory priming (but diminished explicit memory as assessed with auditory recognition). In Experiment 2, a distractor task with greater response frequency disrupted priming on all three of the auditory priming tasks as well as the explicit test. These results imply that although auditory priming is less reliant on attention than explicit memory, it is still greatly affected by at least some divided-attention manipulations. These results are consistent with research using visual priming tasks and have relevance for hypotheses regarding attention and auditory priming.

  1. The attenuation of auditory neglect by implicit cues.

    Science.gov (United States)

    Coleman, A Rand; Williams, J Michael

    2006-09-01

    This study examined implicit semantic and rhyming cues on perception of auditory stimuli among nonaphasic participants who suffered a lesion of the right cerebral hemisphere and auditory neglect of sound perceived by the left ear. Because language represents an elaborate processing of auditory stimuli and the language centers were intact among these patients, it was hypothesized that interactive verbal stimuli presented in a dichotic manner would attenuate neglect. The selected participants were administered an experimental dichotic listening test composed of six types of word pairs: unrelated words, synonyms, antonyms, categorically related words, compound words, and rhyming words. Presentation of word pairs that were semantically related resulted in a dramatic reduction of auditory neglect. Dichotic presentations of rhyming words exacerbated auditory neglect. These findings suggest that the perception of auditory information is strongly affected by the specific content conveyed by the auditory system. Language centers will process a degraded stimulus that contains salient language content. A degraded auditory stimulus is neglected if it is devoid of content that activates the language centers or other cognitive systems. In general, these findings suggest that auditory neglect involves a complex interaction of intact and impaired cerebral processing centers with content that is selectively processed by these centers.

  2. Attention and memory protection: Interactions between retrospective attention cueing and interference.

    Science.gov (United States)

    Makovski, Tal; Pertzov, Yoni

    2015-01-01

    Visual working memory (VWM) and attention have a number of features in common, but despite extensive research it is still unclear how the two interact. Can focused attention improve VWM precision? Can it protect VWM from interference? Here we used a partial-report, continuous-response orientation memory task to examine how attention and interference affect different aspects of VWM and how they interact with one another. Both attention and interference were orthogonally manipulated during the retention interval. Attention was manipulated by presenting informative retro-cues, whereas interference was manipulated by introducing a secondary interfering task. Mixture-model analyses revealed that retro-cues, compared to uninformative cues, improved all aspects of performance: Attention increased recall precision and decreased guessing rate and swap-errors (reporting a wrong item in memory). Similarly, performing a secondary task impaired all aspects of the VWM task. In particular, an interaction between retro-cue and secondary task interference was found primarily for swap-errors. Together these results suggest that both the quantity and quality of VWM representations are sensitive to attention cueing and interference modulations, and they highlight the role of attention in protecting the feature-location associations needed to access the correct items in memory.

  3. Effects of auditory cues on gait initiation and turning in patients with Parkinson's disease.

    Science.gov (United States)

    Gómez-González, J; Martín-Casas, P; Cano-de-la-Cuerda, R

    2016-12-08

    To review the available scientific evidence about the effectiveness of auditory cues during gait initiation and turning in patients with Parkinson's disease. We conducted a literature search in the following databases: Brain, PubMed, Medline, CINAHL, Scopus, Science Direct, Web of Science, Cochrane Database of Systematic Reviews, Cochrane Library Plus, CENTRAL, Trip Database, PEDro, DARE, OTseeker, and Google Scholar. We included all studies published between 2007 and 2016 and evaluating the influence of auditory cues on independent gait initiation and turning in patients with Parkinson's disease. The methodological quality of the studies was assessed with the Jadad scale. We included 13 studies, all of which had a low methodological quality (Jadad scale score≤2). In these studies, high-intensity, high-frequency auditory cues had a positive impact on gait initiation and turning. More specifically, they 1) improved spatiotemporal and kinematic parameters; 2) decreased freezing, turning duration, and falls; and 3) increased gait initiation speed, muscle activation, and gait speed and cadence in patients with Parkinson's disease. We need studies of better methodological quality to establish the Parkinson's disease stage in which auditory cues are most beneficial, as well as to determine the most effective type and frequency of the auditory cue during gait initiation and turning in patients with Parkinson's disease. Copyright © 2016 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  4. The influence of an auditory-memory attention-demanding task on postural control in blind persons.

    Science.gov (United States)

    Melzer, Itshak; Damry, Elad; Landau, Anat; Yagev, Ronit

    2011-05-01

    In order to evaluate the effect of an auditory-memory attention-demanding task on balance control, nine blind adults were compared to nine age-gender-matched sighted controls. This issue is particularly relevant for the blind population in which functional assessment of postural control has to be revealed through "real life" motor and cognitive function. The study aimed to explore whether an auditory-memory attention-demanding cognitive task would influence postural control in blind persons and compare this with blindfolded sighted persons. Subjects were instructed to minimize body sway during narrow base upright standing on a single force platform under two conditions: 1) standing still (single task); 2) as in 1) while performing an auditory-memory attention-demanding cognitive task (dual task). Subjects in both groups were required to stand blindfolded with their eyes closed. Center of Pressure displacement data were collected and analyzed using summary statistics and stabilogram-diffusion analysis. Blind and sighted subjects had similar postural sway in eyes closed condition. However, for dual compared to single task, sighted subjects show significant decrease in postural sway while blind subjects did not. The auditory-memory attention-demanding cognitive task had no interference effect on balance control on blind subjects. It seems that sighted individuals used auditory cues to compensate for momentary loss of vision, whereas blind subjects did not. This may suggest that blind and sighted people use different sensorimotor strategies to achieve stability. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Adapting the Theory of Visual Attention (TVA) to model auditory attention

    DEFF Research Database (Denmark)

    Roberts, Katherine L.; Andersen, Tobias; Kyllingsbæk, Søren

    Mathematical and computational models have provided useful insights into normal and impaired visual attention, but less progress has been made in modelling auditory attention. We are developing a Theory of Auditory Attention (TAA), based on an influential visual model, the Theory of Visual...... Attention (TVA). We report that TVA provides a good fit to auditory data when the stimuli are closely matched to those used in visual studies. In the basic visual TVA task, participants view a brief display of letters and are asked to report either all of the letters (whole report) or a subset of letters (e...... the auditory data, producing good estimates of the rate at which information is encoded (C), the minimum exposure duration required for processing to begin (t0), and the relative attentional weight to targets versus distractors (α). Future work will address the issue of target-distractor confusion, and extend...

  6. Posttraining handling facilitates memory for auditory-cue fear conditioning in rats.

    Science.gov (United States)

    Hui, Isabel R; Hui, Gabriel K; Roozendaal, Benno; McGaugh, James L; Weinberger, Norman M

    2006-09-01

    A large number of studies have indicated that stress exposure or the administration of stress hormones and other neuroactive drugs immediately after a learning experience modulates the consolidation of long-term memory. However, there has been little investigation into how arousal induced by handling of the animals in order to administer these drugs affects memory. Therefore, the present study examined whether the posttraining injection or handling procedure per se affects memory of auditory-cue classical fear conditioning. Male Sprague-Dawley rats, which had been pre-handled on three days for 1 min each prior to conditioning, received three pairings of a single-frequency auditory stimulus and footshock, followed immediately by either a subcutaneous injection of a vehicle solution or brief handling without injection. A control group was placed back into their home cages without receiving any posttraining treatment. Retention was tested 24 h later in a novel chamber and suppression of ongoing motor behavior during a 10-s presentation of the auditory-cue served as the measure of conditioned fear. Animals that received posttraining injection or handling did not differ from each other but showed significantly less stimulus-induced movement compared to the non-handled control group. These findings thus indicate that the posttraining injection or handling procedure is sufficiently arousing or stressful to facilitate memory consolidation of auditory-cue classical fear conditioning.

  7. What You Don't Notice Can Harm You: Age-Related Differences in Detecting Concurrent Visual, Auditory, and Tactile Cues.

    Science.gov (United States)

    Pitts, Brandon J; Sarter, Nadine

    2018-06-01

    Objective This research sought to determine whether people can perceive and process three nonredundant (and unrelated) signals in vision, hearing, and touch at the same time and how aging and concurrent task demands affect this ability. Background Multimodal displays have been shown to improve multitasking and attention management; however, their potential limitations are not well understood. The majority of studies on multimodal information presentation have focused on the processing of only two concurrent and, most often, redundant cues by younger participants. Method Two experiments were conducted in which younger and older adults detected and responded to a series of singles, pairs, and triplets of visual, auditory, and tactile cues in the absence (Experiment 1) and presence (Experiment 2) of an ongoing simulated driving task. Detection rates, response times, and driving task performance were measured. Results Compared to younger participants, older adults showed longer response times and higher error rates in response to cues/cue combinations. Older participants often missed the tactile cue when three cues were combined. They sometimes falsely reported the presence of a visual cue when presented with a pair of auditory and tactile signals. Driving performance suffered most in the presence of cue triplets. Conclusion People are more likely to miss information if more than two concurrent nonredundant signals are presented to different sensory channels. Application The findings from this work help inform the design of multimodal displays and ensure their usefulness across different age groups and in various application domains.

  8. Role of Speaker Cues in Attention Inference

    Directory of Open Access Journals (Sweden)

    Jin Joo Lee

    2017-10-01

    Full Text Available Current state-of-the-art approaches to emotion recognition primarily focus on modeling the nonverbal expressions of the sole individual without reference to contextual elements such as the co-presence of the partner. In this paper, we demonstrate that the accurate inference of listeners’ social-emotional state of attention depends on accounting for the nonverbal behaviors of their storytelling partner, namely their speaker cues. To gain a deeper understanding of the role of speaker cues in attention inference, we conduct investigations into real-world interactions of children (5–6 years old storytelling with their peers. Through in-depth analysis of human–human interaction data, we first identify nonverbal speaker cues (i.e., backchannel-inviting cues and listener responses (i.e., backchannel feedback. We then demonstrate how speaker cues can modify the interpretation of attention-related backchannels as well as serve as a means to regulate the responsiveness of listeners. We discuss the design implications of our findings toward our primary goal of developing attention recognition models for storytelling robots, and we argue that social robots can proactively use speaker cues to form more accurate inferences about the attentive state of their human partners.

  9. Attentional bias for craving-related (chocolate) food cues.

    Science.gov (United States)

    Kemps, Eva; Tiggemann, Marika

    2009-12-01

    In this study, we investigated attentional biases for craving-related food cues. A pictorial dot probe task was used to assess selective attentional processing of one particular highly desired food, namely chocolate, relative to that of other highly desired foods. In Experiment 1, we examined biased processing of chocolate cues in habitual (trait) chocolate cravers, whereas in Experiment 2 we investigated the effect of experimentally induced (state) chocolate cravings on such processing. As predicted, habitual chocolate cravers (Experiment 1) and individuals in whom a craving for chocolate was temporarily induced (Experiment 2) showed speeded detection of probes replacing chocolate-related pictures, demonstrating an attentional bias for chocolate cues. Subsequent examination indicated that in both experiments the observed attentional biases stemmed from difficulty in disengaging attention from chocolate cues rather than from a shift of attention toward such cues. The findings have important theoretical and practical implications.

  10. Effect of rhythmic auditory cueing on gait in cerebral palsy: a systematic review and meta-analysis.

    Science.gov (United States)

    Ghai, Shashank; Ghai, Ishan; Effenberg, Alfred O

    2018-01-01

    Auditory entrainment can influence gait performance in movement disorders. The entrainment can incite neurophysiological and musculoskeletal changes to enhance motor execution. However, a consensus as to its effects based on gait in people with cerebral palsy is still warranted. A systematic review and meta-analysis were carried out to analyze the effects of rhythmic auditory cueing on spatiotemporal and kinematic parameters of gait in people with cerebral palsy. Systematic identification of published literature was performed adhering to Preferred Reporting Items for Systematic Reviews and Meta-Analyses and American Academy for Cerebral Palsy and Developmental Medicine guidelines, from inception until July 2017, on online databases: Web of Science, PEDro, EBSCO, Medline, Cochrane, Embase and ProQuest. Kinematic and spatiotemporal gait parameters were evaluated in a meta-analysis across studies. Of 547 records, nine studies involving 227 participants (108 children/119 adults) met our inclusion criteria. The qualitative review suggested beneficial effects of rhythmic auditory cueing on gait performance among all included studies. The meta-analysis revealed beneficial effects of rhythmic auditory cueing on gait dynamic index (Hedge's g =0.9), gait velocity (1.1), cadence (0.3), and stride length (0.5). This review for the first time suggests a converging evidence toward application of rhythmic auditory cueing to enhance gait performance and stability in people with cerebral palsy. This article details underlying neurophysiological mechanisms and use of cueing as an efficient home-based intervention. It bridges gaps in the literature, and suggests translational approaches on how rhythmic auditory cueing can be incorporated in rehabilitation approaches to enhance gait performance in people with cerebral palsy.

  11. Auditory and visual cueing modulate cycling speed of older adults and persons with Parkinson's disease in a Virtual Cycling (V-Cycle) system.

    Science.gov (United States)

    Gallagher, Rosemary; Damodaran, Harish; Werner, William G; Powell, Wendy; Deutsch, Judith E

    2016-08-19

    Evidence based virtual environments (VEs) that incorporate compensatory strategies such as cueing may change motor behavior and increase exercise intensity while also being engaging and motivating. The purpose of this study was to determine if persons with Parkinson's disease and aged matched healthy adults responded to auditory and visual cueing embedded in a bicycling VE as a method to increase exercise intensity. We tested two groups of participants, persons with Parkinson's disease (PD) (n = 15) and age-matched healthy adults (n = 13) as they cycled on a stationary bicycle while interacting with a VE. Participants cycled under two conditions: auditory cueing (provided by a metronome) and visual cueing (represented as central road markers in the VE). The auditory condition had four trials in which auditory cues or the VE were presented alone or in combination. The visual condition had five trials in which the VE and visual cue rate presentation was manipulated. Data were analyzed by condition using factorial RMANOVAs with planned t-tests corrected for multiple comparisons. There were no differences in pedaling rates between groups for both the auditory and visual cueing conditions. Persons with PD increased their pedaling rate in the auditory (F 4.78, p = 0.029) and visual cueing (F 26.48, p auditory (F = 24.72, p visual cueing (F = 40.69, p visual condition in age-matched healthy adults showed a step-wise increase in pedaling rate (p = 0.003 to p visual cues (p visual cues in order to obtain an increase in cycling intensity. The combination of the VE and auditory cues was neither additive nor interfering. These data serve as preliminary evidence that embedding auditory and visual cues to alter cycling speed in a VE as method to increase exercise intensity that may promote fitness.

  12. Effect of rhythmic auditory cueing on parkinsonian gait: A systematic review and meta-analysis.

    Science.gov (United States)

    Ghai, Shashank; Ghai, Ishan; Schmitz, Gerd; Effenberg, Alfred O

    2018-01-11

    The use of rhythmic auditory cueing to enhance gait performance in parkinsonian patients' is an emerging area of interest. Different theories and underlying neurophysiological mechanisms have been suggested for ascertaining the enhancement in motor performance. However, a consensus as to its effects based on characteristics of effective stimuli, and training dosage is still not reached. A systematic review and meta-analysis was carried out to analyze the effects of different auditory feedbacks on gait and postural performance in patients affected by Parkinson's disease. Systematic identification of published literature was performed adhering to PRISMA guidelines, from inception until May 2017, on online databases; Web of science, PEDro, EBSCO, MEDLINE, Cochrane, EMBASE and PROQUEST. Of 4204 records, 50 studies, involving 1892 participants met our inclusion criteria. The analysis revealed an overall positive effect on gait velocity, stride length, and a negative effect on cadence with application of auditory cueing. Neurophysiological mechanisms, training dosage, effects of higher information processing constraints, and use of cueing as an adjunct with medications are thoroughly discussed. This present review bridges the gaps in literature by suggesting application of rhythmic auditory cueing in conventional rehabilitation approaches to enhance motor performance and quality of life in the parkinsonian community.

  13. The Relationship between Types of Attention and Auditory Processing Skills: Reconsidering Auditory Processing Disorder Diagnosis

    Science.gov (United States)

    Stavrinos, Georgios; Iliadou, Vassiliki-Maria; Edwards, Lindsey; Sirimanna, Tony; Bamiou, Doris-Eva

    2018-01-01

    Measures of attention have been found to correlate with specific auditory processing tests in samples of children suspected of Auditory Processing Disorder (APD), but these relationships have not been adequately investigated. Despite evidence linking auditory attention and deficits/symptoms of APD, measures of attention are not routinely used in APD diagnostic protocols. The aim of the study was to examine the relationship between auditory and visual attention tests and auditory processing tests in children with APD and to assess whether a proposed diagnostic protocol for APD, including measures of attention, could provide useful information for APD management. A pilot study including 27 children, aged 7–11 years, referred for APD assessment was conducted. The validated test of everyday attention for children, with visual and auditory attention tasks, the listening in spatialized noise sentences test, the children's communication checklist questionnaire and tests from a standard APD diagnostic test battery were administered. Pearson's partial correlation analysis examining the relationship between these tests and Cochrane's Q test analysis comparing proportions of diagnosis under each proposed battery were conducted. Divided auditory and divided auditory-visual attention strongly correlated with the dichotic digits test, r = 0.68, p attention battery identified as having Attention Deficits (ADs). The proposed APD battery excluding AD cases did not have a significantly different diagnosis proportion than the standard APD battery. Finally, the newly proposed diagnostic battery, identifying an inattentive subtype of APD, identified five children who would have otherwise been considered not having ADs. The findings show that a subgroup of children with APD demonstrates underlying sustained and divided attention deficits. Attention deficits in children with APD appear to be centred around the auditory modality but further examination of types of attention in both

  14. Absence of both auditory evoked potentials and auditory percepts dependent on timing cues.

    Science.gov (United States)

    Starr, A; McPherson, D; Patterson, J; Don, M; Luxford, W; Shannon, R; Sininger, Y; Tonakawa, L; Waring, M

    1991-06-01

    An 11-yr-old girl had an absence of sensory components of auditory evoked potentials (brainstem, middle and long-latency) to click and tone burst stimuli that she could clearly hear. Psychoacoustic tests revealed a marked impairment of those auditory perceptions dependent on temporal cues, that is, lateralization of binaural clicks, change of binaural masked threshold with changes in signal phase, binaural beats, detection of paired monaural clicks, monaural detection of a silent gap in a sound, and monaural threshold elevation for short duration tones. In contrast, auditory functions reflecting intensity or frequency discriminations (difference limens) were only minimally impaired. Pure tone audiometry showed a moderate (50 dB) bilateral hearing loss with a disproportionate severe loss of word intelligibility. Those auditory evoked potentials that were preserved included (1) cochlear microphonics reflecting hair cell activity; (2) cortical sustained potentials reflecting processing of slowly changing signals; and (3) long-latency cognitive components (P300, processing negativity) reflecting endogenous auditory cognitive processes. Both the evoked potential and perceptual deficits are attributed to changes in temporal encoding of acoustic signals perhaps occurring at the synapse between hair cell and eighth nerve dendrites. The results from this patient are discussed in relation to previously published cases with absent auditory evoked potentials and preserved hearing.

  15. Effect of rhythmic auditory cueing on gait in cerebral palsy: a systematic review and meta-analysis

    Directory of Open Access Journals (Sweden)

    Ghai S

    2017-12-01

    Full Text Available Shashank Ghai,1 Ishan Ghai,2 Alfred O. Effenberg1 1Institute for Sports Science, Leibniz University Hannover, Hannover, Germany; 2School of Life Sciences, Jacobs University, Bremen, Germany Abstract: Auditory entrainment can influence gait performance in movement disorders. The entrainment can incite neurophysiological and musculoskeletal changes to enhance motor execution. However, a consensus as to its effects based on gait in people with cerebral palsy is still warranted. A systematic review and meta-analysis were carried out to analyze the effects of rhythmic auditory cueing on spatiotemporal and kinematic parameters of gait in people with cerebral palsy. Systematic identification of published literature was performed adhering to Preferred Reporting Items for Systematic Reviews and Meta-Analyses and American Academy for Cerebral Palsy and Developmental Medicine guidelines, from inception until July 2017, on online databases: Web of Science, PEDro, EBSCO, Medline, Cochrane, Embase and ProQuest. Kinematic and spatiotemporal gait parameters were evaluated in a meta-analysis across studies. Of 547 records, nine studies involving 227 participants (108 children/119 adults met our inclusion criteria. The qualitative review suggested beneficial effects of rhythmic auditory cueing on gait performance among all included studies. The meta-analysis revealed beneficial effects of rhythmic auditory cueing on gait dynamic index (Hedge’s g=0.9, gait velocity (1.1, cadence (0.3, and stride length (0.5. This review for the first time suggests a converging evidence toward application of rhythmic auditory cueing to enhance gait performance and stability in people with cerebral palsy. This article details underlying neurophysiological mechanisms and use of cueing as an efficient home-based intervention. It bridges gaps in the literature, and suggests translational approaches on how rhythmic auditory cueing can be incorporated in rehabilitation approaches to

  16. A psychophysical imaging method evidencing auditory cue extraction during speech perception: a group analysis of auditory classification images.

    Science.gov (United States)

    Varnet, Léo; Knoblauch, Kenneth; Serniclaes, Willy; Meunier, Fanny; Hoen, Michel

    2015-01-01

    Although there is a large consensus regarding the involvement of specific acoustic cues in speech perception, the precise mechanisms underlying the transformation from continuous acoustical properties into discrete perceptual units remains undetermined. This gap in knowledge is partially due to the lack of a turnkey solution for isolating critical speech cues from natural stimuli. In this paper, we describe a psychoacoustic imaging method known as the Auditory Classification Image technique that allows experimenters to estimate the relative importance of time-frequency regions in categorizing natural speech utterances in noise. Importantly, this technique enables the testing of hypotheses on the listening strategies of participants at the group level. We exemplify this approach by identifying the acoustic cues involved in da/ga categorization with two phonetic contexts, Al- or Ar-. The application of Auditory Classification Images to our group of 16 participants revealed significant critical regions on the second and third formant onsets, as predicted by the literature, as well as an unexpected temporal cue on the first formant. Finally, through a cluster-based nonparametric test, we demonstrate that this method is sufficiently sensitive to detect fine modifications of the classification strategies between different utterances of the same phoneme.

  17. The effects of transient attention on spatial resolution and the size of the attentional cue.

    Science.gov (United States)

    Yeshurun, Yaffa; Carrasco, Marisa

    2008-01-01

    It has been shown that transient attention enhances spatial resolution, but is the effect of transient attention on spatial resolution modulated by the size of the attentional cue? Would a gradual increase in the size of the cue lead to a gradual decrement in spatial resolution? To test these hypotheses, we used a texture segmentation task in which performance depends on spatial resolution, and systematically manipulated the size of the attentional cue: A bar of different lengths (Experiment 1) or a frame of different sizes (Experiments 2-3) indicated the target region in a texture segmentation display. Observers indicated whether a target patch region (oriented line elements in a background of an orthogonal orientation), appearing at a range of eccentricities, was present in the first or the second interval. We replicated the attentional enhancement of spatial resolution found with small cues; attention improved performance at peripheral locations but impaired performance at central locations. However, there was no evidence of gradual resolution decrement with large cues. Transient attention enhanced spatial resolution at the attended location when it was attracted to that location by a small cue but did not affect resolution when it was attracted by a large cue. These results indicate that transient attention cannot adapt its operation on spatial resolution on the basis of the size of the attentional cue.

  18. Measuring Auditory Selective Attention using Frequency Tagging

    Directory of Open Access Journals (Sweden)

    Hari M Bharadwaj

    2014-02-01

    Full Text Available Frequency tagging of sensory inputs (presenting stimuli that fluctuate periodically at rates to which the cortex can phase lock has been used to study attentional modulation of neural responses to inputs in different sensory modalities. For visual inputs, the visual steady-state response (VSSR at the frequency modulating an attended object is enhanced, while the VSSR to a distracting object is suppressed. In contrast, the effect of attention on the auditory steady-state response (ASSR is inconsistent across studies. However, most auditory studies analyzed results at the sensor level or used only a small number of equivalent current dipoles to fit cortical responses. In addition, most studies of auditory spatial attention used dichotic stimuli (independent signals at the ears rather than more natural, binaural stimuli. Here, we asked whether these methodological choices help explain discrepant results. Listeners attended to one of two competing speech streams, one simulated from the left and one from the right, that were modulated at different frequencies. Using distributed source modeling of magnetoencephalography results, we estimate how spatially directed attention modulates the ASSR in neural regions across the whole brain. Attention enhances the ASSR power at the frequency of the attended stream in the contralateral auditory cortex. The attended-stream modulation frequency also drives phase-locked responses in the left (but not right precentral sulcus (lPCS, a region implicated in control of eye gaze and visual spatial attention. Importantly, this region shows no phase locking to the distracting stream suggesting that the lPCS in engaged in an attention-specific manner. Modeling results that take account of the geometry and phases of the cortical sources phase locked to the two streams (including hemispheric asymmetry of lPCS activity help partly explain why past ASSR studies of auditory spatial attention yield seemingly contradictory

  19. Entrainment to an auditory signal: Is attention involved?

    NARCIS (Netherlands)

    Kunert, R.; Jongman, S.R.

    2017-01-01

    Many natural auditory signals, including music and language, change periodically. The effect of such auditory rhythms on the brain is unclear however. One widely held view, dynamic attending theory, proposes that the attentional system entrains to the rhythm and increases attention at moments of

  20. Nonword repetition in adults who stutter: The effects of stimuli stress and auditory-orthographic cues.

    Directory of Open Access Journals (Sweden)

    Geoffrey A Coalson

    Full Text Available Adults who stutter (AWS are less accurate in their immediate repetition of novel phonological sequences compared to adults who do not stutter (AWNS. The present study examined whether manipulation of the following two aspects of traditional nonword repetition tasks unmask distinct weaknesses in phonological working memory in AWS: (1 presentation of stimuli with less-frequent stress patterns, and (2 removal of auditory-orthographic cues immediately prior to response.Fifty-two participants (26 AWS, 26 AWNS produced 12 bisyllabic nonwords in the presence of corresponding auditory-orthographic cues (i.e., immediate repetition task, and the absence of auditory-orthographic cues (i.e., short-term recall task. Half of each cohort (13 AWS, 13 AWNS were exposed to the stimuli with high-frequency trochaic stress, and half (13 AWS, 13 AWNS were exposed to identical stimuli with lower-frequency iambic stress.No differences in immediate repetition accuracy for trochaic or iambic nonwords were observed for either group. However, AWS were less accurate when recalling iambic nonwords than trochaic nonwords in the absence of auditory-orthographic cues.Manipulation of two factors which may minimize phonological demand during standard nonword repetition tasks increased the number of errors in AWS compared to AWNS. These findings suggest greater vulnerability in phonological working memory in AWS, even when producing nonwords as short as two syllables.

  1. Improving visual spatial working memory in younger and older adults: effects of cross-modal cues.

    Science.gov (United States)

    Curtis, Ashley F; Turner, Gary R; Park, Norman W; Murtha, Susan J E

    2017-11-06

    Spatially informative auditory and vibrotactile (cross-modal) cues can facilitate attention but little is known about how similar cues influence visual spatial working memory (WM) across the adult lifespan. We investigated the effects of cues (spatially informative or alerting pre-cues vs. no cues), cue modality (auditory vs. vibrotactile vs. visual), memory array size (four vs. six items), and maintenance delay (900 vs. 1800 ms) on visual spatial location WM recognition accuracy in younger adults (YA) and older adults (OA). We observed a significant interaction between spatially informative pre-cue type, array size, and delay. OA and YA benefitted equally from spatially informative pre-cues, suggesting that attentional orienting prior to WM encoding, regardless of cue modality, is preserved with age.  Contrary to predictions, alerting pre-cues generally impaired performance in both age groups, suggesting that maintaining a vigilant state of arousal by facilitating the alerting attention system does not help visual spatial location WM.

  2. Modulation frequency as a cue for auditory speed perception.

    Science.gov (United States)

    Senna, Irene; Parise, Cesare V; Ernst, Marc O

    2017-07-12

    Unlike vision, the mechanisms underlying auditory motion perception are poorly understood. Here we describe an auditory motion illusion revealing a novel cue to auditory speed perception: the temporal frequency of amplitude modulation (AM-frequency), typical for rattling sounds. Naturally, corrugated objects sliding across each other generate rattling sounds whose AM-frequency tends to directly correlate with speed. We found that AM-frequency modulates auditory speed perception in a highly systematic fashion: moving sounds with higher AM-frequency are perceived as moving faster than sounds with lower AM-frequency. Even more interestingly, sounds with higher AM-frequency also induce stronger motion aftereffects. This reveals the existence of specialized neural mechanisms for auditory motion perception, which are sensitive to AM-frequency. Thus, in spatial hearing, the brain successfully capitalizes on the AM-frequency of rattling sounds to estimate the speed of moving objects. This tightly parallels previous findings in motion vision, where spatio-temporal frequency of moving displays systematically affects both speed perception and the magnitude of the motion aftereffects. Such an analogy with vision suggests that motion detection may rely on canonical computations, with similar neural mechanisms shared across the different modalities. © 2017 The Author(s).

  3. Musical experience shapes top-down auditory mechanisms: evidence from masking and auditory attention performance.

    Science.gov (United States)

    Strait, Dana L; Kraus, Nina; Parbery-Clark, Alexandra; Ashley, Richard

    2010-03-01

    A growing body of research suggests that cognitive functions, such as attention and memory, drive perception by tuning sensory mechanisms to relevant acoustic features. Long-term musical experience also modulates lower-level auditory function, although the mechanisms by which this occurs remain uncertain. In order to tease apart the mechanisms that drive perceptual enhancements in musicians, we posed the question: do well-developed cognitive abilities fine-tune auditory perception in a top-down fashion? We administered a standardized battery of perceptual and cognitive tests to adult musicians and non-musicians, including tasks either more or less susceptible to cognitive control (e.g., backward versus simultaneous masking) and more or less dependent on auditory or visual processing (e.g., auditory versus visual attention). Outcomes indicate lower perceptual thresholds in musicians specifically for auditory tasks that relate with cognitive abilities, such as backward masking and auditory attention. These enhancements were observed in the absence of group differences for the simultaneous masking and visual attention tasks. Our results suggest that long-term musical practice strengthens cognitive functions and that these functions benefit auditory skills. Musical training bolsters higher-level mechanisms that, when impaired, relate to language and literacy deficits. Thus, musical training may serve to lessen the impact of these deficits by strengthening the corticofugal system for hearing. 2009 Elsevier B.V. All rights reserved.

  4. Cueing spatial attention through timing and probability.

    Science.gov (United States)

    Girardi, Giovanna; Antonucci, Gabriella; Nico, Daniele

    2013-01-01

    Even when focused on an effortful task we retain the ability to detect salient environmental information, and even irrelevant visual stimuli can be automatically detected. However, to which extent unattended information affects attentional control is not fully understood. Here we provide evidences of how the brain spontaneously organizes its cognitive resources by shifting attention between a selective-attending and a stimulus-driven modality within a single task. Using a spatial cueing paradigm we investigated the effect of cue-target asynchronies as a function of their probabilities of occurrence (i.e., relative frequency). Results show that this accessory information modulates attentional shifts. A valid spatial cue improved participants' performance as compared to an invalid one only in trials in which target onset was highly predictable because of its more robust occurrence. Conversely, cuing proved ineffective when spatial cue and target were associated according to a less frequent asynchrony. These patterns of response depended on asynchronies' probability and not on their duration. Our findings clearly demonstrate that through a fine decision-making, performed trial-by-trial, the brain utilizes implicit information to decide whether or not voluntarily shifting spatial attention. As if according to a cost-planning strategy, the cognitive effort of shifting attention depending on the cue is performed only when the expected advantages are higher. In a trade-off competition for cognitive resources, voluntary/automatic attending may thus be a more complex process than expected. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. A psychophysical imaging method evidencing auditory cue extraction during speech perception: a group analysis of auditory classification images.

    Directory of Open Access Journals (Sweden)

    Léo Varnet

    Full Text Available Although there is a large consensus regarding the involvement of specific acoustic cues in speech perception, the precise mechanisms underlying the transformation from continuous acoustical properties into discrete perceptual units remains undetermined. This gap in knowledge is partially due to the lack of a turnkey solution for isolating critical speech cues from natural stimuli. In this paper, we describe a psychoacoustic imaging method known as the Auditory Classification Image technique that allows experimenters to estimate the relative importance of time-frequency regions in categorizing natural speech utterances in noise. Importantly, this technique enables the testing of hypotheses on the listening strategies of participants at the group level. We exemplify this approach by identifying the acoustic cues involved in da/ga categorization with two phonetic contexts, Al- or Ar-. The application of Auditory Classification Images to our group of 16 participants revealed significant critical regions on the second and third formant onsets, as predicted by the literature, as well as an unexpected temporal cue on the first formant. Finally, through a cluster-based nonparametric test, we demonstrate that this method is sufficiently sensitive to detect fine modifications of the classification strategies between different utterances of the same phoneme.

  6. Auditory and Visual Attention Performance in Children With ADHD: The Attentional Deficiency of ADHD Is Modality Specific.

    Science.gov (United States)

    Lin, Hung-Yu; Hsieh, Hsieh-Chun; Lee, Posen; Hong, Fu-Yuan; Chang, Wen-Dien; Liu, Kuo-Cheng

    2017-08-01

    This study explored auditory and visual attention in children with ADHD. In a randomized, two-period crossover design, 50 children with ADHD and 50 age- and sex-matched typically developing peers were measured with the Test of Various Attention (TOVA). The deficiency of visual attention is more serious than that of auditory attention in children with ADHD. On the auditory modality, only the deficit of attentional inconsistency is sufficient to explain most cases of ADHD; however, most of the children with ADHD suffered from deficits of sustained attention, response inhibition, and attentional inconsistency on the visual modality. Our results also showed that the deficit of attentional inconsistency is the most important indicator in diagnosing and intervening in ADHD when both auditory and visual modalities are considered. The findings provide strong evidence that the deficits of auditory attention are different from those of visual attention in children with ADHD.

  7. Stroke caused auditory attention deficits in children

    Directory of Open Access Journals (Sweden)

    Karla Maria Ibraim da Freiria Elias

    2013-01-01

    Full Text Available OBJECTIVE: To verify the auditory selective attention in children with stroke. METHODS: Dichotic tests of binaural separation (non-verbal and consonant-vowel and binaural integration - digits and Staggered Spondaic Words Test (SSW - were applied in 13 children (7 boys, from 7 to 16 years, with unilateral stroke confirmed by neurological examination and neuroimaging. RESULTS: The attention performance showed significant differences in comparison to the control group in both kinds of tests. In the non-verbal test, identifications the ear opposite the lesion in the free recall stage was diminished and, in the following stages, a difficulty in directing attention was detected. In the consonant- vowel test, a modification in perceptual asymmetry and difficulty in focusing in the attended stages was found. In the digits and SSW tests, ipsilateral, contralateral and bilateral deficits were detected, depending on the characteristics of the lesions and demand of the task. CONCLUSION: Stroke caused auditory attention deficits when dealing with simultaneous sources of auditory information.

  8. Retro-cue benefits in working memory without sustained focal attention.

    Science.gov (United States)

    Rerko, Laura; Souza, Alessandra S; Oberauer, Klaus

    2014-07-01

    In working memory (WM) tasks, performance can be boosted by directing attention to one memory object: When a retro-cue in the retention interval indicates which object will be tested, responding is faster and more accurate (the retro-cue benefit). We tested whether the retro-cue benefit in WM depends on sustained attention to the cued object by inserting an attention-demanding interruption task between the retro-cue and the memory test. In the first experiment, the interruption task required participants to shift their visual attention away from the cued representation and to a visual classification task on colors. In the second and third experiments, the interruption task required participants to shift their focal attention within WM: Attention was directed away from the cued representation by probing another representation from the memory array prior to probing the cued object. The retro-cue benefit was not attenuated by shifts of perceptual attention or by shifts of attention within WM. We concluded that sustained attention is not needed to maintain the cued representation in a state of heightened accessibility.

  9. The role of temporal coherence in auditory stream segregation

    DEFF Research Database (Denmark)

    Christiansen, Simon Krogholt

    The ability to perceptually segregate concurrent sound sources and focus one’s attention on a single source at a time is essential for the ability to use acoustic information. While perceptual experiments have determined a range of acoustic cues that help facilitate auditory stream segregation......, it is not clear how the auditory system realizes the task. This thesis presents a study of the mechanisms involved in auditory stream segregation. Through a combination of psychoacoustic experiments, designed to characterize the influence of acoustic cues on auditory stream formation, and computational models...... of auditory processing, the role of auditory preprocessing and temporal coherence in auditory stream formation was evaluated. The computational model presented in this study assumes that auditory stream segregation occurs when sounds stimulate non-overlapping neural populations in a temporally incoherent...

  10. Visual selective attention in amnestic mild cognitive impairment.

    Science.gov (United States)

    McLaughlin, Paula M; Anderson, Nicole D; Rich, Jill B; Chertkow, Howard; Murtha, Susan J E

    2014-11-01

    Subtle deficits in visual selective attention have been found in amnestic mild cognitive impairment (aMCI). However, few studies have explored performance on visual search paradigms or the Simon task, which are known to be sensitive to disease severity in Alzheimer's patients. Furthermore, there is limited research investigating how deficiencies can be ameliorated with exogenous support (auditory cues). Sixteen individuals with aMCI and 14 control participants completed 3 experimental tasks that varied in demand and cue availability: visual search-alerting, visual search-orienting, and Simon task. Visual selective attention was influenced by aMCI, auditory cues, and task characteristics. Visual search abilities were relatively consistent across groups. The aMCI participants were impaired on the Simon task when working memory was required, but conflict resolution was similar to controls. Spatially informative orienting cues improved response times, whereas spatially neutral alerting cues did not influence performance. Finally, spatially informative auditory cues benefited the aMCI group more than controls in the visual search task, specifically at the largest array size where orienting demands were greatest. These findings suggest that individuals with aMCI have working memory deficits and subtle deficiencies in orienting attention and rely on exogenous information to guide attention. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Cross-modal attention influences auditory contrast sensitivity: Decreasing visual load improves auditory thresholds for amplitude- and frequency-modulated sounds.

    Science.gov (United States)

    Ciaramitaro, Vivian M; Chow, Hiu Mei; Eglington, Luke G

    2017-03-01

    We used a cross-modal dual task to examine how changing visual-task demands influenced auditory processing, namely auditory thresholds for amplitude- and frequency-modulated sounds. Observers had to attend to two consecutive intervals of sounds and report which interval contained the auditory stimulus that was modulated in amplitude (Experiment 1) or frequency (Experiment 2). During auditory-stimulus presentation, observers simultaneously attended to a rapid sequential visual presentation-two consecutive intervals of streams of visual letters-and had to report which interval contained a particular color (low load, demanding less attentional resources) or, in separate blocks of trials, which interval contained more of a target letter (high load, demanding more attentional resources). We hypothesized that if attention is a shared resource across vision and audition, an easier visual task should free up more attentional resources for auditory processing on an unrelated task, hence improving auditory thresholds. Auditory detection thresholds were lower-that is, auditory sensitivity was improved-for both amplitude- and frequency-modulated sounds when observers engaged in a less demanding (compared to a more demanding) visual task. In accord with previous work, our findings suggest that visual-task demands can influence the processing of auditory information on an unrelated concurrent task, providing support for shared attentional resources. More importantly, our results suggest that attending to information in a different modality, cross-modal attention, can influence basic auditory contrast sensitivity functions, highlighting potential similarities between basic mechanisms for visual and auditory attention.

  12. Multiple reward-cue contingencies favor expectancy over uncertainty in shaping the reward-cue attentional salience.

    Science.gov (United States)

    De Tommaso, Matteo; Mastropasqua, Tommaso; Turatto, Massimo

    2018-01-25

    Reward-predicting cues attract attention because of their motivational value. A debated question regards the conditions under which the cue's attentional salience is governed more by reward expectancy rather than by reward uncertainty. To help shedding light on this relevant issue, here, we manipulated expectancy and uncertainty using three levels of reward-cue contingency, so that, for example, a high level of reward expectancy (p = .8) was compared with the highest level of reward uncertainty (p = .5). In Experiment 1, the best reward-cue during conditioning was preferentially attended in a subsequent visual search task. This result was replicated in Experiment 2, in which the cues were matched in terms of response history. In Experiment 3, we implemented a hybrid procedure consisting of two phases: an omission contingency procedure during conditioning, followed by a visual search task as in the previous experiments. Crucially, during both phases, the reward-cues were never task relevant. Results confirmed that, when multiple reward-cue contingencies are explored by a human observer, expectancy is the major factor controlling both the attentional and the oculomotor salience of the reward-cue.

  13. Attention to irrelevant cues is related to positive symptoms in schizophrenia.

    Science.gov (United States)

    Morris, Richard; Griffiths, Oren; Le Pelley, Michael E; Weickert, Thomas W

    2013-05-01

    Many modern learning theories assume that the amount of attention to a cue depends on how well that cue predicted important events in the past. Schizophrenia is associated with deficits in attention and recent theories of psychosis have argued that positive symptoms such as delusions and hallucinations are related to a failure of selective attention. However, evidence demonstrating that attention to irrelevant cues is related to positive symptoms in schizophrenia is lacking. We used a novel method of measuring attention to nonpredictive (and thus irrelevant) cues in a causal learning test (Le Pelley ME, McLaren IP. Learned associability and associative change in human causal learning. Q J Exp Psychol B. 2003;56:68-79) to assess whether healthy adults and people with schizophrenia discriminate previously predictive and nonpredictive cues. In a series of experiments with independent samples, we demonstrated: (1) when people with schizophrenia who had severe positive symptoms successfully distinguished between predictive and nonpredictive cues during training, they failed to discriminate between predictive and nonpredictive cues relative to healthy adults during subsequent testing and (2) learning about nonpredictive cues was correlated with more severe positive symptoms scores in schizophrenia. These results suggest that positive symptoms of schizophrenia are related to increased attention to nonpredictive cues during causal learning. This deficit in selective attention results in learning irrelevant causal associations and may be the basis of positive symptoms in schizophrenia.

  14. The time course of attentional deployment in contextual cueing.

    Science.gov (United States)

    Jiang, Yuhong V; Sigstad, Heather M; Swallow, Khena M

    2013-04-01

    The time course of attention is a major characteristic on which different types of attention diverge. In addition to explicit goals and salient stimuli, spatial attention is influenced by past experience. In contextual cueing, behaviorally relevant stimuli are more quickly found when they appear in a spatial context that has previously been encountered than when they appear in a new context. In this study, we investigated the time that it takes for contextual cueing to develop following the onset of search layout cues. In three experiments, participants searched for a T target in an array of Ls. Each array was consistently associated with a single target location. In a testing phase, we manipulated the stimulus onset asynchrony (SOA) between the repeated spatial layout and the search display. Contextual cueing was equivalent for a wide range of SOAs between 0 and 1,000 ms. The lack of an increase in contextual cueing with increasing cue durations suggests that as an implicit learning mechanism, contextual cueing cannot be effectively used until search begins.

  15. Contributions of Sensory Coding and Attentional Control to Individual Differences in Performance in Spatial Auditory Selective Attention Tasks.

    Science.gov (United States)

    Dai, Lengshi; Shinn-Cunningham, Barbara G

    2016-01-01

    Listeners with normal hearing thresholds (NHTs) differ in their ability to steer attention to whatever sound source is important. This ability depends on top-down executive control, which modulates the sensory representation of sound in the cortex. Yet, this sensory representation also depends on the coding fidelity of the peripheral auditory system. Both of these factors may thus contribute to the individual differences in performance. We designed a selective auditory attention paradigm in which we could simultaneously measure envelope following responses (EFRs, reflecting peripheral coding), onset event-related potentials (ERPs) from the scalp (reflecting cortical responses to sound) and behavioral scores. We performed two experiments that varied stimulus conditions to alter the degree to which performance might be limited due to fine stimulus details vs. due to control of attentional focus. Consistent with past work, in both experiments we find that attention strongly modulates cortical ERPs. Importantly, in Experiment I, where coding fidelity limits the task, individual behavioral performance correlates with subcortical coding strength (derived by computing how the EFR is degraded for fully masked tones compared to partially masked tones); however, in this experiment, the effects of attention on cortical ERPs were unrelated to individual subject performance. In contrast, in Experiment II, where sensory cues for segregation are robust (and thus less of a limiting factor on task performance), inter-subject behavioral differences correlate with subcortical coding strength. In addition, after factoring out the influence of subcortical coding strength, behavioral differences are also correlated with the strength of attentional modulation of ERPs. These results support the hypothesis that behavioral abilities amongst listeners with NHTs can arise due to both subcortical coding differences and differences in attentional control, depending on stimulus characteristics

  16. Contributions of sensory coding and attentional control to individual differences in performance in spatial auditory selective attention tasks

    Directory of Open Access Journals (Sweden)

    Lengshi Dai

    2016-10-01

    Full Text Available Listeners with normal hearing thresholds differ in their ability to steer attention to whatever sound source is important. This ability depends on top-down executive control, which modulates the sensory representation of sound in cortex. Yet, this sensory representation also depends on the coding fidelity of the peripheral auditory system. Both of these factors may thus contribute to the individual differences in performance. We designed a selective auditory attention paradigm in which we could simultaneously measure envelope following responses (EFRs, reflecting peripheral coding, onset event-related potentials from the scalp (ERPs, reflecting cortical responses to sound, and behavioral scores. We performed two experiments that varied stimulus conditions to alter the degree to which performance might be limited due to fine stimulus details vs. due to control of attentional focus. Consistent with past work, in both experiments we find that attention strongly modulates cortical ERPs. Importantly, in Experiment I, where coding fidelity limits the task, individual behavioral performance correlates with subcortical coding strength (derived by computing how the EFR is degraded for fully masked tones compared to partially masked tones; however, in this experiment, the effects of attention on cortical ERPs were unrelated to individual subject performance. In contrast, in Experiment II, where sensory cues for segregation are robust (and thus less of a limiting factor on task performance, inter-subject behavioral differences correlate with subcortical coding strength. In addition, after factoring out the influence of subcortical coding strength, behavioral differences are also correlated with the strength of attentional modulation of ERPs. These results support the hypothesis that behavioral abilities amongst listeners with normal hearing thresholds can arise due to both subcortical coding differences and differences in attentional control, depending on

  17. Matching cue size and task properties in exogenous attention.

    Science.gov (United States)

    Burnett, Katherine E; d'Avossa, Giovanni; Sapir, Ayelet

    2013-01-01

    Exogenous attention is an involuntary, reflexive orienting response that results in enhanced processing at the attended location. The standard view is that this enhancement generalizes across visual properties of a stimulus. We test whether the size of an exogenous cue sets the attentional field and whether this leads to different effects on stimuli with different visual properties. In a dual task with a random-dot kinematogram (RDK) in each quadrant of the screen, participants discriminated the direction of moving dots in one RDK and localized one red dot. Precues were uninformative and consisted of either a large or a small luminance-change frame. The motion discrimination task showed attentional effects following both large and small exogenous cues. The red dot probe localization task showed attentional effects following a small cue, but not a large cue. Two additional experiments showed that the different effects on localization were not due to reduced spatial uncertainty or suppression of RDK dots in the surround. These results indicate that the effects of exogenous attention depend on the size of the cue and the properties of the task, suggesting the involvement of receptive fields with different sizes in different tasks. These attentional effects are likely to be driven by bottom-up mechanisms in early visual areas.

  18. Association of blood antioxidants status with visual and auditory sustained attention.

    Science.gov (United States)

    Shiraseb, Farideh; Siassi, Fereydoun; Sotoudeh, Gity; Qorbani, Mostafa; Rostami, Reza; Sadeghi-Firoozabadi, Vahid; Narmaki, Elham

    2015-01-01

    A low antioxidants status has been shown to result in oxidative stress and cognitive impairment. Because antioxidants can protect the nervous system, it is expected that a better blood antioxidant status might be related to sustained attention. However, the relationship between the blood antioxidant status and visual and auditory sustained attention has not been investigated. The aim of this study was to evaluate the association of fruits and vegetables intake and the blood antioxidant status with visual and auditory sustained attention in women. This cross-sectional study was performed on 400 healthy women (20-50 years) who attended the sports clubs of Tehran Municipality. Sustained attention was evaluated based on the Integrated Visual and Auditory Continuous Performance Test using the Integrated Visual and Auditory (IVA) software. The 24-hour food recall questionnaire was used for estimating fruits and vegetables intake. Serum total antioxidant capacity (TAC), and erythrocyte superoxide dismutase (SOD) and glutathione peroxidase (GPx) activities were measured in 90 participants. After adjusting for energy intake, age, body mass index (BMI), years of education and physical activity, higher reported fruits, and vegetables intake was associated with better visual and auditory sustained attention (P attention (P visual and auditory sustained attention after adjusting for age, years of education, physical activity, energy, BMI, and caffeine intake (P visual and auditory sustained attention is associated with a better blood antioxidant status. Therefore, improvement of the antioxidant status through an appropriate dietary intake can possibly enhance sustained attention.

  19. Auditory-Cortex Short-Term Plasticity Induced by Selective Attention

    Science.gov (United States)

    Jääskeläinen, Iiro P.; Ahveninen, Jyrki

    2014-01-01

    The ability to concentrate on relevant sounds in the acoustic environment is crucial for everyday function and communication. Converging lines of evidence suggests that transient functional changes in auditory-cortex neurons, “short-term plasticity”, might explain this fundamental function. Under conditions of strongly focused attention, enhanced processing of attended sounds can take place at very early latencies (~50 ms from sound onset) in primary auditory cortex and possibly even at earlier latencies in subcortical structures. More robust selective-attention short-term plasticity is manifested as modulation of responses peaking at ~100 ms from sound onset in functionally specialized nonprimary auditory-cortical areas by way of stimulus-specific reshaping of neuronal receptive fields that supports filtering of selectively attended sound features from task-irrelevant ones. Such effects have been shown to take effect in ~seconds following shifting of attentional focus. There are findings suggesting that the reshaping of neuronal receptive fields is even stronger at longer auditory-cortex response latencies (~300 ms from sound onset). These longer-latency short-term plasticity effects seem to build up more gradually, within tens of seconds after shifting the focus of attention. Importantly, some of the auditory-cortical short-term plasticity effects observed during selective attention predict enhancements in behaviorally measured sound discrimination performance. PMID:24551458

  20. Auditory and visual sustained attention in Down syndrome.

    Science.gov (United States)

    Faught, Gayle G; Conners, Frances A; Himmelberger, Zachary M

    2016-01-01

    Sustained attention (SA) is important to task performance and development of higher functions. It emerges as a separable component of attention during preschool and shows incremental improvements during this stage of development. The current study investigated if auditory and visual SA match developmental level or are particular challenges for youth with DS. Further, we sought to determine if there were modality effects in SA that could predict those seen in short-term memory (STM). We compared youth with DS to typically developing youth matched for nonverbal mental age and receptive vocabulary. Groups completed auditory and visual sustained attention to response tests (SARTs) and STM tasks. Results indicated groups performed similarly on both SARTs, even over varying cognitive ability. Further, within groups participants performed similarly on auditory and visual SARTs, thus SA could not predict modality effects in STM. However, SA did generally predict a significant portion of unique variance in groups' STM. Ultimately, results suggested both auditory and visual SA match developmental level in DS. Further, SA generally predicts STM, though SA does not necessarily predict the pattern of poor auditory relative to visual STM characteristic of DS. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Auditory attention: time of day and type of school

    Directory of Open Access Journals (Sweden)

    Picolini, Mirela Machado

    2010-06-01

    Full Text Available Introduction: The sustained auditory attention is crucial for the development of some communication skills and learning. Objective: To evaluate the effect of time of day and type of school attended by children in their ability to sustained auditory attention. Method: We performed a prospective study of 50 volunteer children of both sexes, aged 7 years, with normal hearing, no learning or behavioral problems and no complaints of attention. These participants underwent Ability Test of Sustained Auditory Attention (SAAAT. The performance was evaluated by total score and the decrease of vigilance. Statistical analysis was used to analysis of variance (ANOVA with significance level of 5% (p<0.05. Results: The result set by the normative test for the age group evaluated showed a statistically significant difference for the errors of inattention (p=0.041, p=0.027 and total error score (p=0.033, p=0.024, in different periods assessment and school types, respectively. Conclusion: Children evaluated in the afternoon and the children studying in public schools had a poorer performance on auditory attention sustained.

  2. Superior pre-attentive auditory processing in musicians.

    Science.gov (United States)

    Koelsch, S; Schröger, E; Tervaniemi, M

    1999-04-26

    The present study focuses on influences of long-term experience on auditory processing, providing the first evidence for pre-attentively superior auditory processing in musicians. This was revealed by the brain's automatic change-detection response, which is reflected electrically as the mismatch negativity (MMN) and generated by the operation of sensoric (echoic) memory, the earliest cognitive memory system. Major chords and single tones were presented to both professional violinists and non-musicians under ignore and attend conditions. Slightly impure chords, presented among perfect major chords elicited a distinct MMN in professional musicians, but not in non-musicians. This demonstrates that compared to non-musicians, musicians are superior in pre-attentively extracting more information out of musically relevant stimuli. Since effects of long-term experience on pre-attentive auditory processing have so far been reported for language-specific phonemes only, results indicate that sensory memory mechanisms can be modulated by training on a more general level.

  3. Emotional cues enhance the attentional effects on spatial and temporal resolution.

    Science.gov (United States)

    Bocanegra, Bruno R; Zeelenberg, René

    2011-12-01

    In the present study, we demonstrated that the emotional significance of a spatial cue enhances the effect of covert attention on spatial and temporal resolution (i.e., our ability to discriminate small spatial details and fast temporal flicker). Our results indicated that fearful face cues, as compared with neutral face cues, enhanced the attentional benefits in spatial resolution but also enhanced the attentional deficits in temporal resolution. Furthermore, we observed that the overall magnitudes of individuals' attentional effects correlated strongly with the magnitude of the emotion × attention interaction effect. Combined, these findings provide strong support for the idea that emotion enhances the strength of a cue's attentional response.

  4. Grabbing attention without knowing: Automatic capture of attention by subliminal cues.

    NARCIS (Netherlands)

    Mulckhuijse, M.G.J.; Talsma, D.; Theeuwes, J.

    2007-01-01

    The present study shows that an abrupt onset cue that is not consciously perceived can cause attentional facilitation followed by inhibition at the cued location. The observation of this classic biphasic effect of facilitation followed by inhibition of return (IOR) suggests that the subliminal cue

  5. Independent effects of bottom-up temporal expectancy and top-down spatial attention. An audiovisual study using rhythmic cueing.

    Directory of Open Access Journals (Sweden)

    Alexander eJones

    2015-01-01

    Full Text Available Selective attention to a spatial location has shown enhance perception and facilitate behaviour for events at attended locations. However, selection relies not only on where but also when an event occurs. Recently, interest has turned to how intrinsic neural oscillations in the brain entrain to rhythms in our environment, and, stimuli appearing in or out of synch with a rhythm have shown to modulate perception and performance. Temporal expectations created by rhythms and spatial attention are two processes which have independently shown to affect stimulus processing but it remains largely unknown how, and if, they interact. In four separate tasks, this study investigated the effects of voluntary spatial attention and bottom-up temporal expectations created by rhythms in both unimodal and crossmodal conditions. In each task the participant used an informative cue, either colour or pitch, to direct their covert spatial attention to the left or right, and respond as quickly as possible to a target. The lateralized target (visual or auditory was then presented at the attended or unattended side. Importantly, although not task relevant, the cue was a rhythm of either flashes or beeps. The target was presented in or out of sync (early or late with the rhythmic cue. The results showed participants were faster responding to spatially attended compared to unattended targets in all tasks. Moreover, there was an effect of rhythmic cueing upon response times in both unimodal and crossmodal conditions. Responses were faster to targets presented in sync with the rhythm compared to when they appeared too early in both crossmodal tasks. That is, rhythmic stimuli in one modality influenced the temporal expectancy in the other modality, suggesting temporal expectancies created by rhythms are crossmodal. Interestingly, there was no interaction between top-down spatial attention and rhythmic cueing in any task suggesting these two processes largely influenced

  6. Higher dietary diversity is related to better visual and auditory sustained attention.

    Science.gov (United States)

    Shiraseb, Farideh; Siassi, Fereydoun; Qorbani, Mostafa; Sotoudeh, Gity; Rostami, Reza; Narmaki, Elham; Yavari, Parvaneh; Aghasi, Mohadeseh; Shaibu, Osman Mohammed

    2016-04-01

    Attention is a complex cognitive function that is necessary for learning, for following social norms of behaviour and for effective performance of responsibilities and duties. It is especially important in sensitive occupations requiring sustained attention. Improvement of dietary diversity (DD) is recognised as an important factor in health promotion, but its association with sustained attention is unknown. The aim of this study was to determine the association between auditory and visual sustained attention and DD. A cross-sectional study was carried out on 400 women aged 20-50 years who attended sports clubs at Tehran Municipality. Sustained attention was evaluated on the basis of the Integrated Visual and Auditory Continuous Performance Test using Integrated Visual and Auditory software. A single 24-h dietary recall questionnaire was used for DD assessment. Dietary diversity scores (DDS) were determined using the FAO guidelines. The mean visual and auditory sustained attention scores were 40·2 (sd 35·2) and 42·5 (sd 38), respectively. The mean DDS was 4·7 (sd 1·5). After adjusting for age, education years, physical activity, energy intake and BMI, mean visual and auditory sustained attention showed a significant increase as the quartiles of DDS increased (P=0·001). In addition, the mean subscales of attention, including auditory consistency and vigilance, visual persistence, visual and auditory focus, speed, comprehension and full attention, increased significantly with increasing DDS (Pvisual and auditory sustained attention.

  7. Heavy drinking, impulsivity and attentional narrowing following alcohol cue exposure.

    Science.gov (United States)

    Hicks, Joshua A; Fields, Sherecce; Davis, William E; Gable, Philip A

    2015-08-01

    Research shows that alcohol-related stimuli have the propensity to capture attention among individuals motivated to consume alcohol. Research has further demonstrated that impulsive individuals are especially prone to this type of attentional bias. Recently, it is suggested that alcohol cue exposure can also produce a general narrowing of attention consistent with the activation of approach motivational states. Based on previous models of addiction and recent research on the activation of approach motivational states, we predicted that impulsive individuals would demonstrate a constriction of attentional focus in response to alcohol cue exposure. Participants (n = 392) completed a task assessing attentional breadth in response to alcohol and non-alcohol cues, followed by measures of alcohol use and impulsivity. The findings revealed that impulsivity scores predicted narrowing of attentional scope following the presentation of alcohol cues for heavier drinkers but not for light drinkers. These results suggest that impulsive individuals who drink more heavily demonstrate a narrowing of attention in the presence of alcohol-related incentive cues. Implications for how these findings might account for the link between impulsivity and alcohol use and misuse are discussed.

  8. Self-Regulation of the Primary Auditory Cortex Attention Via Directed Attention Mediated By Real Time fMRI Neurofeedback

    Science.gov (United States)

    2017-05-05

    NELSON FROM: 59 MDW /SGYU SUBJECT: Professional Presentation Approval 1. Your paper, entitled Self - regulation of the Primary Auditory Cortex Attention via...DATE Sherwood - p.1 Self - regulation of the primary auditory cortex attention via directed attention mediated by real-time fMRI neurofeedback M S...auditory cortex hyperactivity by self - regulation of the primary auditory cortex (A 1) based on real-time functional magnetic resonance imaging neurofeedback

  9. Attentional Bias for Uncertain Cues of Shock in Human Fear Conditioning: Evidence for Attentional Learning Theory

    Science.gov (United States)

    Koenig, Stephan; Uengoer, Metin; Lachnit, Harald

    2017-01-01

    We conducted a human fear conditioning experiment in which three different color cues were followed by an aversive electric shock on 0, 50, and 100% of the trials, and thus induced low (L), partial (P), and high (H) shock expectancy, respectively. The cues differed with respect to the strength of their shock association (L H). During conditioning we measured pupil dilation and ocular fixations to index differences in the attentional processing of the cues. After conditioning, the shock-associated colors were introduced as irrelevant distracters during visual search for a shape target while shocks were no longer administered and we analyzed the cues’ potential to capture and hold overt attention automatically. Our findings suggest that fear conditioning creates an automatic attention bias for the conditioned cues that depends on their correlation with the aversive outcome. This bias was exclusively linked to the strength of the cues’ shock association for the early attentional processing of cues in the visual periphery, but additionally was influenced by the uncertainty of the shock prediction after participants fixated on the cues. These findings are in accord with attentional learning theories that formalize how associative learning shapes automatic attention. PMID:28588466

  10. Stimulus-driven attentional capture by subliminal onset cues

    NARCIS (Netherlands)

    Schoeberl, T.; Fuchs, I.; Theeuwes, J.; Ansorge, U.

    2015-01-01

    In two experiments, we tested whether subliminal abrupt onset cues capture attention in a stimulus-driven way. An onset cue was presented 16 ms prior to the stimulus display that consisted of clearly visible color targets. The onset cue was presented either at the same side as the target (the valid

  11. Thalamic and parietal brain morphology predicts auditory category learning.

    Science.gov (United States)

    Scharinger, Mathias; Henry, Molly J; Erb, Julia; Meyer, Lars; Obleser, Jonas

    2014-01-01

    Auditory categorization is a vital skill involving the attribution of meaning to acoustic events, engaging domain-specific (i.e., auditory) as well as domain-general (e.g., executive) brain networks. A listener's ability to categorize novel acoustic stimuli should therefore depend on both, with the domain-general network being particularly relevant for adaptively changing listening strategies and directing attention to relevant acoustic cues. Here we assessed adaptive listening behavior, using complex acoustic stimuli with an initially salient (but later degraded) spectral cue and a secondary, duration cue that remained nondegraded. We employed voxel-based morphometry (VBM) to identify cortical and subcortical brain structures whose individual neuroanatomy predicted task performance and the ability to optimally switch to making use of temporal cues after spectral degradation. Behavioral listening strategies were assessed by logistic regression and revealed mainly strategy switches in the expected direction, with considerable individual differences. Gray-matter probability in the left inferior parietal lobule (BA 40) and left precentral gyrus was predictive of "optimal" strategy switch, while gray-matter probability in thalamic areas, comprising the medial geniculate body, co-varied with overall performance. Taken together, our findings suggest that successful auditory categorization relies on domain-specific neural circuits in the ascending auditory pathway, while adaptive listening behavior depends more on brain structure in parietal cortex, enabling the (re)direction of attention to salient stimulus properties. © 2013 Published by Elsevier Ltd.

  12. Attentional Bias for Uncertain Cues of Shock in Human Fear Conditioning: Evidence for Attentional Learning Theory

    Directory of Open Access Journals (Sweden)

    Stephan Koenig

    2017-05-01

    Full Text Available We conducted a human fear conditioning experiment in which three different color cues were followed by an aversive electric shock on 0, 50, and 100% of the trials, and thus induced low (L, partial (P, and high (H shock expectancy, respectively. The cues differed with respect to the strength of their shock association (L < P < H and the uncertainty of their prediction (L < P > H. During conditioning we measured pupil dilation and ocular fixations to index differences in the attentional processing of the cues. After conditioning, the shock-associated colors were introduced as irrelevant distracters during visual search for a shape target while shocks were no longer administered and we analyzed the cues’ potential to capture and hold overt attention automatically. Our findings suggest that fear conditioning creates an automatic attention bias for the conditioned cues that depends on their correlation with the aversive outcome. This bias was exclusively linked to the strength of the cues’ shock association for the early attentional processing of cues in the visual periphery, but additionally was influenced by the uncertainty of the shock prediction after participants fixated on the cues. These findings are in accord with attentional learning theories that formalize how associative learning shapes automatic attention.

  13. Attentional modulation of auditory steady-state responses.

    Science.gov (United States)

    Mahajan, Yatin; Davis, Chris; Kim, Jeesun

    2014-01-01

    Auditory selective attention enables task-relevant auditory events to be enhanced and irrelevant ones suppressed. In the present study we used a frequency tagging paradigm to investigate the effects of attention on auditory steady state responses (ASSR). The ASSR was elicited by simultaneously presenting two different streams of white noise, amplitude modulated at either 16 and 23.5 Hz or 32.5 and 40 Hz. The two different frequencies were presented to each ear and participants were instructed to selectively attend to one ear or the other (confirmed by behavioral evidence). The results revealed that modulation of ASSR by selective attention depended on the modulation frequencies used and whether the activation was contralateral or ipsilateral. Attention enhanced the ASSR for contralateral activation from either ear for 16 Hz and suppressed the ASSR for ipsilateral activation for 16 Hz and 23.5 Hz. For modulation frequencies of 32.5 or 40 Hz attention did not affect the ASSR. We propose that the pattern of enhancement and inhibition may be due to binaural suppressive effects on ipsilateral stimulation and the dominance of contralateral hemisphere during dichotic listening. In addition to the influence of cortical processing asymmetries, these results may also reflect a bias towards inhibitory ipsilateral and excitatory contralateral activation present at the level of inferior colliculus. That the effect of attention was clearest for the lower modulation frequencies suggests that such effects are likely mediated by cortical brain structures or by those in close proximity to cortex.

  14. Psychometric properties of Persian version of the Sustained Auditory Attention Capacity Test in children with attention deficit-hyperactivity disorder.

    Science.gov (United States)

    Soltanparast, Sanaz; Jafari, Zahra; Sameni, Seyed Jalal; Salehi, Masoud

    2014-01-01

    The purpose of the present study was to evaluate the psychometric properties (validity and reliability) of the Persian version of the Sustained Auditory Attention Capacity Test in children with attention deficit hyperactivity disorder. The Persian version of the Sustained Auditory Attention Capacity Test was constructed to assess sustained auditory attention using the method provided by Feniman and colleagues (2007). In this test, comments were provided to assess the child's attentional deficit by determining inattention and impulsiveness error, the total scores of the sustained auditory attention capacity test and attention span reduction index. In the present study for determining the validity and reliability of in both Rey Auditory Verbal Learning test and the Persian version of the Sustained Auditory Attention Capacity Test (SAACT), 46 normal children and 41 children with Attention Deficit Hyperactivity (ADHD), all right-handed and aged between 7 and 11 of both genders, were evaluated. In determining convergent validity, a negative significant correlation was found between the three parts of the Rey Auditory Verbal Learning test (first, fifth, and immediate recall) and all indicators of the SAACT except attention span reduction. By comparing the test scores between the normal and ADHD groups, discriminant validity analysis showed significant differences in all indicators of the test except for attention span reduction (pAttention Capacity test has good validity and reliability, that matches other reliable tests, and it can be used for the identification of children with attention deficits and if they suspected to have Attention Deficit Hyperactivity Disorder.

  15. Sustained selective attention to competing amplitude-modulations in human auditory cortex.

    Science.gov (United States)

    Riecke, Lars; Scharke, Wolfgang; Valente, Giancarlo; Gutschalk, Alexander

    2014-01-01

    Auditory selective attention plays an essential role for identifying sounds of interest in a scene, but the neural underpinnings are still incompletely understood. Recent findings demonstrate that neural activity that is time-locked to a particular amplitude-modulation (AM) is enhanced in the auditory cortex when the modulated stream of sounds is selectively attended to under sensory competition with other streams. However, the target sounds used in the previous studies differed not only in their AM, but also in other sound features, such as carrier frequency or location. Thus, it remains uncertain whether the observed enhancements reflect AM-selective attention. The present study aims at dissociating the effect of AM frequency on response enhancement in auditory cortex by using an ongoing auditory stimulus that contains two competing targets differing exclusively in their AM frequency. Electroencephalography results showed a sustained response enhancement for auditory attention compared to visual attention, but not for AM-selective attention (attended AM frequency vs. ignored AM frequency). In contrast, the response to the ignored AM frequency was enhanced, although a brief trend toward response enhancement occurred during the initial 15 s. Together with the previous findings, these observations indicate that selective enhancement of attended AMs in auditory cortex is adaptive under sustained AM-selective attention. This finding has implications for our understanding of cortical mechanisms for feature-based attentional gain control.

  16. Sustained Selective Attention to Competing Amplitude-Modulations in Human Auditory Cortex

    Science.gov (United States)

    Riecke, Lars; Scharke, Wolfgang; Valente, Giancarlo; Gutschalk, Alexander

    2014-01-01

    Auditory selective attention plays an essential role for identifying sounds of interest in a scene, but the neural underpinnings are still incompletely understood. Recent findings demonstrate that neural activity that is time-locked to a particular amplitude-modulation (AM) is enhanced in the auditory cortex when the modulated stream of sounds is selectively attended to under sensory competition with other streams. However, the target sounds used in the previous studies differed not only in their AM, but also in other sound features, such as carrier frequency or location. Thus, it remains uncertain whether the observed enhancements reflect AM-selective attention. The present study aims at dissociating the effect of AM frequency on response enhancement in auditory cortex by using an ongoing auditory stimulus that contains two competing targets differing exclusively in their AM frequency. Electroencephalography results showed a sustained response enhancement for auditory attention compared to visual attention, but not for AM-selective attention (attended AM frequency vs. ignored AM frequency). In contrast, the response to the ignored AM frequency was enhanced, although a brief trend toward response enhancement occurred during the initial 15 s. Together with the previous findings, these observations indicate that selective enhancement of attended AMs in auditory cortex is adaptive under sustained AM-selective attention. This finding has implications for our understanding of cortical mechanisms for feature-based attentional gain control. PMID:25259525

  17. Interhemispheric interaction expands attentional capacity in an auditory selective attention task.

    Science.gov (United States)

    Scalf, Paige E; Banich, Marie T; Erickson, Andrew B

    2009-04-01

    Previous work from our laboratory indicates that interhemispheric interaction (IHI) functionally increases the attentional capacity available to support performance on visual tasks (Banich in The asymmetrical brain, pp 261-302, 2003). Because manipulations of both computational complexity and selection demand alter the benefits of IHI to task performance, we argue that IHI may be a general strategy for meeting increases in attentional demand. Other researchers, however, have suggested that the apparent benefits of IHI to attentional capacity are an epiphenomenon of the organization of the visual system (Fecteau and Enns in Neuropsychologia 43:1412-1428, 2005; Marsolek et al. in Neuropsychologia 40:1983-1999, 2002). In the current experiment, we investigate whether IHI increases attentional capacity outside the visual system by manipulating the selection demands of an auditory temporal pattern-matching task. We find that IHI expands attentional capacity in the auditory system. This suggests that the benefits of requiring IHI derive from a functional increase in attentional capacity rather than the organization of a specific sensory modality.

  18. Attentional bias in smokers: exposure to dynamic smoking cues in contemporary movies.

    Science.gov (United States)

    Lochbuehler, Kirsten; Voogd, Hubert; Scholte, Ron H J; Engels, Rutger C M E

    2011-04-01

    Research has shown that smokers have an attentional bias for pictorial smoking cues. The objective of the present study was to examine whether smokers also have an attentional bias for dynamic smoking cues in contemporary movies and therefore fixate more quickly, more often and for longer periods of time on dynamic smoking cues than non-smokers. By drawing upon established methods for assessing attentional biases for pictorial cues, we aimed to develop a new method for assessing attentional biases for dynamic smoking cues. We examined smokers' and non-smokers' eye movements while watching a movie clip by using eye-tracking technology. The sample consisted of 16 smoking and 17 non-smoking university students. Our results confirm the results of traditional pictorial attentional bias research. Smokers initially directed their gaze more quickly towards smoking-related cues (p = 0.01), focusing on them more often (p = 0.05) and for a longer duration (p = 0.01) compared with non-smokers. Thus, smoking cues in movies directly affect the attention of smokers. These findings indicate that the effects of dynamic smoking cues, in addition to other environmental smoking cues, need to be taken into account in smoking cessation therapies in order to increase successful smoking cessation and to prevent relapses.

  19. Modification of sudden onset auditory ERP by involuntary attention to visual stimuli.

    Science.gov (United States)

    Oray, Serkan; Lu, Zhong-Lin; Dawson, Michael E

    2002-03-01

    To investigate the cross-modal nature of the exogenous attention system, we studied how involuntary attention in the visual modality affects ERPs elicited by sudden onset of events in the auditory modality. Relatively loud auditory white noise bursts were presented to subjects with random and long inter-trial intervals. The noise bursts were either presented alone, or paired with a visual stimulus with a visual to auditory onset asynchrony of 120 ms. In a third condition, the visual stimuli were shown alone. All three conditions, auditory alone, visual alone, and paired visual/auditory, were randomly inter-mixed and presented with equal probabilities. Subjects were instructed to fixate on a point in front of them without task instructions concerning either the auditory or visual stimuli. ERPs were recorded from 28 scalp sites throughout every experimental session. Compared to ERPs in the auditory alone condition, pairing the auditory noise bursts with the visual stimulus reduced the amplitude of the auditory N100 component at Cz by 40% and the auditory P200/P300 component at Cz by 25%. No significant topographical change was observed in the scalp distributions of the N100 and P200/P300. Our results suggest that involuntary attention to visual stimuli suppresses early sensory (N100) as well as late cognitive (P200/P300) processing of sudden auditory events. The activation of the exogenous attention system by sudden auditory onset can be modified by involuntary visual attention in a cross-model, passive prepulse inhibition paradigm.

  20. Grabbing attention without knowing: Automatic capture of attention by subliminal spatial cues

    NARCIS (Netherlands)

    Mulckhuyse, Manon; Talsma, D.; Theeuwes, Jan

    2007-01-01

    The present study shows that an abrupt onset cue that is not consciously perceived can cause attentional facilitation followed by inhibition at the cued location. The observation of this classic biphasic effect of facilitation followed by inhibition of return (IOR) suggests that the subliminal cue

  1. Top down modulation of attention to food cues via working memory.

    Science.gov (United States)

    Higgs, Suzanne; Rutters, Femke; Thomas, Jason M; Naish, Katherine; Humphreys, Glyn W

    2012-08-01

    Attentional biases towards food cues may be linked to the development of obesity. The present study investigated the mechanisms underlying attentional biases to food cues by assessing the role of top down influences, such as working memory (WM). We assessed whether attention in normal-weight, sated participants was drawn to food items specifically when that food item was held in WM. Twenty-three participants (15 f/8 m, age 23.4±5 year, BMI 23.5±4 kg/m(2)) took part in a laboratory based study assessing reaction times to food and non-food stimuli. Participants were presented with an initial cue stimulus to either hold in WM or to merely attend to, and then searched for the target (a circle) in a two-item display. On valid trials the target was flanked by a picture matching the cue, on neutral trials the display did not contain a picture matching the cue, and on invalid trials the distractor (a square) was flanked by a picture matching the cue. Cues were food, cars or stationery items. We observed that, relative to the effects with non-food stimuli, food items in WM strongly affected attention when the memorised cue re-appeared in the search display. In particular there was an enhanced response on valid trials, when the re-appearance of the memorised cue coincided with the search target. There were no effects of cue category on attentional guidance when the cues were merely attended to but not held in WM. These data point towards food having a strong effect on top-down guidance of search from working memory, and suggest a mechanism whereby individuals who are preoccupied with thoughts of food, for example obese individuals, show facilitated detection of food cues in the environment. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Attentional Modulation of Auditory Steady-State Responses

    Science.gov (United States)

    Mahajan, Yatin; Davis, Chris; Kim, Jeesun

    2014-01-01

    Auditory selective attention enables task-relevant auditory events to be enhanced and irrelevant ones suppressed. In the present study we used a frequency tagging paradigm to investigate the effects of attention on auditory steady state responses (ASSR). The ASSR was elicited by simultaneously presenting two different streams of white noise, amplitude modulated at either 16 and 23.5 Hz or 32.5 and 40 Hz. The two different frequencies were presented to each ear and participants were instructed to selectively attend to one ear or the other (confirmed by behavioral evidence). The results revealed that modulation of ASSR by selective attention depended on the modulation frequencies used and whether the activation was contralateral or ipsilateral. Attention enhanced the ASSR for contralateral activation from either ear for 16 Hz and suppressed the ASSR for ipsilateral activation for 16 Hz and 23.5 Hz. For modulation frequencies of 32.5 or 40 Hz attention did not affect the ASSR. We propose that the pattern of enhancement and inhibition may be due to binaural suppressive effects on ipsilateral stimulation and the dominance of contralateral hemisphere during dichotic listening. In addition to the influence of cortical processing asymmetries, these results may also reflect a bias towards inhibitory ipsilateral and excitatory contralateral activation present at the level of inferior colliculus. That the effect of attention was clearest for the lower modulation frequencies suggests that such effects are likely mediated by cortical brain structures or by those in close proximity to cortex. PMID:25334021

  3. Attentional modulation of auditory steady-state responses.

    Directory of Open Access Journals (Sweden)

    Yatin Mahajan

    Full Text Available Auditory selective attention enables task-relevant auditory events to be enhanced and irrelevant ones suppressed. In the present study we used a frequency tagging paradigm to investigate the effects of attention on auditory steady state responses (ASSR. The ASSR was elicited by simultaneously presenting two different streams of white noise, amplitude modulated at either 16 and 23.5 Hz or 32.5 and 40 Hz. The two different frequencies were presented to each ear and participants were instructed to selectively attend to one ear or the other (confirmed by behavioral evidence. The results revealed that modulation of ASSR by selective attention depended on the modulation frequencies used and whether the activation was contralateral or ipsilateral. Attention enhanced the ASSR for contralateral activation from either ear for 16 Hz and suppressed the ASSR for ipsilateral activation for 16 Hz and 23.5 Hz. For modulation frequencies of 32.5 or 40 Hz attention did not affect the ASSR. We propose that the pattern of enhancement and inhibition may be due to binaural suppressive effects on ipsilateral stimulation and the dominance of contralateral hemisphere during dichotic listening. In addition to the influence of cortical processing asymmetries, these results may also reflect a bias towards inhibitory ipsilateral and excitatory contralateral activation present at the level of inferior colliculus. That the effect of attention was clearest for the lower modulation frequencies suggests that such effects are likely mediated by cortical brain structures or by those in close proximity to cortex.

  4. The auditory attention status in Iranian bilingual and monolingual people

    Directory of Open Access Journals (Sweden)

    Nayiere Mansoori

    2013-05-01

    Full Text Available Background and Aim: Bilingualism, as one of the discussing issues of psychology and linguistics, can influence the speech processing. Of several tests for assessing auditory processing, dichotic digit test has been designed to study divided auditory attention. Our study was performed to compare the auditory attention between Iranian bilingual and monolingual young adults. Methods: This cross-sectional study was conducted on 60 students including 30 Turkish-Persian bilinguals and 30 Persian monolinguals aged between 18 to 30 years in both genders. Dichotic digit test was performed on young individuals with normal peripheral hearing and right hand preference. Results: No significant correlation was found between the results of dichotic digit test of monolinguals and bilinguals (p=0.195, and also between the results of right and left ears in monolingual (p=0.460 and bilingual (p=0.054 groups. The mean score of women was significantly more than men (p=0.031. Conclusion: There was no significant difference between bilinguals and monolinguals in divided auditory attention; and it seems that acquisition of second language in lower ages has no noticeable effect on this type of auditory attention.

  5. Neural dynamics underlying attentional orienting to auditory representations in short-term memory.

    Science.gov (United States)

    Backer, Kristina C; Binns, Malcolm A; Alain, Claude

    2015-01-21

    Sounds are ephemeral. Thus, coherent auditory perception depends on "hearing" back in time: retrospectively attending that which was lost externally but preserved in short-term memory (STM). Current theories of auditory attention assume that sound features are integrated into a perceptual object, that multiple objects can coexist in STM, and that attention can be deployed to an object in STM. Recording electroencephalography from humans, we tested these assumptions, elucidating feature-general and feature-specific neural correlates of auditory attention to STM. Alpha/beta oscillations and frontal and posterior event-related potentials indexed feature-general top-down attentional control to one of several coexisting auditory representations in STM. Particularly, task performance during attentional orienting was correlated with alpha/low-beta desynchronization (i.e., power suppression). However, attention to one feature could occur without simultaneous processing of the second feature of the representation. Therefore, auditory attention to memory relies on both feature-specific and feature-general neural dynamics. Copyright © 2015 the authors 0270-6474/15/351307-12$15.00/0.

  6. Estimating the relative weights of visual and auditory tau versus heuristic-based cues for time-to-contact judgments in realistic, familiar scenes by older and younger adults.

    Science.gov (United States)

    Keshavarz, Behrang; Campos, Jennifer L; DeLucia, Patricia R; Oberfeld, Daniel

    2017-04-01

    Estimating time to contact (TTC) involves multiple sensory systems, including vision and audition. Previous findings suggested that the ratio of an object's instantaneous optical size/sound intensity to its instantaneous rate of change in optical size/sound intensity (τ) drives TTC judgments. Other evidence has shown that heuristic-based cues are used, including final optical size or final sound pressure level. Most previous studies have used decontextualized and unfamiliar stimuli (e.g., geometric shapes on a blank background). Here we evaluated TTC estimates by using a traffic scene with an approaching vehicle to evaluate the weights of visual and auditory TTC cues under more realistic conditions. Younger (18-39 years) and older (65+ years) participants made TTC estimates in three sensory conditions: visual-only, auditory-only, and audio-visual. Stimuli were presented within an immersive virtual-reality environment, and cue weights were calculated for both visual cues (e.g., visual τ, final optical size) and auditory cues (e.g., auditory τ, final sound pressure level). The results demonstrated the use of visual τ as well as heuristic cues in the visual-only condition. TTC estimates in the auditory-only condition, however, were primarily based on an auditory heuristic cue (final sound pressure level), rather than on auditory τ. In the audio-visual condition, the visual cues dominated overall, with the highest weight being assigned to visual τ by younger adults, and a more equal weighting of visual τ and heuristic cues in older adults. Overall, better characterizing the effects of combined sensory inputs, stimulus characteristics, and age on the cues used to estimate TTC will provide important insights into how these factors may affect everyday behavior.

  7. The spectrotemporal filter mechanism of auditory selective attention

    Science.gov (United States)

    Lakatos, Peter; Musacchia, Gabriella; O’Connell, Monica N.; Falchier, Arnaud Y.; Javitt, Daniel C.; Schroeder, Charles E.

    2013-01-01

    SUMMARY While we have convincing evidence that attention to auditory stimuli modulates neuronal responses at or before the level of primary auditory cortex (A1), the underlying physiological mechanisms are unknown. We found that attending to rhythmic auditory streams resulted in the entrainment of ongoing oscillatory activity reflecting rhythmic excitability fluctuations in A1. Strikingly, while the rhythm of the entrained oscillations in A1 neuronal ensembles reflected the temporal structure of the attended stream, the phase depended on the attended frequency content. Counter-phase entrainment across differently tuned A1 regions resulted in both the amplification and sharpening of responses at attended time points, in essence acting as a spectrotemporal filter mechanism. Our data suggest that selective attention generates a dynamically evolving model of attended auditory stimulus streams in the form of modulatory subthreshold oscillations across tonotopically organized neuronal ensembles in A1 that enhances the representation of attended stimuli. PMID:23439126

  8. Auditory and visual capture during focused visual attention

    NARCIS (Netherlands)

    Koelewijn, T.; Bronkhorst, A.W.; Theeuwes, J.

    2009-01-01

    It is well known that auditory and visual onsets presented at a particular location can capture a person's visual attention. However, the question of whether such attentional capture disappears when attention is focused endogenously beforehand has not yet been answered. Moreover, previous studies

  9. Contextual cueing improves attentional guidance, even when guidance is supposedly optimal.

    Science.gov (United States)

    Harris, Anthony M; Remington, Roger W

    2017-05-01

    Visual search through previously encountered contexts typically produces reduced reaction times compared with search through novel contexts. This contextual cueing benefit is well established, but there is debate regarding its underlying mechanisms. Eye-tracking studies have consistently shown reduced number of fixations with repetition, supporting improvements in attentional guidance as the source of contextual cueing. However, contextual cueing benefits have been shown in conditions in which attentional guidance should already be optimal-namely, when attention is captured to the target location by an abrupt onset, or under pop-out conditions. These results have been used to argue for a response-related account of contextual cueing. Here, we combine eye tracking with response time to examine the mechanisms behind contextual cueing in spatially cued and pop-out conditions. Three experiments find consistent response time benefits with repetition, which appear to be driven almost entirely by a reduction in number of fixations, supporting improved attentional guidance as the mechanism behind contextual cueing. No differences were observed in the time between fixating the target and responding-our proxy for response related processes. Furthermore, the correlation between contextual cueing magnitude and the reduction in number of fixations on repeated contexts approaches 1. These results argue strongly that attentional guidance is facilitated by familiar search contexts, even when guidance is near-optimal. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Changes in otoacoustic emissions during selective auditory and visual attention.

    Science.gov (United States)

    Walsh, Kyle P; Pasanen, Edward G; McFadden, Dennis

    2015-05-01

    Previous studies have demonstrated that the otoacoustic emissions (OAEs) measured during behavioral tasks can have different magnitudes when subjects are attending selectively or not attending. The implication is that the cognitive and perceptual demands of a task can affect the first neural stage of auditory processing-the sensory receptors themselves. However, the directions of the reported attentional effects have been inconsistent, the magnitudes of the observed differences typically have been small, and comparisons across studies have been made difficult by significant procedural differences. In this study, a nonlinear version of the stimulus-frequency OAE (SFOAE), called the nSFOAE, was used to measure cochlear responses from human subjects while they simultaneously performed behavioral tasks requiring selective auditory attention (dichotic or diotic listening), selective visual attention, or relative inattention. Within subjects, the differences in nSFOAE magnitude between inattention and attention conditions were about 2-3 dB for both auditory and visual modalities, and the effect sizes for the differences typically were large for both nSFOAE magnitude and phase. These results reveal that the cochlear efferent reflex is differentially active during selective attention and inattention, for both auditory and visual tasks, although they do not reveal how attention is improved when efferent activity is greater.

  11. Changes in otoacoustic emissions during selective auditory and visual attention

    Science.gov (United States)

    Walsh, Kyle P.; Pasanen, Edward G.; McFadden, Dennis

    2015-01-01

    Previous studies have demonstrated that the otoacoustic emissions (OAEs) measured during behavioral tasks can have different magnitudes when subjects are attending selectively or not attending. The implication is that the cognitive and perceptual demands of a task can affect the first neural stage of auditory processing—the sensory receptors themselves. However, the directions of the reported attentional effects have been inconsistent, the magnitudes of the observed differences typically have been small, and comparisons across studies have been made difficult by significant procedural differences. In this study, a nonlinear version of the stimulus-frequency OAE (SFOAE), called the nSFOAE, was used to measure cochlear responses from human subjects while they simultaneously performed behavioral tasks requiring selective auditory attention (dichotic or diotic listening), selective visual attention, or relative inattention. Within subjects, the differences in nSFOAE magnitude between inattention and attention conditions were about 2–3 dB for both auditory and visual modalities, and the effect sizes for the differences typically were large for both nSFOAE magnitude and phase. These results reveal that the cochlear efferent reflex is differentially active during selective attention and inattention, for both auditory and visual tasks, although they do not reveal how attention is improved when efferent activity is greater. PMID:25994703

  12. Alcohol-cue exposure effects on craving and attentional bias in underage college-student drinkers.

    Science.gov (United States)

    Ramirez, Jason J; Monti, Peter M; Colwill, Ruth M

    2015-06-01

    The effect of alcohol-cue exposure on eliciting craving has been well documented, and numerous theoretical models assert that craving is a clinically significant construct central to the motivation and maintenance of alcohol-seeking behavior. Furthermore, some theories propose a relationship between craving and attention, such that cue-induced increases in craving bias attention toward alcohol cues, which, in turn, perpetuates craving. This study examined the extent to which alcohol cues induce craving and bias attention toward alcohol cues among underage college-student drinkers. We designed within-subject cue-reactivity and visual-probe tasks to assess in vivo alcohol-cue exposure effects on craving and attentional bias on 39 undergraduate college drinkers (ages 18-20). Participants expressed greater subjective craving to drink alcohol following in vivo cue exposure to a commonly consumed beer compared with water exposure. Furthermore, following alcohol-cue exposure, participants exhibited greater attentional biases toward alcohol cues as measured by a visual-probe task. In addition to the cue-exposure effects on craving and attentional bias, within-subject differences in craving across sessions marginally predicted within-subject differences in attentional bias. Implications for both theory and practice are discussed. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  13. Modelling auditory attention: Insights from the Theory of Visual Attention (TVA)

    DEFF Research Database (Denmark)

    Roberts, K. L.; Andersen, Tobias; Kyllingsbæk, Søren

    modelled using a log-logistic function than an exponential function. A more challenging difference is that in the partial report task, there is more target-distractor confusion for auditory than visual stimuli. This failure of object-formation (prior to attentional object-selection) is not yet effectively......We report initial progress towards creating an auditory analogue of a mathematical model of visual attention: the ‘Theory of Visual Attention’ (TVA; Bundesen, 1990). TVA is one of the best established models of visual attention. It assumes that visual stimuli are initially processed in parallel......, and that there is a ‘race’ for selection and representation in visual short term memory (VSTM). In the basic TVA task, participants view a brief display of letters and are asked to report either all of the letters (whole report) or a subset of the letters (e.g., the red letters; partial report). Fitting the model...

  14. Selective attention to phonology dynamically modulates initial encoding of auditory words within the left hemisphere.

    Science.gov (United States)

    Yoncheva, Yuliya; Maurer, Urs; Zevin, Jason D; McCandliss, Bruce D

    2014-08-15

    Selective attention to phonology, i.e., the ability to attend to sub-syllabic units within spoken words, is a critical precursor to literacy acquisition. Recent functional magnetic resonance imaging evidence has demonstrated that a left-lateralized network of frontal, temporal, and posterior language regions, including the visual word form area, supports this skill. The current event-related potential (ERP) study investigated the temporal dynamics of selective attention to phonology during spoken word perception. We tested the hypothesis that selective attention to phonology dynamically modulates stimulus encoding by recruiting left-lateralized processes specifically while the information critical for performance is unfolding. Selective attention to phonology was captured by manipulating listening goals: skilled adult readers attended to either rhyme or melody within auditory stimulus pairs. Each pair superimposed rhyming and melodic information ensuring identical sensory stimulation. Selective attention to phonology produced distinct early and late topographic ERP effects during stimulus encoding. Data-driven source localization analyses revealed that selective attention to phonology led to significantly greater recruitment of left-lateralized posterior and extensive temporal regions, which was notably concurrent with the rhyme-relevant information within the word. Furthermore, selective attention effects were specific to auditory stimulus encoding and not observed in response to cues, arguing against the notion that they reflect sustained task setting. Collectively, these results demonstrate that selective attention to phonology dynamically engages a left-lateralized network during the critical time-period of perception for achieving phonological analysis goals. These findings suggest a key role for selective attention in on-line phonological computations. Furthermore, these findings motivate future research on the role that neural mechanisms of attention may

  15. Visual and Auditory Cue Effects on Risk Assessment in a Highway Training Simulation

    NARCIS (Netherlands)

    Toet, A.; Houtkamp, J.M.; Meulen, van der R.

    2013-01-01

    We investigated whether manipulation of visual and auditory depth and speed cues can affect a user’s sense of risk for a low-cost nonimmersive virtual environment (VE) representing a highway environment with traffic incidents. The VE is currently used in an examination program to assess procedural

  16. Visual and auditory cue effects on risk assessment in a highway training simulation

    NARCIS (Netherlands)

    Toet, A.; Houtkamp, J.M.; Meulen, R. van der

    2013-01-01

    We investigated whether manipulation of visual and auditory depth and speed cues can affect a user’s sense of risk for a low-cost nonimmersive virtual environment (VE) representing a highway environment with traffic incidents. The VE is currently used in an examination program to assess procedural

  17. Auditory and Visual Capture during Focused Visual Attention

    Science.gov (United States)

    Koelewijn, Thomas; Bronkhorst, Adelbert; Theeuwes, Jan

    2009-01-01

    It is well known that auditory and visual onsets presented at a particular location can capture a person's visual attention. However, the question of whether such attentional capture disappears when attention is focused endogenously beforehand has not yet been answered. Moreover, previous studies have not differentiated between capture by onsets…

  18. The prelimbic cortex directs attention toward predictive cues during fear learning.

    Science.gov (United States)

    Sharpe, Melissa J; Killcross, Simon

    2015-06-01

    The prelimbic cortex is argued to promote conditioned fear expression, at odds with appetitive research implicating this region in attentional processing. Consistent with an attentional account, we report that the effect of prelimbic lesions on fear expression depends on the degree of competition between contextual and discrete cues. Further, when competition from contextual cues is low, we found that PL inactivation resulted in animals expressing fear toward irrelevant discrete cues; an effect selective to inactivation during the learning phase and not during retrieval. These data demonstrate that the prelimbic cortex modulates attention toward cues to preferentially direct fear responding on the basis of their predictive value. © 2015 Sharpe and Killcross; Published by Cold Spring Harbor Laboratory Press.

  19. The role of working memory in auditory selective attention.

    Science.gov (United States)

    Dalton, Polly; Santangelo, Valerio; Spence, Charles

    2009-11-01

    A growing body of research now demonstrates that working memory plays an important role in controlling the extent to which irrelevant visual distractors are processed during visual selective attention tasks (e.g., Lavie, Hirst, De Fockert, & Viding, 2004). Recently, it has been shown that the successful selection of tactile information also depends on the availability of working memory (Dalton, Lavie, & Spence, 2009). Here, we investigate whether working memory plays a role in auditory selective attention. Participants focused their attention on short continuous bursts of white noise (targets) while attempting to ignore pulsed bursts of noise (distractors). Distractor interference in this auditory task, as measured in terms of the difference in performance between congruent and incongruent distractor trials, increased significantly under high (vs. low) load in a concurrent working-memory task. These results provide the first evidence demonstrating a causal role for working memory in reducing interference by irrelevant auditory distractors.

  20. The influence of cueing on attentional focus in perceptual decision making.

    Science.gov (United States)

    Yang, Cheng-Ta; Little, Daniel R; Hsu, Ching-Chun

    2014-11-01

    Selective attention has been known to play an important role in decision making. In the present study, we combined a cueing paradigm with a redundant-target detection task to examine how attention affects the decision process when detecting the redundant targets. Cue validity was manipulated in two experiments. The results showed that when the cue was 50 % valid in one experiment, the participants adopted a parallel self-terminating processing strategy, indicative of a diffuse attentional focus on both target locations. When the cue was 100 % valid in the second experiment, all of the participants switched to a serial self-terminating processing strategy, which in our study indicated focused attention to a single target location. This study demonstrates the flexibility of the decision mechanism and highlights the importance of top-down control in selecting a decision strategy.

  1. Selective attention to smoking cues in former smokers.

    Science.gov (United States)

    Rehme, Anne K; Bey, Katharina; Frommann, Ingo; Mogg, Karin; Bradley, Brendan P; Bludau, Julia; Block, Verena; Sträter, Birgitta; Schütz, Christian G; Wagner, Michael

    2018-02-01

    Repeated drug use modifies the emotional and cognitive processing of drug-associated cues. These changes are supposed to persist even after prolonged abstinence. Several studies demonstrated that smoking cues selectively attract the attention of smokers, but empirical evidence for such an attentional bias among successful quitters is inconclusive. Here, we investigated whether attentional biases persist after smoking cessation. Thirty-eight former smokers, 34 current smokers, and 29 non-smokers participated in a single experimental session. We used three measures of attentional bias for smoking stimuli: A visual probe task with short (500ms) and long (2000ms) picture stimulus durations, and a modified Stroop task with smoking-related and neutral words. Former smokers and current smokers, as compared to non-smokers, showed an attentional bias in visual orienting to smoking pictures in the 500ms condition of the visual probe task. The Stroop interference index of smoking words was negatively related to nicotine dependence in current smokers. Former smokers and mildly dependent smokers, as compared to non-smokers, showed increased interference by smoking words in the Stroop task. Neither current nor former smokers showed an attentional bias in maintained attention (2000ms visual probe task). In conclusion, even after prolonged abstinence smoking cues retain incentive salience in former smokers, who differed from non-smokers on two attentional bias indices. Attentional biases in former smokers operate mainly in early involuntary rather than in controlled processing, and may represent a vulnerability factor for relapse. Therefore, smoking cessation programs should strengthen self-control abilities to prevent relapses. Copyright © 2017 Elsevier B.V. and ECNP. All rights reserved.

  2. Effect of handedness on auditory attentional performance in ADHD students

    Directory of Open Access Journals (Sweden)

    Schmidt SL

    2017-12-01

    Full Text Available Sergio L Schmidt,1,2 Ana Lucia Novais Carvaho,3 Eunice N Simoes2 1Department of Neurophysiology, State University of Rio de Janeiro, Rio de Janeiro, 2Neurology Department, Federal University of the State of Rio de Janeiro, Rio de Janeiro, 3Department of Psychology, Fluminense Federal University, Niteroi, Brazil Abstract: The relationship between handedness and attentional performance is poorly understood. Continuous performance tests (CPTs using visual stimuli are commonly used to assess subjects suffering from attention deficit hyperactivity disorder (ADHD. However, auditory CPTs are considered more useful than visual ones to evaluate classroom attentional problems. A previous study reported that there was a significant effect of handedness on students’ performance on a visual CPT. Here, we examined whether handedness would also affect CPT performance using only auditory stimuli. From an initial sample of 337 students, 11 matched pairs were selected. Repeated ANOVAs showed a significant effect of handedness on attentional performance that was exhibited even in the control group. Left-handers made more commission errors than right-handers. The results were interpreted considering that the association between ADHD and handedness reflects that consistent left-handers are less lateralized and have decreased interhemispheric connections. Auditory attentional data suggest that left-handers have problems in the impulsive/hyperactivity domain. In ADHD, clinical therapeutics and rehabilitation must take handedness into account because consistent sinistrals are more impulsive than dextrals. Keywords: attention, ADHD, consistent left-handers, auditory attention, continuous performance test

  3. Cognitive Training Enhances Auditory Attention Efficiency in Older Adults

    Directory of Open Access Journals (Sweden)

    Jennifer L. O’Brien

    2017-10-01

    Full Text Available Auditory cognitive training (ACT improves attention in older adults; however, the underlying neurophysiological mechanisms are still unknown. The present study examined the effects of ACT on the P3b event-related potential reflecting attention allocation (amplitude and speed of processing (latency during stimulus categorization and the P1-N1-P2 complex reflecting perceptual processing (amplitude and latency. Participants completed an auditory oddball task before and after 10 weeks of ACT (n = 9 or a no contact control period (n = 15. Parietal P3b amplitudes to oddball stimuli decreased at post-test in the trained group as compared to those in the control group, and frontal P3b amplitudes show a similar trend, potentially reflecting more efficient attentional allocation after ACT. No advantages for the ACT group were evident for auditory perceptual processing or speed of processing in this small sample. Our results provide preliminary evidence that ACT may enhance the efficiency of attention allocation, which may account for the positive impact of ACT on the everyday functioning of older adults.

  4. Using auditory pre-information to solve the cocktail-party problem: electrophysiological evidence for age-specific differences.

    Science.gov (United States)

    Getzmann, Stephan; Lewald, Jörg; Falkenstein, Michael

    2014-01-01

    Speech understanding in complex and dynamic listening environments requires (a) auditory scene analysis, namely auditory object formation and segregation, and (b) allocation of the attentional focus to the talker of interest. There is evidence that pre-information is actively used to facilitate these two aspects of the so-called "cocktail-party" problem. Here, a simulated multi-talker scenario was combined with electroencephalography to study scene analysis and allocation of attention in young and middle-aged adults. Sequences of short words (combinations of brief company names and stock-price values) from four talkers at different locations were simultaneously presented, and the detection of target names and the discrimination between critical target values were assessed. Immediately prior to speech sequences, auditory pre-information was provided via cues that either prepared auditory scene analysis or attentional focusing, or non-specific pre-information was given. While performance was generally better in younger than older participants, both age groups benefited from auditory pre-information. The analysis of the cue-related event-related potentials revealed age-specific differences in the use of pre-cues: Younger adults showed a pronounced N2 component, suggesting early inhibition of concurrent speech stimuli; older adults exhibited a stronger late P3 component, suggesting increased resource allocation to process the pre-information. In sum, the results argue for an age-specific utilization of auditory pre-information to improve listening in complex dynamic auditory environments.

  5. Using auditory pre-information to solve the cocktail-party problem: electrophysiological evidence for age-specific differences

    Directory of Open Access Journals (Sweden)

    Stephan eGetzmann

    2014-12-01

    Full Text Available Speech understanding in complex and dynamic listening environments requires (a auditory scene analysis, namely auditory object formation and segregation, and (b allocation of the attentional focus to the talker of interest. There is evidence that pre-information is actively used to facilitate these two aspects of the so-called cocktail-party problem. Here, a simulated multi-talker scenario was combined with electroencephalography to study scene analysis and allocation of attention in young and middle-aged adults. Sequences of short words (combinations of brief company names and stock-price values from four talkers at different locations were simultaneously presented, and the detection of target names and the discrimination between critical target values were assessed. Immediately prior to speech sequences, auditory pre-information was provided via cues that either prepared auditory scene analysis or attentional focusing, or non-specific pre-information was given. While performance was generally better in younger than older participants, both age groups benefited from auditory pre-information. The analysis of the cue-related event-related potentials revealed age-specific differences in the use of pre-cues: Younger adults showed a pronounced N2 component, suggesting early inhibition of concurrent speech stimuli; older adults exhibited a stronger late P3 component, suggesting increased resource allocation to process the pre-information. In sum, the results argue for an age-specific utilization of auditory pre-information to improve listening in complex dynamic auditory environments.

  6. Shifting attention among working memory representations: testing cue type, awareness, and strategic control.

    Science.gov (United States)

    Berryhill, Marian E; Richmond, Lauren L; Shay, Cara S; Olson, Ingrid R

    2012-01-01

    It is well known that visual working memory (VWM) performance is modulated by attentional cues presented during encoding. Interestingly, retrospective cues presented after encoding, but prior to the test phase also improve performance. This improvement in performance is termed the retro-cue benefit. We investigated whether the retro-cue benefit is sensitive to cue type, whether participants were aware of their improvement in performance due to the retro-cue, and whether the effect was under strategic control. Experiment 1 compared the potential cueing benefits of abrupt onset retro-cues relying on bottom-up attention, number retro-cues relying on top-down attention, and arrow retro-cues, relying on a mixture of both. We found a significant retro-cue effect only for arrow retro-cues. In Experiment 2, we tested participants' awareness of their use of the informative retro-cue and found that they were aware of their improved performance. In Experiment 3, we asked whether participants have strategic control over the retro-cue. The retro-cue was difficult to ignore, suggesting that strategic control is low. The retro-cue effect appears to be within conscious awareness but not under full strategic control.

  7. Retrosplenial Cortex Is Required for the Retrieval of Remote Memory for Auditory Cues

    Science.gov (United States)

    Todd, Travis P.; Mehlman, Max L.; Keene, Christopher S.; DeAngeli, Nicole E.; Bucci, David J.

    2016-01-01

    The retrosplenial cortex (RSC) has a well-established role in contextual and spatial learning and memory, consistent with its known connectivity with visuo-spatial association areas. In contrast, RSC appears to have little involvement with delay fear conditioning to an auditory cue. However, all previous studies have examined the contribution of…

  8. Selective attention to phonology dynamically modulates initial encoding of auditory words within the left hemisphere

    Science.gov (United States)

    Yoncheva; Maurer, Urs; Zevin, Jason; McCandliss, Bruce

    2015-01-01

    Selective attention to phonology, i.e., the ability to attend to sub-syllabic units within spoken words, is a critical precursor to literacy acquisition. Recent functional magnetic resonance imaging evidence has demonstrated that a left-lateralized network of frontal, temporal, and posterior language regions, including the visual word form area, supports this skill. The current event-related potential (ERP) study investigated the temporal dynamics of selective attention to phonology during spoken word perception. We tested the hypothesis that selective atten tion to phonology dynamically modulates stimulus encoding by recruiting left-lateralized processes specifically while the information critical for performance is unfolding. Selective attention to phonology was captured by ma nipulating listening goals: skilled adult readers attended to either rhyme or melody within auditory stimulus pairs. Each pair superimposed rhyming and melodic information ensuring identical sensory stimulation. Selective attention to phonology produced distinct early and late topographic ERP effects during stimulus encoding. Data- driven source localization analyses revealed that selective attention to phonology led to significantly greater re cruitment of left-lateralized posterior and extensive temporal regions, which was notably concurrent with the rhyme-relevant information within the word. Furthermore, selective attention effects were specific to auditory stimulus encoding and not observed in response to cues, arguing against the notion that they reflect sustained task setting. Collectively, these results demonstrate that selective attention to phonology dynamically engages a left-lateralized network during the critical time-period of perception for achieving phonological analysis goals. These findings support the key role of selective attention to phonology in the development of literacy and motivate future research on the neural bases of the interaction between phonological

  9. Attentional bias for food cues in advertising among overweight and hungry children

    NARCIS (Netherlands)

    Folkvord, F.; Anschutz, D.J.; Buijzen, M.A.

    2015-01-01

    Attentional bias theory suggests that an increased motivation to receive or avoid a rewarding substance elevates automatic selective attention toward cues that are related to that specific substance. Until now, no study has examined attentional bias toward food cues in food advertisements, even

  10. Sad facial cues inhibit temporal attention: evidence from an event-related potential study.

    Science.gov (United States)

    Kong, Xianxian; Chen, Xiaoqiang; Tan, Bo; Zhao, Dandan; Jin, Zhenlan; Li, Ling

    2013-06-19

    We examined the influence of different emotional cues (happy or sad) on temporal attention (short or long interval) using behavioral as well as event-related potential recordings during a Stroop task. Emotional stimuli cued short and long time intervals, inducing 'sad-short', 'sad-long', 'happy-short', and 'happy-long' conditions. Following the intervals, participants performed a numeric Stroop task. Behavioral results showed the temporal attention effects in the sad-long, happy-long, and happy-short conditions, in which valid cues quickened the reaction times, but not in the sad-short condition. N2 event-related potential components showed sad cues to have decreased activity for short intervals compared with long intervals, whereas happy cues did not. Taken together, these findings provide evidence for different modulation of sad and happy facial cues on temporal attention. Furthermore, sad cues inhibit temporal attention, resulting in longer reaction time and decreased neural activity in the short interval by diverting more attentional resources.

  11. Failure of the extended contingent attentional capture account in multimodal settings

    Directory of Open Access Journals (Sweden)

    Rob H.J. Van der Lubbe

    2006-01-01

    Full Text Available Sudden changes in our environment like sound bursts or light flashes are thought to automatically attract our attention thereby affecting responses to subsequent targets, although an alternative view (the contingent attentional capture account holds that stimuli only capture our attention when they match target features. In the current study, we examined whether an extended version of the latter view can explain exogenous cuing effects on speed and accuracy of performance to targets (uncued-cued in multimodal settings, in which auditory and visual stimuli co-occur. To this end, we determined whether observed effects of visual and auditory cues, which were always intermixed, depend on top-down settings in "pure" blocks, in which only one target modality occurred, as compared to "mixed" blocks, in which targets were either visual or auditory. Results revealed that unimodal and crossmodal cuing effects depend on top-down settings. However, our findingswerenot in accordance with predictions derived from the extended contingent attentional capture account. Specifically,visual cues showed comparable effects for visual targets in pure and mixed blocks, but also a comparable effect for auditory targets in pure blocks, and most surprisingly, an opposite effect in mixed blocks. The latter result suggests that visual stimuli may distract attention from the auditory modality in case when the modality of the forthcoming target is unknown. The results additionally revealed that the Simon effect, the influence of correspondence or not between stimulus and response side, is modulated by exogenous cues in unimodal settings, but not in crossmodal settings. These findings accord with the view that attention plays an important role for the Simon effect, and additionally questions the directness of links between maps of visual and auditory space.

  12. Listenmee and Listenmee smartphone application: synchronizing walking to rhythmic auditory cues to improve gait in Parkinson's disease.

    Science.gov (United States)

    Lopez, William Omar Contreras; Higuera, Carlos Andres Escalante; Fonoff, Erich Talamoni; Souza, Carolina de Oliveira; Albicker, Ulrich; Martinez, Jairo Alberto Espinoza

    2014-10-01

    Evidence supports the use of rhythmic external auditory signals to improve gait in PD patients (Arias & Cudeiro, 2008; Kenyon & Thaut, 2000; McIntosh, Rice & Thaut, 1994; McIntosh et al., 1997; Morris, Iansek, & Matyas, 1994; Thaut, McIntosh, & Rice, 1997; Suteerawattananon, Morris, Etnyre, Jankovic, & Protas , 2004; Willems, Nieuwboer, Chavert, & Desloovere, 2006). However, few prototypes are available for daily use, and to our knowledge, none utilize a smartphone application allowing individualized sounds and cadence. Therefore, we analyzed the effects on gait of Listenmee®, an intelligent glasses system with a portable auditory device, and present its smartphone application, the Listenmee app®, offering over 100 different sounds and an adjustable metronome to individualize the cueing rate as well as its smartwatch with accelerometer to detect magnitude and direction of the proper acceleration, track calorie count, sleep patterns, steps count and daily distances. The present study included patients with idiopathic PD presented gait disturbances including freezing. Auditory rhythmic cues were delivered through Listenmee®. Performance was analyzed in a motion and gait analysis laboratory. The results revealed significant improvements in gait performance over three major dependent variables: walking speed in 38.1%, cadence in 28.1% and stride length in 44.5%. Our findings suggest that auditory cueing through Listenmee® may significantly enhance gait performance. Further studies are needed to elucidate the potential role and maximize the benefits of these portable devices. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Attentional bias to food-related visual cues: is there a role in obesity?

    Science.gov (United States)

    Doolan, K J; Breslin, G; Hanna, D; Gallagher, A M

    2015-02-01

    The incentive sensitisation model of obesity suggests that modification of the dopaminergic associated reward systems in the brain may result in increased awareness of food-related visual cues present in the current food environment. Having a heightened awareness of these visual food cues may impact on food choices and eating behaviours with those being most aware of or demonstrating greater attention to food-related stimuli potentially being at greater risk of overeating and subsequent weight gain. To date, research related to attentional responses to visual food cues has been both limited and conflicting. Such inconsistent findings may in part be explained by the use of different methodological approaches to measure attentional bias and the impact of other factors such as hunger levels, energy density of visual food cues and individual eating style traits that may influence visual attention to food-related cues outside of weight status alone. This review examines the various methodologies employed to measure attentional bias with a particular focus on the role that attentional processing of food-related visual cues may have in obesity. Based on the findings of this review, it appears that it may be too early to clarify the role visual attention to food-related cues may have in obesity. Results however highlight the importance of considering the most appropriate methodology to use when measuring attentional bias and the characteristics of the study populations targeted while interpreting results to date and in designing future studies.

  14. Auditory and visual sustained attention in children with speech sound disorder.

    Directory of Open Access Journals (Sweden)

    Cristina F B Murphy

    Full Text Available Although research has demonstrated that children with specific language impairment (SLI and reading disorder (RD exhibit sustained attention deficits, no study has investigated sustained attention in children with speech sound disorder (SSD. Given the overlap of symptoms, such as phonological memory deficits, between these different language disorders (i.e., SLI, SSD and RD and the relationships between working memory, attention and language processing, it is worthwhile to investigate whether deficits in sustained attention also occur in children with SSD. A total of 55 children (18 diagnosed with SSD (8.11 ± 1.231 and 37 typically developing children (8.76 ± 1.461 were invited to participate in this study. Auditory and visual sustained-attention tasks were applied. Children with SSD performed worse on these tasks; they committed a greater number of auditory false alarms and exhibited a significant decline in performance over the course of the auditory detection task. The extent to which performance is related to auditory perceptual difficulties and probable working memory deficits is discussed. Further studies are needed to better understand the specific nature of these deficits and their clinical implications.

  15. Role of Speaker Cues in Attention Inference

    OpenAIRE

    Jin Joo Lee; Cynthia Breazeal; David DeSteno

    2017-01-01

    Current state-of-the-art approaches to emotion recognition primarily focus on modeling the nonverbal expressions of the sole individual without reference to contextual elements such as the co-presence of the partner. In this paper, we demonstrate that the accurate inference of listeners’ social-emotional state of attention depends on accounting for the nonverbal behaviors of their storytelling partner, namely their speaker cues. To gain a deeper understanding of the role of speaker cues in at...

  16. Auditory attention in childhood and adolescence: An event-related potential study of spatial selective attention to one of two simultaneous stories

    Science.gov (United States)

    Karns, Christina M.; Isbell, Elif; Giuliano, Ryan J.; Neville, Helen J.

    2015-01-01

    Auditory selective attention is a critical skill for goal-directed behavior, especially where noisy distractions may impede focusing attention. To better understand the developmental trajectory of auditory spatial selective attention in an acoustically complex environment, in the current study we measured auditory event-related potentials (ERPs) in human children across five age groups: 3–5 years; 10 years; 13 years; 16 years; and young adults using a naturalistic dichotic listening paradigm, characterizing the ERP morphology for nonlinguistic and linguistic auditory probes embedded in attended and unattended stories. We documented robust maturational changes in auditory evoked potentials that were specific to the types of probes. Furthermore, we found a remarkable interplay between age and attention-modulation of auditory evoked potentials in terms of morphology and latency from the early years of childhood through young adulthood. The results are consistent with the view that attention can operate across age groups by modulating the amplitude of maturing auditory early-latency evoked potentials or by invoking later endogenous attention processes. Development of these processes is not uniform for probes with different acoustic properties within our acoustically dense speech-based dichotic listening task. In light of the developmental differences we demonstrate, researchers conducting future attention studies of children and adolescents should be wary of combining analyses across diverse ages. PMID:26002721

  17. Auditory attention in childhood and adolescence: An event-related potential study of spatial selective attention to one of two simultaneous stories.

    Science.gov (United States)

    Karns, Christina M; Isbell, Elif; Giuliano, Ryan J; Neville, Helen J

    2015-06-01

    Auditory selective attention is a critical skill for goal-directed behavior, especially where noisy distractions may impede focusing attention. To better understand the developmental trajectory of auditory spatial selective attention in an acoustically complex environment, in the current study we measured auditory event-related potentials (ERPs) across five age groups: 3-5 years; 10 years; 13 years; 16 years; and young adults. Using a naturalistic dichotic listening paradigm, we characterized the ERP morphology for nonlinguistic and linguistic auditory probes embedded in attended and unattended stories. We documented robust maturational changes in auditory evoked potentials that were specific to the types of probes. Furthermore, we found a remarkable interplay between age and attention-modulation of auditory evoked potentials in terms of morphology and latency from the early years of childhood through young adulthood. The results are consistent with the view that attention can operate across age groups by modulating the amplitude of maturing auditory early-latency evoked potentials or by invoking later endogenous attention processes. Development of these processes is not uniform for probes with different acoustic properties within our acoustically dense speech-based dichotic listening task. In light of the developmental differences we demonstrate, researchers conducting future attention studies of children and adolescents should be wary of combining analyses across diverse ages. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Electrophysiological evidence for altered visual, but not auditory, selective attention in adolescent cochlear implant users.

    Science.gov (United States)

    Harris, Jill; Kamke, Marc R

    2014-11-01

    Selective attention fundamentally alters sensory perception, but little is known about the functioning of attention in individuals who use a cochlear implant. This study aimed to investigate visual and auditory attention in adolescent cochlear implant users. Event related potentials were used to investigate the influence of attention on visual and auditory evoked potentials in six cochlear implant users and age-matched normally-hearing children. Participants were presented with streams of alternating visual and auditory stimuli in an oddball paradigm: each modality contained frequently presented 'standard' and infrequent 'deviant' stimuli. Across different blocks attention was directed to either the visual or auditory modality. For the visual stimuli attention boosted the early N1 potential, but this effect was larger for cochlear implant users. Attention was also associated with a later P3 component for the visual deviant stimulus, but there was no difference between groups in the later attention effects. For the auditory stimuli, attention was associated with a decrease in N1 latency as well as a robust P3 for the deviant tone. Importantly, there was no difference between groups in these auditory attention effects. The results suggest that basic mechanisms of auditory attention are largely normal in children who are proficient cochlear implant users, but that visual attention may be altered. Ultimately, a better understanding of how selective attention influences sensory perception in cochlear implant users will be important for optimising habilitation strategies. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. Cue-Elicited Increases in Incentive Salience for Marijuana: Craving, Demand, and Attentional Bias

    Science.gov (United States)

    Metrik, Jane; Aston, Elizabeth R.; Kahler, Christopher W.; Rohsenow, Damaris J.; McGeary, John E.; Knopik, Valerie S.; MacKillop, James

    2016-01-01

    Background Incentive salience is a multidimensional construct that includes craving, drug value relative to other reinforcers, and implicit motivation such as attentional bias to drug cues. Laboratory cue reactivity (CR) paradigms have been used to evaluate marijuana incentive salience with measures of craving, but not with behavioral economic measures of marijuana demand or implicit attentional processing tasks. Methods This within-subjects study used a new CR paradigm to examine multiple dimensions of marijuana’s incentive salience and to compare CR-induced increases in craving and demand. Frequent marijuana users (N=93, 34% female) underwent exposure to neutral cues then to lit marijuana cigarettes. Craving, marijuana demand via a marijuana purchase task, and heart rate were assessed after each cue set. A modified Stroop task with cannabis and control words was completed after the marijuana cues as a measure of attentional bias. Results Relative to neutral cues, marijuana cues significantly increased subjective craving and demand indices of intensity (i.e., drug consumed at $0) and Omax (i.e., peak drug expenditure). Elasticity significantly decreased following marijuana cues, reflecting sustained purchase despite price increases. Craving was correlated with demand indices (r’s: 0.23–0.30). Marijuana users displayed significant attentional bias for cannabis-related words after marijuana cues. Cue-elicited increases in intensity were associated with greater attentional bias for marijuana words. Conclusions Greater incentive salience indexed by subjective, behavioral economic, and implicit measures was observed after marijuana versus neutral cues, supporting multidimensional assessment. The study highlights the utility of a behavioral economic approach in detecting cue-elicited changes in marijuana incentive salience. PMID:27515723

  20. Enhancing Auditory Selective Attention Using a Visually Guided Hearing Aid

    Science.gov (United States)

    2017-01-01

    Purpose Listeners with hearing loss, as well as many listeners with clinically normal hearing, often experience great difficulty segregating talkers in a multiple-talker sound field and selectively attending to the desired “target” talker while ignoring the speech from unwanted “masker” talkers and other sources of sound. This listening situation forms the classic “cocktail party problem” described by Cherry (1953) that has received a great deal of study over the past few decades. In this article, a new approach to improving sound source segregation and enhancing auditory selective attention is described. The conceptual design, current implementation, and results obtained to date are reviewed and discussed in this article. Method This approach, embodied in a prototype “visually guided hearing aid” (VGHA) currently used for research, employs acoustic beamforming steered by eye gaze as a means for improving the ability of listeners to segregate and attend to one sound source in the presence of competing sound sources. Results The results from several studies demonstrate that listeners with normal hearing are able to use an attention-based “spatial filter” operating primarily on binaural cues to selectively attend to one source among competing spatially distributed sources. Furthermore, listeners with sensorineural hearing loss generally are less able to use this spatial filter as effectively as are listeners with normal hearing especially in conditions high in “informational masking.” The VGHA enhances auditory spatial attention for speech-on-speech masking and improves signal-to-noise ratio for conditions high in “energetic masking.” Visual steering of the beamformer supports the coordinated actions of vision and audition in selective attention and facilitates following sound source transitions in complex listening situations. Conclusions Both listeners with normal hearing and with sensorineural hearing loss may benefit from the acoustic

  1. Grasping the other's attention: The role of animacy in action cueing of joint attention

    NARCIS (Netherlands)

    Lindemann, O.; Nuku, P.; Rüschemeyer, S.A.; Bekkering, H.

    2011-01-01

    The current experiment investigates the role of animacy on grasp-cueing effects as investigated in joint attention research. In a simple detection task participants responded to the colour change of one of two objects of identical size. Before the target onset, we presented a cueing stimulus

  2. Competition between auditory and visual spatial cues during visual task performance

    NARCIS (Netherlands)

    Koelewijn, T.; Bronkhorst, A.; Theeuwes, J.

    2009-01-01

    There is debate in the crossmodal cueing literature as to whether capture of visual attention by means of sound is a fully automatic process. Recent studies show that when visual attention is endogenously focused sound still captures attention. The current study investigated whether there is

  3. Action experience changes attention to kinematic cues

    Directory of Open Access Journals (Sweden)

    Courtney eFilippi

    2016-02-01

    Full Text Available The current study used remote corneal reflection eye-tracking to examine the relationship between motor experience and action anticipation in 13-month-old infants. To measure online anticipation of actions infants watched videos where the actor’s hand provided kinematic information (in its orientation about the type of object that the actor was going to reach for. The actor’s hand orientation either matched the orientation of a rod (congruent cue or did not match the orientation of the rod (incongruent cue. To examine relations between motor experience and action anticipation, we used a 2 (reach first vs. observe first x 2 (congruent kinematic cue vs. incongruent kinematic cue between-subjects design. We show that 13-month-old infants in the observe first condition spontaneously generate rapid online visual predictions to congruent hand orientation cues and do not visually anticipate when presented incongruent cues. We further demonstrate that the speed that these infants generate predictions to congruent motor cues is correlated with their own ability to pre-shape their hands. Finally, we demonstrate that following reaching experience, infants generate rapid predictions to both congruent and incongruent hand shape cues—suggesting that short-term experience changes attention to kinematics.

  4. Attention, awareness, and the perception of auditory scenes

    Directory of Open Access Journals (Sweden)

    Joel S Snyder

    2012-02-01

    Full Text Available Auditory perception and cognition entails both low-level and high-level processes, which are likely to interact with each other to create our rich conscious experience of soundscapes. Recent research that we review has revealed numerous influences of high-level factors, such as attention, intention, and prior experience, on conscious auditory perception. And recently, studies have shown that auditory scene analysis tasks can exhibit multistability in a manner very similar to ambiguous visual stimuli, presenting a unique opportunity to study neural correlates of auditory awareness and the extent to which mechanisms of perception are shared across sensory modalities. Research has also led to a growing number of techniques through which auditory perception can be manipulated and even completely suppressed. Such findings have important consequences for our understanding of the mechanisms of perception and also should allow scientists to precisely distinguish the influences of different higher-level influences.

  5. Investigating Hemispheric Lateralization of Reflexive Attention to Gaze and Arrow Cues

    Science.gov (United States)

    Marotta, Andrea; Lupianez, Juan; Casagrande, Maria

    2012-01-01

    Recent studies have demonstrated that central cues, such as eyes and arrows, reflexively trigger attentional shifts. However, it is not clear whether the attentional mechanisms induced by these two cues are similar or rather differ in some important way. We investigated hemispheric lateralization of the orienting effects induced by the two cue…

  6. Identification of Auditory Object-Specific Attention from Single-Trial Electroencephalogram Signals via Entropy Measures and Machine Learning

    Directory of Open Access Journals (Sweden)

    Yun Lu

    2018-05-01

    Full Text Available Existing research has revealed that auditory attention can be tracked from ongoing electroencephalography (EEG signals. The aim of this novel study was to investigate the identification of peoples’ attention to a specific auditory object from single-trial EEG signals via entropy measures and machine learning. Approximate entropy (ApEn, sample entropy (SampEn, composite multiscale entropy (CmpMSE and fuzzy entropy (FuzzyEn were used to extract the informative features of EEG signals under three kinds of auditory object-specific attention (Rest, Auditory Object1 Attention (AOA1 and Auditory Object2 Attention (AOA2. The linear discriminant analysis and support vector machine (SVM, were used to construct two auditory attention classifiers. The statistical results of entropy measures indicated that there were significant differences in the values of ApEn, SampEn, CmpMSE and FuzzyEn between Rest, AOA1 and AOA2. For the SVM-based auditory attention classifier, the auditory object-specific attention of Rest, AOA1 and AOA2 could be identified from EEG signals using ApEn, SampEn, CmpMSE and FuzzyEn as features and the identification rates were significantly different from chance level. The optimal identification was achieved by the SVM-based auditory attention classifier using CmpMSE with the scale factor τ = 10. This study demonstrated a novel solution to identify the auditory object-specific attention from single-trial EEG signals without the need to access the auditory stimulus.

  7. Attention-driven auditory cortex short-term plasticity helps segregate relevant sounds from noise.

    Science.gov (United States)

    Ahveninen, Jyrki; Hämäläinen, Matti; Jääskeläinen, Iiro P; Ahlfors, Seppo P; Huang, Samantha; Lin, Fa-Hsuan; Raij, Tommi; Sams, Mikko; Vasios, Christos E; Belliveau, John W

    2011-03-08

    How can we concentrate on relevant sounds in noisy environments? A "gain model" suggests that auditory attention simply amplifies relevant and suppresses irrelevant afferent inputs. However, it is unclear whether this suffices when attended and ignored features overlap to stimulate the same neuronal receptive fields. A "tuning model" suggests that, in addition to gain, attention modulates feature selectivity of auditory neurons. We recorded magnetoencephalography, EEG, and functional MRI (fMRI) while subjects attended to tones delivered to one ear and ignored opposite-ear inputs. The attended ear was switched every 30 s to quantify how quickly the effects evolve. To produce overlapping inputs, the tones were presented alone vs. during white-noise masking notch-filtered ±1/6 octaves around the tone center frequencies. Amplitude modulation (39 vs. 41 Hz in opposite ears) was applied for "frequency tagging" of attention effects on maskers. Noise masking reduced early (50-150 ms; N1) auditory responses to unattended tones. In support of the tuning model, selective attention canceled out this attenuating effect but did not modulate the gain of 50-150 ms activity to nonmasked tones or steady-state responses to the maskers themselves. These tuning effects originated at nonprimary auditory cortices, purportedly occupied by neurons that, without attention, have wider frequency tuning than ±1/6 octaves. The attentional tuning evolved rapidly, during the first few seconds after attention switching, and correlated with behavioral discrimination performance. In conclusion, a simple gain model alone cannot explain auditory selective attention. In nonprimary auditory cortices, attention-driven short-term plasticity retunes neurons to segregate relevant sounds from noise.

  8. Music training relates to the development of neural mechanisms of selective auditory attention.

    Science.gov (United States)

    Strait, Dana L; Slater, Jessica; O'Connell, Samantha; Kraus, Nina

    2015-04-01

    Selective attention decreases trial-to-trial variability in cortical auditory-evoked activity. This effect increases over the course of maturation, potentially reflecting the gradual development of selective attention and inhibitory control. Work in adults indicates that music training may alter the development of this neural response characteristic, especially over brain regions associated with executive control: in adult musicians, attention decreases variability in auditory-evoked responses recorded over prefrontal cortex to a greater extent than in nonmusicians. We aimed to determine whether this musician-associated effect emerges during childhood, when selective attention and inhibitory control are under development. We compared cortical auditory-evoked variability to attended and ignored speech streams in musicians and nonmusicians across three age groups: preschoolers, school-aged children and young adults. Results reveal that childhood music training is associated with reduced auditory-evoked response variability recorded over prefrontal cortex during selective auditory attention in school-aged child and adult musicians. Preschoolers, on the other hand, demonstrate no impact of selective attention on cortical response variability and no musician distinctions. This finding is consistent with the gradual emergence of attention during this period and may suggest no pre-existing differences in this attention-related cortical metric between children who undergo music training and those who do not. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Auditory event-related responses to diphthongs in different attention conditions

    DEFF Research Database (Denmark)

    Morris, David Jackson; Steinmetzger, Kurt; Tøndering, John

    2016-01-01

    The modulation of auditory event-related potentials (ERP) by attention generally results in larger amplitudes when stimuli are attended. We measured the P1-N1-P2 acoustic change complex elicited with synthetic overt (second formant, F2 = 1000 Hz) and subtle (F2 = 100 Hz) diphthongs, while subjects...... (i) attended to the auditory stimuli, (ii) ignored the auditory stimuli and watched a film, and (iii) diverted their attention to a visual discrimination task. Responses elicited by diphthongs where F2 values rose and fell were found to be different and this precluded their combined analysis....... Multivariate analysis of ERP components from the rising F2 changes showed main effects of attention on P2 amplitude and latency, and N1-P2 amplitude. P2 amplitude decreased by 40% between the attend and ignore conditions, and by 60% between the attend and divert conditions. The effect of diphthong magnitude...

  10. Auditory measures of selective and divided attention in young and older adults using single-talker competition.

    Science.gov (United States)

    Humes, Larry E; Lee, Jae Hee; Coughlin, Maureen P

    2006-11-01

    In this study, two experiments were conducted on auditory selective and divided attention in which the listening task involved the identification of words in sentences spoken by one talker while a second talker produced a very similar competing sentence. Ten young normal-hearing (YNH) and 13 elderly hearing-impaired (EHI) listeners participated in each experiment. The type of attention cue used was the main difference between experiments. Across both experiments, several consistent trends were observed. First, in eight of the nine divided-attention tasks across both experiments, the EHI subjects performed significantly worse than the YNH subjects. By comparison, significant differences in performance between age groups were only observed on three of the nine selective-attention tasks. Finally, there were consistent individual differences in performance across both experiments. Correlational analyses performed on the data from the 13 older adults suggested that the individual differences in performance were associated with individual differences in memory (digit span). Among the elderly, differences in age or differences in hearing loss did not contribute to the individual differences observed in either experiment.

  11. The effect of self-control on attentional bias for alcohol cues in male heavy drinkers.

    NARCIS (Netherlands)

    Teunissen, H.A.; Spijkerman, R.; Schoenmakers, T.M.; Vohs, K.D.; Engels, R.C.M.

    2012-01-01

    Attentional bias for alcohol cues increases craving and subsequent alcohol consumption. Override processes can be used to disengage attention from alcohol cues. This requires self-control and implies that depletion of self-control would impair the ability to disengage attention from alcohol cues.

  12. The Effect of Auditory Cueing on the Spatial and Temporal Gait Coordination in Healthy Adults.

    Science.gov (United States)

    Almarwani, Maha; Van Swearingen, Jessie M; Perera, Subashan; Sparto, Patrick J; Brach, Jennifer S

    2017-12-27

    Walk ratio, defined as step length divided by cadence, indicates the coordination of gait. During free walking, deviation from the preferential walk ratio may reveal abnormalities of walking patterns. The purpose of this study was to examine the impact of rhythmic auditory cueing (metronome) on the neuromotor control of gait at different walking speeds. Forty adults (mean age 26.6 ± 6.0 years) participated in the study. Gait characteristics were collected using a computerized walkway. In the preferred walking speed, there was no significant difference in walk ratio between uncued (walk ratio = .0064 ± .0007 m/steps/min) and metronome-cued walking (walk ratio = .0064 ± .0007 m/steps/min; p = .791). A higher value of walk ratio at the slower speed was observed with metronome-cued (walk ratio = .0071 ± .0008 m/steps/min) compared to uncued walking (walk ratio = .0068 ± .0007 m/steps/min; p metronome-cued (walk ratio = .0060 ± .0009 m/steps/min) compared to uncued walking (walk ratio = .0062 ± .0009 m/steps/min; p = .005). In healthy adults, the metronome cues may become an attentional demanding task, and thereby disrupt the spatial and temporal integration of gait at nonpreferred speeds.

  13. Attention-related modulation of auditory brainstem responses during contralateral noise exposure.

    Science.gov (United States)

    Ikeda, Kazunari; Sekiguchi, Takahiro; Hayashi, Akiko

    2008-10-29

    As determinants facilitating attention-related modulation of the auditory brainstem response (ABR), two experimental factors were examined: (i) auditory discrimination; and (ii) contralateral masking intensity. Tone pips at 80 dB sound pressure level were presented to the left ear via either single-tone exposures or oddball exposures, whereas white noise was delivered continuously to the right ear at variable intensities (none--80 dB sound pressure level). Participants each conducted two tasks during stimulation, either reading a book (ignoring task) or detecting target tones (attentive task). Task-related modulation within the ABR range was found only during oddball exposures at contralateral masking intensities greater than or equal to 60 dB. Attention-related modulation of ABR can thus be detected reliably during auditory discrimination under contralateral masking of sufficient intensity.

  14. Selective and divided attention modulates auditory-vocal integration in the processing of pitch feedback errors.

    Science.gov (United States)

    Liu, Ying; Hu, Huijing; Jones, Jeffery A; Guo, Zhiqiang; Li, Weifeng; Chen, Xi; Liu, Peng; Liu, Hanjun

    2015-08-01

    Speakers rapidly adjust their ongoing vocal productions to compensate for errors they hear in their auditory feedback. It is currently unclear what role attention plays in these vocal compensations. This event-related potential (ERP) study examined the influence of selective and divided attention on the vocal and cortical responses to pitch errors heard in auditory feedback regarding ongoing vocalisations. During the production of a sustained vowel, participants briefly heard their vocal pitch shifted up two semitones while they actively attended to auditory or visual events (selective attention), or both auditory and visual events (divided attention), or were not told to attend to either modality (control condition). The behavioral results showed that attending to the pitch perturbations elicited larger vocal compensations than attending to the visual stimuli. Moreover, ERPs were likewise sensitive to the attentional manipulations: P2 responses to pitch perturbations were larger when participants attended to the auditory stimuli compared to when they attended to the visual stimuli, and compared to when they were not explicitly told to attend to either the visual or auditory stimuli. By contrast, dividing attention between the auditory and visual modalities caused suppressed P2 responses relative to all the other conditions and caused enhanced N1 responses relative to the control condition. These findings provide strong evidence for the influence of attention on the mechanisms underlying the auditory-vocal integration in the processing of pitch feedback errors. In addition, selective attention and divided attention appear to modulate the neurobehavioral processing of pitch feedback errors in different ways. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  15. Gender-specific effects of prenatal and adolescent exposure to tobacco smoke on auditory and visual attention.

    Science.gov (United States)

    Jacobsen, Leslie K; Slotkin, Theodore A; Mencl, W Einar; Frost, Stephen J; Pugh, Kenneth R

    2007-12-01

    Prenatal exposure to active maternal tobacco smoking elevates risk of cognitive and auditory processing deficits, and of smoking in offspring. Recent preclinical work has demonstrated a sex-specific pattern of reduction in cortical cholinergic markers following prenatal, adolescent, or combined prenatal and adolescent exposure to nicotine, the primary psychoactive component of tobacco smoke. Given the importance of cortical cholinergic neurotransmission to attentional function, we examined auditory and visual selective and divided attention in 181 male and female adolescent smokers and nonsmokers with and without prenatal exposure to maternal smoking. Groups did not differ in age, educational attainment, symptoms of inattention, or years of parent education. A subset of 63 subjects also underwent functional magnetic resonance imaging while performing an auditory and visual selective and divided attention task. Among females, exposure to tobacco smoke during prenatal or adolescent development was associated with reductions in auditory and visual attention performance accuracy that were greatest in female smokers with prenatal exposure (combined exposure). Among males, combined exposure was associated with marked deficits in auditory attention, suggesting greater vulnerability of neurocircuitry supporting auditory attention to insult stemming from developmental exposure to tobacco smoke in males. Activation of brain regions that support auditory attention was greater in adolescents with prenatal or adolescent exposure to tobacco smoke relative to adolescents with neither prenatal nor adolescent exposure to tobacco smoke. These findings extend earlier preclinical work and suggest that, in humans, prenatal and adolescent exposure to nicotine exerts gender-specific deleterious effects on auditory and visual attention, with concomitant alterations in the efficiency of neurocircuitry supporting auditory attention.

  16. Focal Suppression of Distractor Sounds by Selective Attention in Auditory Cortex.

    Science.gov (United States)

    Schwartz, Zachary P; David, Stephen V

    2018-01-01

    Auditory selective attention is required for parsing crowded acoustic environments, but cortical systems mediating the influence of behavioral state on auditory perception are not well characterized. Previous neurophysiological studies suggest that attention produces a general enhancement of neural responses to important target sounds versus irrelevant distractors. However, behavioral studies suggest that in the presence of masking noise, attention provides a focal suppression of distractors that compete with targets. Here, we compared effects of attention on cortical responses to masking versus non-masking distractors, controlling for effects of listening effort and general task engagement. We recorded single-unit activity from primary auditory cortex (A1) of ferrets during behavior and found that selective attention decreased responses to distractors masking targets in the same spectral band, compared with spectrally distinct distractors. This suppression enhanced neural target detection thresholds, suggesting that limited attention resources serve to focally suppress responses to distractors that interfere with target detection. Changing effort by manipulating target salience consistently modulated spontaneous but not evoked activity. Task engagement and changing effort tended to affect the same neurons, while attention affected an independent population, suggesting that distinct feedback circuits mediate effects of attention and effort in A1. © The Author 2017. Published by Oxford University Press.

  17. Dynamic crossmodal links revealed by steady-state responses in auditory-visual divided attention.

    Science.gov (United States)

    de Jong, Ritske; Toffanin, Paolo; Harbers, Marten

    2010-01-01

    Frequency tagging has been often used to study intramodal attention but not intermodal attention. We used EEG and simultaneous frequency tagging of auditory and visual sources to study intermodal focused and divided attention in detection and discrimination performance. Divided-attention costs were smaller, but still significant, in detection than in discrimination. The auditory steady-state response (SSR) showed no effects of attention at frontocentral locations, but did so at occipital locations where it was evident only when attention was divided between audition and vision. Similarly, the visual SSR at occipital locations was substantially enhanced when attention was divided across modalities. Both effects were equally present in detection and discrimination. We suggest that both effects reflect a common cause: An attention-dependent influence of auditory information processing on early cortical stages of visual information processing, mediated by enhanced effective connectivity between the two modalities under conditions of divided attention. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  18. Auditory and visual capture during focused visual attention

    OpenAIRE

    Koelewijn, T.; Bronkhorst, A.W.; Theeuwes, J.

    2009-01-01

    It is well known that auditory and visual onsets presented at a particular location can capture a person's visual attention. However, the question of whether such attentional capture disappears when attention is focused endogenously beforehand has not yet been answered. Moreover, previous studies have not differentiated between capture by onsets presented at a nontarget (invalid) location and possible performance benefits occurring when the target location is (validly) cued. In this study, th...

  19. Do the affective properties of smoking-related cues influence attentional and approach biases in cigarette smokers?

    Science.gov (United States)

    Bradley, B P; Field, M; Healy, H; Mogg, K

    2008-09-01

    Research indicates that drug-related cues elicit attention and approach biases in drug users. However, attentional biases are not unique to addiction (e.g., they are also found for emotional information). This study examined whether attentional and approach biases in cigarette smokers are mediated by the motivational salience of cues (relevance to drug-taking), rather than by their affective properties (subjective liking of the cues). Cues included pleasant and unpleasant smoking-related pictures. Attentional biases, approach tendencies and subjective evaluation of the cues were assessed on visual probe, stimulus-response compatibility and rating tasks, respectively. Compared with non-smokers, smokers showed a greater attentional bias for both pleasant and unpleasant smoking-related cues presented for 2000 ms, but not for 200 ms. Smokers showed a greater approach bias for unpleasant cues, although the groups did not differ significantly in approach bias for pleasant smoking-related cues. Smokers rated both pleasant and unpleasant smoking pictures more positively than did non-smokers. Results suggest that a bias to maintain attention on smoking-related cues in young adult smokers is primarily a function of drug-relevance, rather than affective properties, of the cues. In contrast, approach tendencies and pleasantness judgements were influenced by drug use, drug-relevance and the affective properties of the cues.

  20. Great cormorants ( Phalacrocorax carbo) can detect auditory cues while diving

    Science.gov (United States)

    Hansen, Kirstin Anderson; Maxwell, Alyssa; Siebert, Ursula; Larsen, Ole Næsbye; Wahlberg, Magnus

    2017-06-01

    In-air hearing in birds has been thoroughly investigated. Sound provides birds with auditory information for species and individual recognition from their complex vocalizations, as well as cues while foraging and for avoiding predators. Some 10% of existing species of birds obtain their food under the water surface. Whether some of these birds make use of acoustic cues while underwater is unknown. An interesting species in this respect is the great cormorant ( Phalacrocorax carbo), being one of the most effective marine predators and relying on the aquatic environment for food year round. Here, its underwater hearing abilities were investigated using psychophysics, where the bird learned to detect the presence or absence of a tone while submerged. The greatest sensitivity was found at 2 kHz, with an underwater hearing threshold of 71 dB re 1 μPa rms. The great cormorant is better at hearing underwater than expected, and the hearing thresholds are comparable to seals and toothed whales in the frequency band 1-4 kHz. This opens up the possibility of cormorants and other aquatic birds having special adaptations for underwater hearing and making use of underwater acoustic cues from, e.g., conspecifics, their surroundings, as well as prey and predators.

  1. Neural effects of cognitive control load on auditory selective attention.

    Science.gov (United States)

    Sabri, Merav; Humphries, Colin; Verber, Matthew; Liebenthal, Einat; Binder, Jeffrey R; Mangalathu, Jain; Desai, Anjali

    2014-08-01

    Whether and how working memory disrupts or alters auditory selective attention is unclear. We compared simultaneous event-related potentials (ERP) and functional magnetic resonance imaging (fMRI) responses associated with task-irrelevant sounds across high and low working memory load in a dichotic-listening paradigm. Participants performed n-back tasks (1-back, 2-back) in one ear (Attend ear) while ignoring task-irrelevant speech sounds in the other ear (Ignore ear). The effects of working memory load on selective attention were observed at 130-210ms, with higher load resulting in greater irrelevant syllable-related activation in localizer-defined regions in auditory cortex. The interaction between memory load and presence of irrelevant information revealed stronger activations primarily in frontal and parietal areas due to presence of irrelevant information in the higher memory load. Joint independent component analysis of ERP and fMRI data revealed that the ERP component in the N1 time-range is associated with activity in superior temporal gyrus and medial prefrontal cortex. These results demonstrate a dynamic relationship between working memory load and auditory selective attention, in agreement with the load model of attention and the idea of common neural resources for memory and attention. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Attentional bias in smokers: exposure to dynamic smoking cues in contemporary movies

    NARCIS (Netherlands)

    Lochbühler, K.C.; Voogd, H.F.J.M.; Scholte, R.H.J.; Engels, R.C.M.E.

    2011-01-01

    Research has shown that smokers have an attentional bias for pictorial smoking cues. The objective of the present study was to examine whether smokers also have an attentional bias for dynamic smoking cues in contemporary movies and therefore fixate more quickly, more often and for longer periods of

  3. EEG phase reset due to auditory attention: an inverse time-scale approach

    International Nuclear Information System (INIS)

    Low, Yin Fen; Strauss, Daniel J

    2009-01-01

    We propose a novel tool to evaluate the electroencephalograph (EEG) phase reset due to auditory attention by utilizing an inverse analysis of the instantaneous phase for the first time. EEGs were acquired through auditory attention experiments with a maximum entropy stimulation paradigm. We examined single sweeps of auditory late response (ALR) with the complex continuous wavelet transform. The phase in the frequency band that is associated with auditory attention (6–10 Hz, termed as theta–alpha border) was reset to the mean phase of the averaged EEGs. The inverse transform was applied to reconstruct the phase-modified signal. We found significant enhancement of the N100 wave in the reconstructed signal. Analysis of the phase noise shows the effects of phase jittering on the generation of the N100 wave implying that a preferred phase is necessary to generate the event-related potential (ERP). Power spectrum analysis shows a remarkable increase of evoked power but little change of total power after stabilizing the phase of EEGs. Furthermore, by resetting the phase only at the theta border of no attention data to the mean phase of attention data yields a result that resembles attention data. These results show strong connections between EEGs and ERP, in particular, we suggest that the presentation of an auditory stimulus triggers the phase reset process at the theta–alpha border which leads to the emergence of the N100 wave. It is concluded that our study reinforces other studies on the importance of the EEG in ERP genesis

  4. EEG phase reset due to auditory attention: an inverse time-scale approach.

    Science.gov (United States)

    Low, Yin Fen; Strauss, Daniel J

    2009-08-01

    We propose a novel tool to evaluate the electroencephalograph (EEG) phase reset due to auditory attention by utilizing an inverse analysis of the instantaneous phase for the first time. EEGs were acquired through auditory attention experiments with a maximum entropy stimulation paradigm. We examined single sweeps of auditory late response (ALR) with the complex continuous wavelet transform. The phase in the frequency band that is associated with auditory attention (6-10 Hz, termed as theta-alpha border) was reset to the mean phase of the averaged EEGs. The inverse transform was applied to reconstruct the phase-modified signal. We found significant enhancement of the N100 wave in the reconstructed signal. Analysis of the phase noise shows the effects of phase jittering on the generation of the N100 wave implying that a preferred phase is necessary to generate the event-related potential (ERP). Power spectrum analysis shows a remarkable increase of evoked power but little change of total power after stabilizing the phase of EEGs. Furthermore, by resetting the phase only at the theta border of no attention data to the mean phase of attention data yields a result that resembles attention data. These results show strong connections between EEGs and ERP, in particular, we suggest that the presentation of an auditory stimulus triggers the phase reset process at the theta-alpha border which leads to the emergence of the N100 wave. It is concluded that our study reinforces other studies on the importance of the EEG in ERP genesis.

  5. Dissociable influences of auditory object vs. spatial attention on visual system oscillatory activity.

    Directory of Open Access Journals (Sweden)

    Jyrki Ahveninen

    Full Text Available Given that both auditory and visual systems have anatomically separate object identification ("what" and spatial ("where" pathways, it is of interest whether attention-driven cross-sensory modulations occur separately within these feature domains. Here, we investigated how auditory "what" vs. "where" attention tasks modulate activity in visual pathways using cortically constrained source estimates of magnetoencephalograpic (MEG oscillatory activity. In the absence of visual stimuli or tasks, subjects were presented with a sequence of auditory-stimulus pairs and instructed to selectively attend to phonetic ("what" vs. spatial ("where" aspects of these sounds, or to listen passively. To investigate sustained modulatory effects, oscillatory power was estimated from time periods between sound-pair presentations. In comparison to attention to sound locations, phonetic auditory attention was associated with stronger alpha (7-13 Hz power in several visual areas (primary visual cortex; lingual, fusiform, and inferior temporal gyri, lateral occipital cortex, as well as in higher-order visual/multisensory areas including lateral/medial parietal and retrosplenial cortices. Region-of-interest (ROI analyses of dynamic changes, from which the sustained effects had been removed, suggested further power increases during Attend Phoneme vs. Location centered at the alpha range 400-600 ms after the onset of second sound of each stimulus pair. These results suggest distinct modulations of visual system oscillatory activity during auditory attention to sound object identity ("what" vs. sound location ("where". The alpha modulations could be interpreted to reflect enhanced crossmodal inhibition of feature-specific visual pathways and adjacent audiovisual association areas during "what" vs. "where" auditory attention.

  6. Watch out! Magnetoencephalographic evidence for early modulation of attention orienting by fearful gaze cueing.

    Directory of Open Access Journals (Sweden)

    Fanny Lachat

    Full Text Available Others' gaze and emotional facial expression are important cues for the process of attention orienting. Here, we investigated with magnetoencephalography (MEG whether the combination of averted gaze and fearful expression may elicit a selectively early effect of attention orienting on the brain responses to targets. We used the direction of gaze of centrally presented fearful and happy faces as the spatial attention orienting cue in a Posner-like paradigm where the subjects had to detect a target checkerboard presented at gazed-at (valid trials or non gazed-at (invalid trials locations of the screen. We showed that the combination of averted gaze and fearful expression resulted in a very early attention orienting effect in the form of additional parietal activity between 55 and 70 ms for the valid versus invalid targets following fearful gaze cues. No such effect was obtained for the targets following happy gaze cues. This early cue-target validity effect selective of fearful gaze cues involved the left superior parietal region and the left lateral middle occipital region. These findings provide the first evidence for an effect of attention orienting induced by fearful gaze in the time range of C1. In doing so, they demonstrate the selective impact of combined gaze and fearful expression cues in the process of attention orienting.

  7. Extensive Tonotopic Mapping across Auditory Cortex Is Recapitulated by Spectrally Directed Attention and Systematically Related to Cortical Myeloarchitecture.

    Science.gov (United States)

    Dick, Frederic K; Lehet, Matt I; Callaghan, Martina F; Keller, Tim A; Sereno, Martin I; Holt, Lori L

    2017-12-13

    Auditory selective attention is vital in natural soundscapes. But it is unclear how attentional focus on the primary dimension of auditory representation-acoustic frequency-might modulate basic auditory functional topography during active listening. In contrast to visual selective attention, which is supported by motor-mediated optimization of input across saccades and pupil dilation, the primate auditory system has fewer means of differentially sampling the world. This makes spectrally-directed endogenous attention a particularly crucial aspect of auditory attention. Using a novel functional paradigm combined with quantitative MRI, we establish in male and female listeners that human frequency-band-selective attention drives activation in both myeloarchitectonically estimated auditory core, and across the majority of tonotopically mapped nonprimary auditory cortex. The attentionally driven best-frequency maps show strong concordance with sensory-driven maps in the same subjects across much of the temporal plane, with poor concordance in areas outside traditional auditory cortex. There is significantly greater activation across most of auditory cortex when best frequency is attended, versus ignored; the same regions do not show this enhancement when attending to the least-preferred frequency band. Finally, the results demonstrate that there is spatial correspondence between the degree of myelination and the strength of the tonotopic signal across a number of regions in auditory cortex. Strong frequency preferences across tonotopically mapped auditory cortex spatially correlate with R 1 -estimated myeloarchitecture, indicating shared functional and anatomical organization that may underlie intrinsic auditory regionalization. SIGNIFICANCE STATEMENT Perception is an active process, especially sensitive to attentional state. Listeners direct auditory attention to track a violin's melody within an ensemble performance, or to follow a voice in a crowded cafe. Although

  8. Neuronal effects of nicotine during auditory selective attention.

    Science.gov (United States)

    Smucny, Jason; Olincy, Ann; Eichman, Lindsay S; Tregellas, Jason R

    2015-06-01

    Although the attention-enhancing effects of nicotine have been behaviorally and neurophysiologically well-documented, its localized functional effects during selective attention are poorly understood. In this study, we examined the neuronal effects of nicotine during auditory selective attention in healthy human nonsmokers. We hypothesized to observe significant effects of nicotine in attention-associated brain areas, driven by nicotine-induced increases in activity as a function of increasing task demands. A single-blind, prospective, randomized crossover design was used to examine neuronal response associated with a go/no-go task after 7 mg nicotine or placebo patch administration in 20 individuals who underwent functional magnetic resonance imaging at 3T. The task design included two levels of difficulty (ordered vs. random stimuli) and two levels of auditory distraction (silence vs. noise). Significant treatment × difficulty × distraction interaction effects on neuronal response were observed in the hippocampus, ventral parietal cortex, and anterior cingulate. In contrast to our hypothesis, U and inverted U-shaped dependencies were observed between the effects of nicotine on response and task demands, depending on the brain area. These results suggest that nicotine may differentially affect neuronal response depending on task conditions. These results have important theoretical implications for understanding how cholinergic tone may influence the neurobiology of selective attention.

  9. Mindful attention reduces neural and self-reported cue-induced craving in smokers

    Science.gov (United States)

    Creswell, John David; Tabibnia, Golnaz; Julson, Erica; Kober, Hedy; Tindle, Hilary A.

    2013-01-01

    An emerging body of research suggests that mindfulness-based interventions may be beneficial for smoking cessation and the treatment of other addictive disorders. One way that mindfulness may facilitate smoking cessation is through the reduction of craving to smoking cues. The present work considers whether mindful attention can reduce self-reported and neural markers of cue-induced craving in treatment seeking smokers. Forty-seven (n = 47) meditation-naïve treatment-seeking smokers (12-h abstinent from smoking) viewed and made ratings of smoking and neutral images while undergoing functional magnetic resonance imaging (fMRI). Participants were trained and instructed to view these images passively or with mindful attention. Results indicated that mindful attention reduced self-reported craving to smoking images, and reduced neural activity in a craving-related region of subgenual anterior cingulate cortex (sgACC). Moreover, a psychophysiological interaction analysis revealed that mindful attention reduced functional connectivity between sgACC and other craving-related regions compared to passively viewing smoking images, suggesting that mindfulness may decouple craving neurocircuitry when viewing smoking cues. These results provide an initial indication that mindful attention may describe a ‘bottom-up’ attention to one’s present moment experience in ways that can help reduce subjective and neural reactivity to smoking cues in smokers. PMID:22114078

  10. Associative cueing of attention through implicit feature-location binding.

    Science.gov (United States)

    Girardi, Giovanna; Nico, Daniele

    2017-09-01

    In order to assess associative learning between two task-irrelevant features in cueing spatial attention, we devised a task in which participants have to make an identity comparison between two sequential visual stimuli. Unbeknownst to them, location of the second stimulus could be predicted by the colour of the first or a concurrent sound. Albeit unnecessary to perform the identity-matching judgment the predictive features thus provided an arbitrary association favouring the spatial anticipation of the second stimulus. A significant advantage was found with faster responses at predicted compared to non-predicted locations. Results clearly demonstrated an associative cueing of attention via a second-order arbitrary feature/location association but with a substantial discrepancy depending on the sensory modality of the predictive feature. With colour as predictive feature, significant advantages emerged only after the completion of three blocks of trials. On the contrary, sound affected responses from the first block of trials and significant advantages were manifest from the beginning of the second. The possible mechanisms underlying the associative cueing of attention in both conditions are discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Attention-dependent allocation of auditory processing resources as measured by mismatch negativity.

    Science.gov (United States)

    Dittmann-Balcar, A; Thienel, R; Schall, U

    1999-12-16

    Mismatch negativity (MMN) is a pre-attentive event-related potential measure of echoic memory. However, recent studies suggest attention-related modulation of MMN. This study investigates duration-elicited MMN in healthy subjects (n = 12) who were performing a visual discrimination task and, subsequently, an auditory discrimination task in a series of increasing task difficulty. MMN amplitude was found to be maximal at centro-frontal electrode sites without hemispheric differences. Comparison of both attend conditions (visual vs. auditory), revealed larger MMN amplitudes at Fz in the visual task without differences across task difficulty. However, significantly smaller MMN in the most demanding auditory condition supports the notion of limited processing capacity whose resources are modulated by attention in response to task requirements.

  12. Black Cigarette Smokers Report More Attention to Smoking Cues Than White Smokers: Implications for Smoking Cessation.

    Science.gov (United States)

    Robinson, Cendrine D; Pickworth, Wallace B; Heishman, Stephen J; Wetter, David W; Cinciripini, Paul M; Li, Yisheng; Rowell, Brigid; Waters, Andrew J

    2015-08-01

    Black cigarette smokers have lower rates of smoking cessation compared with Whites. However, the mechanisms underlying these differences are not clear. Many Blacks live in communities saturated by tobacco advertisements. These cue-rich environments may undermine cessation attempts by provoking smoking. Moreover, attentional bias to smoking cues (attention capture by smoking cues) has been linked to lower cessation outcomes. Cessation attempts among Blacks may be compromised by attentional bias to smoking cues and a cue-rich environment. Attention to smoking cues in Black and White smokers was examined in 2 studies. In both studies, assessments were completed during 2 laboratory visits: a nonabstinent session and an abstinent session. In study 1, nontreatment-seeking smokers (99 Whites, 104 Blacks) completed the Subjective Attentional Bias Questionnaire (SABQ; a self-report measure of attention to cues) and the Smoking Stroop task (a reaction time measure of attentional bias to smoking cues). In study 2, 110 White and 74 Black treatment-seeking smokers completed these assessments and attempted to quit. In study 1, Blacks reported higher ratings than Whites on the SABQ (p = .005). In study 2, Blacks also reported higher ratings than Whites on the SABQ (p = .003). In study 2, Blacks had lower biochemical-verified point prevalence abstinence than Whites, and the between-race difference in outcome was partially mediated by SABQ ratings. Blacks reported greater attention to smoking cues than Whites, possibly due to between-race differences in environments. Greater attention to smoking cues may undermine cessation attempts. © The Author 2015. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Evoked potential correlates of selective attention with multi-channel auditory inputs

    Science.gov (United States)

    Schwent, V. L.; Hillyard, S. A.

    1975-01-01

    Ten subjects were presented with random, rapid sequences of four auditory tones which were separated in pitch and apparent spatial position. The N1 component of the auditory vertex evoked potential (EP) measured relative to a baseline was observed to increase with attention. It was concluded that the N1 enhancement reflects a finely tuned selective attention to one stimulus channel among several concurrent, competing channels. This EP enhancement probably increases with increased information load on the subject.

  14. Modulatory Effects of Attention on Lateral Inhibition in the Human Auditory Cortex.

    Science.gov (United States)

    Engell, Alva; Junghöfer, Markus; Stein, Alwina; Lau, Pia; Wunderlich, Robert; Wollbrink, Andreas; Pantev, Christo

    2016-01-01

    Reduced neural processing of a tone is observed when it is presented after a sound whose spectral range closely frames the frequency of the tone. This observation might be explained by the mechanism of lateral inhibition (LI) due to inhibitory interneurons in the auditory system. So far, several characteristics of bottom up influences on LI have been identified, while the influence of top-down processes such as directed attention on LI has not been investigated. Hence, the study at hand aims at investigating the modulatory effects of focused attention on LI in the human auditory cortex. In the magnetoencephalograph, we present two types of masking sounds (white noise vs. withe noise passing through a notch filter centered at a specific frequency), followed by a test tone with a frequency corresponding to the center-frequency of the notch filter. Simultaneously, subjects were presented with visual input on a screen. To modulate the focus of attention, subjects were instructed to concentrate either on the auditory input or the visual stimuli. More specific, on one half of the trials, subjects were instructed to detect small deviations in loudness in the masking sounds while on the other half of the trials subjects were asked to detect target stimuli on the screen. The results revealed a reduction in neural activation due to LI, which was larger during auditory compared to visual focused attention. Attentional modulations of LI were observed in two post-N1m time intervals. These findings underline the robustness of reduced neural activation due to LI in the auditory cortex and point towards the important role of attention on the modulation of this mechanism in more evaluative processing stages.

  15. Does Contextual Cueing Guide the Deployment of Attention?

    OpenAIRE

    Kunar, Melina A.; Flusberg, Stephen; Horowitz, Todd S.; Wolfe, Jeremy M.

    2007-01-01

    Contextual cueing experiments show that when displays are repeated, reaction times (RTs) to find a target decrease over time even when observers are not aware of the repetition. It has been thought that the context of the display guides attention to the target. We tested this hypothesis by comparing the effects of guidance in a standard search task to the effects of contextual cueing. Firstly, in standard search, an improvement in guidance causes search slopes (derived from RT × Set Size func...

  16. Implicit and explicit selective attention to smoking cues in smokers indexed by brain potentials.

    Science.gov (United States)

    Littel, Marianne; Franken, Ingmar H A

    2011-04-01

    Substance use disorders are characterized by cognitive processing biases, such as automatically detecting and orienting attention towards drug-related stimuli. However, it is unclear how, when and what kind of attention (i.e. implicit, explicit) interacts with the processing of these stimuli. In addition, it is unclear whether smokers are hypersensitive to emotionally significant cues in general or to smoking-related cues in particular. The present event-related potential study aimed to enhance insight in drug-related processing biases by manipulating attention for smoking and other motivationally relevant (emotional) cues in smokers and non-smokers using a visual oddball task. Each of the stimulus categories served as a target (explicit attention; counting) or as a non-target (implicit attention; oddball) category. Compared with non-smokers, smokers' P300 (350-600 ms) was enhanced to smoking pictures under both attentional conditions. P300 amplitude did not differ between groups in response to positive, negative, and neutral cues. It can be concluded from this study that attention manipulation affects the P300 differently in smokers and non-smokers. Smokers display a specific bias to smoking-related cues, and this bias is present during both explicit and implicit attentional processing. Overall, it can be concluded that both explicit and implicit attentional processes appear to play an important role in drug-related processing bias.

  17. Does Contextual Cueing Guide the Deployment of Attention?

    Science.gov (United States)

    Kunar, Melina A.; Flusberg, Stephen; Horowitz, Todd S.; Wolfe, Jeremy M.

    2008-01-01

    Contextual cueing experiments show that when displays are repeated, reaction times (RTs) to find a target decrease over time even when observers are not aware of the repetition. It has been thought that the context of the display guides attention to the target. We tested this hypothesis by comparing the effects of guidance in a standard search task to the effects of contextual cueing. Firstly, in standard search, an improvement in guidance causes search slopes (derived from RT × Set Size functions) to decrease. In contrast, we found that search slopes in contextual cueing did not become more efficient over time (Experiment 1). Secondly, when guidance is optimal (e.g. in easy feature search) we still found a small, but reliable contextual cueing effect (Experiments 2a and 2b), suggesting that other factors, such as response selection, contribute to the effect. Experiment 3 supported this hypothesis by showing that the contextual cueing effect disappeared when we added interference to the response selection process. Overall, our data suggest that the relationship between guidance and contextual cueing is weak and that response selection can account for part of the effect. PMID:17683230

  18. Auditory Selective Attention in Cerebral-Palsied Individuals.

    Science.gov (United States)

    Laraway, Lee Ann

    1985-01-01

    To examine differences between auditory selective attention abilities of normal and cerebral-palsied individuals, 23 cerebral-palsied and 23 normal subjects (5-21) were asked to repeat a series of 30 items in presence of intermittent white noise. Results indicated that cerebral-palsied individuals perform significantly more poorly when the…

  19. Auditory attention enhances processing of positive and negative words in inferior and superior prefrontal cortex.

    Science.gov (United States)

    Wegrzyn, Martin; Herbert, Cornelia; Ethofer, Thomas; Flaisch, Tobias; Kissler, Johanna

    2017-11-01

    Visually presented emotional words are processed preferentially and effects of emotional content are similar to those of explicit attention deployment in that both amplify visual processing. However, auditory processing of emotional words is less well characterized and interactions between emotional content and task-induced attention have not been fully understood. Here, we investigate auditory processing of emotional words, focussing on how auditory attention to positive and negative words impacts their cerebral processing. A Functional magnetic resonance imaging (fMRI) study manipulating word valence and attention allocation was performed. Participants heard negative, positive and neutral words to which they either listened passively or attended by counting negative or positive words, respectively. Regardless of valence, active processing compared to passive listening increased activity in primary auditory cortex, left intraparietal sulcus, and right superior frontal gyrus (SFG). The attended valence elicited stronger activity in left inferior frontal gyrus (IFG) and left SFG, in line with these regions' role in semantic retrieval and evaluative processing. No evidence for valence-specific attentional modulation in auditory regions or distinct valence-specific regional activations (i.e., negative > positive or positive > negative) was obtained. Thus, allocation of auditory attention to positive and negative words can substantially increase their processing in higher-order language and evaluative brain areas without modulating early stages of auditory processing. Inferior and superior frontal brain structures mediate interactions between emotional content, attention, and working memory when prosodically neutral speech is processed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Enhanced attentional bias towards sexually explicit cues in individuals with and without compulsive sexual behaviours.

    Directory of Open Access Journals (Sweden)

    Daisy J Mechelmans

    Full Text Available Compulsive sexual behaviour (CSB is relatively common and has been associated with significant distress and psychosocial impairments. CSB has been conceptualized as either an impulse control disorder or a non-substance 'behavioural' addiction. Substance use disorders are commonly associated with attentional biases to drug cues which are believed to reflect processes of incentive salience. Here we assess male CSB subjects compared to age-matched male healthy controls using a dot probe task to assess attentional bias to sexually explicit cues. We show that compared to healthy volunteers, CSB subjects have enhanced attentional bias to explicit cues but not neutral cues particularly for early stimuli latency. Our findings suggest enhanced attentional bias to explicit cues possibly related to an early orienting attentional response. This finding dovetails with our recent observation that sexually explicit videos were associated with greater activity in a neural network similar to that observed in drug-cue-reactivity studies. Greater desire or wanting rather than liking was further associated with activity in this neural network. These studies together provide support for an incentive motivation theory of addiction underlying the aberrant response towards sexual cues in CSB.

  1. Effects of incongruent auditory and visual room-related cues on sound externalization

    DEFF Research Database (Denmark)

    Carvajal, Juan Camilo Gil; Santurette, Sébastien; Cubick, Jens

    Sounds presented via headphones are typically perceived inside the head. However, the illusion of a sound source located out in space away from the listener’s head can be generated with binaural headphone-based auralization systems by convolving anechoic sound signals with a binaural room impulse...... response (BRIR) measured with miniature microphones placed in the listener’s ear canals. Sound externalization of such virtual sounds can be very convincing and robust but there have been reports that the illusion might break down when the listening environment differs from the room in which the BRIRs were...... recorded [1,2,3]. This may be due to incongruent auditory cues between the recording and playback room during sound reproduction [2]. Alternatively, an expectation effect caused by the visual impression of the room may affect the position of the perceived auditory image [3]. Here, we systematically...

  2. The effect of offset cues on saccade programming and covert attention.

    Science.gov (United States)

    Smith, Daniel T; Casteau, Soazig

    2018-02-01

    Salient peripheral events trigger fast, "exogenous" covert orienting. The influential premotor theory of attention argues that covert orienting of attention depends upon planned but unexecuted eye-movements. One problem with this theory is that salient peripheral events, such as offsets, appear to summon attention when used to measure covert attention (e.g., the Posner cueing task) but appear not to elicit oculomotor preparation in tasks that require overt orienting (e.g., the remote distractor paradigm). Here, we examined the effects of peripheral offsets on covert attention and saccade preparation. Experiment 1 suggested that transient offsets summoned attention in a manual detection task without triggering motor preparation planning in a saccadic localisation task, although there were a high proportion of saccadic capture errors on "no-target" trials, where a cue was presented but no target appeared. In Experiment 2, "no-target" trials were removed. Here, transient offsets produced both attentional facilitation and faster saccadic responses on valid cue trials. A third experiment showed that the permanent disappearance of an object also elicited attentional facilitation and faster saccadic reaction times. These experiments demonstrate that offsets trigger both saccade programming and covert attentional orienting, consistent with the idea that exogenous, covert orienting is tightly coupled with oculomotor activation. The finding that no-go trials attenuates oculomotor priming effects offers a way to reconcile the current findings with previous claims of a dissociation between covert attention and oculomotor control in paradigms that utilise a high proportion of catch trials.

  3. Effects of selective attention on the electrophysiological representation of concurrent sounds in the human auditory cortex.

    Science.gov (United States)

    Bidet-Caulet, Aurélie; Fischer, Catherine; Besle, Julien; Aguera, Pierre-Emmanuel; Giard, Marie-Helene; Bertrand, Olivier

    2007-08-29

    In noisy environments, we use auditory selective attention to actively ignore distracting sounds and select relevant information, as during a cocktail party to follow one particular conversation. The present electrophysiological study aims at deciphering the spatiotemporal organization of the effect of selective attention on the representation of concurrent sounds in the human auditory cortex. Sound onset asynchrony was manipulated to induce the segregation of two concurrent auditory streams. Each stream consisted of amplitude modulated tones at different carrier and modulation frequencies. Electrophysiological recordings were performed in epileptic patients with pharmacologically resistant partial epilepsy, implanted with depth electrodes in the temporal cortex. Patients were presented with the stimuli while they either performed an auditory distracting task or actively selected one of the two concurrent streams. Selective attention was found to affect steady-state responses in the primary auditory cortex, and transient and sustained evoked responses in secondary auditory areas. The results provide new insights on the neural mechanisms of auditory selective attention: stream selection during sound rivalry would be facilitated not only by enhancing the neural representation of relevant sounds, but also by reducing the representation of irrelevant information in the auditory cortex. Finally, they suggest a specialization of the left hemisphere in the attentional selection of fine-grained acoustic information.

  4. Impact of Auditory Selective Attention on Verbal Short-Term Memory and Vocabulary Development

    Science.gov (United States)

    Majerus, Steve; Heiligenstein, Lucie; Gautherot, Nathalie; Poncelet, Martine; Van der Linden, Martial

    2009-01-01

    This study investigated the role of auditory selective attention capacities as a possible mediator of the well-established association between verbal short-term memory (STM) and vocabulary development. A total of 47 6- and 7-year-olds were administered verbal immediate serial recall and auditory attention tasks. Both task types probed processing…

  5. Automatic selective attention as a function of sensory modality in aging.

    Science.gov (United States)

    Guerreiro, Maria J S; Adam, Jos J; Van Gerven, Pascal W M

    2012-03-01

    It was recently hypothesized that age-related differences in selective attention depend on sensory modality (Guerreiro, M. J. S., Murphy, D. R., & Van Gerven, P. W. M. (2010). The role of sensory modality in age-related distraction: A critical review and a renewed view. Psychological Bulletin, 136, 975-1022. doi:10.1037/a0020731). So far, this hypothesis has not been tested in automatic selective attention. The current study addressed this issue by investigating age-related differences in automatic spatial cueing effects (i.e., facilitation and inhibition of return [IOR]) across sensory modalities. Thirty younger (mean age = 22.4 years) and 25 older adults (mean age = 68.8 years) performed 4 left-right target localization tasks, involving all combinations of visual and auditory cues and targets. We used stimulus onset asynchronies (SOAs) of 100, 500, 1,000, and 1,500 ms between cue and target. The results showed facilitation (shorter reaction times with valid relative to invalid cues at shorter SOAs) in the unimodal auditory and in both cross-modal tasks but not in the unimodal visual task. In contrast, there was IOR (longer reaction times with valid relative to invalid cues at longer SOAs) in both unimodal tasks but not in either of the cross-modal tasks. Most important, these spatial cueing effects were independent of age. The results suggest that the modality hypothesis of age-related differences in selective attention does not extend into the realm of automatic selective attention.

  6. Tuning In to Sound: Frequency-Selective Attentional Filter in Human Primary Auditory Cortex

    Science.gov (United States)

    Da Costa, Sandra; van der Zwaag, Wietske; Miller, Lee M.; Clarke, Stephanie

    2013-01-01

    Cocktail parties, busy streets, and other noisy environments pose a difficult challenge to the auditory system: how to focus attention on selected sounds while ignoring others? Neurons of primary auditory cortex, many of which are sharply tuned to sound frequency, could help solve this problem by filtering selected sound information based on frequency-content. To investigate whether this occurs, we used high-resolution fMRI at 7 tesla to map the fine-scale frequency-tuning (1.5 mm isotropic resolution) of primary auditory areas A1 and R in six human participants. Then, in a selective attention experiment, participants heard low (250 Hz)- and high (4000 Hz)-frequency streams of tones presented at the same time (dual-stream) and were instructed to focus attention onto one stream versus the other, switching back and forth every 30 s. Attention to low-frequency tones enhanced neural responses within low-frequency-tuned voxels relative to high, and when attention switched the pattern quickly reversed. Thus, like a radio, human primary auditory cortex is able to tune into attended frequency channels and can switch channels on demand. PMID:23365225

  7. Attentional bias to betel quid cues: An eye tracking study.

    Science.gov (United States)

    Shen, Bin; Chiu, Meng-Chun; Li, Shuo-Heng; Huang, Guo-Joe; Liu, Ling-Jun; Ho, Ming-Chou

    2016-09-01

    The World Health Organization regards betel quid as a human carcinogen, and DSM-IV and ICD-10 dependence symptoms may develop with heavy use. This study, conducted in central Taiwan, investigated whether betel quid chewers can exhibit overt orienting to selectively respond to the betel quid cues. Twenty-four male chewers' and 23 male nonchewers' eye movements to betel-quid-related pictures and matched pictures were assessed during a visual probe task. The eye movement index showed that betel quid chewers were more likely to initially direct their gaze to the betel quid cues, t(23) = 3.70, p betel quid chewers' attentional bias. The results demonstrated that the betel quid chewers (but not the nonchewers) were more likely to initially direct their gaze to the betel quid cues, and spent more time and were more fixated on them. These findings suggested that when attention is directly measured through the eye tracking technique, this methodology may be more sensitive to detecting attentional biases in betel quid chewers. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  8. Modulatory Effects of Attention on Lateral Inhibition in the Human Auditory Cortex.

    Directory of Open Access Journals (Sweden)

    Alva Engell

    Full Text Available Reduced neural processing of a tone is observed when it is presented after a sound whose spectral range closely frames the frequency of the tone. This observation might be explained by the mechanism of lateral inhibition (LI due to inhibitory interneurons in the auditory system. So far, several characteristics of bottom up influences on LI have been identified, while the influence of top-down processes such as directed attention on LI has not been investigated. Hence, the study at hand aims at investigating the modulatory effects of focused attention on LI in the human auditory cortex. In the magnetoencephalograph, we present two types of masking sounds (white noise vs. withe noise passing through a notch filter centered at a specific frequency, followed by a test tone with a frequency corresponding to the center-frequency of the notch filter. Simultaneously, subjects were presented with visual input on a screen. To modulate the focus of attention, subjects were instructed to concentrate either on the auditory input or the visual stimuli. More specific, on one half of the trials, subjects were instructed to detect small deviations in loudness in the masking sounds while on the other half of the trials subjects were asked to detect target stimuli on the screen. The results revealed a reduction in neural activation due to LI, which was larger during auditory compared to visual focused attention. Attentional modulations of LI were observed in two post-N1m time intervals. These findings underline the robustness of reduced neural activation due to LI in the auditory cortex and point towards the important role of attention on the modulation of this mechanism in more evaluative processing stages.

  9. Switching auditory attention using spatial and non-spatial features recruits different cortical networks.

    Science.gov (United States)

    Larson, Eric; Lee, Adrian K C

    2014-01-01

    Switching attention between different stimuli of interest based on particular task demands is important in many everyday settings. In audition in particular, switching attention between different speakers of interest that are talking concurrently is often necessary for effective communication. Recently, it has been shown by multiple studies that auditory selective attention suppresses the representation of unwanted streams in auditory cortical areas in favor of the target stream of interest. However, the neural processing that guides this selective attention process is not well understood. Here we investigated the cortical mechanisms involved in switching attention based on two different types of auditory features. By combining magneto- and electro-encephalography (M-EEG) with an anatomical MRI constraint, we examined the cortical dynamics involved in switching auditory attention based on either spatial or pitch features. We designed a paradigm where listeners were cued in the beginning of each trial to switch or maintain attention halfway through the presentation of concurrent target and masker streams. By allowing listeners time to switch during a gap in the continuous target and masker stimuli, we were able to isolate the mechanisms involved in endogenous, top-down attention switching. Our results show a double dissociation between the involvement of right temporoparietal junction (RTPJ) and the left inferior parietal supramarginal part (LIPSP) in tasks requiring listeners to switch attention based on space and pitch features, respectively, suggesting that switching attention based on these features involves at least partially separate processes or behavioral strategies. © 2013 Elsevier Inc. All rights reserved.

  10. Time-resolved neuroimaging of visual short term memory consolidation by post-perceptual attention shifts.

    Science.gov (United States)

    Hecht, Marcus; Thiemann, Ulf; Freitag, Christine M; Bender, Stephan

    2016-01-15

    Post-perceptual cues can enhance visual short term memory encoding even after the offset of the visual stimulus. However, both the mechanisms by which the sensory stimulus characteristics are buffered as well as the mechanisms by which post-perceptual selective attention enhances short term memory encoding remain unclear. We analyzed late post-perceptual event-related potentials (ERPs) in visual change detection tasks (100ms stimulus duration) by high-resolution ERP analysis to elucidate these mechanisms. The effects of early and late auditory post-cues (300ms or 850ms after visual stimulus onset) as well as the effects of a visual interference stimulus were examined in 27 healthy right-handed adults. Focusing attention with post-perceptual cues at both latencies significantly improved memory performance, i.e. sensory stimulus characteristics were available for up to 850ms after stimulus presentation. Passive watching of the visual stimuli without auditory cue presentation evoked a slow negative wave (N700) over occipito-temporal visual areas. N700 was strongly reduced by a visual interference stimulus which impeded memory maintenance. In contrast, contralateral delay activity (CDA) still developed in this condition after the application of auditory post-cues and was thereby dissociated from N700. CDA and N700 seem to represent two different processes involved in short term memory encoding. While N700 could reflect visual post processing by automatic attention attraction, CDA may reflect the top-down process of searching selectively for the required information through post-perceptual attention. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Contextual Cueing Improves Attentional Guidance, Even When Guidance Is Supposedly Optimal

    OpenAIRE

    Harris, A. M.; Remington, R. W.

    2017-01-01

    Visual search through previously encountered contexts typically produces reduced reaction times compared with search through novel contexts. This contextual cueing benefit is well established, but there is debate regarding its underlying mechanisms. Eye-tracking studies have consistently shown reduced number of fixations with repetition, supporting improvements in attentional guidance as the source of contextual cueing. However, contextual cueing benefits have been shown in conditions in whic...

  12. Infants' Selective Attention to Reliable Visual Cues in the Presence of Salient Distractors

    Science.gov (United States)

    Tummeltshammer, Kristen Swan; Mareschal, Denis; Kirkham, Natasha Z.

    2014-01-01

    With many features competing for attention in their visual environment, infants must learn to deploy attention toward informative cues while ignoring distractions. Three eye tracking experiments were conducted to investigate whether 6- and 8-month-olds (total N = 102) would shift attention away from a distractor stimulus to learn a cue-reward…

  13. Priming T2 in a Visual and Auditory Attentional Blink Task

    NARCIS (Netherlands)

    Burg, E. van der; Olivers, C.N.L.; Bronkhorst, A.W.; Theeuwes, J.

    2008-01-01

    Participants performed an attentional blink (AB) task including digits as targets and letters as distractors within the visual and auditory domains. Prior to the rapid serial visual presentation, a visual or auditory prime was presented in the form of a digit that was identical to the second target

  14. Influence of memory, attention, IQ and age on auditory temporal processing tests: preliminary study

    OpenAIRE

    Murphy, Cristina Ferraz Borges; Zachi, Elaine Cristina; Roque, Daniela Tsubota; Ventura, Dora Selma Fix; Schochat, Eliane

    2014-01-01

    PURPOSE: To investigate the existence of correlations between the performance of children in auditory temporal tests (Frequency Pattern and Gaps in Noise - GIN) and IQ, attention, memory and age measurements. METHOD: Fifteen typically developing individuals between the ages of 7 to 12 years and normal hearing participated in the study. Auditory temporal processing tests (GIN and Frequency Pattern), as well as a Memory test (Digit Span), Attention tests (auditory and visual modality) and ...

  15. Medial parietal cortex activation related to attention control involving alcohol cues

    NARCIS (Netherlands)

    Gladwin, Thomas E.; ter Mors-Schulte, Mieke H. J.; Ridderinkhof, K. Richard; Wiers, Reinout W.

    2013-01-01

    Automatic attentional engagement toward and disengagement from alcohol cues play a role in alcohol use and dependence. In the current study, social drinkers performed a spatial cueing task designed to evoke conflict between such automatic processes and task instructions, a potentially important task

  16. Should visual speech cues (speechreading) be considered when fitting hearing aids?

    Science.gov (United States)

    Grant, Ken

    2002-05-01

    When talker and listener are face-to-face, visual speech cues become an important part of the communication environment, and yet, these cues are seldom considered when designing hearing aids. Models of auditory-visual speech recognition highlight the importance of complementary versus redundant speech information for predicting auditory-visual recognition performance. Thus, for hearing aids to work optimally when visual speech cues are present, it is important to know whether the cues provided by amplification and the cues provided by speechreading complement each other. In this talk, data will be reviewed that show nonmonotonicity between auditory-alone speech recognition and auditory-visual speech recognition, suggesting that efforts designed solely to improve auditory-alone recognition may not always result in improved auditory-visual recognition. Data will also be presented showing that one of the most important speech cues for enhancing auditory-visual speech recognition performance, voicing, is often the cue that benefits least from amplification.

  17. Effect of rhythmic auditory cueing on gait in people with Alzheimer disease.

    Science.gov (United States)

    Wittwer, Joanne E; Webster, Kate E; Hill, Keith

    2013-04-01

    To determine whether rhythmic music and metronome cues alter spatiotemporal gait measures and gait variability in people with Alzheimer disease (AD). A repeated-measures study requiring participants to walk under different cueing conditions. University movement laboratory. Of the people (N=46) who met study criteria (a diagnosis of probable AD and ability to walk 100m) at routine medical review, 30 (16 men; mean age ± SD, 80±6y; revised Addenbrooke's Cognitive Examination range, 26-79) volunteered to participate. Participants walked 4 times over an electronic walkway synchronizing to (1) rhythmic music and (2) a metronome set at individual mean baseline comfortable speed cadence. Gait spatiotemporal measures and gait variability (coefficient of variation [CV]). Data from individual walks under each condition were combined. A 1-way repeated-measures analysis of variance was used to compare uncued baseline, cued, and retest measures. Gait velocity decreased with both music and metronome cues compared with baseline (baseline, 110.5cm/s; music, 103.4cm/s; metronome, 105.4cm/s), primarily because of significant decreases in stride length (baseline, 120.9cm; music, 112.5cm; metronome, 114.8cm) with both cue types. This was coupled with increased stride length variability compared with baseline (baseline CV, 3.4%; music CV, 4.3%; metronome CV, 4.5%) with both cue types. These changes did not persist at (uncued) retest. Temporal variability was unchanged. Rhythmic auditory cueing at comfortable speed tempo produced deleterious effects on gait in a single session in this group with AD. The deterioration in spatial gait parameters may result from impaired executive function associated with AD. Further research should investigate whether these instantaneous cue effects are altered with more practice or with learning methods tailored to people with cognitive impairment. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights

  18. Modeling the Effects of Attentional Cueing on Meditators

    NARCIS (Netherlands)

    van Vugt, Marieke K.; van den Hurk, Paul M.

    2016-01-01

    Training in meditation has been shown to affect functioning of several attentional subsystems, most prominently conflict monitoring, and to some extent orienting. These previous findings described the effects of cueing and manipulating stimulus congruency on response times and accuracies. However,

  19. A common source of attention for auditory and visual tracking.

    Science.gov (United States)

    Fougnie, Daryl; Cockhren, Jurnell; Marois, René

    2018-05-01

    Tasks that require tracking visual information reveal the severe limitations of our capacity to attend to multiple objects that vary in time and space. Although these limitations have been extensively characterized in the visual domain, very little is known about tracking information in other sensory domains. Does tracking auditory information exhibit characteristics similar to those of tracking visual information, and to what extent do these two tracking tasks draw on the same attention resources? We addressed these questions by asking participants to perform either single or dual tracking tasks from the same (visual-visual) or different (visual-auditory) perceptual modalities, with the difficulty of the tracking tasks being manipulated across trials. The results revealed that performing two concurrent tracking tasks, whether they were in the same or different modalities, affected tracking performance as compared to performing each task alone (concurrence costs). Moreover, increasing task difficulty also led to increased costs in both the single-task and dual-task conditions (load-dependent costs). The comparison of concurrence costs between visual-visual and visual-auditory dual-task performance revealed slightly greater interference when two visual tracking tasks were paired. Interestingly, however, increasing task difficulty led to equivalent costs for visual-visual and visual-auditory pairings. We concluded that visual and auditory tracking draw largely, though not exclusively, on common central attentional resources.

  20. Dietary self-control influences top-down guidance of attention to food cues.

    Science.gov (United States)

    Higgs, Suzanne; Dolmans, Dirk; Humphreys, Glyn W; Rutters, Femke

    2015-01-01

    Motivational objects attract attention due to their rewarding properties, but less is known about the role that top-down cognitive processes play in the attention paid to motivationally relevant objects and how this is affected by relevant behavioral traits. Here we assess how thinking about food affects attentional guidance to food items and how this is modulated by traits relating to dietary self-control. Participants completed two tasks in which they were presented with an initial cue (food or non-food) to either hold in working memory (memory task) or to merely attend to (priming task). Holding food items in working memory strongly affected attention when the memorized cue re-appeared in the search display. Tendency towards disinhibited eating was associated with greater attention to food versus non-food pictures in both the priming and working memory tasks, consistent with greater attention to food cues per se. Successful dieters, defined as those high in dietary restraint and low in tendency to disinhibition, showed reduced attention to food when holding food-related information in working memory. These data suggest a strong top-down effect of thinking about food on attention to food items and indicate that the suppression of food items in working memory could be a marker of dieting success.

  1. Dietary self-control influences top-down guidance of attention to food cues

    Directory of Open Access Journals (Sweden)

    Suzanne eHiggs

    2015-04-01

    Full Text Available Motivational objects attract attention due to their rewarding properties, but less is known about the role that top-down cognitive processes play in the attention paid to motivationally relevant objects and how this is affected by relevant behaviour traits. Here we assess how thinking about food affects attentional guidance to food items and how this is modulated by traits relating to dietary self-control. Participants completed two tasks in which they were presented with an initial cue (food or non-food to either hold in working memory (memory task or to merely attend to (priming task. Holding food items in working memory strongly affected attention when the memorized cue re-appeared in the search display. Tendency towards disinhibited eating was associated with greater attention to food versus non-food pictures in both the priming and working memory tasks, consistent with greater attention to food cues per se. Successful dieters, defined as those high in dietary restraint and low in tendency to disinhibition, showed reduced attention to food when holding food-related information in working memory. These data suggest a strong top-down effect of thinking about food on attention to food items and indicate that the suppression of food items in working memory could be a marker of dieting success.

  2. No two cues are alike: Depth of learning during infancy is dependent on what orients attention.

    Science.gov (United States)

    Wu, Rachel; Kirkham, Natasha Z

    2010-10-01

    Human infants develop a variety of attentional mechanisms that allow them to extract relevant information from a cluttered multimodal world. We know that both social and nonsocial cues shift infants' attention, but not how these cues differentially affect learning of multimodal events. Experiment 1 used social cues to direct 8- and 4-month-olds' attention to two audiovisual events (i.e., animations of a cat or dog accompanied by particular sounds) while identical distractor events played in another location. Experiment 2 directed 8-month-olds' attention with colorful flashes to the same events. Experiment 3 measured baseline learning without attention cues both with the familiarization and test trials (no cue condition) and with only the test trials (test control condition). The 8-month-olds exposed to social cues showed specific learning of audiovisual events. The 4-month-olds displayed only general spatial learning from social cues, suggesting that specific learning of audiovisual events from social cues may be a function of experience. Infants cued with the colorful flashes looked indiscriminately to both cued locations during test (similar to the 4-month-olds learning from social cues) despite attending for equal duration to the training trials as the 8-month-olds with the social cues. Results from Experiment 3 indicated that the learning effects in Experiments 1 and 2 resulted from exposure to the different cues and multimodal events. We discuss these findings in terms of the perceptual differences and relevance of the cues. Copyright 2010 Elsevier Inc. All rights reserved.

  3. All I saw was the cake. Hunger effects on attentional capture by visual food cues.

    Science.gov (United States)

    Piech, Richard M; Pastorino, Michael T; Zald, David H

    2010-06-01

    While effects of hunger on motivation and food reward value are well-established, far less is known about the effects of hunger on cognitive processes. Here, we deployed the emotional blink of attention paradigm to investigate the impact of visual food cues on attentional capture under conditions of hunger and satiety. Participants were asked to detect targets which appeared in a rapid visual stream after different types of task irrelevant distractors. We observed that food stimuli acquired increased power to capture attention and prevent target detection when participants were hungry. This occurred despite monetary incentives to perform well. Our findings suggest an attentional mechanism through which hunger heightens perception of food cues. As an objective behavioral marker of the attentional sensitivity to food cues, the emotional attentional blink paradigm may provide a useful technique for studying individual differences, and state manipulations in the sensitivity to food cues. Published by Elsevier Ltd.

  4. Irrelevant Auditory and Visual Events Induce a Visual Attentional Blink

    NARCIS (Netherlands)

    Van der Burg, Erik; Nieuwenstein, Mark R.; Theeuwes, Jan; Olivers, Christian N. L.

    2013-01-01

    In the present study we investigated whether a task-irrelevant distractor can induce a visual attentional blink pattern. Participants were asked to detect only a visual target letter (A, B, or C) and to ignore the preceding auditory, visual, or audiovisual distractor. An attentional blink was

  5. Auditory-Motor Control of Vocal Production during Divided Attention: Behavioral and ERP Correlates.

    Science.gov (United States)

    Liu, Ying; Fan, Hao; Li, Jingting; Jones, Jeffery A; Liu, Peng; Zhang, Baofeng; Liu, Hanjun

    2018-01-01

    When people hear unexpected perturbations in auditory feedback, they produce rapid compensatory adjustments of their vocal behavior. Recent evidence has shown enhanced vocal compensations and cortical event-related potentials (ERPs) in response to attended pitch feedback perturbations, suggesting that this reflex-like behavior is influenced by selective attention. Less is known, however, about auditory-motor integration for voice control during divided attention. The present cross-modal study investigated the behavioral and ERP correlates of auditory feedback control of vocal pitch production during divided attention. During the production of sustained vowels, 32 young adults were instructed to simultaneously attend to both pitch feedback perturbations they heard and flashing red lights they saw. The presentation rate of the visual stimuli was varied to produce a low, intermediate, and high attentional load. The behavioral results showed that the low-load condition elicited significantly smaller vocal compensations for pitch perturbations than the intermediate-load and high-load conditions. As well, the cortical processing of vocal pitch feedback was also modulated as a function of divided attention. When compared to the low-load and intermediate-load conditions, the high-load condition elicited significantly larger N1 responses and smaller P2 responses to pitch perturbations. These findings provide the first neurobehavioral evidence that divided attention can modulate auditory feedback control of vocal pitch production.

  6. Towards a framework for attention cueing in instructional animations: Guidelines for research and design

    NARCIS (Netherlands)

    B.B. de Koning (Björn); H.K. Tabbers (Huib); R.M.J.P. Rikers (Remy); G.W.C. Paas (Fred)

    2009-01-01

    textabstractThis paper examines the transferability of successful cueing approaches from text and static visualization research to animations. Theories of visual attention and learning as well as empirical evidence for the instructional effectiveness of attention cueing are reviewed and, based on

  7. Sensorineural hearing loss degrades behavioral and physiological measures of human spatial selective auditory attention

    Science.gov (United States)

    Dai, Lengshi; Best, Virginia; Shinn-Cunningham, Barbara G.

    2018-01-01

    Listeners with sensorineural hearing loss often have trouble understanding speech amid other voices. While poor spatial hearing is often implicated, direct evidence is weak; moreover, studies suggest that reduced audibility and degraded spectrotemporal coding may explain such problems. We hypothesized that poor spatial acuity leads to difficulty deploying selective attention, which normally filters out distracting sounds. In listeners with normal hearing, selective attention causes changes in the neural responses evoked by competing sounds, which can be used to quantify the effectiveness of attentional control. Here, we used behavior and electroencephalography to explore whether control of selective auditory attention is degraded in hearing-impaired (HI) listeners. Normal-hearing (NH) and HI listeners identified a simple melody presented simultaneously with two competing melodies, each simulated from different lateral angles. We quantified performance and attentional modulation of cortical responses evoked by these competing streams. Compared with NH listeners, HI listeners had poorer sensitivity to spatial cues, performed more poorly on the selective attention task, and showed less robust attentional modulation of cortical responses. Moreover, across NH and HI individuals, these measures were correlated. While both groups showed cortical suppression of distracting streams, this modulation was weaker in HI listeners, especially when attending to a target at midline, surrounded by competing streams. These findings suggest that hearing loss interferes with the ability to filter out sound sources based on location, contributing to communication difficulties in social situations. These findings also have implications for technologies aiming to use neural signals to guide hearing aid processing. PMID:29555752

  8. Gender differences in pre-attentive change detection for visual but not auditory stimuli.

    Science.gov (United States)

    Yang, Xiuxian; Yu, Yunmiao; Chen, Lu; Sun, Hailian; Qiao, Zhengxue; Qiu, Xiaohui; Zhang, Congpei; Wang, Lin; Zhu, Xiongzhao; He, Jincai; Zhao, Lun; Yang, Yanjie

    2016-01-01

    Despite ongoing debate about gender differences in pre-attention processes, little is known about gender effects on change detection for auditory and visual stimuli. We explored gender differences in change detection while processing duration information in auditory and visual modalities. We investigated pre-attentive processing of duration information using a deviant-standard reverse oddball paradigm (50 ms/150 ms) for auditory and visual mismatch negativity (aMMN and vMMN) in males and females (n=21/group). In the auditory modality, decrement and increment aMMN were observed at 150-250 ms after the stimulus onset, and there was no significant gender effect on MMN amplitudes in temporal or fronto-central areas. In contrast, in the visual modality, only increment vMMN was observed at 180-260 ms after the onset of stimulus, and it was higher in males than in females. No gender effect was found in change detection for auditory stimuli, but change detection was facilitated for visual stimuli in males. Gender effects should be considered in clinical studies of pre-attention for visual stimuli. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  9. Visual attention to food cues in obesity: an eye-tracking study.

    Science.gov (United States)

    Doolan, Katy J; Breslin, Gavin; Hanna, Donncha; Murphy, Kate; Gallagher, Alison M

    2014-12-01

    Based on the theory of incentive sensitization, the aim of this study was to investigate differences in attentional processing of food-related visual cues between normal-weight and overweight/obese males and females. Twenty-six normal-weight (14M, 12F) and 26 overweight/obese (14M, 12F) adults completed a visual probe task and an eye-tracking paradigm. Reaction times and eye movements to food and control images were collected during both a fasted and fed condition in a counterbalanced design. Participants had greater visual attention towards high-energy-density food images compared to low-energy-density food images regardless of hunger condition. This was most pronounced in overweight/obese males who had significantly greater maintained attention towards high-energy-density food images when compared with their normal-weight counterparts however no between weight group differences were observed for female participants. High-energy-density food images appear to capture visual attention more readily than low-energy-density food images. Results also suggest the possibility of an altered visual food cue-associated reward system in overweight/obese males. Attentional processing of food cues may play a role in eating behaviors thus should be taken into consideration as part of an integrated approach to curbing obesity. © 2014 The Obesity Society.

  10. Great cormorants (Phalacrocorax carbo) can detect auditory cues while diving

    DEFF Research Database (Denmark)

    Hansen, Kirstin Anderson; Maxwell, Alyssa; Siebert, Ursula

    2017-01-01

    In-air hearing in birds has been thoroughly investigated. Sound provides birds with auditory information for species and individual recognition from their complex vocalizations, as well as cues while foraging and for avoiding predators. Some 10% of existing species of birds obtain their food under...... the water surface. Whether some of these birds make use of acoustic cues while underwater is unknown. An interesting species in this respect is the great cormorant (Phalacrocorax carbo), being one of the most effective marine predators and relying on the aquatic environment for food year round. Here, its...... underwater hearing abilities were investigated using psychophysics, where the bird learned to detect the presence or absence of a tone while submerged. The greatest sensitivity was found at 2 kHz, with an underwater hearing threshold of 71 dB re 1 μPa rms. The great cormorant is better at hearing underwater...

  11. Feature-based and object-based attention orientation during short-term memory maintenance.

    Science.gov (United States)

    Ku, Yixuan

    2015-12-01

    Top-down attention biases the short-term memory (STM) processing at multiple stages. Orienting attention during the maintenance period of STM by a retrospective cue (retro-cue) strengthens the representation of the cued item and improves the subsequent STM performance. In a recent article, Backer et al. (Backer KC, Binns MA, Alain C. J Neurosci 35: 1307-1318, 2015) extended these findings from the visual to the auditory domain and combined electroencephalography to dissociate neural mechanisms underlying feature-based and object-based attention orientation. Both event-related potentials and neural oscillations explained the behavioral benefits of retro-cues and favored the theory that feature-based and object-based attention orientation were independent. Copyright © 2015 the American Physiological Society.

  12. Intentional switching in auditory selective attention: Exploring age-related effects in a spatial setup requiring speech perception.

    Science.gov (United States)

    Oberem, Josefa; Koch, Iring; Fels, Janina

    2017-06-01

    Using a binaural-listening paradigm, age-related differences in the ability to intentionally switch auditory selective attention between two speakers, defined by their spatial location, were examined. Therefore 40 normal-hearing participants (20 young, Ø 24.8years; 20 older Ø 67.8years) were tested. The spatial reproduction of stimuli was provided by headphones using head-related-transfer-functions of an artificial head. Spoken number words of two speakers were presented simultaneously to participants from two out of eight locations on the horizontal plane. Guided by a visual cue indicating the spatial location of the target speaker, the participants were asked to categorize the target's number word into smaller vs. greater than five while ignoring the distractor's speech. Results showed significantly higher reaction times and error rates for older participants. The relative influence of the spatial switch of the target-speaker (switch or repetition of speaker's direction in space) was identical across age groups. Congruency effects (stimuli spoken by target and distractor may evoke the same answer or different answers) were increased for older participants and depend on the target's position. Results suggest that the ability to intentionally switch auditory attention to a new cued location was unimpaired whereas it was generally harder for older participants to suppress processing the distractor's speech. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Cortical Brain Activity Reflecting Attentional Biasing Toward Reward-Predicting Cues Covaries with Economic Decision-Making Performance.

    Science.gov (United States)

    San Martín, René; Appelbaum, Lawrence G; Huettel, Scott A; Woldorff, Marty G

    2016-01-01

    Adaptive choice behavior depends critically on identifying and learning from outcome-predicting cues. We hypothesized that attention may be preferentially directed toward certain outcome-predicting cues. We studied this possibility by analyzing event-related potential (ERP) responses in humans during a probabilistic decision-making task. Participants viewed pairs of outcome-predicting visual cues and then chose to wager either a small (i.e., loss-minimizing) or large (i.e., gain-maximizing) amount of money. The cues were bilaterally presented, which allowed us to extract the relative neural responses to each cue by using a contralateral-versus-ipsilateral ERP contrast. We found an early lateralized ERP response, whose features matched the attention-shift-related N2pc component and whose amplitude scaled with the learned reward-predicting value of the cues as predicted by an attention-for-reward model. Consistently, we found a double dissociation involving the N2pc. Across participants, gain-maximization positively correlated with the N2pc amplitude to the most reliable gain-predicting cue, suggesting an attentional bias toward such cues. Conversely, loss-minimization was negatively correlated with the N2pc amplitude to the most reliable loss-predicting cue, suggesting an attentional avoidance toward such stimuli. These results indicate that learned stimulus-reward associations can influence rapid attention allocation, and that differences in this process are associated with individual differences in economic decision-making performance. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Frequency-specific attentional modulation in human primary auditory cortex and midbrain

    NARCIS (Netherlands)

    Riecke, Lars; Peters, Judith C; Valente, Giancarlo; Poser, Benedikt A; Kemper, Valentin G; Formisano, Elia; Sorger, Bettina

    2018-01-01

    Paying selective attention to an audio frequency selectively enhances activity within primary auditory cortex (PAC) at the tonotopic site (frequency channel) representing that frequency. Animal PAC neurons achieve this 'frequency-specific attentional spotlight' by adapting their frequency tuning,

  15. Selective attention reduces physiological noise in the external ear canals of humans. I: Auditory attention

    Science.gov (United States)

    Walsh, Kyle P.; Pasanen, Edward G.; McFadden, Dennis

    2014-01-01

    In this study, a nonlinear version of the stimulus-frequency OAE (SFOAE), called the nSFOAE, was used to measure cochlear responses from human subjects while they simultaneously performed behavioral tasks requiring, or not requiring, selective auditory attention. Appended to each stimulus presentation, and included in the calculation of each nSFOAE response, was a 30-ms silent period that was used to estimate the level of the inherent physiological noise in the ear canals of our subjects during each behavioral condition. Physiological-noise magnitudes were higher (noisier) for all subjects in the inattention task, and lower (quieter) in the selective auditory-attention tasks. These noise measures initially were made at the frequency of our nSFOAE probe tone (4.0 kHz), but the same attention effects also were observed across a wide range of frequencies. We attribute the observed differences in physiological-noise magnitudes between the inattention and attention conditions to different levels of efferent activation associated with the differing attentional demands of the behavioral tasks. One hypothesis is that when the attentional demand is relatively great, efferent activation is relatively high, and a decrease in the gain of the cochlear amplifier leads to lower-amplitude cochlear activity, and thus a smaller measure of noise from the ear. PMID:24732069

  16. Visual form predictions facilitate auditory processing at the N1.

    Science.gov (United States)

    Paris, Tim; Kim, Jeesun; Davis, Chris

    2017-02-20

    Auditory-visual (AV) events often involve a leading visual cue (e.g. auditory-visual speech) that allows the perceiver to generate predictions about the upcoming auditory event. Electrophysiological evidence suggests that when an auditory event is predicted, processing is sped up, i.e., the N1 component of the ERP occurs earlier (N1 facilitation). However, it is not clear (1) whether N1 facilitation is based specifically on predictive rather than multisensory integration and (2) which particular properties of the visual cue it is based on. The current experiment used artificial AV stimuli in which visual cues predicted but did not co-occur with auditory cues. Visual form cues (high and low salience) and the auditory-visual pairing were manipulated so that auditory predictions could be based on form and timing or on timing only. The results showed that N1 facilitation occurred only for combined form and temporal predictions. These results suggest that faster auditory processing (as indicated by N1 facilitation) is based on predictive processing generated by a visual cue that clearly predicts both what and when the auditory stimulus will occur. Copyright © 2016. Published by Elsevier Ltd.

  17. Auditory selective attention in adolescents with major depression: An event-related potential study.

    Science.gov (United States)

    Greimel, E; Trinkl, M; Bartling, J; Bakos, S; Grossheinrich, N; Schulte-Körne, G

    2015-02-01

    Major depression (MD) is associated with deficits in selective attention. Previous studies in adults with MD using event-related potentials (ERPs) reported abnormalities in the neurophysiological correlates of auditory selective attention. However, it is yet unclear whether these findings can be generalized to MD in adolescence. Thus, the aim of the present ERP study was to explore the neural mechanisms of auditory selective attention in adolescents with MD. 24 male and female unmedicated adolescents with MD and 21 control subjects were included in the study. ERPs were collected during an auditory oddball paradigm. Depressive adolescents tended to show a longer N100 latency to target and non-target tones. Moreover, MD subjects showed a prolonged latency of the P200 component to targets. Across groups, longer P200 latency was associated with a decreased tendency of disinhibited behavior as assessed by a behavioral questionnaire. To be able to draw more precise conclusions about differences between the neural bases of selective attention in adolescents vs. adults with MD, future studies should include both age groups and apply the same experimental setting across all subjects. The study provides strong support for abnormalities in the neurophysiolgical bases of selective attention in adolecents with MD at early stages of auditory information processing. Absent group differences in later ERP components reflecting voluntary attentional processes stand in contrast to results reported in adults with MD and may suggest that adolescents with MD possess mechanisms to compensate for abnormalities in the early stages of selective attention. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Auditory Attentional Capture: Effects of Singleton Distractor Sounds

    Science.gov (United States)

    Dalton, Polly; Lavie, Nilli

    2004-01-01

    The phenomenon of attentional capture by a unique yet irrelevant singleton distractor has typically been studied in visual search. In this article, the authors examine whether a similar phenomenon occurs in the auditory domain. Participants searched sequences of sounds for targets defined by frequency, intensity, or duration. The presence of a…

  19. Contingent capture of involuntary visual attention interferes with detection of auditory stimuli.

    Science.gov (United States)

    Kamke, Marc R; Harris, Jill

    2014-01-01

    The involuntary capture of attention by salient visual stimuli can be influenced by the behavioral goals of an observer. For example, when searching for a target item, irrelevant items that possess the target-defining characteristic capture attention more strongly than items not possessing that feature. Such contingent capture involves a shift of spatial attention toward the item with the target-defining characteristic. It is not clear, however, if the associated decrements in performance for detecting the target item are entirely due to involuntary orienting of spatial attention. To investigate whether contingent capture also involves a non-spatial interference, adult observers were presented with streams of visual and auditory stimuli and were tasked with simultaneously monitoring for targets in each modality. Visual and auditory targets could be preceded by a lateralized visual distractor that either did, or did not, possess the target-defining feature (a specific color). In agreement with the contingent capture hypothesis, target-colored distractors interfered with visual detection performance (response time and accuracy) more than distractors that did not possess the target color. Importantly, the same pattern of results was obtained for the auditory task: visual target-colored distractors interfered with sound detection. The decrement in auditory performance following a target-colored distractor suggests that contingent capture involves a source of processing interference in addition to that caused by a spatial shift of attention. Specifically, we argue that distractors possessing the target-defining characteristic enter a capacity-limited, serial stage of neural processing, which delays detection of subsequently presented stimuli regardless of the sensory modality.

  20. Contingent capture of involuntary visual attention interferes with detection of auditory stimuli

    Directory of Open Access Journals (Sweden)

    Marc R. Kamke

    2014-06-01

    Full Text Available The involuntary capture of attention by salient visual stimuli can be influenced by the behavioral goals of an observer. For example, when searching for a target item, irrelevant items that possess the target-defining characteristic capture attention more strongly than items not possessing that feature. Such contingent capture involves a shift of spatial attention toward the item with the target-defining characteristic. It is not clear, however, if the associated decrements in performance for detecting the target item are entirely due to involuntary orienting of spatial attention. To investigate whether contingent capture also involves a non-spatial interference, adult observers were presented with streams of visual and auditory stimuli and were tasked with simultaneously monitoring for targets in each modality. Visual and auditory targets could be preceded by a lateralized visual distractor that either did, or did not, possess the target-defining feature (a specific color. In agreement with the contingent capture hypothesis, target-colored distractors interfered with visual detection performance (response time and accuracy more than distractors that did not possess the target color. Importantly, the same pattern of results was obtained for the auditory task: visual target-colored distractors interfered with sound detection. The decrement in auditory performance following a target-colored distractor suggests that contingent capture involves a source of processing interference in addition to that caused by a spatial shift of attention. Specifically, we argue that distractors possessing the target-defining characteristic enter a capacity-limited, serial stage of neural processing, which delays detection of subsequently presented stimuli regardless of the sensory modality.

  1. Peripheral Visual Cues: Their Fate in Processing and Effects on Attention and Temporal-Order Perception.

    Science.gov (United States)

    Tünnermann, Jan; Scharlau, Ingrid

    2016-01-01

    Peripheral visual cues lead to large shifts in psychometric distributions of temporal-order judgments. In one view, such shifts are attributed to attention speeding up processing of the cued stimulus, so-called prior entry. However, sometimes these shifts are so large that it is unlikely that they are caused by attention alone. Here we tested the prevalent alternative explanation that the cue is sometimes confused with the target on a perceptual level, bolstering the shift of the psychometric function. We applied a novel model of cued temporal-order judgments, derived from Bundesen's Theory of Visual Attention. We found that cue-target confusions indeed contribute to shifting psychometric functions. However, cue-induced changes in the processing rates of the target stimuli play an important role, too. At smaller cueing intervals, the cue increased the processing speed of the target. At larger intervals, inhibition of return was predominant. Earlier studies of cued TOJs were insensitive to these effects because in psychometric distributions they are concealed by the conjoint effects of cue-target confusions and processing rate changes.

  2. Validating emotional attention regulation as a component of emotional intelligence: A Stroop approach to individual differences in tuning in to and out of nonverbal cues.

    Science.gov (United States)

    Elfenbein, Hillary Anger; Jang, Daisung; Sharma, Sudeep; Sanchez-Burks, Jeffrey

    2017-03-01

    Emotional intelligence (EI) has captivated researchers and the public alike, but it has been challenging to establish its components as objective abilities. Self-report scales lack divergent validity from personality traits, and few ability tests have objectively correct answers. We adapt the Stroop task to introduce a new facet of EI called emotional attention regulation (EAR), which involves focusing emotion-related attention for the sake of information processing rather than for the sake of regulating one's own internal state. EAR includes 2 distinct components. First, tuning in to nonverbal cues involves identifying nonverbal cues while ignoring alternate content, that is, emotion recognition under conditions of distraction by competing stimuli. Second, tuning out of nonverbal cues involves ignoring nonverbal cues while identifying alternate content, that is, the ability to interrupt emotion recognition when needed to focus attention elsewhere. An auditory test of valence included positive and negative words spoken in positive and negative vocal tones. A visual test of approach-avoidance included green- and red-colored facial expressions depicting happiness and anger. The error rates for incongruent trials met the key criteria for establishing the validity of an EI test, in that the measure demonstrated test-retest reliability, convergent validity with other EI measures, divergent validity from factors such as general processing speed and mostly personality, and predictive validity in this case for well-being. By demonstrating that facets of EI can be validly theorized and empirically assessed, results also speak to the validity of EI more generally. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Step-to-step variability in treadmill walking: influence of rhythmic auditory cueing.

    Directory of Open Access Journals (Sweden)

    Philippe Terrier

    Full Text Available While walking, human beings continuously adjust step length (SpL, step time (SpT, step speed (SpS = SpL/SpT and step width (SpW by integrating both feedforward and feedback mechanisms. These motor control processes result in correlations of gait parameters between consecutive strides (statistical persistence. Constraining gait with a speed cue (treadmill and/or a rhythmic auditory cue (metronome, modifies the statistical persistence to anti-persistence. The objective was to analyze whether the combined effect of treadmill and rhythmic auditory cueing (RAC modified not only statistical persistence, but also fluctuation magnitude (standard deviation, SD, and stationarity of SpL, SpT, SpS and SpW. Twenty healthy subjects performed 6 × 5 min. walking tests at various imposed speeds on a treadmill instrumented with foot-pressure sensors. Freely-chosen walking cadences were assessed during the first three trials, and then imposed accordingly in the last trials with a metronome. Fluctuation magnitude (SD of SpT, SpL, SpS and SpW was assessed, as well as NonStationarity Index (NSI, which estimates the dispersion of local means in the times series (SD of 20 local means over 10 steps. No effect of RAC on fluctuation magnitude (SD was observed. SpW was not modified by RAC, what is likely the evidence that lateral foot placement is separately regulated. Stationarity (NSI was modified by RAC in the same manner as persistent pattern: Treadmill induced low NSI in the time series of SpS, and high NSI in SpT and SpL. On the contrary, SpT, SpL and SpS exhibited low NSI under RAC condition. We used relatively short sample of consecutive strides (100 as compared to the usual number of strides required to analyze fluctuation dynamics (200 to 1000 strides. Therefore, the responsiveness of stationarity measure (NSI to cued walking opens the perspective to perform short walking tests that would be adapted to patients with a reduced gait perimeter.

  4. Human pupillary dilation response to deviant auditory stimuli: Effects of stimulus properties and voluntary attention

    Directory of Open Access Journals (Sweden)

    Hsin-I eLiao

    2016-02-01

    Full Text Available A unique sound that deviates from a repetitive background sound induces signature neural responses, such as mismatch negativity and novelty P3 response in electro-encephalography studies. Here we show that a deviant auditory stimulus induces a human pupillary dilation response (PDR that is sensitive to the stimulus properties and irrespective whether attention is directed to the sounds or not. In an auditory oddball sequence, we used white noise and 2000-Hz tones as oddballs against repeated 1000-Hz tones. Participants’ pupillary responses were recorded while they listened to the auditory oddball sequence. In Experiment 1, they were not involved in any task. Results show that pupils dilated to the noise oddballs for approximately 4 s, but no such PDR was found for the 2000-Hz tone oddballs. In Experiments 2, two types of visual oddballs were presented synchronously with the auditory oddballs. Participants discriminated the auditory or visual oddballs while trying to ignore stimuli from the other modality. The purpose of this manipulation was to direct attention to or away from the auditory sequence. In Experiment 3, the visual oddballs and the auditory oddballs were always presented asynchronously to prevent residuals of attention on to-be-ignored oddballs due to the concurrence with the attended oddballs. Results show that pupils dilated to both the noise and 2000-Hz tone oddballs in all conditions. Most importantly, PDRs to noise were larger than those to the 2000-Hz tone oddballs regardless of the attention condition in both experiments. The overall results suggest that the stimulus-dependent factor of the PDR appears to be independent of attention.

  5. Human Pupillary Dilation Response to Deviant Auditory Stimuli: Effects of Stimulus Properties and Voluntary Attention.

    Science.gov (United States)

    Liao, Hsin-I; Yoneya, Makoto; Kidani, Shunsuke; Kashino, Makio; Furukawa, Shigeto

    2016-01-01

    A unique sound that deviates from a repetitive background sound induces signature neural responses, such as mismatch negativity and novelty P3 response in electro-encephalography studies. Here we show that a deviant auditory stimulus induces a human pupillary dilation response (PDR) that is sensitive to the stimulus properties and irrespective whether attention is directed to the sounds or not. In an auditory oddball sequence, we used white noise and 2000-Hz tones as oddballs against repeated 1000-Hz tones. Participants' pupillary responses were recorded while they listened to the auditory oddball sequence. In Experiment 1, they were not involved in any task. Results show that pupils dilated to the noise oddballs for approximately 4 s, but no such PDR was found for the 2000-Hz tone oddballs. In Experiments 2, two types of visual oddballs were presented synchronously with the auditory oddballs. Participants discriminated the auditory or visual oddballs while trying to ignore stimuli from the other modality. The purpose of this manipulation was to direct attention to or away from the auditory sequence. In Experiment 3, the visual oddballs and the auditory oddballs were always presented asynchronously to prevent residuals of attention on to-be-ignored oddballs due to the concurrence with the attended oddballs. Results show that pupils dilated to both the noise and 2000-Hz tone oddballs in all conditions. Most importantly, PDRs to noise were larger than those to the 2000-Hz tone oddballs regardless of the attention condition in both experiments. The overall results suggest that the stimulus-dependent factor of the PDR appears to be independent of attention.

  6. Control of Auditory Attention in Children With Specific Language Impairment.

    Science.gov (United States)

    Victorino, Kristen R; Schwartz, Richard G

    2015-08-01

    Children with specific language impairment (SLI) appear to demonstrate deficits in attention and its control. Selective attention involves the cognitive control of attention directed toward a relevant stimulus and simultaneous inhibition of attention toward irrelevant stimuli. The current study examined attention control during a cross-modal word recognition task. Twenty participants with SLI (ages 9-12 years) and 20 age-matched peers with typical language development (TLD) listened to words through headphones and were instructed to attend to the words in 1 ear while ignoring the words in the other ear. They were simultaneously presented with pictures and asked to make a lexical decision about whether the pictures and auditory words were the same or different. Accuracy and reaction time were measured in 5 conditions, in which the stimulus in the unattended channel was manipulated. The groups performed with similar accuracy. Compared with their peers with TLD, children with SLI had slower reaction times overall and different within-group patterns of performance by condition. Children with TLD showed efficient inhibitory control in conditions that required active suppression of competing stimuli. Participants with SLI had difficulty exerting control over their auditory attention in all conditions, with particular difficulty inhibiting distractors of all types.

  7. Attention-driven auditory cortex short-term plasticity helps segregate relevant sounds from noise

    OpenAIRE

    Ahveninen, Jyrki; Hämäläinen, Matti; Jääskeläinen, Iiro P.; Ahlfors, Seppo P.; Huang, Samantha; Lin, Fa-Hsuan; Raij, Tommi; Sams, Mikko; Vasios, Christos E.; Belliveau, John W.

    2011-01-01

    How can we concentrate on relevant sounds in noisy environments? A “gain model” suggests that auditory attention simply amplifies relevant and suppresses irrelevant afferent inputs. However, it is unclear whether this suffices when attended and ignored features overlap to stimulate the same neuronal receptive fields. A “tuning model” suggests that, in addition to gain, attention modulates feature selectivity of auditory neurons. We recorded magnetoencephalography, EEG, and functional MRI (fMR...

  8. Broken Expectations: Violation of Expectancies, Not Novelty, Captures Auditory Attention

    Science.gov (United States)

    Vachon, Francois; Hughes, Robert W.; Jones, Dylan M.

    2012-01-01

    The role of memory in behavioral distraction by auditory attentional capture was investigated: We examined whether capture is a product of the novelty of the capturing event (i.e., the absence of a recent memory for the event) or its violation of learned expectancies on the basis of a memory for an event structure. Attentional capture--indicated…

  9. Dynamic crossmodal links revealed by steady-state responses in auditory-visual divided attention

    NARCIS (Netherlands)

    de Jong, Ritske; Toffanin, Paolo; Harbers, Marten; Martens, Sander

    Frequency tagging has been often used to study intramodal attention but not intermodal attention. We used EEG and simultaneous frequency tagging of auditory and visual sources to study intermodal focused and divided attention in detection and discrimination performance. Divided-attention costs were

  10. Size and synchronization of auditory cortex promotes musical, literacy, and attentional skills in children.

    Science.gov (United States)

    Seither-Preisler, Annemarie; Parncutt, Richard; Schneider, Peter

    2014-08-13

    Playing a musical instrument is associated with numerous neural processes that continuously modify the human brain and may facilitate characteristic auditory skills. In a longitudinal study, we investigated the auditory and neural plasticity of musical learning in 111 young children (aged 7-9 y) as a function of the intensity of instrumental practice and musical aptitude. Because of the frequent co-occurrence of central auditory processing disorders and attentional deficits, we also tested 21 children with attention deficit (hyperactivity) disorder [AD(H)D]. Magnetic resonance imaging and magnetoencephalography revealed enlarged Heschl's gyri and enhanced right-left hemispheric synchronization of the primary evoked response (P1) to harmonic complex sounds in children who spent more time practicing a musical instrument. The anatomical characteristics were positively correlated with frequency discrimination, reading, and spelling skills. Conversely, AD(H)D children showed reduced volumes of Heschl's gyri and enhanced volumes of the plana temporalia that were associated with a distinct bilateral P1 asynchrony. This may indicate a risk for central auditory processing disorders that are often associated with attentional and literacy problems. The longitudinal comparisons revealed a very high stability of auditory cortex morphology and gray matter volumes, suggesting that the combined anatomical and functional parameters are neural markers of musicality and attention deficits. Educational and clinical implications are considered. Copyright © 2014 the authors 0270-6474/14/3410937-13$15.00/0.

  11. Distraction task rather than focal attention modulates gamma activity associated with auditory steady-state responses (ASSRs)

    DEFF Research Database (Denmark)

    Griskova-Bulanova, Inga; Ruksenas, Osvaldas; Dapsys, Kastytis

    2011-01-01

    To explore the modulation of auditory steady-state response (ASSR) by experimental tasks, differing in attentional focus and arousal level.......To explore the modulation of auditory steady-state response (ASSR) by experimental tasks, differing in attentional focus and arousal level....

  12. Vestibular Stimulation and Auditory Perception in Children with Attention Deficit Hyperactivity Disorder

    Directory of Open Access Journals (Sweden)

    Azin Salamati

    2014-09-01

    Full Text Available Objectives: Rehabilitation strategies play a pivotal role in reliving the inappropriate behaviors and improving children's performance during school. Concentration and visual and auditory comprehension in children are crucial to effective learning and have drawn interest from researchers and clinicians. Vestibular function deficits usually cause high level of alertness and vigilance, and problems in maintaining focus, paying selective attention, and altering in precision and attention to the stimulus. The aim of this study is to investigate the correlation between vestibular stimulation and auditory perception in children with attention deficit hyperactivity disorder. Methods: Totally 30 children aged from 7 to 12 years with attention deficit hyperactivity disorder participated in this study. They were assessed based on the criteria of diagnostic and statistical manual of mental disorders. After obtaining guardian and parental consent, they were enrolled and randomly matched on age to two groups of intervention and control. Integrated visual and auditory continuous performance test was carried out as a pre-test. Those in the intervention group received vestibular stimulation during the therapy sessions, twice a week for 10 weeks. At the end the test was done to both groups as post-test. Results: The pre-and post-test scores were measured and compared the differences between means for two subject groups. Statistical analyses found a significant difference for the mean differences regarding auditory comprehension improvement. Discussion: The findings suggest that vestibular training is a reliable and powerful option treatment for attention deficit hyperactivity disorder especially along with other trainings, meaning that stimulating the sense of balance highlights the importance of interaction between inhabitation and cognition.

  13. Selective Attention to Visual Stimuli Using Auditory Distractors Is Altered in Alpha-9 Nicotinic Receptor Subunit Knock-Out Mice.

    Science.gov (United States)

    Terreros, Gonzalo; Jorratt, Pascal; Aedo, Cristian; Elgoyhen, Ana Belén; Delano, Paul H

    2016-07-06

    During selective attention, subjects voluntarily focus their cognitive resources on a specific stimulus while ignoring others. Top-down filtering of peripheral sensory responses by higher structures of the brain has been proposed as one of the mechanisms responsible for selective attention. A prerequisite to accomplish top-down modulation of the activity of peripheral structures is the presence of corticofugal pathways. The mammalian auditory efferent system is a unique neural network that originates in the auditory cortex and projects to the cochlear receptor through the olivocochlear bundle, and it has been proposed to function as a top-down filter of peripheral auditory responses during attention to cross-modal stimuli. However, to date, there is no conclusive evidence of the involvement of olivocochlear neurons in selective attention paradigms. Here, we trained wild-type and α-9 nicotinic receptor subunit knock-out (KO) mice, which lack cholinergic transmission between medial olivocochlear neurons and outer hair cells, in a two-choice visual discrimination task and studied the behavioral consequences of adding different types of auditory distractors. In addition, we evaluated the effects of contralateral noise on auditory nerve responses as a measure of the individual strength of the olivocochlear reflex. We demonstrate that KO mice have a reduced olivocochlear reflex strength and perform poorly in a visual selective attention paradigm. These results confirm that an intact medial olivocochlear transmission aids in ignoring auditory distraction during selective attention to visual stimuli. The auditory efferent system is a neural network that originates in the auditory cortex and projects to the cochlear receptor through the olivocochlear system. It has been proposed to function as a top-down filter of peripheral auditory responses during attention to cross-modal stimuli. However, to date, there is no conclusive evidence of the involvement of olivocochlear

  14. Switching in the Cocktail Party: Exploring Intentional Control of Auditory Selective Attention

    Science.gov (United States)

    Koch, Iring; Lawo, Vera; Fels, Janina; Vorlander, Michael

    2011-01-01

    Using a novel variant of dichotic selective listening, we examined the control of auditory selective attention. In our task, subjects had to respond selectively to one of two simultaneously presented auditory stimuli (number words), always spoken by a female and a male speaker, by performing a numerical size categorization. The gender of the…

  15. Impact of Spatial and Verbal Short-Term Memory Load on Auditory Spatial Attention Gradients.

    Science.gov (United States)

    Golob, Edward J; Winston, Jenna; Mock, Jeffrey R

    2017-01-01

    Short-term memory load can impair attentional control, but prior work shows that the extent of the effect ranges from being very general to very specific. One factor for the mixed results may be reliance on point estimates of memory load effects on attention. Here we used auditory attention gradients as an analog measure to map-out the impact of short-term memory load over space. Verbal or spatial information was maintained during an auditory spatial attention task and compared to no-load. Stimuli were presented from five virtual locations in the frontal azimuth plane, and subjects focused on the midline. Reaction times progressively increased for lateral stimuli, indicating an attention gradient. Spatial load further slowed responses at lateral locations, particularly in the left hemispace, but had little effect at midline. Verbal memory load had no (Experiment 1), or a minimal (Experiment 2) influence on reaction times. Spatial and verbal load increased switch costs between memory encoding and attention tasks relative to the no load condition. The findings show that short-term memory influences the distribution of auditory attention over space; and that the specific pattern depends on the type of information in short-term memory.

  16. Impact of Spatial and Verbal Short-Term Memory Load on Auditory Spatial Attention Gradients

    Directory of Open Access Journals (Sweden)

    Edward J. Golob

    2017-11-01

    Full Text Available Short-term memory load can impair attentional control, but prior work shows that the extent of the effect ranges from being very general to very specific. One factor for the mixed results may be reliance on point estimates of memory load effects on attention. Here we used auditory attention gradients as an analog measure to map-out the impact of short-term memory load over space. Verbal or spatial information was maintained during an auditory spatial attention task and compared to no-load. Stimuli were presented from five virtual locations in the frontal azimuth plane, and subjects focused on the midline. Reaction times progressively increased for lateral stimuli, indicating an attention gradient. Spatial load further slowed responses at lateral locations, particularly in the left hemispace, but had little effect at midline. Verbal memory load had no (Experiment 1, or a minimal (Experiment 2 influence on reaction times. Spatial and verbal load increased switch costs between memory encoding and attention tasks relative to the no load condition. The findings show that short-term memory influences the distribution of auditory attention over space; and that the specific pattern depends on the type of information in short-term memory.

  17. Comparative Evaluation of Auditory Attention in 7 to 9 Year Old Learning Disabled Students

    Directory of Open Access Journals (Sweden)

    Fereshteh Amiriani

    2011-06-01

    Full Text Available Background and Aim: Learning disability is a term referes to a group of disorders manifesting listening, reading, writing, or mathematical problems. These children mostly have attention difficulties in classroom that leads to many learning problems. In this study we aimed to compare the auditory attention of 7 to 9 year old children with learning disability to non- learning disability age matched normal group.Methods: Twenty seven male 7 to 9 year old students with learning disability and 27 age and sex matched normal conrols were selected with unprobable simple sampling. 27 In order to evaluate auditory selective and divided attention, Farsi versions of speech in noise and dichotic digit test were used respectively.Results: Comparison of mean scores of Farsi versions of speech in noise in both ears of 7 and 8 year-old students in two groups indicated no significant difference (p>0.05 Mean scores of 9 year old controls was significant more than those of the cases only in the right ear (p=0.033. However, no significant difference was observed between mean scores of dichotic digit test assessing the right ear of 9 year-old learning disability and non learning disability students (p>0.05. Moreover, mean scores of 7 and 8 year- old students with learning disability was less than those of their normal peers in the left ear (p>0.05.Conclusion: Selective auditory attention is not affected in the optimal signal to noise ratio, while divided attention seems to be affected by maturity delay of auditory system or central auditory system disorders.

  18. Saccade frequency response to visual cues during gait in Parkinson's disease: the selective role of attention.

    Science.gov (United States)

    Stuart, Samuel; Lord, Sue; Galna, Brook; Rochester, Lynn

    2018-04-01

    Gait impairment is a core feature of Parkinson's disease (PD) with implications for falls risk. Visual cues improve gait in PD, but the underlying mechanisms are unclear. Evidence suggests that attention and vision play an important role; however, the relative contribution from each is unclear. Measurement of visual exploration (specifically saccade frequency) during gait allows for real-time measurement of attention and vision. Understanding how visual cues influence visual exploration may allow inferences of the underlying mechanisms to response which could help to develop effective therapeutics. This study aimed to examine saccade frequency during gait in response to a visual cue in PD and older adults and investigate the roles of attention and vision in visual cue response in PD. A mobile eye-tracker measured saccade frequency during gait in 55 people with PD and 32 age-matched controls. Participants walked in a straight line with and without a visual cue (50 cm transverse lines) presented under single task and dual-task (concurrent digit span recall). Saccade frequency was reduced when walking in PD compared to controls; however, visual cues ameliorated saccadic deficit. Visual cues significantly increased saccade frequency in both PD and controls under both single task and dual-task. Attention rather than visual function was central to saccade frequency and gait response to visual cues in PD. In conclusion, this study highlights the impact of visual cues on visual exploration when walking and the important role of attention in PD. Understanding these complex features will help inform intervention development. © 2018 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  19. Conflict adaptation in time: foreperiods as contextual cues for attentional adjustment.

    Science.gov (United States)

    Wendt, Mike; Kiesel, Andrea

    2011-10-01

    Interference evoked by distractor stimulus information, such as flankers in the Eriksen task, is reduced when the proportion of conflicting stimuli is increased. This modulation is sensitive to contextual cues such as stimulus location or color, suggesting attentional adjustment to conflict contingencies on the basis of context information. In the present study, we explored whether conflict adjustment is modulated by temporal variation of conflict likelihood. To this end, we associated low and high proportions of conflict stimuli with foreperiods of different lengths. Flanker interference was higher with foreperiods associated with low conflict proportions, suggesting that participants use the foreperiod as a contextual cue for attentional adjustment. We conjecture that participants initially adopt the strategy useful for conflict contingencies associated with short foreperiods, and then readjust during the trial, in the absence of any additional exogenous cue, when the imperative stimulus has not occurred during a certain time interval.

  20. Inhibition of histone deacetylase 3 via RGFP966 facilitates cortical plasticity underlying unusually accurate auditory associative cue memory for excitatory and inhibitory cue-reward associations.

    Science.gov (United States)

    Shang, Andrea; Bylipudi, Sooraz; Bieszczad, Kasia M

    2018-05-31

    Epigenetic mechanisms are key for regulating long-term memory (LTM) and are known to exert control on memory formation in multiple systems of the adult brain, including the sensory cortex. One epigenetic mechanism is chromatin modification by histone acetylation. Blocking the action of histone de-acetylases (HDACs) that normally negatively regulate LTM by repressing transcription, has been shown to enable memory formation. Indeed, HDAC-inhibition appears to facilitate memory by altering the dynamics of gene expression events important for memory consolidation. However less understood are the ways in which molecular-level consolidation processes alter subsequent memory to enhance storage or facilitate retrieval. Here we used a sensory perspective to investigate whether the characteristics of memory formed with HDAC inhibitors are different from naturally-formed memory. One possibility is that HDAC inhibition enables memory to form with greater sensory detail than normal. Because the auditory system undergoes learning-induced remodeling that provides substrates for sound-specific LTM, we aimed to identify behavioral effects of HDAC inhibition on memory for specific sound features using a standard model of auditory associative cue-reward learning, memory, and cortical plasticity. We found that three systemic post-training treatments of an HDAC3-inhibitor (RGPF966, Abcam Inc.) in rats in the early phase of training facilitated auditory discriminative learning, changed auditory cortical tuning, and increased the specificity for acoustic frequency formed in memory of both excitatory (S+) and inhibitory (S-) associations for at least 2 weeks. The findings support that epigenetic mechanisms act on neural and behavioral sensory acuity to increase the precision of associative cue memory, which can be revealed by studying the sensory characteristics of long-term associative memory formation with HDAC inhibitors. Published by Elsevier B.V.

  1. Using multisensory cues to facilitate air traffic management.

    Science.gov (United States)

    Ngo, Mary K; Pierce, Russell S; Spence, Charles

    2012-12-01

    In the present study, we sought to investigate whether auditory and tactile cuing could be used to facilitate a complex, real-world air traffic management scenario. Auditory and tactile cuing provides an effective means of improving both the speed and accuracy of participants' performance in a variety of laboratory-based visual target detection and identification tasks. A low-fidelity air traffic simulation task was used in which participants monitored and controlled aircraft.The participants had to ensure that the aircraft landed or exited at the correct altitude, speed, and direction and that they maintained a safe separation from all other aircraft and boundaries. The performance measures recorded included en route time, handoff delay, and conflict resolution delay (the performance measure of interest). In a baseline condition, the aircraft in conflict was highlighted in red (visual cue), and in the experimental conditions, this standard visual cue was accompanied by a simultaneously presented auditory, vibrotactile, or audiotactile cue. Participants responded significantly more rapidly, but no less accurately, to conflicts when presented with an additional auditory or audiotactile cue than with either a vibrotactile or visual cue alone. Auditory and audiotactile cues have the potential for improving operator performance by reducing the time it takes to detect and respond to potential visual target events. These results have important implications for the design and use of multisensory cues in air traffic management.

  2. Does the sight of physical threat induce a tactile processing bias? Modality-specific attentional facilitation induced by viewing threatening pictures.

    Science.gov (United States)

    Van Damme, Stefaan; Gallace, Alberto; Spence, Charles; Crombez, Geert; Moseley, G Lorimer

    2009-02-09

    Threatening stimuli are thought to bias spatial attention toward the location from which the threat is presented. Although this effect is well-established in the visual domain, little is known regarding whether tactile attention is similarly affected by threatening pictures. We hypothesised that tactile attention might be more affected by cues implying physical threat to a person's bodily tissues than by cues implying general threat. In the present study, participants made temporal order judgments (TOJs) concerning which of a pair of tactile (or auditory) stimuli, one presented to either hand, at a range of inter-stimulus intervals, had been presented first. A picture (showing physical threat, general threat, or no threat) was presented in front of one or the other hand shortly before the tactile stimuli. The results revealed that tactile attention was biased toward the side on which the picture was presented, and that this effect was significantly larger for physical threat pictures than for general threat or neutral pictures. By contrast, the bias in auditory attention toward the side of the picture was significantly larger for general threat pictures than for physical threat pictures or neutral pictures. These findings therefore demonstrate a modality-specific effect of physically threatening cues on the processing of tactile stimuli, and of generally threatening cues on auditory information processing. These results demonstrate that the processing of tactile information from the body part closest to the threatening stimulus is prioritized over tactile information from elsewhere on the body.

  3. Can you hear me now? Musical training shapes functional brain networks for selective auditory attention and hearing speech in noise

    Directory of Open Access Journals (Sweden)

    Dana L Strait

    2011-06-01

    Full Text Available Even in the quietest of rooms, our senses are perpetually inundated by a barrage of sounds, requiring the auditory system to adapt to a variety of listening conditions in order to extract signals of interest (e.g., one speaker’s voice amidst others. Brain networks that promote selective attention are thought to sharpen the neural encoding of a target signal, suppressing competing sounds and enhancing perceptual performance. Here, we ask: does musical training benefit cortical mechanisms that underlie selective attention to speech? To answer this question, we assessed the impact of selective auditory attention on cortical auditory-evoked response variability in musicians and nonmusicians. Outcomes indicate strengthened brain networks for selective auditory attention in musicians in that musicians but not nonmusicians demonstrate decreased prefrontal response variability with auditory attention. Results are interpreted in the context of previous work from our laboratory documenting perceptual and subcortical advantages in musicians for the hearing and neural encoding of speech in background noise. Musicians’ neural proficiency for selectively engaging and sustaining auditory attention to language indicates a potential benefit of music for auditory training. Given the importance of auditory attention for the development of language-related skills, musical training may aid in the prevention, habilitation and remediation of children with a wide range of attention-based language and learning impairments.

  4. Auditory Attention and Comprehension During a Simulated Night Shift: Effects of Task Characteristics.

    Science.gov (United States)

    Pilcher, June J; Jennings, Kristen S; Phillips, Ginger E; McCubbin, James A

    2016-11-01

    The current study investigated performance on a dual auditory task during a simulated night shift. Night shifts and sleep deprivation negatively affect performance on vigilance-based tasks, but less is known about the effects on complex tasks. Because language processing is necessary for successful work performance, it is important to understand how it is affected by night work and sleep deprivation. Sixty-two participants completed a simulated night shift resulting in 28 hr of total sleep deprivation. Performance on a vigilance task and a dual auditory language task was examined across four testing sessions. The results indicate that working at night negatively impacts vigilance, auditory attention, and comprehension. The effects on the auditory task varied based on the content of the auditory material. When the material was interesting and easy, the participants performed better. Night work had a greater negative effect when the auditory material was less interesting and more difficult. These findings support research that vigilance decreases during the night. The results suggest that auditory comprehension suffers when individuals are required to work at night. Maintaining attention and controlling effort especially on passages that are less interesting or more difficult could improve performance during night shifts. The results from the current study apply to many work environments where decision making is necessary in response to complex auditory information. Better predicting the effects of night work on language processing is important for developing improved means of coping with shiftwork. © 2016, Human Factors and Ergonomics Society.

  5. Attentional Capture by Deviant Sounds: A Noncontingent Form of Auditory Distraction?

    Science.gov (United States)

    Vachon, François; Labonté, Katherine; Marsh, John E.

    2017-01-01

    The occurrence of an unexpected, infrequent sound in an otherwise homogeneous auditory background tends to disrupt the ongoing cognitive task. This "deviation effect" is typically explained in terms of attentional capture whereby the deviant sound draws attention away from the focal activity, regardless of the nature of this activity.…

  6. Music-induced positive mood broadens the scope of auditory attention.

    Science.gov (United States)

    Putkinen, Vesa; Makkonen, Tommi; Eerola, Tuomas

    2017-07-01

    Previous studies indicate that positive mood broadens the scope of visual attention, which can manifest as heightened distractibility. We used event-related potentials (ERP) to investigate whether music-induced positive mood has comparable effects on selective attention in the auditory domain. Subjects listened to experimenter-selected happy, neutral or sad instrumental music and afterwards participated in a dichotic listening task. Distractor sounds in the unattended channel elicited responses related to early sound encoding (N1/MMN) and bottom-up attention capture (P3a) while target sounds in the attended channel elicited a response related to top-down-controlled processing of task-relevant stimuli (P3b). For the subjects in a happy mood, the N1/MMN responses to the distractor sounds were enlarged while the P3b elicited by the target sounds was diminished. Behaviorally, these subjects tended to show heightened error rates on target trials following the distractor sounds. Thus, the ERP and behavioral results indicate that the subjects in a happy mood allocated their attentional resources more diffusely across the attended and the to-be-ignored channels. Therefore, the current study extends previous research on the effects of mood on visual attention and indicates that even unfamiliar instrumental music can broaden the scope of auditory attention via its effects on mood. © The Author (2017). Published by Oxford University Press.

  7. Pre-attentive, context-specific representation of fear memory in the auditory cortex of rat.

    Directory of Open Access Journals (Sweden)

    Akihiro Funamizu

    Full Text Available Neural representation in the auditory cortex is rapidly modulated by both top-down attention and bottom-up stimulus properties, in order to improve perception in a given context. Learning-induced, pre-attentive, map plasticity has been also studied in the anesthetized cortex; however, little attention has been paid to rapid, context-dependent modulation. We hypothesize that context-specific learning leads to pre-attentively modulated, multiplex representation in the auditory cortex. Here, we investigate map plasticity in the auditory cortices of anesthetized rats conditioned in a context-dependent manner, such that a conditioned stimulus (CS of a 20-kHz tone and an unconditioned stimulus (US of a mild electrical shock were associated only under a noisy auditory context, but not in silence. After the conditioning, although no distinct plasticity was found in the tonotopic map, tone-evoked responses were more noise-resistive than pre-conditioning. Yet, the conditioned group showed a reduced spread of activation to each tone with noise, but not with silence, associated with a sharpening of frequency tuning. The encoding accuracy index of neurons showed that conditioning deteriorated the accuracy of tone-frequency representations in noisy condition at off-CS regions, but not at CS regions, suggesting that arbitrary tones around the frequency of the CS were more likely perceived as the CS in a specific context, where CS was associated with US. These results together demonstrate that learning-induced plasticity in the auditory cortex occurs in a context-dependent manner.

  8. Pre-attentive, context-specific representation of fear memory in the auditory cortex of rat.

    Science.gov (United States)

    Funamizu, Akihiro; Kanzaki, Ryohei; Takahashi, Hirokazu

    2013-01-01

    Neural representation in the auditory cortex is rapidly modulated by both top-down attention and bottom-up stimulus properties, in order to improve perception in a given context. Learning-induced, pre-attentive, map plasticity has been also studied in the anesthetized cortex; however, little attention has been paid to rapid, context-dependent modulation. We hypothesize that context-specific learning leads to pre-attentively modulated, multiplex representation in the auditory cortex. Here, we investigate map plasticity in the auditory cortices of anesthetized rats conditioned in a context-dependent manner, such that a conditioned stimulus (CS) of a 20-kHz tone and an unconditioned stimulus (US) of a mild electrical shock were associated only under a noisy auditory context, but not in silence. After the conditioning, although no distinct plasticity was found in the tonotopic map, tone-evoked responses were more noise-resistive than pre-conditioning. Yet, the conditioned group showed a reduced spread of activation to each tone with noise, but not with silence, associated with a sharpening of frequency tuning. The encoding accuracy index of neurons showed that conditioning deteriorated the accuracy of tone-frequency representations in noisy condition at off-CS regions, but not at CS regions, suggesting that arbitrary tones around the frequency of the CS were more likely perceived as the CS in a specific context, where CS was associated with US. These results together demonstrate that learning-induced plasticity in the auditory cortex occurs in a context-dependent manner.

  9. The footprints of visual attention during search with 100% valid and 100% invalid cues.

    Science.gov (United States)

    Eckstein, Miguel P; Pham, Binh T; Shimozaki, Steven S

    2004-06-01

    Human performance during visual search typically improves when spatial cues indicate the possible target locations. In many instances, the performance improvement is quantitatively predicted by a Bayesian or quasi-Bayesian observer in which visual attention simply selects the information at the cued locations without changing the quality of processing or sensitivity and ignores the information at the uncued locations. Aside from the general good agreement between the effect of the cue on model and human performance, there has been little independent confirmation that humans are effectively selecting the relevant information. In this study, we used the classification image technique to assess the effectiveness of spatial cues in the attentional selection of relevant locations and suppression of irrelevant locations indicated by spatial cues. Observers searched for a bright target among dimmer distractors that might appear (with 50% probability) in one of eight locations in visual white noise. The possible target location was indicated using a 100% valid box cue or seven 100% invalid box cues in which the only potential target locations was uncued. For both conditions, we found statistically significant perceptual templates shaped as differences of Gaussians at the relevant locations with no perceptual templates at the irrelevant locations. We did not find statistical significant differences between the shapes of the inferred perceptual templates for the 100% valid and 100% invalid cues conditions. The results confirm the idea that during search visual attention allows the observer to effectively select relevant information and ignore irrelevant information. The results for the 100% invalid cues condition suggests that the selection process is not drawn automatically to the cue but can be under the observers' voluntary control.

  10. Interaction between endogenous and exogenous orienting in crossmodal attention.

    Science.gov (United States)

    Chen, Xiaoxi; Chen, Qi; Gao, Dingguo; Yue, Zhenzhu

    2012-08-01

    Using a cue-target paradigm, we investigated the interaction between endogenous and exogenous orienting in cross-modal attention. A peripheral (exogenous) cue was presented after a central (endogenous) cue with a variable time interval. The endogenous and exogenous cues were presented in one sensory modality (auditory in Experiment 1 and visual in Experiment 2) whereas the target was presented in another modality. Both experiments showed a significant endogenous cuing effect (longer reaction times in the invalid condition than in the valid condition). However, exogenous cuing produced a facilitatory effect in both experiments in response to the target when endogenous cuing was valid, but it elicited a facilitatory effect in Experiment 1 and an inhibitory effect in Experiment 2 when endogenous cuing was invalid. These findings indicate that endogenous and exogenous cuing can co-operate in orienting attention to the crossmodal target. Moreover, the interaction between endogenous and exogenous orienting of attention is modulated by the modality between the cue and the target. © 2012 The Authors. Scandinavian Journal of Psychology © 2012 The Scandinavian Psychological Associations.

  11. Auditory stream segregation using amplitude modulated bandpass noise

    Directory of Open Access Journals (Sweden)

    Yingjiu eNie

    2015-08-01

    Full Text Available The purpose of this study was to investigate the roles of spectral overlap and amplitude modulation (AM rate for stream segregation for noise signals, as well as to test the build-up effect based on these two cues. Segregation ability was evaluated using an objective paradigm with listeners’ attention focused on stream segregation. Stimulus sequences consisted of two interleaved sets of bandpass noise bursts (A and B bursts. The A and B bursts differed in spectrum, AM-rate, or both. The amount of the difference between the two sets of noise bursts was varied. Long and short sequences were studied to investigate the build-up effect for segregation based on spectral and AM-rate differences. Results showed the following: 1. Stream segregation ability increased with greater spectral separation. 2. Larger AM-rate separations were associated with stronger segregation abilities. 3. Spectral separation was found to elicit the build-up effect for the range of spectral differences assessed in the current study. 4. AM-rate separation interacted with spectral separation suggesting an additive effect of spectral separation and AM-rate separation on segregation build-up. The findings suggest that, when normal-hearing listeners direct their attention toward segregation, they are able to segregate auditory streams based on reduced spectral contrast cues that vary by the amount of spectral overlap. Further, regardless of the spectral separation they were able to use AM-rate difference as a secondary/weaker cue. Based on the spectral differences, listeners can segregate auditory streams better as the listening duration is prolonged—i.e. sparse spectral cues elicit build-up segregation; however, AM-rate differences only appear to elicit build-up when in combination with spectral difference cues.

  12. Selective attention in normal and impaired hearing.

    Science.gov (United States)

    Shinn-Cunningham, Barbara G; Best, Virginia

    2008-12-01

    A common complaint among listeners with hearing loss (HL) is that they have difficulty communicating in common social settings. This article reviews how normal-hearing listeners cope in such settings, especially how they focus attention on a source of interest. Results of experiments with normal-hearing listeners suggest that the ability to selectively attend depends on the ability to analyze the acoustic scene and to form perceptual auditory objects properly. Unfortunately, sound features important for auditory object formation may not be robustly encoded in the auditory periphery of HL listeners. In turn, impaired auditory object formation may interfere with the ability to filter out competing sound sources. Peripheral degradations are also likely to reduce the salience of higher-order auditory cues such as location, pitch, and timbre, which enable normal-hearing listeners to select a desired sound source out of a sound mixture. Degraded peripheral processing is also likely to increase the time required to form auditory objects and focus selective attention so that listeners with HL lose the ability to switch attention rapidly (a skill that is particularly important when trying to participate in a lively conversation). Finally, peripheral deficits may interfere with strategies that normal-hearing listeners employ in complex acoustic settings, including the use of memory to fill in bits of the conversation that are missed. Thus, peripheral hearing deficits are likely to cause a number of interrelated problems that challenge the ability of HL listeners to communicate in social settings requiring selective attention.

  13. The Prelimbic Cortex Directs Attention toward Predictive Cues during Fear Learning

    Science.gov (United States)

    Sharpe, Melissa J.; Killcross, Simon

    2015-01-01

    The prelimbic cortex is argued to promote conditioned fear expression, at odds with appetitive research implicating this region in attentional processing. Consistent with an attentional account, we report that the effect of prelimbic lesions on fear expression depends on the degree of competition between contextual and discrete cues. Further, when…

  14. A Brief Period of Postnatal Visual Deprivation Alters the Balance between Auditory and Visual Attention.

    Science.gov (United States)

    de Heering, Adélaïde; Dormal, Giulia; Pelland, Maxime; Lewis, Terri; Maurer, Daphne; Collignon, Olivier

    2016-11-21

    Is a short and transient period of visual deprivation early in life sufficient to induce lifelong changes in how we attend to, and integrate, simple visual and auditory information [1, 2]? This question is of crucial importance given the recent demonstration in both animals and humans that a period of blindness early in life permanently affects the brain networks dedicated to visual, auditory, and multisensory processing [1-16]. To address this issue, we compared a group of adults who had been treated for congenital bilateral cataracts during early infancy with a group of normally sighted controls on a task requiring simple detection of lateralized visual and auditory targets, presented alone or in combination. Redundancy gains obtained from the audiovisual conditions were similar between groups and surpassed the reaction time distribution predicted by Miller's race model. However, in comparison to controls, cataract-reversal patients were faster at processing simple auditory targets and showed differences in how they shifted attention across modalities. Specifically, they were faster at switching attention from visual to auditory inputs than in the reverse situation, while an opposite pattern was observed for controls. Overall, these results reveal that the absence of visual input during the first months of life does not prevent the development of audiovisual integration but enhances the salience of simple auditory inputs, leading to a different crossmodal distribution of attentional resources between auditory and visual stimuli. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Exogenous Attentional Capture by Subliminal Abrupt-Onset Cues: Evidence from Contrast-Polarity Independent Cueing Effects

    NARCIS (Netherlands)

    Fuchs, I.; Theeuwes, J.; Ansorge, U.

    2013-01-01

    In the present study, we tested whether subliminal abrupt-onset cues capture attention in a bottom-up or top-down controlled manner. For our tests, we varied the searched-for target-contrast polarity (i.e., dark or light targets against a gray background) over four experiments. In line with the

  16. Neural Correlates of Selective Attention With Hearing Aid Use Followed by ReadMyQuips Auditory Training Program.

    Science.gov (United States)

    Rao, Aparna; Rishiq, Dania; Yu, Luodi; Zhang, Yang; Abrams, Harvey

    The objectives of this study were to investigate the effects of hearing aid use and the effectiveness of ReadMyQuips (RMQ), an auditory training program, on speech perception performance and auditory selective attention using electrophysiological measures. RMQ is an audiovisual training program designed to improve speech perception in everyday noisy listening environments. Participants were adults with mild to moderate hearing loss who were first-time hearing aid users. After 4 weeks of hearing aid use, the experimental group completed RMQ training in 4 weeks, and the control group received listening practice on audiobooks during the same period. Cortical late event-related potentials (ERPs) and the Hearing in Noise Test (HINT) were administered at prefitting, pretraining, and post-training to assess effects of hearing aid use and RMQ training. An oddball paradigm allowed tracking of changes in P3a and P3b ERPs to distractors and targets, respectively. Behavioral measures were also obtained while ERPs were recorded from participants. After 4 weeks of hearing aid use but before auditory training, HINT results did not show a statistically significant change, but there was a significant P3a reduction. This reduction in P3a was correlated with improvement in d prime (d') in the selective attention task. Increased P3b amplitudes were also correlated with improvement in d' in the selective attention task. After training, this correlation between P3b and d' remained in the experimental group, but not in the control group. Similarly, HINT testing showed improved speech perception post training only in the experimental group. The criterion calculated in the auditory selective attention task showed a reduction only in the experimental group after training. ERP measures in the auditory selective attention task did not show any changes related to training. Hearing aid use was associated with a decrement in involuntary attention switch to distractors in the auditory selective

  17. The human auditory brainstem response to running speech reveals a subcortical mechanism for selective attention.

    Science.gov (United States)

    Forte, Antonio Elia; Etard, Octave; Reichenbach, Tobias

    2017-10-10

    Humans excel at selectively listening to a target speaker in background noise such as competing voices. While the encoding of speech in the auditory cortex is modulated by selective attention, it remains debated whether such modulation occurs already in subcortical auditory structures. Investigating the contribution of the human brainstem to attention has, in particular, been hindered by the tiny amplitude of the brainstem response. Its measurement normally requires a large number of repetitions of the same short sound stimuli, which may lead to a loss of attention and to neural adaptation. Here we develop a mathematical method to measure the auditory brainstem response to running speech, an acoustic stimulus that does not repeat and that has a high ecological validity. We employ this method to assess the brainstem's activity when a subject listens to one of two competing speakers, and show that the brainstem response is consistently modulated by attention.

  18. Learning to Match Auditory and Visual Speech Cues: Social Influences on Acquisition of Phonological Categories

    Science.gov (United States)

    Altvater-Mackensen, Nicole; Grossmann, Tobias

    2015-01-01

    Infants' language exposure largely involves face-to-face interactions providing acoustic and visual speech cues but also social cues that might foster language learning. Yet, both audiovisual speech information and social information have so far received little attention in research on infants' early language development. Using a preferential…

  19. The Effects of Attention Cueing on Visualizers' Multimedia Learning

    Science.gov (United States)

    Yang, Hui-Yu

    2016-01-01

    The present study examines how various types of attention cueing and cognitive preference affect learners' comprehension of a cardiovascular system and cognitive load. EFL learners were randomly assigned to one of four conditions: non-signal, static-blood-signal, static-blood-static-arrow-signal, and animation-signal. The results indicated that…

  20. A Persian version of the sustained auditory attention capacity test and its results in normal children

    Directory of Open Access Journals (Sweden)

    Sanaz Soltanparast

    2013-03-01

    Full Text Available Background and Aim: Sustained attention refers to the ability to maintain attention in target stimuli over a sustained period of time. This study was conducted to develop a Persian version of the sustained auditory attention capacity test and to study its results in normal children.Methods: To develop the Persian version of the sustained auditory attention capacity test, like the original version, speech stimuli were used. The speech stimuli consisted of one hundred monosyllabic words consisting of a 20 times random of and repetition of the words of a 21-word list of monosyllabic words, which were randomly grouped together. The test was carried out at comfortable hearing level using binaural, and diotic presentation modes on 46 normal children of 7 to 11 years of age of both gender.Results: There was a significant difference between age, and an average of impulsiveness error score (p=0.004 and total score of sustained auditory attention capacity test (p=0.005. No significant difference was revealed between age, and an average of inattention error score and attention reduction span index. Gender did not have a significant impact on various indicators of the test.Conclusion: The results of this test on a group of normal hearing children confirmed its ability to measure sustained auditory attention capacity through speech stimuli.

  1. In search of the focus of attention in working memory: 13 years of the retro-cue effect.

    Science.gov (United States)

    Souza, Alessandra S; Oberauer, Klaus

    2016-10-01

    The concept of attention has a prominent place in cognitive psychology. Attention can be directed not only to perceptual information, but also to information in working memory (WM). Evidence for an internal focus of attention has come from the retro-cue effect: Performance in tests of visual WM is improved when attention is guided to the test-relevant contents of WM ahead of testing them. The retro-cue paradigm has served as a test bed to empirically investigate the functions and limits of the focus of attention in WM. In this article, we review the growing body of (behavioral) studies on the retro-cue effect. We evaluate the degrees of experimental support for six hypotheses about what causes the retro-cue effect: (1) Attention protects representations from decay, (2) attention prioritizes the selected WM contents for comparison with a probe display, (3) attended representations are strengthened in WM, (4) not-attended representations are removed from WM, (5) a retro-cue to the retrieval target provides a head start for its retrieval before decision making, and (6) attention protects the selected representation from perceptual interference. The extant evidence provides support for the last four of these hypotheses.

  2. Bottom-up influences of voice continuity in focusing selective auditory attention.

    Science.gov (United States)

    Bressler, Scott; Masud, Salwa; Bharadwaj, Hari; Shinn-Cunningham, Barbara

    2014-01-01

    Selective auditory attention causes a relative enhancement of the neural representation of important information and suppression of the neural representation of distracting sound, which enables a listener to analyze and interpret information of interest. Some studies suggest that in both vision and in audition, the "unit" on which attention operates is an object: an estimate of the information coming from a particular external source out in the world. In this view, which object ends up in the attentional foreground depends on the interplay of top-down, volitional attention and stimulus-driven, involuntary attention. Here, we test the idea that auditory attention is object based by exploring whether continuity of a non-spatial feature (talker identity, a feature that helps acoustic elements bind into one perceptual object) also influences selective attention performance. In Experiment 1, we show that perceptual continuity of target talker voice helps listeners report a sequence of spoken target digits embedded in competing reversed digits spoken by different talkers. In Experiment 2, we provide evidence that this benefit of voice continuity is obligatory and automatic, as if voice continuity biases listeners by making it easier to focus on a subsequent target digit when it is perceptually linked to what was already in the attentional foreground. Our results support the idea that feature continuity enhances streaming automatically, thereby influencing the dynamic processes that allow listeners to successfully attend to objects through time in the cacophony that assails our ears in many everyday settings.

  3. AD/HD and the Capture of Attention by Briefly Exposed Delay-Related Cues: Evidence from a Conditioning Paradigm

    Science.gov (United States)

    Sonuga-Barke, Edmund J. S.; De Houwer, Jan; De Ruiter, Karen; Ajzenstzen, Michal; Holland, Sarah

    2004-01-01

    Background: The selective attention of children with attention deficit/hyperactivity disorder (AD/HD) to briefly exposed delay-related cues was examined in two experiments using a dot-probe conditioning paradigm. Method: Colour cues were paired with negatively (i.e., imposition of delay) and positively valenced cues (i.e., escape from or avoidance…

  4. An analysis of auditory cues for inclusion in a close quarters battle room clearing operation

    OpenAIRE

    Greenwald, Thomas W.

    2002-01-01

    Approved for public release, distribution is unlimited The purpose of this thesis is to examine which auditory cues need to be included in a virtual representation of a Close Quarters Combat Room Clearing Operation. Future missions of the United States Armed Forces, especially those of the Army and Marine Corps, are increasingly likely to be conducted in cities or built-up areas. A critical need exists for MOUT (Military Operations in Urban Terrain) training by our armed forces, and the en...

  5. Influence of memory, attention, IQ and age on auditory temporal processing tests: preliminary study.

    Science.gov (United States)

    Murphy, Cristina Ferraz Borges; Zachi, Elaine Cristina; Roque, Daniela Tsubota; Ventura, Dora Selma Fix; Schochat, Eliane

    2014-01-01

    To investigate the existence of correlations between the performance of children in auditory temporal tests (Frequency Pattern and Gaps in Noise--GIN) and IQ, attention, memory and age measurements. Fifteen typically developing individuals between the ages of 7 to 12 years and normal hearing participated in the study. Auditory temporal processing tests (GIN and Frequency Pattern), as well as a Memory test (Digit Span), Attention tests (auditory and visual modality) and intelligence tests (RAVEN test of Progressive Matrices) were applied. Significant and positive correlation between the Frequency Pattern test and age variable were found, which was considered good (p<0.01, 75.6%). There were no significant correlations between the GIN test and the variables tested. Auditory temporal skills seem to be influenced by different factors: while the performance in temporal ordering skill seems to be influenced by maturational processes, the performance in temporal resolution was not influenced by any of the aspects investigated.

  6. Real color captures attention and overrides spatial cues in grapheme-color synesthetes but not in controls.

    Science.gov (United States)

    van Leeuwen, Tessa M; Hagoort, Peter; Händel, Barbara F

    2013-08-01

    Grapheme-color synesthetes perceive color when reading letters or digits. We investigated oscillatory brain signals of synesthetes vs. controls using magnetoencephalography. Brain oscillations specifically in the alpha band (∼10Hz) have two interesting features: alpha has been linked to inhibitory processes and can act as a marker for attention. The possible role of reduced inhibition as an underlying cause of synesthesia, as well as the precise role of attention in synesthesia is widely discussed. To assess alpha power effects due to synesthesia, synesthetes as well as matched controls viewed synesthesia-inducing graphemes, colored control graphemes, and non-colored control graphemes while brain activity was recorded. Subjects had to report a color change at the end of each trial which allowed us to assess the strength of synesthesia in each synesthete. Since color (synesthetic or real) might allocate attention we also included an attentional cue in our paradigm which could direct covert attention. In controls the attentional cue always caused a lateralization of alpha power with a contralateral decrease and ipsilateral alpha increase over occipital sensors. In synesthetes, however, the influence of the cue was overruled by color: independent of the attentional cue, alpha power decreased contralateral to the color (synesthetic or real). This indicates that in synesthetes color guides attention. This was confirmed by reaction time effects due to color, i.e. faster RTs for the color side independent of the cue. Finally, the stronger the observed color dependent alpha lateralization, the stronger was the manifestation of synesthesia as measured by congruency effects of synesthetic colors on RTs. Behavioral and imaging results indicate that color induces a location-specific, automatic shift of attention towards color in synesthetes but not in controls. We hypothesize that this mechanism can facilitate coupling of grapheme and color during the development of

  7. In the presence of conflicting gaze cues, fearful expression and eye-size guide attention.

    Science.gov (United States)

    Carlson, Joshua M; Aday, Jacob

    2017-10-19

    Humans are social beings that often interact in multi-individual environments. As such, we are frequently confronted with nonverbal social signals, including eye-gaze direction, from multiple individuals. Yet, the factors that allow for the prioritisation of certain gaze cues over others are poorly understood. Using a modified conflicting gaze paradigm, we tested the hypothesis that fearful gaze would be favoured amongst competing gaze cues. We further hypothesised that this effect is related to the increased sclera exposure, which is characteristic of fearful expressions. Across three experiments, we found that fearful, but not happy, gaze guides observers' attention over competing non-emotional gaze. The guidance of attention by fearful gaze appears to be linked to increased sclera exposure. However, differences in sclera exposure do not prioritise competing gazes of other types. Thus, fearful gaze guides attention among competing cues and this effect is facilitated by increased sclera exposure - but increased sclera exposure per se does not guide attention. The prioritisation of fearful gaze over non-emotional gaze likely represents an adaptive means of selectively attending to survival-relevant spatial locations.

  8. Auditory spatial attention to speech and complex non-speech sounds in children with autism spectrum disorder.

    Science.gov (United States)

    Soskey, Laura N; Allen, Paul D; Bennetto, Loisa

    2017-08-01

    One of the earliest observable impairments in autism spectrum disorder (ASD) is a failure to orient to speech and other social stimuli. Auditory spatial attention, a key component of orienting to sounds in the environment, has been shown to be impaired in adults with ASD. Additionally, specific deficits in orienting to social sounds could be related to increased acoustic complexity of speech. We aimed to characterize auditory spatial attention in children with ASD and neurotypical controls, and to determine the effect of auditory stimulus complexity on spatial attention. In a spatial attention task, target and distractor sounds were played randomly in rapid succession from speakers in a free-field array. Participants attended to a central or peripheral location, and were instructed to respond to target sounds at the attended location while ignoring nearby sounds. Stimulus-specific blocks evaluated spatial attention for simple non-speech tones, speech sounds (vowels), and complex non-speech sounds matched to vowels on key acoustic properties. Children with ASD had significantly more diffuse auditory spatial attention than neurotypical children when attending front, indicated by increased responding to sounds at adjacent non-target locations. No significant differences in spatial attention emerged based on stimulus complexity. Additionally, in the ASD group, more diffuse spatial attention was associated with more severe ASD symptoms but not with general inattention symptoms. Spatial attention deficits have important implications for understanding social orienting deficits and atypical attentional processes that contribute to core deficits of ASD. Autism Res 2017, 10: 1405-1416. © 2017 International Society for Autism Research, Wiley Periodicals, Inc. © 2017 International Society for Autism Research, Wiley Periodicals, Inc.

  9. Visual Attention in Flies-Dopamine in the Mushroom Bodies Mediates the After-Effect of Cueing.

    Science.gov (United States)

    Koenig, Sebastian; Wolf, Reinhard; Heisenberg, Martin

    2016-01-01

    Visual environments may simultaneously comprise stimuli of different significance. Often such stimuli require incompatible responses. Selective visual attention allows an animal to respond exclusively to the stimuli at a certain location in the visual field. In the process of establishing its focus of attention the animal can be influenced by external cues. Here we characterize the behavioral properties and neural mechanism of cueing in the fly Drosophila melanogaster. A cue can be attractive, repulsive or ineffective depending upon (e.g.) its visual properties and location in the visual field. Dopamine signaling in the brain is required to maintain the effect of cueing once the cue has disappeared. Raising or lowering dopamine at the synapse abolishes this after-effect. Specifically, dopamine is necessary and sufficient in the αβ-lobes of the mushroom bodies. Evidence is provided for an involvement of the αβposterior Kenyon cells.

  10. Aberrant interference of auditory negative words on attention in patients with schizophrenia.

    Directory of Open Access Journals (Sweden)

    Norichika Iwashiro

    Full Text Available Previous research suggests that deficits in attention-emotion interaction are implicated in schizophrenia symptoms. Although disruption in auditory processing is crucial in the pathophysiology of schizophrenia, deficits in interaction between emotional processing of auditorily presented language stimuli and auditory attention have not yet been clarified. To address this issue, the current study used a dichotic listening task to examine 22 patients with schizophrenia and 24 age-, sex-, parental socioeconomic background-, handedness-, dexterous ear-, and intelligence quotient-matched healthy controls. The participants completed a word recognition task on the attended side in which a word with emotionally valenced content (negative/positive/neutral was presented to one ear and a different neutral word was presented to the other ear. Participants selectively attended to either ear. In the control subjects, presentation of negative but not positive word stimuli provoked a significantly prolonged reaction time compared with presentation of neutral word stimuli. This interference effect for negative words existed whether or not subjects directed attention to the negative words. This interference effect was significantly smaller in the patients with schizophrenia than in the healthy controls. Furthermore, the smaller interference effect was significantly correlated with severe positive symptoms and delusional behavior in the patients with schizophrenia. The present findings suggest that aberrant interaction between semantic processing of negative emotional content and auditory attention plays a role in production of positive symptoms in schizophrenia. (224 words.

  11. Frequency-specific attentional modulation in human primary auditory cortex and midbrain.

    Science.gov (United States)

    Riecke, Lars; Peters, Judith C; Valente, Giancarlo; Poser, Benedikt A; Kemper, Valentin G; Formisano, Elia; Sorger, Bettina

    2018-07-01

    Paying selective attention to an audio frequency selectively enhances activity within primary auditory cortex (PAC) at the tonotopic site (frequency channel) representing that frequency. Animal PAC neurons achieve this 'frequency-specific attentional spotlight' by adapting their frequency tuning, yet comparable evidence in humans is scarce. Moreover, whether the spotlight operates in human midbrain is unknown. To address these issues, we studied the spectral tuning of frequency channels in human PAC and inferior colliculus (IC), using 7-T functional magnetic resonance imaging (FMRI) and frequency mapping, while participants focused on different frequency-specific sounds. We found that shifts in frequency-specific attention alter the response gain, but not tuning profile, of PAC frequency channels. The gain modulation was strongest in low-frequency channels and varied near-monotonically across the tonotopic axis, giving rise to the attentional spotlight. We observed less prominent, non-tonotopic spatial patterns of attentional modulation in IC. These results indicate that the frequency-specific attentional spotlight in human PAC as measured with FMRI arises primarily from tonotopic gain modulation, rather than adapted frequency tuning. Moreover, frequency-specific attentional modulation of afferent sound processing in human IC seems to be considerably weaker, suggesting that the spotlight diminishes toward this lower-order processing stage. Our study sheds light on how the human auditory pathway adapts to the different demands of selective hearing. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Context Modulates Congruency Effects in Selective Attention to Social Cues

    Directory of Open Access Journals (Sweden)

    Andrea Ravagli

    2018-06-01

    Full Text Available Head and gaze directions are used during social interactions as essential cues to infer where someone attends. When head and gaze are oriented toward opposite directions, we need to extract socially meaningful information despite stimulus conflict. Recently, a cognitive and neural mechanism for filtering-out conflicting stimuli has been identified while performing non-social attention tasks. This mechanism is engaged proactively when conflict is anticipated in a high proportion of trials and reactively when conflict occurs infrequently. Here, we investigated whether a similar mechanism is at play for limiting distraction from conflicting social cues during gaze or head direction discrimination tasks in contexts with different probabilities of conflict. Results showed that, for the gaze direction task only (Experiment 1, inverse efficiency (IE scores for distractor-absent trials (i.e., faces with averted gaze and centrally oriented head were larger (indicating worse performance when these trials were intermixed with congruent/incongruent distractor-present trials (i.e., faces with averted gaze and tilted head in the same/opposite direction relative to when the same distractor-absent trials were shown in isolation. Moreover, on distractor-present trials, IE scores for congruent (vs. incongruent head-gaze pairs in blocks with rare conflict were larger than in blocks with frequent conflict, suggesting that adaptation to conflict was more efficient than adaptation to infrequent events. However, when the task required discrimination of head orientation while ignoring gaze direction, performance was not impacted by both block-level and current trial congruency (Experiment 2, unless the cognitive load of the task was increased by adding a concurrent task (Experiment 3. Overall, our study demonstrates that during attention to social cues proactive cognitive control mechanisms are modulated by the expectation of conflicting stimulus information at both

  13. Integration of auditory and tactile inputs in musical meter perception.

    Science.gov (United States)

    Huang, Juan; Gamble, Darik; Sarnlertsophon, Kristine; Wang, Xiaoqin; Hsiao, Steven

    2013-01-01

    Musicians often say that they not only hear but also "feel" music. To explore the contribution of tactile information to "feeling" music, we investigated the degree that auditory and tactile inputs are integrated in humans performing a musical meter-recognition task. Subjects discriminated between two types of sequences, "duple" (march-like rhythms) and "triple" (waltz-like rhythms), presented in three conditions: (1) unimodal inputs (auditory or tactile alone); (2) various combinations of bimodal inputs, where sequences were distributed between the auditory and tactile channels such that a single channel did not produce coherent meter percepts; and (3) bimodal inputs where the two channels contained congruent or incongruent meter cues. We first show that meter is perceived similarly well (70-85 %) when tactile or auditory cues are presented alone. We next show in the bimodal experiments that auditory and tactile cues are integrated to produce coherent meter percepts. Performance is high (70-90 %) when all of the metrically important notes are assigned to one channel and is reduced to 60 % when half of these notes are assigned to one channel. When the important notes are presented simultaneously to both channels, congruent cues enhance meter recognition (90 %). Performance dropped dramatically when subjects were presented with incongruent auditory cues (10 %), as opposed to incongruent tactile cues (60 %), demonstrating that auditory input dominates meter perception. These observations support the notion that meter perception is a cross-modal percept with tactile inputs underlying the perception of "feeling" music.

  14. Auditory prediction during speaking and listening.

    Science.gov (United States)

    Sato, Marc; Shiller, Douglas M

    2018-02-02

    In the present EEG study, the role of auditory prediction in speech was explored through the comparison of auditory cortical responses during active speaking and passive listening to the same acoustic speech signals. Two manipulations of sensory prediction accuracy were used during the speaking task: (1) a real-time change in vowel F1 feedback (reducing prediction accuracy relative to unaltered feedback) and (2) presenting a stable auditory target rather than a visual cue to speak (enhancing auditory prediction accuracy during baseline productions, and potentially enhancing the perturbing effect of altered feedback). While subjects compensated for the F1 manipulation, no difference between the auditory-cue and visual-cue conditions were found. Under visually-cued conditions, reduced N1/P2 amplitude was observed during speaking vs. listening, reflecting a motor-to-sensory prediction. In addition, a significant correlation was observed between the magnitude of behavioral compensatory F1 response and the magnitude of this speaking induced suppression (SIS) for P2 during the altered auditory feedback phase, where a stronger compensatory decrease in F1 was associated with a stronger the SIS effect. Finally, under the auditory-cued condition, an auditory repetition-suppression effect was observed in N1/P2 amplitude during the listening task but not active speaking, suggesting that auditory predictive processes during speaking and passive listening are functionally distinct. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. A randomised controlled trial evaluating the effect of an individual auditory cueing device on freezing and gait speed in people with Parkinson's disease

    Directory of Open Access Journals (Sweden)

    Lynch Deirdre

    2008-12-01

    Full Text Available Abstract Background Parkinson's disease is a progressive neurological disorder resulting from a degeneration of dopamine producing cells in the substantia nigra. Clinical symptoms typically affect gait pattern and motor performance. Evidence suggests that the use of individual auditory cueing devices may be used effectively for the management of gait and freezing in people with Parkinson's disease. The primary aim of the randomised controlled trial is to evaluate the effect of an individual auditory cueing device on freezing and gait speed in people with Parkinson's disease. Methods A prospective multi-centre randomised cross over design trial will be conducted. Forty-seven subjects will be randomised into either Group A or Group B, each with a control and intervention phase. Baseline measurements will be recorded using the Freezing of Gait Questionnaire as the primary outcome measure and 3 secondary outcome measures, the 10 m Walk Test, Timed "Up & Go" Test and the Modified Falls Efficacy Scale. Assessments are taken 3-times over a 3-week period. A follow-up assessment will be completed after three months. A secondary aim of the study is to evaluate the impact of such a device on the quality of life of people with Parkinson's disease using a qualitative methodology. Conclusion The Apple iPod-Shuffle™ and similar devices provide a cost effective and an innovative platform for integration of individual auditory cueing devices into clinical, social and home environments and are shown to have immediate effect on gait, with improvements in walking speed, stride length and freezing. It is evident that individual auditory cueing devices are of benefit to people with Parkinson's disease and the aim of this randomised controlled trial is to maximise the benefits by allowing the individual to use devices in both a clinical and social setting, with minimal disruption to their daily routine. Trial registration The protocol for this study is registered

  16. Effect of the cognitive-motor dual-task using auditory cue on balance of surviviors with chronic stroke: a pilot study.

    Science.gov (United States)

    Choi, Wonjae; Lee, GyuChang; Lee, Seungwon

    2015-08-01

    To investigate the effect of a cognitive-motor dual-task using auditory cues on the balance of patients with chronic stroke. Randomized controlled trial. Inpatient rehabilitation center. Thirty-seven individuals with chronic stroke. The participants were randomly allocated to the dual-task group (n=19) and the single-task group (n=18). The dual-task group performed a cognitive-motor dual-task in which they carried a circular ring from side to side according to a random auditory cue during treadmill walking. The single-task group walked on a treadmill only. All subjects completed 15 min per session, three times per week, for four weeks with conventional rehabilitation five times per week over the four weeks. Before and after intervention, both static and dynamic balance were measured with a force platform and using the Timed Up and Go (TUG) test. The dual-task group showed significant improvement in all variables compared to the single-task group, except for anteroposterior (AP) sway velocity with eyes open and TUG at follow-up: mediolateral (ML) sway velocity with eye open (dual-task group vs. single-task group: 2.11 mm/s vs. 0.38 mm/s), ML sway velocity with eye close (2.91 mm/s vs. 1.35 mm/s), AP sway velocity with eye close (4.84 mm/s vs. 3.12 mm/s). After intervention, all variables showed significant improvement in the dual-task group compared to baseline. The study results suggest that the performance of a cognitive-motor dual-task using auditory cues may influence balance improvements in chronic stroke patients. © The Author(s) 2014.

  17. Involuntary attention with uncertainty: peripheral cues improve perception of masked letters, but may impair perception of low-contrast letters.

    Science.gov (United States)

    Kerzel, Dirk; Gauch, Angélique; Buetti, Simona

    2010-10-01

    Improvements of perceptual performance following the presentation of peripheral cues have been ascribed to accelerated accrual of information, enhanced contrast perception, and decision bias. We investigated effects of peripheral cues on the perception of Gabor and letter stimuli. Non-predictive, peripheral cues improved perceptual accuracy when the stimuli were masked. In contrast, peripheral cues degraded perception of low-contrast letters and did not affect the perception of low-contrast Gabors. The results suggest that involuntary attention accelerates accrual of information but are not entirely consistent with the idea that involuntary attention enhances subjective contrast. Rather, peripheral cues may cause crowding with single letter targets of low contrast. Further, we investigated the effect of the amount of uncertainty on involuntary attention. Cueing effects were (initially) larger when there were more possible target locations. In addition, cueing effects were larger when error feedback was absent and observers had no knowledge of results. Despite these strategic factors, location uncertainty was not sufficient to produce cueing effects, showing that location uncertainty paired with non-predictive cues reveals perceptual and not (only) decisional processes.

  18. Atomoxetine effects on attentional bias to drug-related cues in cocaine dependent individuals

    NARCIS (Netherlands)

    Passamonti, L. (Luca); M. Luijten (Maartje); Ziauddeen, H.; I. Coyle-Gilchrist (Ian); Rittman, T.; Brain, S.A.E.; Regenthal, R.; I.H.A. Franken (Ingmar); Sahakian, B.J.; Bullmore, E.T.; Robbins, T.W.; Ersche, K.D.

    2017-01-01

    textabstractRationale: Biased attention towards drug-related cues and reduced inhibitory control over the regulation of drug-intake characterize drug addiction. The noradrenaline system has been critically implicated in both attentional and response inhibitory processes and is directly affected by

  19. Do current and former cigarette smokers have an attentional bias for e-cigarette cues?

    Science.gov (United States)

    Lochbuehler, Kirsten; Wileyto, E Paul; Tang, Kathy Z; Mercincavage, Melissa; Cappella, Joseph N; Strasser, Andrew A

    2018-03-01

    The similarity of e-cigarettes to tobacco cigarettes with regard to shape and usage raises the question of whether e-cigarette cues have the same incentive motivational properties as tobacco cigarette cues. The objective of the present study was to examine whether e-cigarette cues capture and hold smokers' and former smokers' attention and whether the attentional focus is associated with subsequent craving for tobacco cigarettes. It was also examined whether device type (cigalike or mod) moderated this relationship. Participants (46 current daily smokers, 38 former smokers, 48 non-smokers) were randomly assigned to a device type condition in which their eye-movements were assessed while completing a visual probe task. Craving was assessed before and after the task. Smokers, but not former or non-smokers, maintained their gaze longer on e-cigarette than on neutral pictures ( p = 0.004). No difference in dwell time was found between device type. None of the smoking status groups showed faster initial fixations or faster reaction times to e-cigarette compared with neutral cues. Baseline craving was associated with dwell time on e-cigarette cues ( p = 0.004). Longer dwell time on e-cigarette cues was associated with more favorable attitudes towards e-cigarettes. These findings indicate that e-cigarette cues may contribute to craving for tobacco cigarettes and suggest the potential regulation of e-cigarette marketing.

  20. A Comparison of Selective Auditory Attention Abilities in Open-Space Versus Closed Classroom Students.

    Science.gov (United States)

    Reinertsen, Gloria M.

    A study compared performances on a test of selective auditory attention between students educated in open-space versus closed classroom environments. An open-space classroom environment was defined as having no walls separating it from hallways or other classrooms. It was hypothesized that the incidence of auditory figure-ground (ability to focus…

  1. Time to Guide: Evidence for Delayed Attentional Guidance in Contextual Cueing.

    Science.gov (United States)

    Kunar, Melina A; Flusberg, Stephen J; Wolfe, Jeremy M

    2008-01-01

    Contextual cueing experiments show that, when displays are repeated, reaction times (RTs) to find a target decrease over time even when the observers are not aware of the repetition. Recent evidence suggests that this benefit in standard contextual cueing tasks is not likely to be due to an improvement in attentional guidance (Kunar, Flusberg, Horowitz & Wolfe, 2007). Nevertheless, we ask whether guidance can help participants find the target in a repeated display, if they are given sufficient time to encode the display. In Experiment 1 we increased the display complexity so that it took participants longer to find the target. Here we found a larger effect of guidance than in a condition with shorter RTs. Experiment 2 gave participants prior exposure to the display context. The data again showed that with more time participants could implement guidance to help find the target, provided that there was something in the search stimuli locations to guide attention to. The data suggest that although the benefit in a standard contextual cueing task is unlikely to be a result of guidance, guidance can play a role if it is given time to develop.

  2. Coupling between Theta Oscillations and Cognitive Control Network during Cross-Modal Visual and Auditory Attention: Supramodal vs Modality-Specific Mechanisms.

    Science.gov (United States)

    Wang, Wuyi; Viswanathan, Shivakumar; Lee, Taraz; Grafton, Scott T

    2016-01-01

    Cortical theta band oscillations (4-8 Hz) in EEG signals have been shown to be important for a variety of different cognitive control operations in visual attention paradigms. However the synchronization source of these signals as defined by fMRI BOLD activity and the extent to which theta oscillations play a role in multimodal attention remains unknown. Here we investigated the extent to which cross-modal visual and auditory attention impacts theta oscillations. Using a simultaneous EEG-fMRI paradigm, healthy human participants performed an attentional vigilance task with six cross-modal conditions using naturalistic stimuli. To assess supramodal mechanisms, modulation of theta oscillation amplitude for attention to either visual or auditory stimuli was correlated with BOLD activity by conjunction analysis. Negative correlation was localized to cortical regions associated with the default mode network and positively with ventral premotor areas. Modality-associated attention to visual stimuli was marked by a positive correlation of theta and BOLD activity in fronto-parietal area that was not observed in the auditory condition. A positive correlation of theta and BOLD activity was observed in auditory cortex, while a negative correlation of theta and BOLD activity was observed in visual cortex during auditory attention. The data support a supramodal interaction of theta activity with of DMN function, and modality-associated processes within fronto-parietal networks related to top-down theta related cognitive control in cross-modal visual attention. On the other hand, in sensory cortices there are opposing effects of theta activity during cross-modal auditory attention.

  3. Attention, memory, and auditory processing in 10- to 15-year-old children with listening difficulties.

    Science.gov (United States)

    Sharma, Mridula; Dhamani, Imran; Leung, Johahn; Carlile, Simon

    2014-12-01

    The aim of this study was to examine attention, memory, and auditory processing in children with reported listening difficulty in noise (LDN) despite having clinically normal hearing. Twenty-one children with LDN and 15 children with no listening concerns (controls) participated. The clinically normed auditory processing tests included the Frequency/Pitch Pattern Test (FPT; Musiek, 2002), the Dichotic Digits Test (Musiek, 1983), the Listening in Spatialized Noise-Sentences (LiSN-S) test (Dillon, Cameron, Glyde, Wilson, & Tomlin, 2012), gap detection in noise (Baker, Jayewardene, Sayle, & Saeed, 2008), and masking level difference (MLD; Wilson, Moncrieff, Townsend, & Pillion, 2003). Also included were research-based psychoacoustic tasks, such as auditory stream segregation, localization, sinusoidal amplitude modulation (SAM), and fine structure perception. All were also evaluated on attention and memory test batteries. The LDN group was significantly slower switching their auditory attention and had poorer inhibitory control. Additionally, the group mean results showed significantly poorer performance on FPT, MLD, 4-Hz SAM, and memory tests. Close inspection of the individual data revealed that only 5 participants (out of 21) in the LDN group showed significantly poor performance on FPT compared with clinical norms. Further testing revealed the frequency discrimination of these 5 children to be significantly impaired. Thus, the LDN group showed deficits in attention switching and inhibitory control, whereas only a subset of these participants demonstrated an additional frequency resolution deficit.

  4. Action video games improve reading abilities and visual-to-auditory attentional shifting in English-speaking children with dyslexia.

    Science.gov (United States)

    Franceschini, Sandro; Trevisan, Piergiorgio; Ronconi, Luca; Bertoni, Sara; Colmar, Susan; Double, Kit; Facoetti, Andrea; Gori, Simone

    2017-07-19

    Dyslexia is characterized by difficulties in learning to read and there is some evidence that action video games (AVG), without any direct phonological or orthographic stimulation, improve reading efficiency in Italian children with dyslexia. However, the cognitive mechanism underlying this improvement and the extent to which the benefits of AVG training would generalize to deep English orthography, remain two critical questions. During reading acquisition, children have to integrate written letters with speech sounds, rapidly shifting their attention from visual to auditory modality. In our study, we tested reading skills and phonological working memory, visuo-spatial attention, auditory, visual and audio-visual stimuli localization, and cross-sensory attentional shifting in two matched groups of English-speaking children with dyslexia before and after they played AVG or non-action video games. The speed of words recognition and phonological decoding increased after playing AVG, but not non-action video games. Furthermore, focused visuo-spatial attention and visual-to-auditory attentional shifting also improved only after AVG training. This unconventional reading remediation program also increased phonological short-term memory and phoneme blending skills. Our report shows that an enhancement of visuo-spatial attention and phonological working memory, and an acceleration of visual-to-auditory attentional shifting can directly translate into better reading in English-speaking children with dyslexia.

  5. Role of the right inferior parietal cortex in auditory selective attention: An rTMS study.

    Science.gov (United States)

    Bareham, Corinne A; Georgieva, Stanimira D; Kamke, Marc R; Lloyd, David; Bekinschtein, Tristan A; Mattingley, Jason B

    2018-02-01

    Selective attention is the process of directing limited capacity resources to behaviourally relevant stimuli while ignoring competing stimuli that are currently irrelevant. Studies in healthy human participants and in individuals with focal brain lesions have suggested that the right parietal cortex is crucial for resolving competition for attention. Following right-hemisphere damage, for example, patients may have difficulty reporting a brief, left-sided stimulus if it occurs with a competitor on the right, even though the same left stimulus is reported normally when it occurs alone. Such "extinction" of contralesional stimuli has been documented for all the major sense modalities, but it remains unclear whether its occurrence reflects involvement of one or more specific subregions of the temporo-parietal cortex. Here we employed repetitive transcranial magnetic stimulation (rTMS) over the right hemisphere to examine the effect of disruption of two candidate regions - the supramarginal gyrus (SMG) and the superior temporal gyrus (STG) - on auditory selective attention. Eighteen neurologically normal, right-handed participants performed an auditory task, in which they had to detect target digits presented within simultaneous dichotic streams of spoken distractor letters in the left and right channels, both before and after 20 min of 1 Hz rTMS over the SMG, STG or a somatosensory control site (S1). Across blocks, participants were asked to report on auditory streams in the left, right, or both channels, which yielded focused and divided attention conditions. Performance was unchanged for the two focused attention conditions, regardless of stimulation site, but was selectively impaired for contralateral left-sided targets in the divided attention condition following stimulation of the right SMG, but not the STG or S1. Our findings suggest a causal role for the right inferior parietal cortex in auditory selective attention. Copyright © 2017 Elsevier Ltd. All rights

  6. Investigating the time course of tactile reflexive attention using a non-spatial discrimination task.

    Science.gov (United States)

    Miles, Eleanor; Poliakoff, Ellen; Brown, Richard J

    2008-06-01

    Peripheral cues are thought to facilitate responses to stimuli presented at the same location because they lead to exogenous attention shifts. Facilitation has been observed in numerous studies of visual and auditory attention, but there have been only four demonstrations of tactile facilitation, all in studies with potential confounds. Three studies used a spatial (finger versus thumb) discrimination task, where the cue could have provided a spatial framework that might have assisted the discrimination of subsequent targets presented on the same side as the cue. The final study circumvented this problem by using a non-spatial discrimination; however, the cues were informative and interspersed with visual cues which may have affected the attentional effects observed. In the current study, therefore, we used a non-spatial tactile frequency discrimination task following a non-informative tactile white noise cue. When the target was presented 150 ms after the cue, we observed faster discrimination responses to targets presented on the same side compared to the opposite side as the cue; by 1000 ms, responses were significantly faster to targets presented on the opposite side to the cue. Thus, we demonstrated that tactile attentional facilitation can be observed in a non-spatial discrimination task, under unimodal conditions and with entirely non-predictive cues. Furthermore, we provide the first demonstration of significant tactile facilitation and tactile inhibition of return within a single experiment.

  7. Time to learn: evidence for two types of attentional guidance in contextual cueing.

    Science.gov (United States)

    Ogawa, Hirokazu; Watanabe, Katsumi

    2010-01-01

    Repetition of the same spatial configurations of a search display implicitly facilitates performance of a visual-search task when the target location in the display is fixed. The improvement of performance is referred to as contextual cueing. We examined whether the association process between target location and surrounding configuration of distractors occurs during active search or at the instant the target is found. To dissociate these two processes, we changed the surrounding configuration of the distractors at the instant of target detection so that the layout where the participants had searched for the target and the layout presented at the instant of target detection differed. The results demonstrated that both processes are responsible for the contextual-cueing effect, but they differ in the accuracies of attentional guidance and their time courses, suggesting that two different types of attentional-guidance processes may be involved in contextual cueing.

  8. Attentional Bias to Food Cues in Youth with Loss of Control Eating

    Science.gov (United States)

    Shank, Lisa M.; Tanofsky-Kraff, Marian; Nelson, Eric E.; Shomaker, Lauren B.; Ranzenhofer, Lisa M.; Hannallah, Louise M.; Field, Sara E.; Vannucci, Anna; Bongiorno, Diana M.; Brady, Sheila M.; Condarco, Tania; Demidowich, Andrew; Kelly, Nichole R.; Cassidy, Omni; Simmons, W. Kyle; Engel, Scott G.; Pine, Daniel S.; Yanovski, Jack A.

    2014-01-01

    Emerging data indicate that adults with binge eating may exhibit an attentional bias toward highly palatable foods, which may promote obesogenic eating patterns and excess weight gain. However, it is unknown to what extent youth with loss of control (LOC) eating display a similar bias. We therefore studied 76 youth (14.5±2.3y; 86.8% female; BMI-z 1.7± .73) with (n=47) and without (n=29) reported LOC eating. Following a breakfast to reduce hunger, youth participated in a computerized visual probe task of sustained attention that assessed reaction time to pairs of pictures consisting of high palatable foods, low palatable foods, and neutral household objects. Although sustained attentional bias did not differ by LOC eating presence and was unrelated to body weight, a two-way interaction between BMI-z and LOC eating was observed (p = .01), such that only among youth with LOC eating, attentional bias toward high palatable foods versus neutral objects was positively associated with BMI-z. These findings suggest that LOC eating and body weight interact in their association with attentional bias to highly palatable foods cues, and may partially explain the mixed literature linking attentional bias to food cues with excess body weight. PMID:25435490

  9. Attending to auditory memory.

    Science.gov (United States)

    Zimmermann, Jacqueline F; Moscovitch, Morris; Alain, Claude

    2016-06-01

    Attention to memory describes the process of attending to memory traces when the object is no longer present. It has been studied primarily for representations of visual stimuli with only few studies examining attention to sound object representations in short-term memory. Here, we review the interplay of attention and auditory memory with an emphasis on 1) attending to auditory memory in the absence of related external stimuli (i.e., reflective attention) and 2) effects of existing memory on guiding attention. Attention to auditory memory is discussed in the context of change deafness, and we argue that failures to detect changes in our auditory environments are most likely the result of a faulty comparison system of incoming and stored information. Also, objects are the primary building blocks of auditory attention, but attention can also be directed to individual features (e.g., pitch). We review short-term and long-term memory guided modulation of attention based on characteristic features, location, and/or semantic properties of auditory objects, and propose that auditory attention to memory pathways emerge after sensory memory. A neural model for auditory attention to memory is developed, which comprises two separate pathways in the parietal cortex, one involved in attention to higher-order features and the other involved in attention to sensory information. This article is part of a Special Issue entitled SI: Auditory working memory. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Auditory and Visual Working Memory Functioning in College Students with Attention-Deficit/Hyperactivity Disorder and/or Learning Disabilities.

    Science.gov (United States)

    Liebel, Spencer W; Nelson, Jason M

    2017-12-01

    We investigated the auditory and visual working memory functioning in college students with attention-deficit/hyperactivity disorder, learning disabilities, and clinical controls. We examined the role attention-deficit/hyperactivity disorder subtype status played in working memory functioning. The unique influence that both domains of working memory have on reading and math abilities was investigated. A sample of 268 individuals seeking postsecondary education comprise four groups of the present study: 110 had an attention-deficit/hyperactivity disorder diagnosis only, 72 had a learning disability diagnosis only, 35 had comorbid attention-deficit/hyperactivity disorder and learning disability diagnoses, and 60 individuals without either of these disorders comprise a clinical control group. Participants underwent a comprehensive neuropsychological evaluation, and licensed psychologists employed a multi-informant, multi-method approach in obtaining diagnoses. In the attention-deficit/hyperactivity disorder only group, there was no difference between auditory and visual working memory functioning, t(100) = -1.57, p = .12. In the learning disability group, however, auditory working memory functioning was significantly weaker compared with visual working memory, t(71) = -6.19, p attention-deficit/hyperactivity disorder only group, there were no auditory or visual working memory functioning differences between participants with either a predominantly inattentive type or a combined type diagnosis. Visual working memory did not incrementally contribute to the prediction of academic achievement skills. Individuals with attention-deficit/hyperactivity disorder did not demonstrate significant working memory differences compared with clinical controls. Individuals with a learning disability demonstrated weaker auditory working memory than individuals in either the attention-deficit/hyperactivity or clinical control groups. © The Author 2017. Published by Oxford University

  11. Impaired Facilitatory Mechanisms of Auditory Attention After Damage of the Lateral Prefrontal Cortex

    OpenAIRE

    Bidet-Caulet, Aurélie; Buchanan, Kelly G.; Viswanath, Humsini; Black, Jessica; Scabini, Donatella; Bonnet-Brilhault, Frédérique; Knight, Robert T.

    2014-01-01

    There is growing evidence that auditory selective attention operates via distinct facilitatory and inhibitory mechanisms enabling selective enhancement and suppression of sound processing, respectively. The lateral prefrontal cortex (LPFC) plays a crucial role in the top-down control of selective attention. However, whether the LPFC controls facilitatory, inhibitory, or both attentional mechanisms is unclear. Facilitatory and inhibitory mechanisms were assessed, in patients with LPFC damage, ...

  12. Time to guide: evidence for delayed attentional guidance in contextual cueing \\ud

    OpenAIRE

    Kunar, Melina A.; Flusberg, Stephen J.; Wolfe, Jeremy M

    2008-01-01

    Contextual cueing experiments show that, when displays are repeated, reaction times (RTs) to find a target decrease over time even when the observers are not aware of the repetition. Recent evidence suggests that this benefit in standard contextual cueing tasks is not likely to be due to an improvement in attentional guidance (Kunar, Flusberg, Horowitz, & Wolfe, 2007). Nevertheless, we ask whether guidance can help participants find the target in a repeated display, if they are given sufficie...

  13. Atomoxetine effects on attentional bias to drug-related cues in cocaine dependent individuals.

    Science.gov (United States)

    Passamonti, Luca; Luijten, M; Ziauddeen, H; Coyle-Gilchrist, I T S; Rittman, T; Brain, S A E; Regenthal, R; Franken, I H A; Sahakian, B J; Bullmore, E T; Robbins, T W; Ersche, K D

    2017-08-01

    Biased attention towards drug-related cues and reduced inhibitory control over the regulation of drug-intake characterize drug addiction. The noradrenaline system has been critically implicated in both attentional and response inhibitory processes and is directly affected by drugs such as cocaine. We examined the potentially beneficial effects of the noradrenaline reuptake inhibitor atomoxetine in improving cognitive control during two tasks that used cocaine- and non-cocaine-related stimuli. A double-blind, placebo-controlled, and cross-over psycho-pharmacological design was employed. A single oral dose of atomoxetine (40 mg) was administered to 28 cocaine-dependent individuals (CDIs) and 28 healthy controls. All participants performed a pictorial attentional bias task involving both cocaine- and non-cocaine-related pictures as well as a verbal go/no-go task composed of cocaine- and food-related words. As expected, CDIs showed attentional bias to cocaine-related cues whilst controls did not. More importantly, however, atomoxetine, relative to placebo, significantly attenuated attentional bias in CDIs (F 26  = 6.73, P = 0.01). During the go/no-go task, there was a treatment × trial × group interaction, although this finding only showed a trend towards statistical significance (F 26  = 3.38, P = 0.07). Our findings suggest that atomoxetine reduces attentional bias to drug-related cues in CDIs. This may result from atomoxetine's modulation of the balance between tonic/phasic activity in the locus coeruleus and the possibly parallel enhancement of noradrenergic neurotransmission within the prefrontal cortex. Studying how cognitive enhancers such as atomoxetine influence key neurocognitive indices in cocaine addiction may help to develop reliable biomarkers for patient stratification in future clinical trials.

  14. Dietary self-control influences top?down guidance of attention to food cues

    OpenAIRE

    Higgs, Suzanne; Dolmans, Dirk; Humphreys, Glyn W.; Rutters, Femke

    2015-01-01

    Motivational objects attract attention due to their rewarding properties, but less is known about the role that top–down cognitive processes play in the attention paid to motivationally relevant objects and how this is affected by relevant behavioral traits. Here we assess how thinking about food affects attentional guidance to food items and how this is modulated by traits relating to dietary self-control. Participants completed two tasks in which they were presented with an initial cue (foo...

  15. Auditory temporal preparation induced by rhythmic cues during concurrent auditory working memory tasks.

    Science.gov (United States)

    Cutanda, Diana; Correa, Ángel; Sanabria, Daniel

    2015-06-01

    The present study investigated whether participants can develop temporal preparation driven by auditory isochronous rhythms when concurrently performing an auditory working memory (WM) task. In Experiment 1, participants had to respond to an auditory target presented after a regular or an irregular sequence of auditory stimuli while concurrently performing a Sternberg-type WM task. Results showed that participants responded faster after regular compared with irregular rhythms and that this effect was not affected by WM load; however, the lack of a significant main effect of WM load made it difficult to draw any conclusion regarding the influence of the dual-task manipulation in Experiment 1. In order to enhance dual-task interference, Experiment 2 combined the auditory rhythm procedure with an auditory N-Back task, which required WM updating (monitoring and coding of the information) and was presumably more demanding than the mere rehearsal of the WM task used in Experiment 1. Results now clearly showed dual-task interference effects (slower reaction times [RTs] in the high- vs. the low-load condition). However, such interference did not affect temporal preparation induced by rhythms, with faster RTs after regular than after irregular sequences in the high-load and low-load conditions. These results revealed that secondary tasks demanding memory updating, relative to tasks just demanding rehearsal, produced larger interference effects on overall RTs in the auditory rhythm task. Nevertheless, rhythm regularity exerted a strong temporal preparation effect that survived the interference of the WM task even when both tasks competed for processing resources within the auditory modality. (c) 2015 APA, all rights reserved).

  16. Bottom-up influences of voice continuity in focusing selective auditory attention

    OpenAIRE

    Bressler, Scott; Masud, Salwa; Bharadwaj, Hari; Shinn-Cunningham, Barbara

    2014-01-01

    Selective auditory attention causes a relative enhancement of the neural representation of important information and suppression of the neural representation of distracting sound, which enables a listener to analyze and interpret information of interest. Some studies suggest that in both vision and in audition, the “unit” on which attention operates is an object: an estimate of the information coming from a particular external source out in the world. In this view, which object ends up in the...

  17. Pre-Attentive Auditory Processing of Lexicality

    Science.gov (United States)

    Jacobsen, Thomas; Horvath, Janos; Schroger, Erich; Lattner, Sonja; Widmann, Andreas; Winkler, Istvan

    2004-01-01

    The effects of lexicality on auditory change detection based on auditory sensory memory representations were investigated by presenting oddball sequences of repeatedly presented stimuli, while participants ignored the auditory stimuli. In a cross-linguistic study of Hungarian and German participants, stimulus sequences were composed of words that…

  18. Reward processing in the value-driven attention network: reward signals tracking cue identity and location.

    Science.gov (United States)

    Anderson, Brian A

    2017-03-01

    Through associative reward learning, arbitrary cues acquire the ability to automatically capture visual attention. Previous studies have examined the neural correlates of value-driven attentional orienting, revealing elevated activity within a network of brain regions encompassing the visual corticostriatal loop [caudate tail, lateral occipital complex (LOC) and early visual cortex] and intraparietal sulcus (IPS). Such attentional priority signals raise a broader question concerning how visual signals are combined with reward signals during learning to create a representation that is sensitive to the confluence of the two. This study examines reward signals during the cued reward training phase commonly used to generate value-driven attentional biases. High, compared with low, reward feedback preferentially activated the value-driven attention network, in addition to regions typically implicated in reward processing. Further examination of these reward signals within the visual system revealed information about the identity of the preceding cue in the caudate tail and LOC, and information about the location of the preceding cue in IPS, while early visual cortex represented both location and identity. The results reveal teaching signals within the value-driven attention network during associative reward learning, and further suggest functional specialization within different regions of this network during the acquisition of an integrated representation of stimulus value. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  19. Feeling music: integration of auditory and tactile inputs in musical meter perception.

    Science.gov (United States)

    Huang, Juan; Gamble, Darik; Sarnlertsophon, Kristine; Wang, Xiaoqin; Hsiao, Steven

    2012-01-01

    Musicians often say that they not only hear, but also "feel" music. To explore the contribution of tactile information in "feeling" musical rhythm, we investigated the degree that auditory and tactile inputs are integrated in humans performing a musical meter recognition task. Subjects discriminated between two types of sequences, 'duple' (march-like rhythms) and 'triple' (waltz-like rhythms) presented in three conditions: 1) Unimodal inputs (auditory or tactile alone), 2) Various combinations of bimodal inputs, where sequences were distributed between the auditory and tactile channels such that a single channel did not produce coherent meter percepts, and 3) Simultaneously presented bimodal inputs where the two channels contained congruent or incongruent meter cues. We first show that meter is perceived similarly well (70%-85%) when tactile or auditory cues are presented alone. We next show in the bimodal experiments that auditory and tactile cues are integrated to produce coherent meter percepts. Performance is high (70%-90%) when all of the metrically important notes are assigned to one channel and is reduced to 60% when half of these notes are assigned to one channel. When the important notes are presented simultaneously to both channels, congruent cues enhance meter recognition (90%). Performance drops dramatically when subjects were presented with incongruent auditory cues (10%), as opposed to incongruent tactile cues (60%), demonstrating that auditory input dominates meter perception. We believe that these results are the first demonstration of cross-modal sensory grouping between any two senses.

  20. Examining Age-Related Differences in Auditory Attention Control Using a Task-Switching Procedure

    OpenAIRE

    Vera Lawo; Iring Koch

    2014-01-01

    Objectives. Using a novel task-switching variant of dichotic selective listening, we examined age-related differences in the ability to intentionally switch auditory attention between 2 speakers defined by their sex.

  1. Modeling the Development of Audiovisual Cue Integration in Speech Perception.

    Science.gov (United States)

    Getz, Laura M; Nordeen, Elke R; Vrabic, Sarah C; Toscano, Joseph C

    2017-03-21

    Adult speech perception is generally enhanced when information is provided from multiple modalities. In contrast, infants do not appear to benefit from combining auditory and visual speech information early in development. This is true despite the fact that both modalities are important to speech comprehension even at early stages of language acquisition. How then do listeners learn how to process auditory and visual information as part of a unified signal? In the auditory domain, statistical learning processes provide an excellent mechanism for acquiring phonological categories. Is this also true for the more complex problem of acquiring audiovisual correspondences, which require the learner to integrate information from multiple modalities? In this paper, we present simulations using Gaussian mixture models (GMMs) that learn cue weights and combine cues on the basis of their distributional statistics. First, we simulate the developmental process of acquiring phonological categories from auditory and visual cues, asking whether simple statistical learning approaches are sufficient for learning multi-modal representations. Second, we use this time course information to explain audiovisual speech perception in adult perceivers, including cases where auditory and visual input are mismatched. Overall, we find that domain-general statistical learning techniques allow us to model the developmental trajectory of audiovisual cue integration in speech, and in turn, allow us to better understand the mechanisms that give rise to unified percepts based on multiple cues.

  2. Feature-Selective Attention Adaptively Shifts Noise Correlations in Primary Auditory Cortex.

    Science.gov (United States)

    Downer, Joshua D; Rapone, Brittany; Verhein, Jessica; O'Connor, Kevin N; Sutter, Mitchell L

    2017-05-24

    Sensory environments often contain an overwhelming amount of information, with both relevant and irrelevant information competing for neural resources. Feature attention mediates this competition by selecting the sensory features needed to form a coherent percept. How attention affects the activity of populations of neurons to support this process is poorly understood because population coding is typically studied through simulations in which one sensory feature is encoded without competition. Therefore, to study the effects of feature attention on population-based neural coding, investigations must be extended to include stimuli with both relevant and irrelevant features. We measured noise correlations ( r noise ) within small neural populations in primary auditory cortex while rhesus macaques performed a novel feature-selective attention task. We found that the effect of feature-selective attention on r noise depended not only on the population tuning to the attended feature, but also on the tuning to the distractor feature. To attempt to explain how these observed effects might support enhanced perceptual performance, we propose an extension of a simple and influential model in which shifts in r noise can simultaneously enhance the representation of the attended feature while suppressing the distractor. These findings present a novel mechanism by which attention modulates neural populations to support sensory processing in cluttered environments. SIGNIFICANCE STATEMENT Although feature-selective attention constitutes one of the building blocks of listening in natural environments, its neural bases remain obscure. To address this, we developed a novel auditory feature-selective attention task and measured noise correlations ( r noise ) in rhesus macaque A1 during task performance. Unlike previous studies showing that the effect of attention on r noise depends on population tuning to the attended feature, we show that the effect of attention depends on the tuning

  3. Changes in cue reactivity and attentional bias following experimental cue exposure and response prevention: a laboratory study of the effects of D-cycloserine in heavy drinkers.

    Science.gov (United States)

    Kamboj, Sunjeev K; Massey-Chase, Rachel; Rodney, Lydia; Das, Ravi; Almahdi, Basil; Curran, H Valerie; Morgan, Celia J A

    2011-09-01

    The effects of D-cycloserine (DCS) in animal models of anxiety disorders and addiction indicate a role for N-methyl D-aspartate (NMDA) receptors in extinction learning. Exposure/response prevention treatments for anxiety disorders in humans are enhanced by DCS, suggesting a promising co-therapy regime, mediated by NMDA receptors. Exposure/response prevention may also be effective in problematic drinkers, and DCS might enhance habituation to cues in these individuals. Since heavy drinkers show ostensible conditioned responses to alcohol cues, habituation following exposure/response prevention should be evident in these drinkers, with DCS enhancing this effect. The objective of this study is to investigate the effect of DCS on exposure/response prevention in heavy drinkers. In a randomised, double-blind, placebo-controlled study, heavy social drinkers recruited from the community received either DCS (125 mg; n = 19) or placebo (n = 17) 1 h prior to each of two sessions of exposure/response prevention. Cue reactivity and attentional bias were assessed during these two sessions and at a third follow-up session. Between-session drinking behaviour was recorded. Robust cue reactivity and attentional bias to alcohol cues was evident, as expected of heavy drinkers. Within- and between-session habituation of cue reactivity, as well as a reduction in attentional bias to alcohol cues over time was found. However, there was no evidence of greater habituation in the DCS group. Subtle stimulant effects (increased subjective contentedness and euphoria) which were unrelated to exposure/response prevention were found following DCS. DCS does not appear to enhance habituation of alcohol cue reactivity in heavy non-dependent drinkers. Its utility in enhancing treatments based on exposure/response prevention in dependent drinkers or drug users remains open.

  4. A new test of attention in listening (TAIL) predicts auditory performance.

    Science.gov (United States)

    Zhang, Yu-Xuan; Barry, Johanna G; Moore, David R; Amitay, Sygal

    2012-01-01

    Attention modulates auditory perception, but there are currently no simple tests that specifically quantify this modulation. To fill the gap, we developed a new, easy-to-use test of attention in listening (TAIL) based on reaction time. On each trial, two clearly audible tones were presented sequentially, either at the same or different ears. The frequency of the tones was also either the same or different (by at least two critical bands). When the task required same/different frequency judgments, presentation at the same ear significantly speeded responses and reduced errors. A same/different ear (location) judgment was likewise facilitated by keeping tone frequency constant. Perception was thus influenced by involuntary orienting of attention along the task-irrelevant dimension. When information in the two stimulus dimensions were congruent (same-frequency same-ear, or different-frequency different-ear), response was faster and more accurate than when they were incongruent (same-frequency different-ear, or different-frequency same-ear), suggesting the involvement of executive control to resolve conflicts. In total, the TAIL yielded five independent outcome measures: (1) baseline reaction time, indicating information processing efficiency, (2) involuntary orienting of attention to frequency and (3) location, and (4) conflict resolution for frequency and (5) location. Processing efficiency and conflict resolution accounted for up to 45% of individual variances in the low- and high-threshold variants of three psychoacoustic tasks assessing temporal and spectral processing. Involuntary orientation of attention to the irrelevant dimension did not correlate with perceptual performance on these tasks. Given that TAIL measures are unlikely to be limited by perceptual sensitivity, we suggest that the correlations reflect modulation of perceptual performance by attention. The TAIL thus has the power to identify and separate contributions of different components of attention

  5. Nonlinear dynamics of human locomotion: effects of rhythmic auditory cueing on local dynamic stability

    Directory of Open Access Journals (Sweden)

    Philippe eTerrier

    2013-09-01

    Full Text Available It has been observed that times series of gait parameters (stride length (SL, stride time (ST and stride speed (SS, exhibit long-term persistence and fractal-like properties. Synchronizing steps with rhythmic auditory stimuli modifies the persistent fluctuation pattern to anti-persistence. Another nonlinear method estimates the degree of resilience of gait control to small perturbations, i.e. the local dynamic stability (LDS. The method makes use of the maximal Lyapunov exponent, which estimates how fast a nonlinear system embedded in a reconstructed state space (attractor diverges after an infinitesimal perturbation. We propose to use an instrumented treadmill to simultaneously measure basic gait parameters (time series of SL, ST and SS from which the statistical persistence among consecutive strides can be assessed, and the trajectory of the center of pressure (from which the LDS can be estimated. In 20 healthy participants, the response to rhythmic auditory cueing (RAC of LDS and of statistical persistence (assessed with detrended fluctuation analysis (DFA was compared. By analyzing the divergence curves, we observed that long-term LDS (computed as the reverse of the average logarithmic rate of divergence between the 4th and the 10th strides downstream from nearest neighbors in the reconstructed attractor was strongly enhanced (relative change +47%. That is likely the indication of a more dampened dynamics. The change in short-term LDS (divergence over one step was smaller (+3%. DFA results (scaling exponents confirmed an anti-persistent pattern in ST, SL and SS. Long-term LDS (but not short-term LDS and scaling exponents exhibited a significant correlation between them (r=0.7. Both phenomena probably result from the more conscious/voluntary gait control that is required by RAC. We suggest that LDS and statistical persistence should be used to evaluate the efficiency of cueing therapy in patients with neurological gait disorders.

  6. Demonstrating the potential for dynamic auditory stimulation to contribute to motion sickness.

    Directory of Open Access Journals (Sweden)

    Behrang Keshavarz

    Full Text Available Auditory cues can create the illusion of self-motion (vection in the absence of visual or physical stimulation. The present study aimed to determine whether auditory cues alone can also elicit motion sickness and how auditory cues contribute to motion sickness when added to visual motion stimuli. Twenty participants were seated in front of a curved projection display and were exposed to a virtual scene that constantly rotated around the participant's vertical axis. The virtual scene contained either visual-only, auditory-only, or a combination of corresponding visual and auditory cues. All participants performed all three conditions in a counterbalanced order. Participants tilted their heads alternately towards the right or left shoulder in all conditions during stimulus exposure in order to create pseudo-Coriolis effects and to maximize the likelihood for motion sickness. Measurements of motion sickness (onset, severity, vection (latency, strength, duration, and postural steadiness (center of pressure were recorded. Results showed that adding auditory cues to the visual stimuli did not, on average, affect motion sickness and postural steadiness, but it did reduce vection onset times and increased vection strength compared to pure visual or pure auditory stimulation. Eighteen of the 20 participants reported at least slight motion sickness in the two conditions including visual stimuli. More interestingly, six participants also reported slight motion sickness during pure auditory stimulation and two of the six participants stopped the pure auditory test session due to motion sickness. The present study is the first to demonstrate that motion sickness may be caused by pure auditory stimulation, which we refer to as "auditorily induced motion sickness".

  7. Selective attention to emotional cues and emotion recognition in healthy subjects: the role of mineralocorticoid receptor stimulation.

    Science.gov (United States)

    Schultebraucks, Katharina; Deuter, Christian E; Duesenberg, Moritz; Schulze, Lars; Hellmann-Regen, Julian; Domke, Antonia; Lockenvitz, Lisa; Kuehl, Linn K; Otte, Christian; Wingenfeld, Katja

    2016-09-01

    Selective attention toward emotional cues and emotion recognition of facial expressions are important aspects of social cognition. Stress modulates social cognition through cortisol, which acts on glucocorticoid (GR) and mineralocorticoid receptors (MR) in the brain. We examined the role of MR activation on attentional bias toward emotional cues and on emotion recognition. We included 40 healthy young women and 40 healthy young men (mean age 23.9 ± 3.3), who either received 0.4 mg of the MR agonist fludrocortisone or placebo. A dot-probe paradigm was used to test for attentional biases toward emotional cues (happy and sad faces). Moreover, we used a facial emotion recognition task to investigate the ability to recognize emotional valence (anger and sadness) from facial expression in four graded categories of emotional intensity (20, 30, 40, and 80 %). In the emotional dot-probe task, we found a main effect of treatment and a treatment × valence interaction. Post hoc analyses revealed an attentional bias away from sad faces after placebo intake and a shift in selective attention toward sad faces compared to placebo. We found no attentional bias toward happy faces after fludrocortisone or placebo intake. In the facial emotion recognition task, there was no main effect of treatment. MR stimulation seems to be important in modulating quick, automatic emotional processing, i.e., a shift in selective attention toward negative emotional cues. Our results confirm and extend previous findings of MR function. However, we did not find an effect of MR stimulation on emotion recognition.

  8. What Grasps and Holds 8-Month-Old Infants' Looking Attention? The Effects of Object Size and Depth Cues

    OpenAIRE

    Guan, Yu; Corbetta, Daniela

    2012-01-01

    The current eye-tracking study explored the relative impact of object size and depth cues on 8-month-old infants' visual attention processes. A series of slides containing 3 objects of either different or same size were displayed on backgrounds with varying depth cues. The distribution of infants' first looks (a measure of initial attention switch) and infants' looking durations (a measure of sustained attention) at the objects were analyzed. Results revealed that the large objects captured i...

  9. Long-term memory biases auditory spatial attention.

    Science.gov (United States)

    Zimmermann, Jacqueline F; Moscovitch, Morris; Alain, Claude

    2017-10-01

    Long-term memory (LTM) has been shown to bias attention to a previously learned visual target location. Here, we examined whether memory-predicted spatial location can facilitate the detection of a faint pure tone target embedded in real world audio clips (e.g., soundtrack of a restaurant). During an initial familiarization task, participants heard audio clips, some of which included a lateralized target (p = 50%). On each trial participants indicated whether the target was presented from the left, right, or was absent. Following a 1 hr retention interval, participants were presented with the same audio clips, which now all included a target. In Experiment 1, participants showed memory-based gains in response time and d'. Experiment 2 showed that temporal expectations modulate attention, with greater memory-guided attention effects on performance when temporal context was reinstated from learning (i.e., when timing of the target within audio clips was not changed from initially learned timing). Experiment 3 showed that while conscious recall of target locations was modulated by exposure to target-context associations during learning (i.e., better recall with higher number of learning blocks), the influence of LTM associations on spatial attention was not reduced (i.e., number of learning blocks did not affect memory-guided attention). Both Experiments 2 and 3 showed gains in performance related to target-context associations, even for associations that were not explicitly remembered. Together, these findings indicate that memory for audio clips is acquired quickly and is surprisingly robust; both implicit and explicit LTM for the location of a faint target tone modulated auditory spatial attention. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. A right-ear bias of auditory selective attention is evident in alpha oscillations.

    Science.gov (United States)

    Payne, Lisa; Rogers, Chad S; Wingfield, Arthur; Sekuler, Robert

    2017-04-01

    Auditory selective attention makes it possible to pick out one speech stream that is embedded in a multispeaker environment. We adapted a cued dichotic listening task to examine suppression of a speech stream lateralized to the nonattended ear, and to evaluate the effects of attention on the right ear's well-known advantage in the perception of linguistic stimuli. After being cued to attend to input from either their left or right ear, participants heard two different four-word streams presented simultaneously to the separate ears. Following each dichotic presentation, participants judged whether a spoken probe word had been in the attended ear's stream. We used EEG signals to track participants' spatial lateralization of auditory attention, which is marked by interhemispheric differences in EEG alpha (8-14 Hz) power. A right-ear advantage (REA) was evident in faster response times and greater sensitivity in distinguishing attended from unattended words. Consistent with the REA, we found strongest parietal and right frontotemporal alpha modulation during the attend-right condition. These findings provide evidence for a link between selective attention and the REA during directed dichotic listening. © 2016 Society for Psychophysiological Research.

  11. Orienting attention in visual working memory requires central capacity: decreased retro-cue effects under dual-task conditions.

    Science.gov (United States)

    Janczyk, Markus; Berryhill, Marian E

    2014-04-01

    The retro-cue effect (RCE) describes superior working memory performance for validly cued stimulus locations long after encoding has ended. Importantly, this happens with delays beyond the range of iconic memory. In general, the RCE is a stable phenomenon that emerges under varied stimulus configurations and timing parameters. We investigated its susceptibility to dual-task interference to determine the attentional requirements at the time point of cue onset and encoding. In Experiment 1, we compared single- with dual-task conditions. In Experiment 2, we borrowed from the psychological refractory period paradigm and compared conditions with high and low (dual-) task overlap. The secondary task was always binary tone discrimination requiring a manual response. Across both experiments, an RCE was found, but it was diminished in magnitude in the critical dual-task conditions. A previous study did not find evidence that sustained attention is required in the interval between cue offset and test. Our results apparently contradict these findings and point to a critical time period around cue onset and briefly thereafter during which attention is required.

  12. Spatial attention triggered by unimodal, crossmodal, and bimodal exogenous cues: a comparison of reflexive orienting mechanisms

    NARCIS (Netherlands)

    Santangelo, Valerio; van der Lubbe, Robert Henricus Johannes; Belardinelli, Marta Olivetti; Postma, Albert

    The aim of this study was to establish whether spatial attention triggered by bimodal exogenous cues acts differently as compared to unimodal and crossmodal exogenous cues due to crossmodal integration. In order to investigate this issue, we examined cuing effects in discrimination tasks and

  13. [Some electrophysiological and hemodynamic characteristics of auditory selective attention in norm and schizophrenia].

    Science.gov (United States)

    Lebedeva, I S; Akhadov, T A; Petriaĭkin, A V; Kaleda, V G; Barkhatova, A N; Golubev, S A; Rumiantseva, E E; Vdovenko, A M; Fufaeva, E A; Semenova, N A

    2011-01-01

    Six patients in the state of remission after the first episode ofjuvenile schizophrenia and seven sex- and age-matched mentally healthy subjects were examined by fMRI and ERP methods. The auditory oddball paradigm was applied. Differences in P300 parameters didn't reach the level of significance, however, a significantly higher hemodynamic response to target stimuli was found in patients bilaterally in the supramarginal gyrus and in the right medial frontal gyrus, which points to pathology of these brain areas in supporting of auditory selective attention.

  14. The right planum temporale is involved in stimulus-driven, auditory attention--evidence from transcranial magnetic stimulation.

    Directory of Open Access Journals (Sweden)

    Marco Hirnstein

    Full Text Available It is well known that the planum temporale (PT area in the posterior temporal lobe carries out spectro-temporal analysis of auditory stimuli, which is crucial for speech, for example. There are suggestions that the PT is also involved in auditory attention, specifically in the discrimination and selection of stimuli from the left and right ear. However, direct evidence is missing so far. To examine the role of the PT in auditory attention we asked fourteen participants to complete the Bergen Dichotic Listening Test. In this test two different consonant-vowel syllables (e.g., "ba" and "da" are presented simultaneously, one to each ear, and participants are asked to verbally report the syllable they heard best or most clearly. Thus attentional selection of a syllable is stimulus-driven. Each participant completed the test three times: after their left and right PT (located with anatomical brain scans had been stimulated with repetitive transcranial magnetic stimulation (rTMS, which transiently interferes with normal brain functioning in the stimulated sites, and after sham stimulation, where participants were led to believe they had been stimulated but no rTMS was applied (control. After sham stimulation the typical right ear advantage emerged, that is, participants reported relatively more right than left ear syllables, reflecting a left-hemispheric dominance for language. rTMS over the right but not left PT significantly reduced the right ear advantage. This was the result of participants reporting more left and fewer right ear syllables after right PT stimulation, suggesting there was a leftward shift in stimulus selection. Taken together, our findings point to a new function of the PT in addition to auditory perception: particularly the right PT is involved in stimulus selection and (stimulus-driven, auditory attention.

  15. Measuring effects of voluntary attention: a comparison among predictive arrow, colour, and number cues.

    Science.gov (United States)

    Olk, Bettina; Tsankova, Elena; Petca, A Raisa; Wilhelm, Adalbert F X

    2014-10-01

    The Posner cueing paradigm is one of the most widely used paradigms in attention research. Importantly, when employing it, it is critical to understand which type of orienting a cue triggers. It has been suggested that large effects elicited by predictive arrow cues reflect an interaction of involuntary and voluntary orienting. This conclusion is based on comparisons of cueing effects of predictive arrows, nonpredictive arrows (involuntary orienting), and predictive numbers (voluntary orienting). Experiment 1 investigated whether this conclusion is restricted to comparisons with number cues and showed similar results to those of previous studies, but now for comparisons to predictive colour cues, indicating that the earlier conclusion can be generalized. Experiment 2 assessed whether the size of a cueing effect is related to the ease of deriving direction information from a cue, based on the rationale that effects for arrows may be larger, because it may be easier to process direction information given by symbols such as arrows than that given by other cues. Indeed, direction information is derived faster and more accurately from arrows than from colour and number cues in a direction judgement task, and cueing effects are larger for arrows than for the other cues. Importantly though, performance in the two tasks is not correlated. Hence, the large cueing effects of arrows are not a result of the ease of information processing, but of the types of orienting that the arrows elicit.

  16. Rhythmic auditory cueing to improve walking in patients with neurological conditions other than Parkinson's disease--what is the evidence?

    Science.gov (United States)

    Wittwer, Joanne E; Webster, Kate E; Hill, Keith

    2013-01-01

    To investigate whether synchronising over-ground walking to rhythmic auditory cues improves temporal and spatial gait measures in adults with neurological clinical conditions other than Parkinson's disease. A search was performed in June 2011 using the computerised databases AGELINE, AMED, AMI, CINAHL, Current Contents, EMBASE, MEDLINE, PsycINFO and PUBMED, and extended using hand-searching of relevant journals and article reference lists. Methodological quality was independently assessed by two reviewers. A best evidence synthesis was applied to rate levels of evidence. Fourteen studies, four of which were randomized controlled trials (RCTs), met the inclusion criteria. Patient groups included those with stroke (six studies); Huntington's disease and spinal cord injury (two studies each); traumatic brain injury, dementia, multiple sclerosis and normal pressure hydrocephalus (one study each). The best evidence synthesis found moderate evidence of improved velocity and stride length of people with stroke following gait training with rhythmic music. Insufficient evidence was found for other included neurological disorders due to low study numbers and poor methodological quality of some studies. Synchronising walking to rhythmic auditory cues can result in short-term improvement in gait measures of people with stroke. Further high quality studies are needed before recommendations for clinical practice can be made.

  17. Auditory Selective Attention: an introduction and evidence for distinct facilitation and inhibition mechanisms

    OpenAIRE

    Mikyska, Constanze Elisabeth Anna

    2012-01-01

    Objective Auditory selective attention is a complex brain function that is still not completely understood. The classic example is the so-called “cocktail party effect” (Cherry, 1953), which describes the impressive ability to focus one’s attention on a single voice from a multitude of voices. This means that particular stimuli in the environment are enhanced in contrast to other ones of lower priority that are ignored. To be able to understand how attention can influence the perception and p...

  18. Brain activity during divided and selective attention to auditory and visual sentence comprehension tasks

    OpenAIRE

    Moisala, Mona; Salmela, Viljami; Salo, Emma; Carlson, Synnove; Vuontela, Virve; Salonen, Oili; Alho, Kimmo

    2015-01-01

    Using functional magnetic resonance imaging (fMRI), we measured brain activity of human participants while they performed a sentence congruence judgment task in either the visual or auditory modality separately, or in both modalities simultaneously. Significant performance decrements were observed when attention was divided between the two modalities compared with when one modality was selectively attended. Compared with selective attention (i.e., single tasking), divided attention (i.e., dua...

  19. Attentional Bias to Food Cues in Youth with Loss of Control Eating

    Science.gov (United States)

    2016-05-20

    Vollstadt-Klein S, et al. 2012. Impairment of inhibitory control in response to food - associated cues and attentional bias of obese participants...on the obesity epidemic? Int. J. Eat. Disord. 34:S117-20 104. Yokum S, Ng J, Stice E. 2011. Attentional bias to food images associated with...bias toward high palatable foods versus neutral objects was positively associated with BMI-z. These findings suggest that LOC eating and body weight

  20. The Role of Search Speed in the Contextual Cueing of Children's Attention.

    Science.gov (United States)

    Darby, Kevin; Burling, Joseph; Yoshida, Hanako

    2014-01-01

    The contextual cueing effect is a robust phenomenon in which repeated exposure to the same arrangement of random elements guides attention to relevant information by constraining search. The effect is measured using an object search task in which a target (e.g., the letter T) is located within repeated or nonrepeated visual contexts (e.g., configurations of the letter L). Decreasing response times for the repeated configurations indicates that contextual information has facilitated search. Although the effect is robust among adult participants, recent attempts to document the effect in children have yielded mixed results. We examined the effect of search speed on contextual cueing with school-aged children, comparing three types of stimuli that promote different search times in order to observe how speed modulates this effect. Reliable effects of search time were found, suggesting that visual search speed uniquely constrains the role of attention toward contextually cued information.

  1. Early auditory evoked potential is modulated by selective attention and related to individual differences in visual working memory capacity.

    Science.gov (United States)

    Giuliano, Ryan J; Karns, Christina M; Neville, Helen J; Hillyard, Steven A

    2014-12-01

    A growing body of research suggests that the predictive power of working memory (WM) capacity for measures of intellectual aptitude is due to the ability to control attention and select relevant information. Crucially, attentional mechanisms implicated in controlling access to WM are assumed to be domain-general, yet reports of enhanced attentional abilities in individuals with larger WM capacities are primarily within the visual domain. Here, we directly test the link between WM capacity and early attentional gating across sensory domains, hypothesizing that measures of visual WM capacity should predict an individual's capacity to allocate auditory selective attention. To address this question, auditory ERPs were recorded in a linguistic dichotic listening task, and individual differences in ERP modulations by attention were correlated with estimates of WM capacity obtained in a separate visual change detection task. Auditory selective attention enhanced ERP amplitudes at an early latency (ca. 70-90 msec), with larger P1 components elicited by linguistic probes embedded in an attended narrative. Moreover, this effect was associated with greater individual estimates of visual WM capacity. These findings support the view that domain-general attentional control mechanisms underlie the wide variation of WM capacity across individuals.

  2. Attention deficits revealed by passive auditory change detection for pure tones and lexical tones in ADHD children.

    Science.gov (United States)

    Yang, Ming-Tao; Hsu, Chun-Hsien; Yeh, Pei-Wen; Lee, Wang-Tso; Liang, Jao-Shwann; Fu, Wen-Mei; Lee, Chia-Ying

    2015-01-01

    Inattention (IA) has been a major problem in children with attention deficit/hyperactivity disorder (ADHD), accounting for their behavioral and cognitive dysfunctions. However, there are at least three processing steps underlying attentional control for auditory change detection, namely pre-attentive change detection, involuntary attention orienting, and attention reorienting for further evaluation. This study aimed to examine whether children with ADHD would show deficits in any of these subcomponents by using mismatch negativity (MMN), P3a, and late discriminative negativity (LDN) as event-related potential (ERP) markers, under the passive auditory oddball paradigm. Two types of stimuli-pure tones and Mandarin lexical tones-were used to examine if the deficits were general across linguistic and non-linguistic domains. Participants included 15 native Mandarin-speaking children with ADHD and 16 age-matched controls (across groups, age ranged between 6 and 15 years). Two passive auditory oddball paradigms (lexical tones and pure tones) were applied. The pure tone oddball paradigm included a standard stimulus (1000 Hz, 80%) and two deviant stimuli (1015 and 1090 Hz, 10% each). The Mandarin lexical tone oddball paradigm's standard stimulus was /yi3/ (80%) and two deviant stimuli were /yi1/ and /yi2/ (10% each). The results showed no MMN difference, but did show attenuated P3a and enhanced LDN to the large deviants for both pure and lexical tone changes in the ADHD group. Correlation analysis showed that children with higher ADHD tendency, as indexed by parents' and teachers' ratings on ADHD symptoms, showed less positive P3a amplitudes when responding to large lexical tone deviants. Thus, children with ADHD showed impaired auditory change detection for both pure tones and lexical tones in both involuntary attention switching, and attention reorienting for further evaluation. These ERP markers may therefore be used for the evaluation of anti-ADHD drugs that aim to

  3. Attention deficits revealed by passive auditory change detection for pure tones and lexical tones in ADHD children

    Directory of Open Access Journals (Sweden)

    Ming-Tao eYang

    2015-08-01

    Full Text Available Inattention has been a major problem in children with attention deficit/hyperactivity disorder (ADHD, accounting for their behavioral and cognitive dysfunctions. However, there are at least three processing steps underlying attentional control for auditory change detection, namely pre-attentive change detection, involuntary attention orienting, and attention reorienting for further evaluation. This study aimed to examine whether children with ADHD would show deficits in any of these subcomponents by using mismatch negativity (MMN, P3a, and late discriminative negativity (LDN as event-related potential (ERP markers, under the passive auditory oddball paradigm. Two types of stimuli - pure tones and Mandarin lexical tones - were used to examine if the deficits were general across linguistic and non-linguistic domains. Participants included 15 native Mandarin-speaking children with ADHD and 16 age-matched controls (across groups, age ranged between 6 and 15 years. Two passive auditory oddball paradigms (lexical tones and pure tones were applied. Pure tone paradigm included standard stimuli (1000 Hz, 80% and two deviant stimuli (1015 Hz and 1090 Hz, 10% each. The Mandarin lexical tone paradigm’s standard stimuli was /yi3/ (80% and two deviant stimuli were /yi1/ and /yi2/ (10% each. The results showed no MMN difference, but did show attenuated P3a and enhanced LDN to the large deviants for both pure and lexical tone changes in the ADHD group. Correlation analysis showed that children with higher ADHD tendency, as indexed by parents’ and teachers’ rating on ADHD symptoms, showed less positive P3a amplitudes when responding to large lexical tone deviants. Thus, children with ADHD showed impaired auditory change detection for both pure tones and lexical tones in both involuntary attention switching, and attention reorienting for further evaluation. These ERP markers may therefore be used for evaluation of anti-ADHD drugs that aim to alleviate these

  4. Attention effects at auditory periphery derived from human scalp potentials: displacement measure of potentials.

    Science.gov (United States)

    Ikeda, Kazunari; Hayashi, Akiko; Sekiguchi, Takahiro; Era, Shukichi

    2006-10-01

    It is known in humans that electrophysiological measures such as the auditory brainstem response (ABR) are difficult to identify the attention effect at the auditory periphery, whereas the centrifugal effect has been detected by measuring otoacoustic emissions. This research developed a measure responsive to the shift of human scalp potentials within a brief post-stimulus period (13 ms), that is, displacement percentage, and applied it to an experiment to retrieve the peripheral attention effect. In the present experimental paradigm, tone pips were exposed to the left ear whereas the other ear was masked by white noise. Twelve participants each conducted two conditions of either ignoring or attending to the tone pips. Relative to averaged scalp potentials in the ignoring condition, the shift of the potentials was found within early component range during the attentive condition, and displacement percentage then revealed a significant magnitude difference between the two conditions. These results suggest that, using a measure representing the potential shift itself, the peripheral effect of attention can be detected from human scalp potentials.

  5. Neural Correlates of Auditory Figure-Ground Segregation Based on Temporal Coherence.

    Science.gov (United States)

    Teki, Sundeep; Barascud, Nicolas; Picard, Samuel; Payne, Christopher; Griffiths, Timothy D; Chait, Maria

    2016-09-01

    To make sense of natural acoustic environments, listeners must parse complex mixtures of sounds that vary in frequency, space, and time. Emerging work suggests that, in addition to the well-studied spectral cues for segregation, sensitivity to temporal coherence-the coincidence of sound elements in and across time-is also critical for the perceptual organization of acoustic scenes. Here, we examine pre-attentive, stimulus-driven neural processes underlying auditory figure-ground segregation using stimuli that capture the challenges of listening in complex scenes where segregation cannot be achieved based on spectral cues alone. Signals ("stochastic figure-ground": SFG) comprised a sequence of brief broadband chords containing random pure tone components that vary from 1 chord to another. Occasional tone repetitions across chords are perceived as "figures" popping out of a stochastic "ground." Magnetoencephalography (MEG) measurement in naïve, distracted, human subjects revealed robust evoked responses, commencing from about 150 ms after figure onset that reflect the emergence of the "figure" from the randomly varying "ground." Neural sources underlying this bottom-up driven figure-ground segregation were localized to planum temporale, and the intraparietal sulcus, demonstrating that this area, outside the "classic" auditory system, is also involved in the early stages of auditory scene analysis." © The Author 2016. Published by Oxford University Press.

  6. Linking attentional processes and conceptual problem solving: visual cues facilitate the automaticity of extracting relevant information from diagrams.

    Science.gov (United States)

    Rouinfar, Amy; Agra, Elise; Larson, Adam M; Rebello, N Sanjay; Loschky, Lester C

    2014-01-01

    This study investigated links between visual attention processes and conceptual problem solving. This was done by overlaying visual cues on conceptual physics problem diagrams to direct participants' attention to relevant areas to facilitate problem solving. Participants (N = 80) individually worked through four problem sets, each containing a diagram, while their eye movements were recorded. Each diagram contained regions that were relevant to solving the problem correctly and separate regions related to common incorrect responses. Problem sets contained an initial problem, six isomorphic training problems, and a transfer problem. The cued condition saw visual cues overlaid on the training problems. Participants' verbal responses were used to determine their accuracy. This study produced two major findings. First, short duration visual cues which draw attention to solution-relevant information and aid in the organizing and integrating of it, facilitate both immediate problem solving and generalization of that ability to new problems. Thus, visual cues can facilitate re-representing a problem and overcoming impasse, enabling a correct solution. Importantly, these cueing effects on problem solving did not involve the solvers' attention necessarily embodying the solution to the problem, but were instead caused by solvers attending to and integrating relevant information in the problems into a solution path. Second, this study demonstrates that when such cues are used across multiple problems, solvers can automatize the extraction of problem-relevant information extraction. These results suggest that low-level attentional selection processes provide a necessary gateway for relevant information to be used in problem solving, but are generally not sufficient for correct problem solving. Instead, factors that lead a solver to an impasse and to organize and integrate problem information also greatly facilitate arriving at correct solutions.

  7. A treat for the eyes. An eye-tracking study on children's attention to unhealthy and healthy food cues in media content.

    Science.gov (United States)

    Spielvogel, Ines; Matthes, Jörg; Naderer, Brigitte; Karsay, Kathrin

    2018-06-01

    Based on cue reactivity theory, food cues embedded in media content can lead to physiological and psychological responses in children. Research suggests that unhealthy food cues are represented more extensively and interactively in children's media environments than healthy ones. However, it is not clear to this date whether children react differently to unhealthy compared to healthy food cues. In an experimental study with 56 children (55.4% girls; M age  = 8.00, SD = 1.58), we used eye-tracking to determine children's attention to unhealthy and healthy food cues embedded in a narrative cartoon movie. Besides varying the food type (i.e., healthy vs. unhealthy), we also manipulated the integration levels of food cues with characters (i.e., level of food integration; no interaction vs. handling vs. consumption), and we assessed children's individual susceptibility factors by measuring the impact of their hunger level. Our results indicated that unhealthy food cues attract children's visual attention to a larger extent than healthy cues. However, their initial visual interest did not differ between unhealthy and healthy food cues. Furthermore, an increase in the level of food integration led to an increase in visual attention. Our findings showed no moderating impact of hunger. We conclude that especially unhealthy food cues with an interactive connection trigger cue reactivity in children. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Spatiotemporal Relationships among Audiovisual Stimuli Modulate Auditory Facilitation of Visual Target Discrimination.

    Science.gov (United States)

    Li, Qi; Yang, Huamin; Sun, Fang; Wu, Jinglong

    2015-03-01

    Sensory information is multimodal; through audiovisual interaction, task-irrelevant auditory stimuli tend to speed response times and increase visual perception accuracy. However, mechanisms underlying these performance enhancements have remained unclear. We hypothesize that task-irrelevant auditory stimuli might provide reliable temporal and spatial cues for visual target discrimination and behavioral response enhancement. Using signal detection theory, the present study investigated the effects of spatiotemporal relationships on auditory facilitation of visual target discrimination. Three experiments were conducted where an auditory stimulus maintained reliable temporal and/or spatial relationships with visual target stimuli. Results showed that perception sensitivity (d') to visual target stimuli was enhanced only when a task-irrelevant auditory stimulus maintained reliable spatiotemporal relationships with a visual target stimulus. When only reliable spatial or temporal information was contained, perception sensitivity was not enhanced. These results suggest that reliable spatiotemporal relationships between visual and auditory signals are required for audiovisual integration during a visual discrimination task, most likely due to a spread of attention. These results also indicate that auditory facilitation of visual target discrimination follows from late-stage cognitive processes rather than early stage sensory processes. © 2015 SAGE Publications.

  9. Increased psychophysiological parameters of attention in non-psychotic individuals with auditory verbal hallucinations

    DEFF Research Database (Denmark)

    van Lutterveld, Remko; Oranje, Bob; Abramovic, Lucija

    2010-01-01

    with an auditory oddball paradigm in 18 non-psychotic individuals with AVH and 18 controls. RESULTS: P300 amplitude was increased in the AVH group as compared to controls, reflecting superior effortful attention. A trend in the same direction was found for processing negativity. No significant differences were...... found for mismatch negativity. CONCLUSION: Contrary to our expectations, non-psychotic individuals with AVH show increased rather than decreased psychophysiological measures of effortful attention compared to healthy controls, refuting a pivotal role of decreased effortful attention...

  10. Attention and alcohol cues: a role for medial parietal cortex and shifting away from alcohol features?

    Directory of Open Access Journals (Sweden)

    Thomas Edward Gladwin

    2013-12-01

    Full Text Available Attention plays a central role in theories of alcohol dependence; however, its precise role in alcohol-related biases is not yet clear. In the current study, social drinkers performed a spatial cueing task designed to evoke conflict between automatic processes due to incentive salience and control exerted to follow task-related goals. Such conflict is a potentially important task feature from the perspective of dual-process models of addiction. Subjects received instructions either to direct their attention towards pictures of alcoholic beverages, and away from non-alcohol beverages; or to direct their attention towards pictures of non-alcoholic beverages, and away from alcohol beverages. A probe stimulus was likely to appear at the attended location, so that both spatial and non-spatial interference was possible. Activation in medial parietal cortex was found during Approach Alcohol versus Avoid Alcohol blocks. This region is associated with the, possibly automatic, shifting of attention between stimulus features, suggesting that subjects may have shifted attention away from certain features of alcoholic cues when attention had to be directed towards an upcoming stimulus at their location. Further, activation in voxels close to this region was negatively correlated with riskier drinking behavior. A tentative interpretation of the results is that risky drinking may be associated with a reduced tendency to shift attention away from potentially distracting task-irrelevant alcohol cues. The results suggest novel hypotheses and directions for future study, in particular towards the potential therapeutic use of training the ability to shifting attention away from alcohol-related stimulus features.

  11. I can see what you are saying: Auditory labels reduce visual search times.

    Science.gov (United States)

    Cho, Kit W

    2016-10-01

    The present study explored the self-directed-speech effect, the finding that relative to silent reading of a label (e.g., DOG), saying it aloud reduces visual search reaction times (RTs) for locating a target picture among distractors. Experiment 1 examined whether this effect is due to a confound in the differences in the number of cues in self-directed speech (two) vs. silent reading (one) and tested whether self-articulation is required for the effect. The results showed that self-articulation is not required and that merely hearing the auditory label reduces visual search RTs relative to silent reading. This finding also rules out the number of cues confound. Experiment 2 examined whether hearing an auditory label activates more prototypical features of the label's referent and whether the auditory-label benefit is moderated by the target's imagery concordance (the degree to which the target picture matches the mental picture that is activated by a written label for the target). When the target imagery concordance was high, RTs following the presentation of a high prototypicality picture or auditory cue were comparable and shorter than RTs following a visual label or low prototypicality picture cue. However, when the target imagery concordance was low, RTs following an auditory cue were shorter than the comparable RTs following the picture cues and visual-label cue. The results suggest that an auditory label activates both prototypical and atypical features of a concept and can facilitate visual search RTs even when compared to picture primes. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. The Benefit of Attention-to-Memory Depends on the Interplay of Memory Capacity and Memory Load

    Science.gov (United States)

    Lim, Sung-Joo; Wöstmann, Malte; Geweke, Frederik; Obleser, Jonas

    2018-01-01

    Humans can be cued to attend to an item in memory, which facilitates and enhances the perceptual precision in recalling this item. Here, we demonstrate that this facilitating effect of attention-to-memory hinges on the overall degree of memory load. The benefit an individual draws from attention-to-memory depends on her overall working memory performance, measured as sensitivity (d′) in a retroactive cue (retro-cue) pitch discrimination task. While listeners maintained 2, 4, or 6 auditory syllables in memory, we provided valid or neutral retro-cues to direct listeners’ attention to one, to-be-probed syllable in memory. Participants’ overall memory performance (i.e., perceptual sensitivity d′) was relatively unaffected by the presence of valid retro-cues across memory loads. However, a more fine-grained analysis using psychophysical modeling shows that valid retro-cues elicited faster pitch-change judgments and improved perceptual precision. Importantly, as memory load increased, listeners’ overall working memory performance correlated with inter-individual differences in the degree to which precision improved (r = 0.39, p = 0.029). Under high load, individuals with low working memory profited least from attention-to-memory. Our results demonstrate that retrospective attention enhances perceptual precision of attended items in memory but listeners’ optimal use of informative cues depends on their overall memory abilities. PMID:29520246

  13. The Benefit of Attention-to-Memory Depends on the Interplay of Memory Capacity and Memory Load

    Directory of Open Access Journals (Sweden)

    Sung-Joo Lim

    2018-02-01

    Full Text Available Humans can be cued to attend to an item in memory, which facilitates and enhances the perceptual precision in recalling this item. Here, we demonstrate that this facilitating effect of attention-to-memory hinges on the overall degree of memory load. The benefit an individual draws from attention-to-memory depends on her overall working memory performance, measured as sensitivity (d′ in a retroactive cue (retro-cue pitch discrimination task. While listeners maintained 2, 4, or 6 auditory syllables in memory, we provided valid or neutral retro-cues to direct listeners’ attention to one, to-be-probed syllable in memory. Participants’ overall memory performance (i.e., perceptual sensitivity d′ was relatively unaffected by the presence of valid retro-cues across memory loads. However, a more fine-grained analysis using psychophysical modeling shows that valid retro-cues elicited faster pitch-change judgments and improved perceptual precision. Importantly, as memory load increased, listeners’ overall working memory performance correlated with inter-individual differences in the degree to which precision improved (r = 0.39, p = 0.029. Under high load, individuals with low working memory profited least from attention-to-memory. Our results demonstrate that retrospective attention enhances perceptual precision of attended items in memory but listeners’ optimal use of informative cues depends on their overall memory abilities.

  14. Heads First: Visual Aftereffects Reveal Hierarchical Integration of Cues to Social Attention.

    Directory of Open Access Journals (Sweden)

    Sarah Cooney

    Full Text Available Determining where another person is attending is an important skill for social interaction that relies on various visual cues, including the turning direction of the head and body. This study reports a novel high-level visual aftereffect that addresses the important question of how these sources of information are combined in gauging social attention. We show that adapting to images of heads turned 25° to the right or left produces a perceptual bias in judging the turning direction of subsequently presented bodies. In contrast, little to no change in the judgment of head orientation occurs after adapting to extremely oriented bodies. The unidirectional nature of the aftereffect suggests that cues from the human body signaling social attention are combined in a hierarchical fashion and is consistent with evidence from single-cell recording studies in nonhuman primates showing that information about head orientation can override information about body posture when both are visible.

  15. Acquisition of Conditioning between Methamphetamine and Cues in Healthy Humans.

    Directory of Open Access Journals (Sweden)

    Joel S Cavallo

    Full Text Available Environmental stimuli repeatedly paired with drugs of abuse can elicit conditioned responses that are thought to promote future drug seeking. We recently showed that healthy volunteers acquired conditioned responses to auditory and visual stimuli after just two pairings with methamphetamine (MA, 20 mg, oral. This study extended these findings by systematically varying the number of drug-stimuli pairings. We expected that more pairings would result in stronger conditioning. Three groups of healthy adults were randomly assigned to receive 1, 2 or 4 pairings (Groups P1, P2 and P4, Ns = 13, 16, 16, respectively of an auditory-visual stimulus with MA, and another stimulus with placebo (PBO. Drug-cue pairings were administered in an alternating, counterbalanced order, under double-blind conditions, during 4 hr sessions. MA produced prototypic subjective effects (mood, ratings of drug effects and alterations in physiology (heart rate, blood pressure. Although subjects did not exhibit increased behavioral preference for, or emotional reactivity to, the MA-paired cue after conditioning, they did exhibit an increase in attentional bias (initial gaze toward the drug-paired stimulus. Further, subjects who had four pairings reported "liking" the MA-paired cue more than the PBO cue after conditioning. Thus, the number of drug-stimulus pairings, varying from one to four, had only modest effects on the strength of conditioned responses. Further studies investigating the parameters under which drug conditioning occurs will help to identify risk factors for developing drug abuse, and provide new treatment strategies.

  16. Multimodal computational attention for scene understanding and robotics

    CERN Document Server

    Schauerte, Boris

    2016-01-01

    This book presents state-of-the-art computational attention models that have been successfully tested in diverse application areas and can build the foundation for artificial systems to efficiently explore, analyze, and understand natural scenes. It gives a comprehensive overview of the most recent computational attention models for processing visual and acoustic input. It covers the biological background of visual and auditory attention, as well as bottom-up and top-down attentional mechanisms and discusses various applications. In the first part new approaches for bottom-up visual and acoustic saliency models are presented and applied to the task of audio-visual scene exploration of a robot. In the second part the influence of top-down cues for attention modeling is investigated. .

  17. Using auditory-visual speech to probe the basis of noise-impaired consonant-vowel perception in dyslexia and auditory neuropathy

    Science.gov (United States)

    Ramirez, Joshua; Mann, Virginia

    2005-08-01

    Both dyslexics and auditory neuropathy (AN) subjects show inferior consonant-vowel (CV) perception in noise, relative to controls. To better understand these impairments, natural acoustic speech stimuli that were masked in speech-shaped noise at various intensities were presented to dyslexic, AN, and control subjects either in isolation or accompanied by visual articulatory cues. AN subjects were expected to benefit from the pairing of visual articulatory cues and auditory CV stimuli, provided that their speech perception impairment reflects a relatively peripheral auditory disorder. Assuming that dyslexia reflects a general impairment of speech processing rather than a disorder of audition, dyslexics were not expected to similarly benefit from an introduction of visual articulatory cues. The results revealed an increased effect of noise masking on the perception of isolated acoustic stimuli by both dyslexic and AN subjects. More importantly, dyslexics showed less effective use of visual articulatory cues in identifying masked speech stimuli and lower visual baseline performance relative to AN subjects and controls. Last, a significant positive correlation was found between reading ability and the ameliorating effect of visual articulatory cues on speech perception in noise. These results suggest that some reading impairments may stem from a central deficit of speech processing.

  18. Common region wins the competition between extrinsic grouping cues: Evidence from a task without explicit attention to grouping.

    Science.gov (United States)

    Montoro, Pedro R; Villalba-García, Cristina; Luna, Dolores; Hinojosa, José A

    2017-12-01

    The competition between perceptual grouping factors is a relatively ignored topic, especially in the case of extrinsic grouping cues (e.g., common region or connectedness). Recent studies have examined the integration of extrinsic cues using tasks that induce selective attention to groups based on different grouping cues. However, this procedure could generate alternative strategies for task performance, which are non-related to the perceptual grouping operations. In the current work, we used an indirect task, i.e. repetition discrimination task, without explicit attention to grouping cues to further examine the rules that govern dominance between competing extrinsic grouping factors. This procedure allowed us to obtain an unbiased measure of the competition between common region and connectedness cues acting within the same display. The results corroborate previous data showing that grouping by common region dominated the perceived organization of the display, even though the phenomenological strength of the grouping cues was equated for each participant by means of a preliminary scaling task. Our results highlight the relevance of using indirect tasks as an essential tool for the systematic study of the integration of extrinsic grouping cues.

  19. The Gaze-Cueing Effect in the United States and Japan: Influence of Cultural Differences in Cognitive Strategies on Control of Attention

    Directory of Open Access Journals (Sweden)

    Saki Takao

    2018-01-01

    Full Text Available The direction of gaze automatically and exogenously guides visual spatial attention, a phenomenon termed as the gaze-cueing effect. Although this effect arises when the duration of stimulus onset asynchrony (SOA between a non-predictive gaze cue and the target is relatively long, no empirical research has examined the factors underlying this extended cueing effect. Two experiments compared the gaze-cueing effect at longer SOAs (700 ms in Japanese and American participants. Cross-cultural studies on cognition suggest that Westerners tend to use a context-independent analytical strategy to process visual environments, whereas Asians use a context-dependent holistic approach. We hypothesized that Japanese participants would not demonstrate the gaze-cueing effect at longer SOAs because they are more sensitive to contextual information, such as the knowledge that the direction of a gaze is not predictive. Furthermore, we hypothesized that American participants would demonstrate the gaze-cueing effect at the long SOAs because they tend to follow gaze direction whether it is predictive or not. In Experiment 1, American participants demonstrated the gaze-cueing effect at the long SOA, indicating that their attention was driven by the central non-predictive gaze direction regardless of the SOAs. In Experiment 2, Japanese participants demonstrated no gaze-cueing effect at the long SOA, suggesting that the Japanese participants exercised voluntary control of their attention, which inhibited the gaze-cueing effect with the long SOA. Our findings suggest that the control of visual spatial attention elicited by social stimuli systematically differs between American and Japanese individuals.

  20. The Gaze-Cueing Effect in the United States and Japan: Influence of Cultural Differences in Cognitive Strategies on Control of Attention.

    Science.gov (United States)

    Takao, Saki; Yamani, Yusuke; Ariga, Atsunori

    2017-01-01

    The direction of gaze automatically and exogenously guides visual spatial attention, a phenomenon termed as the gaze-cueing effect . Although this effect arises when the duration of stimulus onset asynchrony (SOA) between a non-predictive gaze cue and the target is relatively long, no empirical research has examined the factors underlying this extended cueing effect. Two experiments compared the gaze-cueing effect at longer SOAs (700 ms) in Japanese and American participants. Cross-cultural studies on cognition suggest that Westerners tend to use a context-independent analytical strategy to process visual environments, whereas Asians use a context-dependent holistic approach. We hypothesized that Japanese participants would not demonstrate the gaze-cueing effect at longer SOAs because they are more sensitive to contextual information, such as the knowledge that the direction of a gaze is not predictive. Furthermore, we hypothesized that American participants would demonstrate the gaze-cueing effect at the long SOAs because they tend to follow gaze direction whether it is predictive or not. In Experiment 1, American participants demonstrated the gaze-cueing effect at the long SOA, indicating that their attention was driven by the central non-predictive gaze direction regardless of the SOAs. In Experiment 2, Japanese participants demonstrated no gaze-cueing effect at the long SOA, suggesting that the Japanese participants exercised voluntary control of their attention, which inhibited the gaze-cueing effect with the long SOA. Our findings suggest that the control of visual spatial attention elicited by social stimuli systematically differs between American and Japanese individuals.

  1. Early Auditory Evoked Potential Is Modulated by Selective Attention and Related to Individual Differences in Visual Working Memory Capacity

    Science.gov (United States)

    Giuliano, Ryan J.; Karns, Christina M.; Neville, Helen J.; Hillyard, Steven A.

    2015-01-01

    A growing body of research suggests that the predictive power of working memory (WM) capacity for measures of intellectual aptitude is due to the ability to control attention and select relevant information. Crucially, attentional mechanisms implicated in controlling access to WM are assumed to be domain-general, yet reports of enhanced attentional abilities in individuals with larger WM capacities are primarily within the visual domain. Here, we directly test the link between WM capacity and early attentional gating across sensory domains, hypothesizing that measures of visual WM capacity should predict an individual’s capacity to allocate auditory selective attention. To address this question, auditory ERPs were recorded in a linguistic dichotic listening task, and individual differences in ERP modulations by attention were correlated with estimates of WM capacity obtained in a separate visual change detection task. Auditory selective attention enhanced ERP amplitudes at an early latency (ca. 70–90 msec), with larger P1 components elicited by linguistic probes embedded in an attended narrative. Moreover, this effect was associated with greater individual estimates of visual WM capacity. These findings support the view that domain-general attentional control mechanisms underlie the wide variation of WM capacity across individuals. PMID:25000526

  2. Nonverbal spatially selective attention in 4- and 5-year-old children.

    Science.gov (United States)

    Sanders, Lisa D; Zobel, Benjamin H

    2012-07-01

    Under some conditions 4- and 5-year-old children can differentially process sounds from attended and unattended locations. In fact, the latency of spatially selective attention effects on auditory processing as measured with event-related potentials (ERPs) is quite similar in young children and adults. However, it is not clear if developmental differences in the polarity, distribution, and duration of attention effects are best attributed to acoustic characteristics, availability of non-spatial attention cues, task demands, or domain. In the current study adults and children were instructed to attend to one of two simultaneously presented soundscapes (e.g., city sounds or night sounds) to detect targets (e.g., car horn or owl hoot) in the attended channel only. Probes presented from the same location as the attended soundscape elicited a larger negativity by 80 ms after onset in both adults and children. This initial negative difference (Nd) was followed by a larger positivity for attended probes in adults and another negativity for attended probes in children. The results indicate that the neural systems by which attention modulates early auditory processing are available for young children even when presented with nonverbal sounds. They also suggest important interactions between attention, acoustic characteristics, and maturity on auditory evoked potentials. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Attentional bias toward high-calorie food-cues and trait motor impulsivity interactively predict weight gain

    Directory of Open Access Journals (Sweden)

    Adrian Meule

    2016-05-01

    Full Text Available Strong bottom-up impulses and weak top-down control may interactively lead to overeating and, consequently, weight gain. In the present study, female university freshmen were tested at the start of the first semester and again at the start of the second semester. Attentional bias toward high- or low-calorie food-cues was assessed using a dot-probe paradigm and participants completed the Barratt Impulsiveness Scale . Attentional bias and motor impulsivity interactively predicted change in body mass index: motor impulsivity positively predicted weight gain only when participants showed an attentional bias toward high-calorie food-cues. Attentional and non-planning impulsivity were unrelated to weight change. Results support findings showing that weight gain is prospectively predicted by a combination of weak top-down control (i.e. high impulsivity and strong bottom-up impulses (i.e. high automatic motivational drive toward high-calorie food stimuli. They also highlight the fact that only specific aspects of impulsivity are relevant in eating and weight regulation.

  4. Internet-based attention bias modification for social anxiety: a randomised controlled comparison of training towards negative and training towards positive cues.

    Science.gov (United States)

    Boettcher, Johanna; Leek, Linda; Matson, Lisa; Holmes, Emily A; Browning, Michael; MacLeod, Colin; Andersson, Gerhard; Carlbring, Per

    2013-01-01

    Biases in attention processes are thought to play a crucial role in the aetiology and maintenance of Social Anxiety Disorder (SAD). The goal of the present study was to examine the efficacy of a programme intended to train attention towards positive cues and a programme intended to train attention towards negative cues. In a randomised, controlled, double-blind design, the impact of these two training conditions on both selective attention and social anxiety were compared to that of a control training condition. A modified dot probe task was used, and delivered via the internet. A total of 129 individuals, diagnosed with SAD, were randomly assigned to one of these three conditions and took part in a 14-day programme with daily training/control sessions. Participants in all three groups did not on average display an attentional bias prior to the training. Critically, results on change in attention bias implied that significantly differential change in selective attention to threat was not detected in the three conditions. However, symptoms of social anxiety reduced significantly from pre- to follow-up-assessment in all three conditions (dwithin  = 0.63-1.24), with the procedure intended to train attention towards threat cues producing, relative to the control condition, a significantly greater reduction of social fears. There were no significant differences in social anxiety outcome between the training condition intended to induce attentional bias towards positive cues and the control condition. To our knowledge, this is the first RCT where a condition intended to induce attention bias to negative cues yielded greater emotional benefits than a control condition. Intriguingly, changes in symptoms are unlikely to be by the mechanism of change in attention processes since there was no change detected in bias per se. Implications of this finding for future research on attention bias modification in social anxiety are discussed. ClinicalTrials.gov NCT01463137.

  5. Influence of negative affect on selective attention to smoking-related cues and urge to smoke in cigarette smokers.

    Science.gov (United States)

    Bradley, Brendan P; Garner, Matthew; Hudson, Laura; Mogg, Karin

    2007-07-01

    According to recent models of addiction, negative affect plays an important role in maintaining drug dependence. The study investigated the effect of negative mood on attentional biases for smoking-related cues and smoking urge in cigarette smokers. Eye movements to smoking-related and control pictures, and manual response times to probes, were recorded during a visual probe task. Smoking urges and mood were assessed by self-report measures. Negative affect was manipulated experimentally as a within-participants independent variable; that is, each participant received negative and neutral mood induction procedures, in counterbalanced order in separate sessions, before the attentional task. There were two groups of participants: smokers and nonsmokers. Smokers showed (i) a greater tendency to shift gaze initially towards smoking-related cues, and (ii) greater urge to smoke when they were in negative mood compared with neutral mood. Manual response time data suggested that smokers showed a greater tendency than nonsmokers to maintain attention on smoking-related cues, irrespective of mood. The results offer partial support for the view that negative mood increases selective attention to drug cues, and urge to smoke, in smokers. The findings are discussed in relation to an affective processing model of negative reinforcement in drug dependence.

  6. Speed on the dance floor: Auditory and visual cues for musical tempo.

    Science.gov (United States)

    London, Justin; Burger, Birgitta; Thompson, Marc; Toiviainen, Petri

    2016-02-01

    Musical tempo is most strongly associated with the rate of the beat or "tactus," which may be defined as the most prominent rhythmic periodicity present in the music, typically in a range of 1.67-2 Hz. However, other factors such as rhythmic density, mean rhythmic inter-onset interval, metrical (accentual) structure, and rhythmic complexity can affect perceived tempo (Drake, Gros, & Penel, 1999; London, 2011 Drake, Gros, & Penel, 1999; London, 2011). Visual information can also give rise to a perceived beat/tempo (Iversen, et al., 2015), and auditory and visual temporal cues can interact and mutually influence each other (Soto-Faraco & Kingstone, 2004; Spence, 2015). A five-part experiment was performed to assess the integration of auditory and visual information in judgments of musical tempo. Participants rated the speed of six classic R&B songs on a seven point scale while observing an animated figure dancing to them. Participants were presented with original and time-stretched (±5%) versions of each song in audio-only, audio+video (A+V), and video-only conditions. In some videos the animations were of spontaneous movements to the different time-stretched versions of each song, and in other videos the animations were of "vigorous" versus "relaxed" interpretations of the same auditory stimulus. Two main results were observed. First, in all conditions with audio, even though participants were able to correctly rank the original vs. time-stretched versions of each song, a song-specific tempo-anchoring effect was observed, such that sped-up versions of slower songs were judged to be faster than slowed-down versions of faster songs, even when their objective beat rates were the same. Second, when viewing a vigorous dancing figure in the A+V condition, participants gave faster tempo ratings than from the audio alone or when viewing the same audio with a relaxed dancing figure. The implications of this illusory tempo percept for cross-modal sensory integration and

  7. Paying attention to attention in recognition memory: insights from models and electrophysiology.

    Science.gov (United States)

    Dubé, Chad; Payne, Lisa; Sekuler, Robert; Rotello, Caren M

    2013-12-01

    Reliance on remembered facts or events requires memory for their sources, that is, the contexts in which those facts or events were embedded. Understanding of source retrieval has been stymied by the fact that uncontrolled fluctuations of attention during encoding can cloud results of key importance to theoretical development. To address this issue, we combined electrophysiology (high-density electroencephalogram, EEG, recordings) with computational modeling of behavioral results. We manipulated subjects' attention to an auditory attribute, whether the source of individual study words was a male or female speaker. Posterior alpha-band (8-14 Hz) power in subjects' EEG increased after a cue to ignore the voice of the person who was about to speak. Receiver-operating-characteristic analysis validated our interpretation of oscillatory dynamics as a marker of attention to source information. With attention under experimental control, computational modeling showed unequivocally that memory for source (male or female speaker) reflected a continuous signal detection process rather than a threshold recollection process.

  8. Intentional attention switching in dichotic listening: exploring the efficiency of nonspatial and spatial selection.

    Science.gov (United States)

    Lawo, Vera; Fels, Janina; Oberem, Josefa; Koch, Iring

    2014-10-01

    Using an auditory variant of task switching, we examined the ability to intentionally switch attention in a dichotic-listening task. In our study, participants responded selectively to one of two simultaneously presented auditory number words (spoken by a female and a male, one for each ear) by categorizing its numerical magnitude. The mapping of gender (female vs. male) and ear (left vs. right) was unpredictable. The to-be-attended feature for gender or ear, respectively, was indicated by a visual selection cue prior to auditory stimulus onset. In Experiment 1, explicitly cued switches of the relevant feature dimension (e.g., from gender to ear) and switches of the relevant feature within a dimension (e.g., from male to female) occurred in an unpredictable manner. We found large performance costs when the relevant feature switched, but switches of the relevant feature dimension incurred only small additional costs. The feature-switch costs were larger in ear-relevant than in gender-relevant trials. In Experiment 2, we replicated these findings using a simplified design (i.e., only within-dimension switches with blocked dimensions). In Experiment 3, we examined preparation effects by manipulating the cueing interval and found a preparation benefit only when ear was cued. Together, our data suggest that the large part of attentional switch costs arises from reconfiguration at the level of relevant auditory features (e.g., left vs. right) rather than feature dimensions (ear vs. gender). Additionally, our findings suggest that ear-based target selection benefits more from preparation time (i.e., time to direct attention to one ear) than gender-based target selection.

  9. A Fröhlich effect and representational gravity in memory for auditory pitch.

    Science.gov (United States)

    Hubbard, Timothy L; Ruppel, Susan E

    2013-08-01

    Memory for the initial pitch of an auditory target that increased or decreased in auditory frequency was examined. Memory was displaced forward in the direction of pitch motion, and this is consistent with the Fröhlich effect previously observed for visual targets moving in visual physical space. The Fröhlich effect for pitch increased with faster target velocity and decreased if an auditory cue with the same pitch as the initial pitch of the target was presented before the target was presented. The Fröhlich effect was larger for descending pitch motion than for ascending pitch motion, and this is consistent with an influence of representational gravity. The data suggest that representation of auditory frequency space exhibits some of the same biases as representation of visual physical space, and implications for theories of attention in displacement and for crossmodal and multisensory representation of space are discussed. 2013 APA, all rights reserved

  10. The Role of Search Speed in the Contextual Cueing of Children’s Attention

    Science.gov (United States)

    Darby, Kevin; Burling, Joseph; Yoshida, Hanako

    2013-01-01

    The contextual cueing effect is a robust phenomenon in which repeated exposure to the same arrangement of random elements guides attention to relevant information by constraining search. The effect is measured using an object search task in which a target (e.g., the letter T) is located within repeated or nonrepeated visual contexts (e.g., configurations of the letter L). Decreasing response times for the repeated configurations indicates that contextual information has facilitated search. Although the effect is robust among adult participants, recent attempts to document the effect in children have yielded mixed results. We examined the effect of search speed on contextual cueing with school-aged children, comparing three types of stimuli that promote different search times in order to observe how speed modulates this effect. Reliable effects of search time were found, suggesting that visual search speed uniquely constrains the role of attention toward contextually cued information. PMID:24505167

  11. Early, but not late visual distractors affect movement synchronization to a temporal-spatial visual cue

    Directory of Open Access Journals (Sweden)

    Ashley J Booth

    2015-06-01

    Full Text Available The ease of synchronising movements to a rhythmic cue is dependent on the modality of the cue presentation: timing accuracy is much higher when synchronising with discrete auditory rhythms than an equivalent visual stimulus presented through flashes. However, timing accuracy is improved if the visual cue presents spatial as well as temporal information (e.g. a dot following an oscillatory trajectory. Similarly, when synchronising with an auditory target metronome in the presence of a second visual distracting metronome, the distraction is stronger when the visual cue contains spatial-temporal information rather than temporal only. The present study investigates individuals’ ability to synchronise movements to a temporal-spatial visual cue in the presence of same-modality temporal-spatial distractors. Moreover, we investigated how increasing the number of distractor stimuli impacted on maintaining synchrony with the target cue. Participants made oscillatory vertical arm movements in time with a vertically oscillating white target dot centred on a large projection screen. The target dot was surrounded by 2, 8 or 14 distractor dots, which had an identical trajectory to the target but at a phase lead or lag of 0, 100 or 200ms. We found participants’ timing performance was only affected in the phase-lead conditions and when there were large numbers of distractors present (8 and 14. This asymmetry suggests participants still rely on salient events in the stimulus trajectory to synchronise movements. Subsequently, distractions occurring in the window of attention surrounding those events have the maximum impact on timing performance.

  12. Auditory Stream Segregation Improves Infants' Selective Attention to Target Tones Amid Distracters

    Science.gov (United States)

    Smith, Nicholas A.; Trainor, Laurel J.

    2011-01-01

    This study examined the role of auditory stream segregation in the selective attention to target tones in infancy. Using a task adapted from Bregman and Rudnicky's 1975 study and implemented in a conditioned head-turn procedure, infant and adult listeners had to discriminate the temporal order of 2,200 and 2,400 Hz target tones presented alone,…

  13. Relevance of Spectral Cues for Auditory Spatial Processing in the Occipital Cortex of the Blind

    Science.gov (United States)

    Voss, Patrice; Lepore, Franco; Gougoux, Frédéric; Zatorre, Robert J.

    2011-01-01

    We have previously shown that some blind individuals can localize sounds more accurately than their sighted counterparts when one ear is obstructed, and that this ability is strongly associated with occipital cortex activity. Given that spectral cues are important for monaurally localizing sounds when one ear is obstructed, and that blind individuals are more sensitive to small spectral differences, we hypothesized that enhanced use of spectral cues via occipital cortex mechanisms could explain the better performance of blind individuals in monaural localization. Using positron-emission tomography (PET), we scanned blind and sighted persons as they discriminated between sounds originating from a single spatial position, but with different spectral profiles that simulated different spatial positions based on head-related transfer functions. We show here that a sub-group of early blind individuals showing superior monaural sound localization abilities performed significantly better than any other group on this spectral discrimination task. For all groups, performance was best for stimuli simulating peripheral positions, consistent with the notion that spectral cues are more helpful for discriminating peripheral sources. PET results showed that all blind groups showed cerebral blood flow increases in the occipital cortex; but this was also the case in the sighted group. A voxel-wise covariation analysis showed that more occipital recruitment was associated with better performance across all blind subjects but not the sighted. An inter-regional covariation analysis showed that the occipital activity in the blind covaried with that of several frontal and parietal regions known for their role in auditory spatial processing. Overall, these results support the notion that the superior ability of a sub-group of early-blind individuals to localize sounds is mediated by their superior ability to use spectral cues, and that this ability is subserved by cortical processing in

  14. Age differences in visual-auditory self-motion perception during a simulated driving task

    Directory of Open Access Journals (Sweden)

    Robert eRamkhalawansingh

    2016-04-01

    Full Text Available Recent evidence suggests that visual-auditory cue integration may change as a function of age such that integration is heightened among older adults. Our goal was to determine whether these changes in multisensory integration are also observed in the context of self-motion perception under realistic task constraints. Thus, we developed a simulated driving paradigm in which we provided older and younger adults with visual motion cues (i.e. optic flow and systematically manipulated the presence or absence of congruent auditory cues to self-motion (i.e. engine, tire, and wind sounds. Results demonstrated that the presence or absence of congruent auditory input had different effects on older and younger adults. Both age groups demonstrated a reduction in speed variability when auditory cues were present compared to when they were absent, but older adults demonstrated a proportionally greater reduction in speed variability under combined sensory conditions. These results are consistent with evidence indicating that multisensory integration is heightened in older adults. Importantly, this study is the first to provide evidence to suggest that age differences in multisensory integration may generalize from simple stimulus detection tasks to the integration of the more complex and dynamic visual and auditory cues that are experienced during self-motion.

  15. Testing a cue outside the training context increases attention to the contexts and impairs performance in human predictive learning.

    Science.gov (United States)

    Aristizabal, José A; Ramos-Álvarez, Manuel M; Callejas-Aguilera, José E; Rosas, Juan M

    2017-12-01

    One experiment in human predictive learning explored the impact of a context change on attention to contexts and predictive ratings controlled by the cue. In Context A: cue X was paired with an outcome four times, while cue Y was presented without an outcome four times in Context B:. In both contexts filler cues were presented without the outcome. During the test, target cues X and Y were presented either in the context where they were trained, or in the alternative context. With the context change expectation of the outcome X, expressed as predictive ratings, decreased in the presence of X and increased in the presence of Y. Looking at the contexts, expressed as a percentage of the overall gaze dwell time on a trial, was high across the four training trials, and increased with the context change. Results suggest that the presentation of unexpected information leads to increases in attention to contextual cues. Implications for contextual control of behavior are discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Neural Correlates of Auditory Figure-Ground Segregation Based on Temporal Coherence

    Science.gov (United States)

    Teki, Sundeep; Barascud, Nicolas; Picard, Samuel; Payne, Christopher; Griffiths, Timothy D.; Chait, Maria

    2016-01-01

    To make sense of natural acoustic environments, listeners must parse complex mixtures of sounds that vary in frequency, space, and time. Emerging work suggests that, in addition to the well-studied spectral cues for segregation, sensitivity to temporal coherence—the coincidence of sound elements in and across time—is also critical for the perceptual organization of acoustic scenes. Here, we examine pre-attentive, stimulus-driven neural processes underlying auditory figure-ground segregation using stimuli that capture the challenges of listening in complex scenes where segregation cannot be achieved based on spectral cues alone. Signals (“stochastic figure-ground”: SFG) comprised a sequence of brief broadband chords containing random pure tone components that vary from 1 chord to another. Occasional tone repetitions across chords are perceived as “figures” popping out of a stochastic “ground.” Magnetoencephalography (MEG) measurement in naïve, distracted, human subjects revealed robust evoked responses, commencing from about 150 ms after figure onset that reflect the emergence of the “figure” from the randomly varying “ground.” Neural sources underlying this bottom-up driven figure-ground segregation were localized to planum temporale, and the intraparietal sulcus, demonstrating that this area, outside the “classic” auditory system, is also involved in the early stages of auditory scene analysis.” PMID:27325682

  17. Persistent fluctuations in stride intervals under fractal auditory stimulation.

    Science.gov (United States)

    Marmelat, Vivien; Torre, Kjerstin; Beek, Peter J; Daffertshofer, Andreas

    2014-01-01

    Stride sequences of healthy gait are characterized by persistent long-range correlations, which become anti-persistent in the presence of an isochronous metronome. The latter phenomenon is of particular interest because auditory cueing is generally considered to reduce stride variability and may hence be beneficial for stabilizing gait. Complex systems tend to match their correlation structure when synchronizing. In gait training, can one capitalize on this tendency by using a fractal metronome rather than an isochronous one? We examined whether auditory cues with fractal variations in inter-beat intervals yield similar fractal inter-stride interval variability as isochronous auditory cueing in two complementary experiments. In Experiment 1, participants walked on a treadmill while being paced by either an isochronous or a fractal metronome with different variation strengths between beats in order to test whether participants managed to synchronize with a fractal metronome and to determine the necessary amount of variability for participants to switch from anti-persistent to persistent inter-stride intervals. Participants did synchronize with the metronome despite its fractal randomness. The corresponding coefficient of variation of inter-beat intervals was fixed in Experiment 2, in which participants walked on a treadmill while being paced by non-isochronous metronomes with different scaling exponents. As expected, inter-stride intervals showed persistent correlations similar to self-paced walking only when cueing contained persistent correlations. Our results open up a new window to optimize rhythmic auditory cueing for gait stabilization by integrating fractal fluctuations in the inter-beat intervals.

  18. Complex-tone pitch representations in the human auditory system

    DEFF Research Database (Denmark)

    Bianchi, Federica

    in listeners with SNHL, it is likely that HI listeners rely on the enhanced envelope cues to retrieve the pitch of unresolved harmonics. Hence, the relative importance of pitch cues may be altered in HI listeners, whereby envelope cues may be used instead of TFS cues to obtain a similar performance in pitch......Understanding how the human auditory system processes the physical properties of an acoustical stimulus to give rise to a pitch percept is a fascinating aspect of hearing research. Since most natural sounds are harmonic complex tones, this work focused on the nature of pitch-relevant cues...... that are necessary for the auditory system to retrieve the pitch of complex sounds. The existence of different pitch-coding mechanisms for low-numbered (spectrally resolved) and high-numbered (unresolved) harmonics was investigated by comparing pitch-discrimination performance across different cohorts of listeners...

  19. Rapid Auditory System Adaptation Using a Virtual Auditory Environment

    Directory of Open Access Journals (Sweden)

    Gaëtan Parseihian

    2011-10-01

    Full Text Available Various studies have highlighted plasticity of the auditory system from visual stimuli, limiting the trained field of perception. The aim of the present study is to investigate auditory system adaptation using an audio-kinesthetic platform. Participants were placed in a Virtual Auditory Environment allowing the association of the physical position of a virtual sound source with an alternate set of acoustic spectral cues or Head-Related Transfer Function (HRTF through the use of a tracked ball manipulated by the subject. This set-up has the advantage to be not being limited to the visual field while also offering a natural perception-action coupling through the constant awareness of one's hand position. Adaptation process to non-individualized HRTF was realized through a spatial search game application. A total of 25 subjects participated, consisting of subjects presented with modified cues using non-individualized HRTF and a control group using individual measured HRTFs to account for any learning effect due to the game itself. The training game lasted 12 minutes and was repeated over 3 consecutive days. Adaptation effects were measured with repeated localization tests. Results showed a significant performance improvement for vertical localization and a significant reduction in the front/back confusion rate after 3 sessions.

  20. Internet-Based Attention Bias Modification for Social Anxiety: A Randomised Controlled Comparison of Training towards Negative and Training Towards Positive Cues

    Science.gov (United States)

    Boettcher, Johanna; Leek, Linda; Matson, Lisa; Holmes, Emily A.; Browning, Michael; MacLeod, Colin; Andersson, Gerhard; Carlbring, Per

    2013-01-01

    Biases in attention processes are thought to play a crucial role in the aetiology and maintenance of Social Anxiety Disorder (SAD). The goal of the present study was to examine the efficacy of a programme intended to train attention towards positive cues and a programme intended to train attention towards negative cues. In a randomised, controlled, double-blind design, the impact of these two training conditions on both selective attention and social anxiety were compared to that of a control training condition. A modified dot probe task was used, and delivered via the internet. A total of 129 individuals, diagnosed with SAD, were randomly assigned to one of these three conditions and took part in a 14-day programme with daily training/control sessions. Participants in all three groups did not on average display an attentional bias prior to the training. Critically, results on change in attention bias implied that significantly differential change in selective attention to threat was not detected in the three conditions. However, symptoms of social anxiety reduced significantly from pre- to follow-up-assessment in all three conditions (dwithin  = 0.63–1.24), with the procedure intended to train attention towards threat cues producing, relative to the control condition, a significantly greater reduction of social fears. There were no significant differences in social anxiety outcome between the training condition intended to induce attentional bias towards positive cues and the control condition. To our knowledge, this is the first RCT where a condition intended to induce attention bias to negative cues yielded greater emotional benefits than a control condition. Intriguingly, changes in symptoms are unlikely to be by the mechanism of change in attention processes since there was no change detected in bias per se. Implications of this finding for future research on attention bias modification in social anxiety are discussed. Trial Registration Clinical

  1. Internet-based attention bias modification for social anxiety: a randomised controlled comparison of training towards negative and training towards positive cues.

    Directory of Open Access Journals (Sweden)

    Johanna Boettcher

    Full Text Available Biases in attention processes are thought to play a crucial role in the aetiology and maintenance of Social Anxiety Disorder (SAD. The goal of the present study was to examine the efficacy of a programme intended to train attention towards positive cues and a programme intended to train attention towards negative cues. In a randomised, controlled, double-blind design, the impact of these two training conditions on both selective attention and social anxiety were compared to that of a control training condition. A modified dot probe task was used, and delivered via the internet. A total of 129 individuals, diagnosed with SAD, were randomly assigned to one of these three conditions and took part in a 14-day programme with daily training/control sessions. Participants in all three groups did not on average display an attentional bias prior to the training. Critically, results on change in attention bias implied that significantly differential change in selective attention to threat was not detected in the three conditions. However, symptoms of social anxiety reduced significantly from pre- to follow-up-assessment in all three conditions (dwithin  = 0.63-1.24, with the procedure intended to train attention towards threat cues producing, relative to the control condition, a significantly greater reduction of social fears. There were no significant differences in social anxiety outcome between the training condition intended to induce attentional bias towards positive cues and the control condition. To our knowledge, this is the first RCT where a condition intended to induce attention bias to negative cues yielded greater emotional benefits than a control condition. Intriguingly, changes in symptoms are unlikely to be by the mechanism of change in attention processes since there was no change detected in bias per se. Implications of this finding for future research on attention bias modification in social anxiety are discussed

  2. Selective attention and the auditory vertex potential. 1: Effects of stimulus delivery rate

    Science.gov (United States)

    Schwent, V. L.; Hillyard, S. A.; Galambos, R.

    1975-01-01

    Enhancement of the auditory vertex potentials with selective attention to dichotically presented tone pips was found to be critically sensitive to the range of inter-stimulus intervals in use. Only at the shortest intervals was a clear-cut enhancement of the latency component to stimuli observed for the attended ear.

  3. Empathy, Pain and Attention: Cues that Predict Pain Stimulation to the Partner and the Self Capture Visual Attention

    Directory of Open Access Journals (Sweden)

    Lingdan Wu

    2017-09-01

    Full Text Available Empathy motivates helping and cooperative behaviors and plays an important role in social interactions and personal communication. The present research examined the hypothesis that a state of empathy guides attention towards stimuli significant to others in a similar way as to stimuli relevant to the self. Sixteen couples in romantic partnerships were examined in a pain-related empathy paradigm including an anticipation phase and a stimulation phase. Abstract visual symbols (i.e., arrows and flashes signaled the delivery of a Pain or Nopain stimulus to the partner or the self while dense sensor event-related potentials (ERPs were simultaneously recorded from both persons. During the anticipation phase, stimuli predicting Pain compared to Nopain stimuli to the partner elicited a larger early posterior negativity (EPN and late positive potential (LPP, which were similar in topography and latency to the EPN and LPP modulations elicited by stimuli signaling pain for the self. Noteworthy, using abstract cue symbols to cue Pain and Nopain stimuli suggests that these effects are not driven by perceptual features. The findings demonstrate that symbolic stimuli relevant for the partner capture attention, which implies a state of empathy to the pain of the partner. From a broader perspective, states of empathy appear to regulate attention processing according to the perceived needs and goals of the partner.

  4. Interaction of streaming and attention in human auditory cortex.

    Science.gov (United States)

    Gutschalk, Alexander; Rupp, André; Dykstra, Andrew R

    2015-01-01

    Serially presented tones are sometimes segregated into two perceptually distinct streams. An ongoing debate is whether this basic streaming phenomenon reflects automatic processes or requires attention focused to the stimuli. Here, we examined the influence of focused attention on streaming-related activity in human auditory cortex using magnetoencephalography (MEG). Listeners were presented with a dichotic paradigm in which left-ear stimuli consisted of canonical streaming stimuli (ABA_ or ABAA) and right-ear stimuli consisted of a classical oddball paradigm. In phase one, listeners were instructed to attend the right-ear oddball sequence and detect rare deviants. In phase two, they were instructed to attend the left ear streaming stimulus and report whether they heard one or two streams. The frequency difference (ΔF) of the sequences was set such that the smallest and largest ΔF conditions generally induced one- and two-stream percepts, respectively. Two intermediate ΔF conditions were chosen to elicit bistable percepts (i.e., either one or two streams). Attention enhanced the peak-to-peak amplitude of the P1-N1 complex, but only for ambiguous ΔF conditions, consistent with the notion that automatic mechanisms for streaming tightly interact with attention and that the latter is of particular importance for ambiguous sound sequences.

  5. Development of Attentional Control of Verbal Auditory Perception from Middle to Late Childhood: Comparisons to Healthy Aging

    Science.gov (United States)

    Passow, Susanne; Müller, Maike; Westerhausen, René; Hugdahl, Kenneth; Wartenburger, Isabell; Heekeren, Hauke R.; Lindenberger, Ulman; Li, Shu-Chen

    2013-01-01

    Multitalker situations confront listeners with a plethora of competing auditory inputs, and hence require selective attention to relevant information, especially when the perceptual saliency of distracting inputs is high. This study augmented the classical forced-attention dichotic listening paradigm by adding an interaural intensity manipulation…

  6. A Cueing Procedure To Control Impulsivity in Children with Attention Deficit Hyperactivity Disorder.

    Science.gov (United States)

    Posavac, Heidi D.; Sheridan, Susan M.; Posavac, Steven S.

    1999-01-01

    Tests the efficacy of a cueing procedure for improving the impulse regulation of four boys with Attention Deficit Hyperactivity Disorder (ADHD) during social skills training. Behavioral data suggested that all subjects demonstrated positive changes in impulse regulation. Likewise, the treatment effects appeared to have produced positive effects on…

  7. Is the effect of tinnitus on auditory steady-state response amplitude mediated by attention?

    Directory of Open Access Journals (Sweden)

    Eugen eDiesch

    2012-05-01

    Full Text Available Objectives: The amplitude of the auditory steady-state response (ASSR is enhanced in tinnitus. As ASSR ampli¬tude is also enhanced by attention, the effect of tinnitus on ASSR amplitude could be interpreted as an effect of attention mediated by tinnitus. As attention effects on the N1 are signi¬fi¬cantly larger than those on the ASSR, if the effect of tinnitus on ASSR amplitude were due to attention, there should be similar amplitude enhancement effects in tinnitus for the N1 component of the auditory evoked response. Methods: MEG recordings of auditory evoked responses which were previously examined for the ASSR (Diesch et al. 2010 were analysed with respect to the N1m component. Like the ASSR previously, the N1m was analysed in the source domain (source space projection. Stimuli were amplitude-modulated tones with one of three carrier fre¬quen¬cies matching the tinnitus frequency or a surrogate frequency 1½ octaves above the audio¬metric edge frequency in con¬trols, the audiometric edge frequency, and a frequency below the audio¬metric edgeResults: In the earlier ASSR study (Diesch et al., 2010, the ASSR amplitude in tinnitus patients, but not in controls, was significantly larger in the (surrogate tinnitus condition than in the edge condition. In the present study, both tinnitus patients and healthy controls show an N1m-amplitude profile identical to the one of ASSR amplitudes in healthy controls. N1m amplitudes elicited by tonal frequencies located at the audiometric edge and at the (surrogate tinnitus frequency are smaller than N1m amplitudes elicited by sub-edge tones and do not differ among each other.Conclusions: There is no N1-amplitude enhancement effect in tinnitus. The enhancement effect of tinnitus on ASSR amplitude cannot be accounted for in terms of attention induced by tinnitus.

  8. Visual and cross-modal cues increase the identification of overlapping visual stimuli in Balint's syndrome.

    Science.gov (United States)

    D'Imperio, Daniela; Scandola, Michele; Gobbetto, Valeria; Bulgarelli, Cristina; Salgarello, Matteo; Avesani, Renato; Moro, Valentina

    2017-10-01

    Cross-modal interactions improve the processing of external stimuli, particularly when an isolated sensory modality is impaired. When information from different modalities is integrated, object recognition is facilitated probably as a result of bottom-up and top-down processes. The aim of this study was to investigate the potential effects of cross-modal stimulation in a case of simultanagnosia. We report a detailed analysis of clinical symptoms and an 18 F-fluorodeoxyglucose (FDG) brain positron emission tomography/computed tomography (PET/CT) study of a patient affected by Balint's syndrome, a rare and invasive visual-spatial disorder following bilateral parieto-occipital lesions. An experiment was conducted to investigate the effects of visual and nonvisual cues on performance in tasks involving the recognition of overlapping pictures. Four modalities of sensory cues were used: visual, tactile, olfactory, and auditory. Data from neuropsychological tests showed the presence of ocular apraxia, optic ataxia, and simultanagnosia. The results of the experiment indicate a positive effect of the cues on the recognition of overlapping pictures, not only in the identification of the congruent valid-cued stimulus (target) but also in the identification of the other, noncued stimuli. All the sensory modalities analyzed (except the auditory stimulus) were efficacious in terms of increasing visual recognition. Cross-modal integration improved the patient's ability to recognize overlapping figures. However, while in the visual unimodal modality both bottom-up (priming, familiarity effect, disengagement of attention) and top-down processes (mental representation and short-term memory, the endogenous orientation of attention) are involved, in the cross-modal integration it is semantic representations that mainly activate visual recognition processes. These results are potentially useful for the design of rehabilitation training for attentional and visual-perceptual deficits.

  9. Brain correlates of the orientation of auditory spatial attention onto speaker location in a "cocktail-party" situation.

    Science.gov (United States)

    Lewald, Jörg; Hanenberg, Christina; Getzmann, Stephan

    2016-10-01

    Successful speech perception in complex auditory scenes with multiple competing speakers requires spatial segregation of auditory streams into perceptually distinct and coherent auditory objects and focusing of attention toward the speaker of interest. Here, we focused on the neural basis of this remarkable capacity of the human auditory system and investigated the spatiotemporal sequence of neural activity within the cortical network engaged in solving the "cocktail-party" problem. Twenty-eight subjects localized a target word in the presence of three competing sound sources. The analysis of the ERPs revealed an anterior contralateral subcomponent of the N2 (N2ac), computed as the difference waveform for targets to the left minus targets to the right. The N2ac peaked at about 500 ms after stimulus onset, and its amplitude was correlated with better localization performance. Cortical source localization for the contrast of left versus right targets at the time of the N2ac revealed a maximum in the region around left superior frontal sulcus and frontal eye field, both of which are known to be involved in processing of auditory spatial information. In addition, a posterior-contralateral late positive subcomponent (LPCpc) occurred at a latency of about 700 ms. Both these subcomponents are potential correlates of allocation of spatial attention to the target under cocktail-party conditions. © 2016 Society for Psychophysiological Research.

  10. Visual cues and listening effort: individual variability.

    Science.gov (United States)

    Picou, Erin M; Ricketts, Todd A; Hornsby, Benjamin W Y

    2011-10-01

    To investigate the effect of visual cues on listening effort as well as whether predictive variables such as working memory capacity (WMC) and lipreading ability affect the magnitude of listening effort. Twenty participants with normal hearing were tested using a paired-associates recall task in 2 conditions (quiet and noise) and 2 presentation modalities (audio only [AO] and auditory-visual [AV]). Signal-to-noise ratios were adjusted to provide matched speech recognition across audio-only and AV noise conditions. Also measured were subjective perceptions of listening effort and 2 predictive variables: (a) lipreading ability and (b) WMC. Objective and subjective results indicated that listening effort increased in the presence of noise, but on average the addition of visual cues did not significantly affect the magnitude of listening effort. Although there was substantial individual variability, on average participants who were better lipreaders or had larger WMCs demonstrated reduced listening effort in noise in AV conditions. Overall, the results support the hypothesis that integrating auditory and visual cues requires cognitive resources in some participants. The data indicate that low lipreading ability or low WMC is associated with relatively effortful integration of auditory and visual information in noise.

  11. The absence of an auditory-visual attentional blink is not due to echoic memory.

    Science.gov (United States)

    Van der Burg, Erik; Olivers, Christian N; Bronkhorst, Adelbei W; Koelewijn, Thomas; Theeuwes, Jan

    2007-10-01

    The second of two targets is often missed when presented shortly after the first target--a phenomenon referred to as the attentional blink (AB). Whereas the AB is a robust phenomenon within sensory modalities, the evidence for cross-modal ABs is rather mixed. Here, we test the possibility that the absence of an auditory-visual AB for visual letter recognition when streams of tones are used is due to the efficient use of echoic memory, allowing for the postponement of auditory processing. However, forcing participants to immediately process the auditory target, either by presenting interfering sounds during retrieval or by making the first target directly relevant for a speeded response to the second target, did not result in a return of a cross-modal AB. Thefindings argue against echoic memory as an explanation for efficient cross-modal processing. Instead, we hypothesized that a cross-modal AB may be observed when the different modalities use common representations, such as semantic representations. In support of this, a deficit for visual letter recognition returned when the auditory task required a distinction between spoken digits and letters.

  12. Short-term plasticity in auditory cognition.

    Science.gov (United States)

    Jääskeläinen, Iiro P; Ahveninen, Jyrki; Belliveau, John W; Raij, Tommi; Sams, Mikko

    2007-12-01

    Converging lines of evidence suggest that auditory system short-term plasticity can enable several perceptual and cognitive functions that have been previously considered as relatively distinct phenomena. Here we review recent findings suggesting that auditory stimulation, auditory selective attention and cross-modal effects of visual stimulation each cause transient excitatory and (surround) inhibitory modulations in the auditory cortex. These modulations might adaptively tune hierarchically organized sound feature maps of the auditory cortex (e.g. tonotopy), thus filtering relevant sounds during rapidly changing environmental and task demands. This could support auditory sensory memory, pre-attentive detection of sound novelty, enhanced perception during selective attention, influence of visual processing on auditory perception and longer-term plastic changes associated with perceptual learning.

  13. Behavioral and Brain Measures of Phasic Alerting Effects on Visual Attention.

    Science.gov (United States)

    Wiegand, Iris; Petersen, Anders; Finke, Kathrin; Bundesen, Claus; Lansner, Jon; Habekost, Thomas

    2017-01-01

    In the present study, we investigated effects of phasic alerting on visual attention in a partial report task, in which half of the displays were preceded by an auditory warning cue. Based on the computational Theory of Visual Attention (TVA), we estimated parameters of spatial and non-spatial aspects of visual attention and measured event-related lateralizations (ERLs) over visual processing areas. We found that the TVA parameter sensory effectiveness a , which is thought to reflect visual processing capacity, significantly increased with phasic alerting. By contrast, the distribution of visual processing resources according to task relevance and spatial position, as quantified in parameters top-down control α and spatial bias w index , was not modulated by phasic alerting. On the electrophysiological level, the latencies of ERLs in response to the task displays were reduced following the warning cue. These results suggest that phasic alerting facilitates visual processing in a general, unselective manner and that this effect originates in early stages of visual information processing.

  14. Atypical auditory refractory periods in children from lower socio-economic status backgrounds: ERP evidence for a role of selective attention.

    Science.gov (United States)

    Stevens, Courtney; Paulsen, David; Yasen, Alia; Neville, Helen

    2015-02-01

    Previous neuroimaging studies indicate that lower socio-economic status (SES) is associated with reduced effects of selective attention on auditory processing. Here, we investigated whether lower SES is also associated with differences in a stimulus-driven aspect of auditory processing: the neural refractory period, or reduced amplitude response at faster rates of stimulus presentation. Thirty-two children aged 3 to 8 years participated, and were divided into two SES groups based on maternal education. Event-related brain potentials were recorded to probe stimuli presented at interstimulus intervals (ISIs) of 200, 500, or 1000 ms. These probes were superimposed on story narratives when attended and ignored, permitting a simultaneous experimental manipulation of selective attention. Results indicated that group differences in refractory periods differed as a function of attention condition. Children from higher SES backgrounds showed full neural recovery by 500 ms for attended stimuli, but required at least 1000 ms for unattended stimuli. In contrast, children from lower SES backgrounds showed similar refractory effects to attended and unattended stimuli, with full neural recovery by 500 ms. Thus, in higher SES children only, one functional consequence of selective attention is attenuation of the response to unattended stimuli, particularly at rapid ISIs, altering basic properties of the auditory refractory period. Together, these data indicate that differences in selective attention impact basic aspects of auditory processing in children from lower SES backgrounds. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Persistent fluctuations in stride intervals under fractal auditory stimulation.

    Directory of Open Access Journals (Sweden)

    Vivien Marmelat

    Full Text Available Stride sequences of healthy gait are characterized by persistent long-range correlations, which become anti-persistent in the presence of an isochronous metronome. The latter phenomenon is of particular interest because auditory cueing is generally considered to reduce stride variability and may hence be beneficial for stabilizing gait. Complex systems tend to match their correlation structure when synchronizing. In gait training, can one capitalize on this tendency by using a fractal metronome rather than an isochronous one? We examined whether auditory cues with fractal variations in inter-beat intervals yield similar fractal inter-stride interval variability as isochronous auditory cueing in two complementary experiments. In Experiment 1, participants walked on a treadmill while being paced by either an isochronous or a fractal metronome with different variation strengths between beats in order to test whether participants managed to synchronize with a fractal metronome and to determine the necessary amount of variability for participants to switch from anti-persistent to persistent inter-stride intervals. Participants did synchronize with the metronome despite its fractal randomness. The corresponding coefficient of variation of inter-beat intervals was fixed in Experiment 2, in which participants walked on a treadmill while being paced by non-isochronous metronomes with different scaling exponents. As expected, inter-stride intervals showed persistent correlations similar to self-paced walking only when cueing contained persistent correlations. Our results open up a new window to optimize rhythmic auditory cueing for gait stabilization by integrating fractal fluctuations in the inter-beat intervals.

  16. The effects of the dopamine D₃ receptor antagonist GSK598809 on attentional bias to palatable food cues in overweight and obese subjects.

    Science.gov (United States)

    Nathan, Pradeep J; O'Neill, Barry V; Mogg, Karin; Bradley, Brendan P; Beaver, John; Bani, Massimo; Merlo-Pich, Emilio; Fletcher, Paul C; Swirski, Bridget; Koch, Annelize; Dodds, Chris M; Bullmore, Edward T

    2012-03-01

    The mesolimbic dopamine system plays a critical role in the reinforcing effects of rewards. Evidence from pre-clinical studies suggests that D₃ receptor antagonists may attenuate the motivational impact of rewarding cues. In this study we examined the acute effects of the D₃ receptor antagonist GSK598809 on attentional bias to rewarding food cues in overweight to obese individuals (n=26, BMI mean=32.7±3.7, range 27-40 kg/m²) who reported binge and emotional eating. We also determined whether individual differences in restrained eating style modulated the effects of GSK598809 on attentional bias. The study utilized a randomized, double-blind, placebo-controlled cross-over design with each participant tested following acute administration of placebo and GSK598809 (175 mg). Attentional bias was assessed by the visual probe task and modified Stroop task using food-related words. Overall GSK598809 had no effects on attentional bias in either the visual probe or food Stroop tasks. However, the effect of GSK598809 on both visual probe and food Stroop attentional bias scores was inversely correlated with a measure of eating restraint allowing the identification of two subpopulations, low- and high-restrained eaters. Low-restrained eaters had a significant attentional bias towards food cues in both tasks under placebo, and this was attenuated by GSK598809. In contrast, high-restrained eaters showed no attentional bias to food cues following either placebo or GSK598809. These findings suggest that excessive attentional bias to food cues generated by individual differences in eating traits can be modulated by D₃ receptor antagonists, warranting further investigation with measures of eating behaviour and weight loss.

  17. Does Attention Play a Role in Dynamic Receptive Field Adaptation to Changing Acoustic Salience in A1?

    OpenAIRE

    Fritz, Jonathan; Elhilali, Mounya; David, Stephen; Shamma, Shihab

    2007-01-01

    Acoustic filter properties of A1 neurons can dynamically adapt to stimulus statistics, classical conditioning, instrumental learning and the changing auditory attentional focus. We have recently developed an experimental paradigm that allows us to view cortical receptive field plasticity on-line as the animal meets different behavioral challenges by attending to salient acoustic cues and changing its cortical filters to enhance performance. We propose that attention is the key trigger that in...

  18. Difference in Perseverative Errors during a Visual Attention Task with Auditory Distractors in Alpha-9 Nicotinic Receptor Subunit Wild Type and Knock-Out Mice.

    Science.gov (United States)

    Jorratt, Pascal; Delano, Paul H; Delgado, Carolina; Dagnino-Subiabre, Alexies; Terreros, Gonzalo

    2017-01-01

    The auditory efferent system is a neural network that originates in the auditory cortex and projects to the cochlear receptor through olivocochlear (OC) neurons. Medial OC neurons make cholinergic synapses with outer hair cells (OHCs) through nicotinic receptors constituted by α9 and α10 subunits. One of the physiological functions of the α9 nicotinic receptor subunit (α9-nAChR) is the suppression of auditory distractors during selective attention to visual stimuli. In a recent study we demonstrated that the behavioral performance of alpha-9 nicotinic receptor knock-out (KO) mice is altered during selective attention to visual stimuli with auditory distractors since they made less correct responses and more omissions than wild type (WT) mice. As the inhibition of the behavioral responses to irrelevant stimuli is an important mechanism of the selective attention processes, behavioral errors are relevant measures that can reflect altered inhibitory control. Errors produced during a cued attention task can be classified as premature, target and perseverative errors. Perseverative responses can be considered as an inability to inhibit the repetition of an action already planned, while premature responses can be considered as an index of the ability to wait or retain an action. Here, we studied premature, target and perseverative errors during a visual attention task with auditory distractors in WT and KO mice. We found that α9-KO mice make fewer perseverative errors with longer latencies than WT mice in the presence of auditory distractors. In addition, although we found no significant difference in the number of target error between genotypes, KO mice made more short-latency target errors than WT mice during the presentation of auditory distractors. The fewer perseverative error made by α9-KO mice could be explained by a reduced motivation for reward and an increased impulsivity during decision making with auditory distraction in KO mice.

  19. Object-based spatial attention when objects have sufficient depth cues.

    Science.gov (United States)

    Takeya, Ryuji; Kasai, Tetsuko

    2015-01-01

    Attention directed to a part of an object tends to obligatorily spread over all of the spatial regions that belong to the object, which may be critical for rapid object-recognition in cluttered visual scenes. Previous studies have generally used simple rectangles as objects and have shown that attention spreading is reflected by amplitude modulation in the posterior N1 component (150-200 ms poststimulus) of event-related potentials, while other interpretations (i.e., rectangular holes) may arise implicitly in early visual processing stages. By using modified Kanizsa-type stimuli that provided less ambiguity of depth ordering, the present study examined early event-related potential spatial-attention effects for connected and separated objects, both of which were perceived in front of (Experiment 1) and in back of (Experiment 2) the surroundings. Typical P1 (100-140 ms) and N1 (150-220 ms) attention effects of ERP in response to unilateral probes were observed in both experiments. Importantly, the P1 attention effect was decreased for connected objects compared to separated objects only in Experiment 1, and the typical object-based modulations of N1 were not observed in either experiment. These results suggest that spatial attention spreads over a figural object at earlier stages of processing than previously indicated, in three-dimensional visual scenes with multiple depth cues.

  20. Visual attention to alcohol cues and responsible drinking statements within alcohol advertisements and public health campaigns: Relationships with drinking intentions and alcohol consumption in the laboratory.

    Science.gov (United States)

    Kersbergen, Inge; Field, Matt

    2017-06-01

    Both alcohol advertising and public health campaigns increase alcohol consumption in the short term, and this may be attributable to attentional capture by alcohol-related cues in both types of media. The present studies investigated the association between (a) visual attention to alcohol cues and responsible drinking statements in alcohol advertising and public health campaigns, and (b) next-week drinking intentions (Study 1) and drinking behavior in the lab (Study 2). In Study 1, 90 male participants viewed 1 of 3 TV alcohol adverts (conventional advert; advert that emphasized responsible drinking; or public health campaign; between-subjects manipulation) while their visual attention to alcohol cues and responsible drinking statements was recorded, before reporting their drinking intentions. Study 2 used a within-subjects design in which 62 participants (27% male) viewed alcohol and soda advertisements while their attention to alcohol/soda cues and responsible drinking statements was recorded, before completing a bogus taste test with different alcoholic and nonalcoholic drinks. In both studies, alcohol cues attracted more attention than responsible drinking statements, except when viewing a public health TV campaign. Attention to responsible drinking statements was not associated with intentions to drink alcohol over the next week (Study 1) or alcohol consumption in the lab (Study 2). However, attention to alcohol portrayal cues within alcohol advertisements was associated with ad lib alcohol consumption in Study 2, although attention to other types of alcohol cues (brand logos, glassware, and packaging) was not associated. Future studies should investigate how responsible drinking statements might be improved to attract more attention. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Effects of divided attention and operating room noise on perception of pulse oximeter pitch changes: a laboratory study.

    Science.gov (United States)

    Stevenson, Ryan A; Schlesinger, Joseph J; Wallace, Mark T

    2013-02-01

    Anesthesiology requires performing visually oriented procedures while monitoring auditory information about a patient's vital signs. A concern in operating room environments is the amount of competing information and the effects that divided attention has on patient monitoring, such as detecting auditory changes in arterial oxygen saturation via pulse oximetry. The authors measured the impact of visual attentional load and auditory background noise on the ability of anesthesia residents to monitor the pulse oximeter auditory display in a laboratory setting. Accuracies and response times were recorded reflecting anesthesiologists' abilities to detect changes in oxygen saturation across three levels of visual attention in quiet and with noise. Results show that visual attentional load substantially affects the ability to detect changes in oxygen saturation concentrations conveyed by auditory cues signaling 99 and 98% saturation. These effects are compounded by auditory noise, up to a 17% decline in performance. These deficits are seen in the ability to accurately detect a change in oxygen saturation and in speed of response. Most anesthesia accidents are initiated by small errors that cascade into serious events. Lack of monitor vigilance and inattention are two of the more commonly cited factors. Reducing such errors is thus a priority for improving patient safety. Specifically, efforts to reduce distractors and decrease background noise should be considered during induction and emergence, periods of especially high risk, when anesthesiologists has to attend to many tasks and are thus susceptible to error.

  2. Effects of divided attention and operating room noise on perception of pulse oximeter pitch changes: A laboratory study

    Science.gov (United States)

    Stevenson, Ryan A.; Fellow, Postdoctoral; Schlesinger, Joseph J.; Physician, Resident; Wallace, Mark T.

    2012-01-01

    Background Anesthesiology requires performing visually-oriented procedures while monitoring auditory information about a patient’s vital signs. A concern in operating rooms environments is the amount of competing information and the effects that divided attention have on patient monitoring, such as detecting auditory changes in arterial oxygen saturation via pulse oximetry. Methods We measured the impact of visual attentional load and auditory background noise on the ability of anesthesia residents to monitor the pulse oximeter auditory display in a laboratory setting. Accuracies and response times were recorded reflecting anesthesiologists’ abilities to detect changes in oxygen saturation across three levels of visual attention in quiet and with noise. Results Results show that visual attentional load substantially impacts the ability to detect changes in oxygen saturation levels conveyed by auditory cues signaling 99 and 98% saturation. These effects are compounded by auditory noise, with up to a 17% decline in performance. These deficits are seen in the ability to accurately detect a change in oxygen saturation and in speed of response. Conclusions Most anesthesia accidents are initiated by small errors that cascade into serious events. Lack of monitor vigilance and inattention are two of the more commonly cited factors. Reducing such errors is thus a priority for improving patient safety. Specifically, efforts to reduce distractors and lower background noise should be considered during induction and emergence, periods of especially high risk, when anesthesiologists must attend to many tasks and are thus susceptible to error. PMID:23263015

  3. Comparing Auditory-Only and Audiovisual Word Learning for Children with Hearing Loss.

    Science.gov (United States)

    McDaniel, Jena; Camarata, Stephen; Yoder, Paul

    2018-05-15

    Although reducing visual input to emphasize auditory cues is a common practice in pediatric auditory (re)habilitation, the extant literature offers minimal empirical evidence for whether unisensory auditory-only (AO) or multisensory audiovisual (AV) input is more beneficial to children with hearing loss for developing spoken language skills. Using an adapted alternating treatments single case research design, we evaluated the effectiveness and efficiency of a receptive word learning intervention with and without access to visual speechreading cues. Four preschool children with prelingual hearing loss participated. Based on probes without visual cues, three participants demonstrated strong evidence for learning in the AO and AV conditions relative to a control (no-teaching) condition. No participants demonstrated a differential rate of learning between AO and AV conditions. Neither an inhibitory effect predicted by a unisensory theory nor a beneficial effect predicted by a multisensory theory for providing visual cues was identified. Clinical implications are discussed.

  4. What causes IOR? Attention or perception? - manipulating cue and target luminance in either blocked or mixed condition.

    Science.gov (United States)

    Zhao, Yuanyuan; Heinke, Dietmar

    2014-12-01

    Inhibition of return (IOR) refers to the performance disadvantage when detecting a target presented at a previously cued location. The current paper contributes to the long-standing debate whether IOR is caused by attentional processing or perceptual processing. We present a series of four experiments which varied the cue luminance in mixed and blocked conditions. We hypothesised that if inhibition was initialized by an attentional process the size of IOR should not vary in the blocked condition as participants should be able to adapt to the level of cue luminance. However, if a perceptual process triggers inhibition both experimental manipulations should lead to varying levels of IOR. Indeed, we found evidence for the latter hypothesis. In addition, we also varied the target luminance in blocked and mixed condition. Both manipulations, cue luminance and target luminance, affected IOR in an additive fashion suggesting that the two stimuli affect human behaviour on different processing stages. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Dimension-based attention in visual short-term memory.

    Science.gov (United States)

    Pilling, Michael; Barrett, Doug J K

    2016-07-01

    We investigated how dimension-based attention influences visual short-term memory (VSTM). This was done through examining the effects of cueing a feature dimension in two perceptual comparison tasks (change detection and sameness detection). In both tasks, a memory array and a test array consisting of a number of colored shapes were presented successively, interleaved by a blank interstimulus interval (ISI). In Experiment 1 (change detection), the critical event was a feature change in one item across the memory and test arrays. In Experiment 2 (sameness detection), the critical event was the absence of a feature change in one item across the two arrays. Auditory cues indicated the feature dimension (color or shape) of the critical event with 80 % validity; the cues were presented either prior to the memory array, during the ISI, or simultaneously with the test array. In Experiment 1, the cue validity influenced sensitivity only when the cue was given at the earliest position; in Experiment 2, the cue validity influenced sensitivity at all three cue positions. We attributed the greater effectiveness of top-down guidance by cues in the sameness detection task to the more active nature of the comparison process required to detect sameness events (Hyun, Woodman, Vogel, Hollingworth, & Luck, Journal of Experimental Psychology: Human Perception and Performance, 35; 1140-1160, 2009).

  6. Listen, you are writing!Speeding up online spelling with a dynamic auditory BCI

    Directory of Open Access Journals (Sweden)

    Martijn eSchreuder

    2011-10-01

    Full Text Available Representing an intuitive spelling interface for Brain-Computer Interfaces (BCI in the auditory domain is not straightforward. In consequence, all existing approaches based on event-related potentials (ERP rely at least partially on a visual representation of the interface. This online study introduces an auditory spelling interface that eliminates the necessity for such a visualization. In up to two sessions, a group of healthy subjects (N=21 was asked to use a text entry application, utilizing the spatial cues of the AMUSE paradigm (Auditory Multiclass Spatial ERP. The speller relies on the auditory sense both for stimulation and the core feedback. Without prior BCI experience, 76% of the participants were able to write a full sentence during the first session. By exploiting the advantages of a newly introduced dynamic stopping method, a maximum writing speed of 1.41 characters/minute (7.55 bits/minute could be reached during the second session (average: .94 char/min, 5.26 bits/min. For the first time, the presented work shows that an auditory BCI can reach performances similar to state-of-the-art visual BCIs based on covert attention. These results represent an important step towards a purely auditory BCI.

  7. Different effects of the two types of spatial pre-cueing: what precisely is "attention" in Di Lollo's and Enns' substitution masking theory?

    Science.gov (United States)

    Luiga, I; Bachmann, T

    2007-11-01

    Enns and Di Lollo [Psychological Science, 8 (2), 135-139, 1997] have introduced the object substitution theory of visual masking. Object substitution masking occurs when focusing attention on the target is delayed. However, Posner (Quarterly Journal of Experimental Psychology, 32, 3-25, 1980) has already shown that attention can be directed to a target at least in two ways: intentionally (endogenously) and automatically (exogenously). We conducted two experiments to explore the effects of endogenous and exogenous cues on substitution masking. The results showed that when attention was shifted to the target location automatically (using a local peripheral pre-cue), masking was attenuated. A decrease in target identification dependent on a delay of mask offset, typical to substitution masking, was not observed. However, strong substitution masking occurred when the target location was not pre-cued or when attention was directed to the target location intentionally (using a symbolic pre-cue displayed centrally). The hypothesis of two different mechanisms of attentional control in substitution masking was confirmed.

  8. Difference in Perseverative Errors during a Visual Attention Task with Auditory Distractors in Alpha-9 Nicotinic Receptor Subunit Wild Type and Knock-Out Mice

    Directory of Open Access Journals (Sweden)

    Pascal Jorratt

    2017-11-01

    Full Text Available The auditory efferent system is a neural network that originates in the auditory cortex and projects to the cochlear receptor through olivocochlear (OC neurons. Medial OC neurons make cholinergic synapses with outer hair cells (OHCs through nicotinic receptors constituted by α9 and α10 subunits. One of the physiological functions of the α9 nicotinic receptor subunit (α9-nAChR is the suppression of auditory distractors during selective attention to visual stimuli. In a recent study we demonstrated that the behavioral performance of alpha-9 nicotinic receptor knock-out (KO mice is altered during selective attention to visual stimuli with auditory distractors since they made less correct responses and more omissions than wild type (WT mice. As the inhibition of the behavioral responses to irrelevant stimuli is an important mechanism of the selective attention processes, behavioral errors are relevant measures that can reflect altered inhibitory control. Errors produced during a cued attention task can be classified as premature, target and perseverative errors. Perseverative responses can be considered as an inability to inhibit the repetition of an action already planned, while premature responses can be considered as an index of the ability to wait or retain an action. Here, we studied premature, target and perseverative errors during a visual attention task with auditory distractors in WT and KO mice. We found that α9-KO mice make fewer perseverative errors with longer latencies than WT mice in the presence of auditory distractors. In addition, although we found no significant difference in the number of target error between genotypes, KO mice made more short-latency target errors than WT mice during the presentation of auditory distractors. The fewer perseverative error made by α9-KO mice could be explained by a reduced motivation for reward and an increased impulsivity during decision making with auditory distraction in KO mice.

  9. The Gaze-Cueing Effect in the United States and Japan: Influence of Cultural Differences in Cognitive Strategies on Control of Attention

    OpenAIRE

    Saki Takao; Yusuke Yamani; Atsunori Ariga

    2018-01-01

    The direction of gaze automatically and exogenously guides visual spatial attention, a phenomenon termed as the gaze-cueing effect. Although this effect arises when the duration of stimulus onset asynchrony (SOA) between a non-predictive gaze cue and the target is relatively long, no empirical research has examined the factors underlying this extended cueing effect. Two experiments compared the gaze-cueing effect at longer SOAs (700 ms) in Japanese and American participants. Cross-cultural st...

  10. Speaker's voice as a memory cue.

    Science.gov (United States)

    Campeanu, Sandra; Craik, Fergus I M; Alain, Claude

    2015-02-01

    Speaker's voice occupies a central role as the cornerstone of auditory social interaction. Here, we review the evidence suggesting that speaker's voice constitutes an integral context cue in auditory memory. Investigation into the nature of voice representation as a memory cue is essential to understanding auditory memory and the neural correlates which underlie it. Evidence from behavioral and electrophysiological studies suggest that while specific voice reinstatement (i.e., same speaker) often appears to facilitate word memory even without attention to voice at study, the presence of a partial benefit of similar voices between study and test is less clear. In terms of explicit memory experiments utilizing unfamiliar voices, encoding methods appear to play a pivotal role. Voice congruency effects have been found when voice is specifically attended at study (i.e., when relatively shallow, perceptual encoding takes place). These behavioral findings coincide with neural indices of memory performance such as the parietal old/new recollection effect and the late right frontal effect. The former distinguishes between correctly identified old words and correctly identified new words, and reflects voice congruency only when voice is attended at study. Characterization of the latter likely depends upon voice memory, rather than word memory. There is also evidence to suggest that voice effects can be found in implicit memory paradigms. However, the presence of voice effects appears to depend greatly on the task employed. Using a word identification task, perceptual similarity between study and test conditions is, like for explicit memory tests, crucial. In addition, the type of noise employed appears to have a differential effect. While voice effects have been observed when white noise is used at both study and test, using multi-talker babble does not confer the same results. In terms of neuroimaging research modulations, characterization of an implicit memory effect

  11. Cigarette Cue Attentional Bias in Cocaine-Smoking and Non-Cocaine-Using Cigarette Smokers.

    Science.gov (United States)

    Marks, Katherine R; Alcorn, Joseph L; Stoops, William W; Rush, Craig R

    2016-09-01

    Cigarette smoking in cocaine users is nearly four times higher than the national prevalence and cocaine use increases cigarette smoking. The mechanisms underlying cigarette smoking in cocaine-using individuals need to be identified to promote cigarette and cocaine abstinence. Previous studies have examined the salience of cigarette and cocaine cues separately. The present aim was to determine whether cigarette attentional bias (AB) is higher in cigarettes smokers who smoke cocaine relative to individuals who only smoke cigarettes. Twenty cigarette smokers who smoke cocaine and 20 non-cocaine-using cigarette smokers completed a visual probe task with eye-tracking technology. During this task, the magnitude of cigarette and cocaine AB was assessed through orienting bias, fixation time, and response time. Cocaine users displayed an orienting bias towards cigarette cues. Cocaine users also endorsed a more urgent desire to smoke to relieve negative affect associated with cigarette craving than non-cocaine users (g = 0.6). Neither group displayed a cigarette AB, as measured by fixation time. Cocaine users, but not non-cocaine users, displayed a cocaine AB as measured by orienting bias (g = 2.0) and fixation time (g = 1.2). There were no significant effects for response time data. Cocaine-smoking cigarettes smokers display an initial orienting bias toward cigarette cues, but not sustained cigarette AB. The incentive motivation underlying cigarette smoking also differs. Cocaine smokers report more urgent desire to smoke to relieve negative affect. Identifying differences in motivation to smoke cigarettes may provide new treatment targets for cigarette and cocaine use disorders. These results suggest that cocaine-smoking cigarette smokers display an initial orienting bias towards cigarette cues, but not sustained attention towards cigarette cues, relative to non-cocaine-using smokers. Smoked cocaine users also report a more urgent desire to smoke to relieve negative affect

  12. Behavioral and Brain Measures of Phasic Alerting Effects on Visual Attention

    DEFF Research Database (Denmark)

    Wiegand, Iris Michaela; Petersen, Anders; Finke, Kathrin

    2017-01-01

    In the present study, we investigated effects of phasic alerting on visual attention in a partial report task, in which half of the displays were preceded by an auditory warning cue. Based on the computational Theory of Visual Attention (TVA), we estimated parameters of spatial and non......-spatial aspects of visual attention and measured event-related lateralizations (ERLs) over visual processing areas. We found that the TVA parameter sensory effectiveness a, which is thought to reflect visual processing capacity, significantly increased with phasic alerting. By contrast, the distribution of visual....... These results suggest that phasic alerting facilitates visual processing in a general, unselective manner and that this effect originates in early stages of visual information processing....

  13. Visual and auditory reaction time for air traffic controllers using quantitative electroencephalograph (QEEG) data.

    Science.gov (United States)

    Abbass, Hussein A; Tang, Jiangjun; Ellejmi, Mohamed; Kirby, Stephen

    2014-12-01

    The use of quantitative electroencephalograph in the analysis of air traffic controllers' performance can reveal with a high temporal resolution those mental responses associated with different task demands. To understand the relationship between visual and auditory correct responses, reaction time, and the corresponding brain areas and functions, air traffic controllers were given an integrated visual and auditory continuous reaction task. Strong correlations were found between correct responses to the visual target and the theta band in the frontal lobe, the total power in the medial of the parietal lobe and the theta-to-beta ratio in the left side of the occipital lobe. Incorrect visual responses triggered activations in additional bands including the alpha band in the medial of the frontal and parietal lobes, and the Sensorimotor Rhythm in the medial of the parietal lobe. Controllers' responses to visual cues were found to be more accurate but slower than their corresponding performance on auditory cues. These results suggest that controllers are more susceptible to overload when more visual cues are used in the air traffic control system, and more errors are pruned as more auditory cues are used. Therefore, workload studies should be carried out to assess the usefulness of additional cues and their interactions with the air traffic control environment.

  14. Spatial attention and reading ability: ERP correlates of flanker and cue-size effects in good and poor adult phonological decoders.

    Science.gov (United States)

    Matthews, Allison Jane; Martin, Frances Heritage

    2015-12-01

    To investigate facilitatory and inhibitory processes during selective attention among adults with good (n=17) and poor (n=14) phonological decoding skills, a go/nogo flanker task was completed while EEG was recorded. Participants responded to a middle target letter flanked by compatible or incompatible flankers. The target was surrounded by a small or large circular cue which was presented simultaneously or 500ms prior. Poor decoders showed a greater RT cost for incompatible stimuli preceded by large cues and less RT benefit for compatible stimuli. Poor decoders also showed reduced modulation of ERPs by cue-size at left hemisphere posterior sites (N1) and by flanker compatibility at right hemisphere posterior sites (N1) and frontal sites (N2), consistent with processing differences in fronto-parietal attention networks. These findings have potential implications for understanding the relationship between spatial attention and phonological decoding in dyslexia. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Visual Input Enhances Selective Speech Envelope Tracking in Auditory Cortex at a ‘Cocktail Party’

    Science.gov (United States)

    Golumbic, Elana Zion; Cogan, Gregory B.; Schroeder, Charles E.; Poeppel, David

    2013-01-01

    Our ability to selectively attend to one auditory signal amidst competing input streams, epitomized by the ‘Cocktail Party’ problem, continues to stimulate research from various approaches. How this demanding perceptual feat is achieved from a neural systems perspective remains unclear and controversial. It is well established that neural responses to attended stimuli are enhanced compared to responses to ignored ones, but responses to ignored stimuli are nonetheless highly significant, leading to interference in performance. We investigated whether congruent visual input of an attended speaker enhances cortical selectivity in auditory cortex, leading to diminished representation of ignored stimuli. We recorded magnetoencephalographic (MEG) signals from human participants as they attended to segments of natural continuous speech. Using two complementary methods of quantifying the neural response to speech, we found that viewing a speaker’s face enhances the capacity of auditory cortex to track the temporal speech envelope of that speaker. This mechanism was most effective in a ‘Cocktail Party’ setting, promoting preferential tracking of the attended speaker, whereas without visual input no significant attentional modulation was observed. These neurophysiological results underscore the importance of visual input in resolving perceptual ambiguity in a noisy environment. Since visual cues in speech precede the associated auditory signals, they likely serve a predictive role in facilitating auditory processing of speech, perhaps by directing attentional resources to appropriate points in time when to-be-attended acoustic input is expected to arrive. PMID:23345218

  16. Working memory capacity and visual-verbal cognitive load modulate auditory-sensory gating in the brainstem: toward a unified view of attention.

    Science.gov (United States)

    Sörqvist, Patrik; Stenfelt, Stefan; Rönnberg, Jerker

    2012-11-01

    Two fundamental research questions have driven attention research in the past: One concerns whether selection of relevant information among competing, irrelevant, information takes place at an early or at a late processing stage; the other concerns whether the capacity of attention is limited by a central, domain-general pool of resources or by independent, modality-specific pools. In this article, we contribute to these debates by showing that the auditory-evoked brainstem response (an early stage of auditory processing) to task-irrelevant sound decreases as a function of central working memory load (manipulated with a visual-verbal version of the n-back task). Furthermore, individual differences in central/domain-general working memory capacity modulated the magnitude of the auditory-evoked brainstem response, but only in the high working memory load condition. The results support a unified view of attention whereby the capacity of a late/central mechanism (working memory) modulates early precortical sensory processing.

  17. Prestimulus subsequent memory effects for auditory and visual events.

    Science.gov (United States)

    Otten, Leun J; Quayle, Angela H; Puvaneswaran, Bhamini

    2010-06-01

    It has been assumed that the effective encoding of information into memory primarily depends on neural activity elicited when an event is initially encountered. Recently, it has been shown that memory formation also relies on neural activity just before an event. The precise role of such activity in memory is currently unknown. Here, we address whether prestimulus activity affects the encoding of auditory and visual events, is set up on a trial-by-trial basis, and varies as a function of the type of recognition judgment an item later receives. Electrical brain activity was recorded from the scalps of 24 healthy young adults while they made semantic judgments on randomly intermixed series of visual and auditory words. Each word was preceded by a cue signaling the modality of the upcoming word. Auditory words were preceded by auditory cues and visual words by visual cues. A recognition memory test with remember/know judgments followed after a delay of about 45 min. As observed previously, a negative-going, frontally distributed modulation just before visual word onset predicted later recollection of the word. Crucially, the same effect was found for auditory words and observed on stay as well as switch trials. These findings emphasize the flexibility and general role of prestimulus activity in memory formation, and support a functional interpretation of the activity in terms of semantic preparation. At least with an unpredictable trial sequence, the activity is set up anew on each trial.

  18. Express attentional re-engagement but delayed entry into consciousness following invalid spatial cues in visual search.

    Directory of Open Access Journals (Sweden)

    Benoit Brisson

    Full Text Available BACKGROUND: In predictive spatial cueing studies, reaction times (RT are shorter for targets appearing at cued locations (valid trials than at other locations (invalid trials. An increase in the amplitude of early P1 and/or N1 event-related potential (ERP components is also present for items appearing at cued locations, reflecting early attentional sensory gain control mechanisms. However, it is still unknown at which stage in the processing stream these early amplitude effects are translated into latency effects. METHODOLOGY/PRINCIPAL FINDINGS: Here, we measured the latency of two ERP components, the N2pc and the sustained posterior contralateral negativity (SPCN, to evaluate whether visual selection (as indexed by the N2pc and visual-short term memory processes (as indexed by the SPCN are delayed in invalid trials compared to valid trials. The P1 was larger contralateral to the cued side, indicating that attention was deployed to the cued location prior to the target onset. Despite these early amplitude effects, the N2pc onset latency was unaffected by cue validity, indicating an express, quasi-instantaneous re-engagement of attention in invalid trials. In contrast, latency effects were observed for the SPCN, and these were correlated to the RT effect. CONCLUSIONS/SIGNIFICANCE: Results show that latency differences that could explain the RT cueing effects must occur after visual selection processes giving rise to the N2pc, but at or before transfer in visual short-term memory, as reflected by the SPCN, at least in discrimination tasks in which the target is presented concurrently with at least one distractor. Given that the SPCN was previously associated to conscious report, these results further show that entry into consciousness is delayed following invalid cues.

  19. Neural dynamics of object-based multifocal visual spatial attention and priming: object cueing, useful-field-of-view, and crowding.

    Science.gov (United States)

    Foley, Nicholas C; Grossberg, Stephen; Mingolla, Ennio

    2012-08-01

    How are spatial and object attention coordinated to achieve rapid object learning and recognition during eye movement search? How do prefrontal priming and parietal spatial mechanisms interact to determine the reaction time costs of intra-object attention shifts, inter-object attention shifts, and shifts between visible objects and covertly cued locations? What factors underlie individual differences in the timing and frequency of such attentional shifts? How do transient and sustained spatial attentional mechanisms work and interact? How can volition, mediated via the basal ganglia, influence the span of spatial attention? A neural model is developed of how spatial attention in the where cortical stream coordinates view-invariant object category learning in the what cortical stream under free viewing conditions. The model simulates psychological data about the dynamics of covert attention priming and switching requiring multifocal attention without eye movements. The model predicts how "attentional shrouds" are formed when surface representations in cortical area V4 resonate with spatial attention in posterior parietal cortex (PPC) and prefrontal cortex (PFC), while shrouds compete among themselves for dominance. Winning shrouds support invariant object category learning, and active surface-shroud resonances support conscious surface perception and recognition. Attentive competition between multiple objects and cues simulates reaction-time data from the two-object cueing paradigm. The relative strength of sustained surface-driven and fast-transient motion-driven spatial attention controls individual differences in reaction time for invalid cues. Competition between surface-driven attentional shrouds controls individual differences in detection rate of peripheral targets in useful-field-of-view tasks. The model proposes how the strength of competition can be mediated, though learning or momentary changes in volition, by the basal ganglia. A new explanation of

  20. Perceptual Plasticity for Auditory Object Recognition

    Science.gov (United States)

    Heald, Shannon L. M.; Van Hedger, Stephen C.; Nusbaum, Howard C.

    2017-01-01

    In our auditory environment, we rarely experience the exact acoustic waveform twice. This is especially true for communicative signals that have meaning for listeners. In speech and music, the acoustic signal changes as a function of the talker (or instrument), speaking (or playing) rate, and room acoustics, to name a few factors. Yet, despite this acoustic variability, we are able to recognize a sentence or melody as the same across various kinds of acoustic inputs and determine meaning based on listening goals, expectations, context, and experience. The recognition process relates acoustic signals to prior experience despite variability in signal-relevant and signal-irrelevant acoustic properties, some of which could be considered as “noise” in service of a recognition goal. However, some acoustic variability, if systematic, is lawful and can be exploited by listeners to aid in recognition. Perceivable changes in systematic variability can herald a need for listeners to reorganize perception and reorient their attention to more immediately signal-relevant cues. This view is not incorporated currently in many extant theories of auditory perception, which traditionally reduce psychological or neural representations of perceptual objects and the processes that act on them to static entities. While this reduction is likely done for the sake of empirical tractability, such a reduction may seriously distort the perceptual process to be modeled. We argue that perceptual representations, as well as the processes underlying perception, are dynamically determined by an interaction between the uncertainty of the auditory signal and constraints of context. This suggests that the process of auditory recognition is highly context-dependent in that the identity of a given auditory object may be intrinsically tied to its preceding context. To argue for the flexible neural and psychological updating of sound-to-meaning mappings across speech and music, we draw upon examples

  1. Auditory Integration Training

    Directory of Open Access Journals (Sweden)

    Zahra Jafari

    2002-07-01

    Full Text Available Auditory integration training (AIT is a hearing enhancement training process for sensory input anomalies found in individuals with autism, attention deficit hyperactive disorder, dyslexia, hyperactivity, learning disability, language impairments, pervasive developmental disorder, central auditory processing disorder, attention deficit disorder, depressin, and hyperacute hearing. AIT, recently introduced in the United States, and has received much notice of late following the release of The Sound of a Moracle, by Annabel Stehli. In her book, Mrs. Stehli describes before and after auditory integration training experiences with her daughter, who was diagnosed at age four as having autism.

  2. S-ketamine influences strategic allocation of attention but not exogenous capture of attention.

    Science.gov (United States)

    Fuchs, Isabella; Ansorge, Ulrich; Huber-Huber, Christoph; Höflich, Anna; Lanzenberger, Rupert

    2015-09-01

    We investigated whether s-ketamine differentially affects strategic allocation of attention. In Experiment 1, (1) a less visible cue was weakly masked by the onsets of competing placeholders or (2) a better visible cue was not masked because it was presented in isolation. Both types of cue appeared more often opposite of the target (75%) than at target position (25%). With this setup, we tested for strategic attention shifts to the opposite side of the cues and for exogenous attentional capture toward the cue's side in a short cue-target interval, as well as for (reverse) cueing effects in a long cue-target interval after s-ketamine and after placebo treatment in a double-blind within-participant design. We found reduced strategic attention shifts after cues presented without placeholders for the s-ketamine compared to the placebo treatment in the short interval, indicating an early effect on the strategic allocation of attention. No differences between the two treatments were found for exogenous attentional capture by less visible cues, suggesting that s-ketamine does not affect exogenous attentional capture in the presence of competing distractors. Experiment 2 confirmed that the competing onsets of the placeholders prevented the strategic cueing effect. Taken together, the results indicate that s-ketamine affects strategic attentional capture, but not exogenous attentional capture. The findings point to a more prominent role of s-ketamine during top-down controlled forms of attention that require suppression of automatic capture than during automatic capture itself. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. An auditory cue-depreciation effect.

    Science.gov (United States)

    Gibson, J M; Watkins, M J

    1991-01-01

    An experiment is reported in which subjects first heard a list of words and then tried to identify these same words from degraded utterances. Paralleling previous findings in the visual modality, the probability of identifying a given utterance was reduced when the utterance was immediately preceded by other, more degraded, utterances of the same word. A second experiment replicated this "cue-depreciation effect" and in addition found the effect to be weakened, if not eliminated, when the target word was not included in the initial list or when the test was delayed by two days.

  4. Interaction of Object Binding Cues in Binaural Masking Pattern Experiments.

    Science.gov (United States)

    Verhey, Jesko L; Lübken, Björn; van de Par, Steven

    2016-01-01

    Object binding cues such as binaural and across-frequency modulation cues are likely to be used by the auditory system to separate sounds from different sources in complex auditory scenes. The present study investigates the interaction of these cues in a binaural masking pattern paradigm where a sinusoidal target is masked by a narrowband noise. It was hypothesised that beating between signal and masker may contribute to signal detection when signal and masker do not spectrally overlap but that this cue could not be used in combination with interaural cues. To test this hypothesis an additional sinusoidal interferer was added to the noise masker with a lower frequency than the noise whereas the target had a higher frequency than the noise. Thresholds increase when the interferer is added. This effect is largest when the spectral interferer-masker and masker-target distances are equal. The result supports the hypothesis that modulation cues contribute to signal detection in the classical masking paradigm and that these are analysed with modulation bandpass filters. A monaural model including an across-frequency modulation process is presented that account for this effect. Interestingly, the interferer also affects dichotic thresholds indicating that modulation cues also play a role in binaural processing.

  5. Brain dynamics of visual attention during anticipation and encoding of threat- and safe-cues in spider-phobic individuals.

    Science.gov (United States)

    Michalowski, Jaroslaw M; Pané-Farré, Christiane A; Löw, Andreas; Hamm, Alfons O

    2015-09-01

    This study systematically investigated the sensitivity of the phobic attention system by measuring event-related potentials (ERPs) in spider-phobic and non-phobic volunteers in a context where spider and neutral pictures were presented (phobic threat condition) and in contexts where no phobic but unpleasant and neutral or only neutral pictures were displayed (phobia-irrelevant conditions). In a between-group study, participants were assigned to phobia-irrelevant conditions either before or after the exposure to spider pictures (pre-exposure vs post-exposure participants). Additionally, each picture was preceded by a fixation cross presented in one of three different colors that were informative about the category of an upcoming picture. In the phobic threat condition, spider-phobic participants showed a larger P1 than controls for all pictures and signal cues. Moreover, individuals with spider phobia who were sensitized by the exposure to phobic stimuli (i.e. post-exposure participants) responded with an increased P1 also in phobia-irrelevant conditions. In contrast, no group differences between spider-phobic and non-phobic individuals were observed in the P1-amplitudes during viewing of phobia-irrelevant stimuli in the pre-exposure group. In addition, cues signaling neutral pictures elicited decreased stimulus-preceding negativity (SPN) compared with cues signaling emotional pictures. Moreover, emotional pictures and cues signaling emotional pictures evoked larger early posterior negativity (EPN) and late positive potential (LPP) than neutral stimuli. Spider phobics showed greater selective attention effects than controls for phobia-relevant pictures (increased EPN and LPP) and cues (increased LPP and SPN). Increased sensitization of the attention system observed in spider-phobic individuals might facilitate fear conditioning and promote generalization of fear playing an important role in the maintenance of anxiety disorders. © The Author (2015). Published by

  6. Beyond the real world: attention debates in auditory mismatch negativity.

    Science.gov (United States)

    Chung, Kyungmi; Park, Jin Young

    2018-04-11

    The aim of this study was to address the potential for the auditory mismatch negativity (aMMN) to be used in applied event-related potential (ERP) studies by determining whether the aMMN would be an attention-dependent ERP component and could be differently modulated across visual tasks or virtual reality (VR) stimuli with different visual properties and visual complexity levels. A total of 80 participants, aged 19-36 years, were assigned to either a reading-task (21 men and 19 women) or a VR-task (22 men and 18 women) group. Two visual-task groups of healthy young adults were matched in age, sex, and handedness. All participants were instructed to focus only on the given visual tasks and ignore auditory change detection. While participants in the reading-task group read text slides, those in the VR-task group viewed three 360° VR videos in a random order and rated how visually complex the given virtual environment was immediately after each VR video ended. Inconsistent with the finding of a partial significant difference in perceived visual complexity in terms of brightness of virtual environments, both visual properties of distance and brightness showed no significant differences in the modulation of aMMN amplitudes. A further analysis was carried out to compare elicited aMMN amplitudes of a typical MMN task and an applied VR task. No significant difference in the aMMN amplitudes was found across the two groups who completed visual tasks with different visual-task demands. In conclusion, the aMMN is a reliable ERP marker of preattentive cognitive processing for auditory deviance detection.

  7. The effects of interstimulus interval on event-related indices of attention: an auditory selective attention test of perceptual load theory.

    Science.gov (United States)

    Gomes, Hilary; Barrett, Sophia; Duff, Martin; Barnhardt, Jack; Ritter, Walter

    2008-03-01

    We examined the impact of perceptual load by manipulating interstimulus interval (ISI) in two auditory selective attention studies that varied in the difficulty of the target discrimination. In the paradigm, channels were separated by frequency and target/deviant tones were softer in intensity. Three ISI conditions were presented: fast (300ms), medium (600ms) and slow (900ms). Behavioral (accuracy and RT) and electrophysiological measures (Nd, P3b) were observed. In both studies, participants evidenced poorer accuracy during the fast ISI condition than the slow suggesting that ISI impacted task difficulty. However, none of the three measures of processing examined, Nd amplitude, P3b amplitude elicited by unattended deviant stimuli, or false alarms to unattended deviants, were impacted by ISI in the manner predicted by perceptual load theory. The prediction based on perceptual load theory, that there would be more processing of irrelevant stimuli under conditions of low as compared to high perceptual load, was not supported in these auditory studies. Task difficulty/perceptual load impacts the processing of irrelevant stimuli in the auditory modality differently than predicted by perceptual load theory, and perhaps differently than in the visual modality.

  8. Active auditory experience in infancy promotes brain plasticity in Theta and Gamma oscillations

    Directory of Open Access Journals (Sweden)

    Gabriella Musacchia

    2017-08-01

    Full Text Available Language acquisition in infants is driven by on-going neural plasticity that is acutely sensitive to environmental acoustic cues. Recent studies showed that attention-based experience with non-linguistic, temporally-modulated auditory stimuli sharpens cortical responses. A previous ERP study from this laboratory showed that interactive auditory experience via behavior-based feedback (AEx, over a 6-week period from 4- to 7-months-of-age, confers a processing advantage, compared to passive auditory exposure (PEx or maturation alone (Naïve Control, NC. Here, we provide a follow-up investigation of the underlying neural oscillatory patterns in these three groups. In AEx infants, Standard stimuli with invariant frequency (STD elicited greater Theta-band (4–6 Hz activity in Right Auditory Cortex (RAC, as compared to NC infants, and Deviant stimuli with rapid frequency change (DEV elicited larger responses in Left Auditory Cortex (LAC. PEx and NC counterparts showed less-mature bilateral patterns. AEx infants also displayed stronger Gamma (33–37 Hz activity in the LAC during DEV discrimination, compared to NCs, while NC and PEx groups demonstrated bilateral activity in this band, if at all. This suggests that interactive acoustic experience with non-linguistic stimuli can promote a distinct, robust and precise cortical pattern during rapid auditory processing, perhaps reflecting mechanisms that support fine-tuning of early acoustic mapping.

  9. Assessment of rival males through the use of multiple sensory cues in the fruitfly Drosophila pseudoobscura.

    Directory of Open Access Journals (Sweden)

    Chris P Maguire

    Full Text Available Environments vary stochastically, and animals need to behave in ways that best fit the conditions in which they find themselves. The social environment is particularly variable, and responding appropriately to it can be vital for an animal's success. However, cues of social environment are not always reliable, and animals may need to balance accuracy against the risk of failing to respond if local conditions or interfering signals prevent them detecting a cue. Recent work has shown that many male Drosophila fruit flies respond to the presence of rival males, and that these responses increase their success in acquiring mates and fathering offspring. In Drosophila melanogaster males detect rivals using auditory, tactile and olfactory cues. However, males fail to respond to rivals if any two of these senses are not functioning: a single cue is not enough to produce a response. Here we examined cue use in the detection of rival males in a distantly related Drosophila species, D. pseudoobscura, where auditory, olfactory, tactile and visual cues were manipulated to assess the importance of each sensory cue singly and in combination. In contrast to D. melanogaster, male D. pseudoobscura require intact olfactory and tactile cues to respond to rivals. Visual cues were not important for detecting rival D. pseudoobscura, while results on auditory cues appeared puzzling. This difference in cue use in two species in the same genus suggests that cue use is evolutionarily labile, and may evolve in response to ecological or life history differences between species.

  10. ASSESSMENT OF ATTENTION THRESHOLD IN RATS BY TITRATION OF VISUAL CUE DURATION DURING THE FIVE CHOICE SERIAL REACTION TIME TASK

    Science.gov (United States)

    Martin, Thomas J.; Grigg, Amanda; Kim, Susy A.; Ririe, Douglas G.; Eisenach, James C.

    2014-01-01

    Background The 5 choice serial reaction time task (5CSRTT) is commonly used to assess attention in rodents. We sought to develop a variant of the 5CSRTT that would speed training to objective success criteria, and to test whether this variant could determine attention capability in each subject. New Method Fisher 344 rats were trained to perform a variant of the 5CSRTT in which the duration of visual cue presentation (cue duration) was titrated between trials based upon performance. The cue duration was decreased when the subject made a correct response, or increased with incorrect responses or omissions. Additionally, test day challenges were provided consisting of lengthening the intertrial interval and inclusion of a visual distracting stimulus. Results Rats readily titrated the cue duration to less than 1 sec in 25 training sessions or less (mean ± SEM, 22.9 ± 0.7), and the median cue duration (MCD) was calculated as a measure of attention threshold. Increasing the intertrial interval increased premature responses, decreased the number of trials completed, and increased the MCD. Decreasing the intertrial interval and time allotted for consuming the food reward demonstrated that a minimum of 3.5 sec is required for rats to consume two food pellets and successfully attend to the next trial. Visual distraction in the form of a 3 Hz flashing light increased the MCD and both premature and time out responses. Comparison with existing method The titration variant of the 5CSRTT is a useful method that dynamically measures attention threshold across a wide range of subject performance, and significantly decreases the time required for training. Task challenges produce similar effects in the titration method as reported for the classical procedure. Conclusions The titration 5CSRTT method is an efficient training procedure for assessing attention and can be utilized to assess the limit in performance ability across subjects and various schedule manipulations. PMID

  11. Developing Spatial Knowledge in the Absence of Vision: Allocentric and Egocentric Representations Generated by Blind People When Supported by Auditory Cues

    Directory of Open Access Journals (Sweden)

    Luca Latini Corazzini

    2010-10-01

    Full Text Available The study of visuospatial representations and visuospatial memory can profit from the analysis of the performance of specific groups. in particular, the surprising skills and limitations of blind people may be an important source of information. For example, converging evidence indicates that, even though blind individuals are able to develop both egocentric and allocentric space representations, the latter tend to be much more restricted than those in blindfolded sighted individuals. however, no study has explored yet whether this conclusion also holds when people receive practice with the spatial environment and are supported by auditory stimuli. The present research examined these issues with the use of an experimental apparatus based on the morris Water maze (morris et al., 1982. in this setup, blind people and blindfolded controls were given the opportunity to develop knowledge of the environment with the support of simultaneous auditory cues. The results show that even in this favourable case blind people spontaneously maintain to rely on an egocentric spatial representation.

  12. Hippocampal volume and auditory attention on a verbal memory task with adult survivors of pediatric brain tumor.

    Science.gov (United States)

    Jayakar, Reema; King, Tricia Z; Morris, Robin; Na, Sabrina

    2015-03-01

    We examined the nature of verbal memory deficits and the possible hippocampal underpinnings in long-term adult survivors of childhood brain tumor. 35 survivors (M = 24.10 ± 4.93 years at testing; 54% female), on average 15 years post-diagnosis, and 59 typically developing adults (M = 22.40 ± 4.35 years, 54% female) participated. Automated FMRIB Software Library (FSL) tools were used to measure hippocampal, putamen, and whole brain volumes. The California Verbal Learning Test-Second Edition (CVLT-II) was used to assess verbal memory. Hippocampal, F(1, 91) = 4.06, ηp² = .04; putamen, F(1, 91) = 11.18, ηp² = .11; and whole brain, F(1, 92) = 18.51, ηp² = .17, volumes were significantly lower for survivors than controls (p memory indices of auditory attention list span (Trial 1: F(1, 92) = 12.70, η² = .12) and final list learning (Trial 5: F(1, 92) = 6.01, η² = .06) were significantly lower for survivors (p attention, but none of the other CVLT-II indices. Secondary analyses for the effect of treatment factors are presented. Volumetric differences between survivors and controls exist for the whole brain and for subcortical structures on average 15 years post-diagnosis. Treatment factors seem to have a unique effect on subcortical structures. Memory differences between survivors and controls are largely contingent upon auditory attention list span. Only hippocampal volume is associated with the auditory attention list span component of verbal memory. These findings are particularly robust for survivors treated with radiation. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  13. Trading of dynamic interaural time and level difference cues and its effect on the auditory motion-onset response measured with electroencephalography.

    Science.gov (United States)

    Altmann, Christian F; Ueda, Ryuhei; Bucher, Benoit; Furukawa, Shigeto; Ono, Kentaro; Kashino, Makio; Mima, Tatsuya; Fukuyama, Hidenao

    2017-10-01

    Interaural time (ITD) and level differences (ILD) constitute the two main cues for sound localization in the horizontal plane. Despite extensive research in animal models and humans, the mechanism of how these two cues are integrated into a unified percept is still far from clear. In this study, our aim was to test with human electroencephalography (EEG) whether integration of dynamic ITD and ILD cues is reflected in the so-called motion-onset response (MOR), an evoked potential elicited by moving sound sources. To this end, ITD and ILD trajectories were determined individually by cue trading psychophysics. We then measured EEG while subjects were presented with either static click-trains or click-trains that contained a dynamic portion at the end. The dynamic part was created by combining ITD with ILD either congruently to elicit the percept of a right/leftward moving sound, or incongruently to elicit the percept of a static sound. In two experiments that differed in the method to derive individual dynamic cue trading stimuli, we observed an MOR with at least a change-N1 (cN1) component for both the congruent and incongruent conditions at about 160-190 ms after motion-onset. A significant change-P2 (cP2) component for both the congruent and incongruent ITD/ILD combination was found only in the second experiment peaking at about 250 ms after motion onset. In sum, this study shows that a sound which - by a combination of counter-balanced ITD and ILD cues - induces a static percept can still elicit a motion-onset response, indicative of independent ITD and ILD processing at the level of the MOR - a component that has been proposed to be, at least partly, generated in non-primary auditory cortex. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Visuospatial information processing load and the ratio between parietal cue and target P3 amplitudes in the Attentional Network Test.

    Science.gov (United States)

    Abramov, Dimitri M; Pontes, Monique; Pontes, Adailton T; Mourao-Junior, Carlos A; Vieira, Juliana; Quero Cunha, Carla; Tamborino, Tiago; Galhanone, Paulo R; deAzevedo, Leonardo C; Lazarev, Vladimir V

    2017-04-24

    In ERP studies of cognitive processes during attentional tasks, the cue signals containing information about the target can increase the amplitude of the parietal cue P3 in relation to the 'neutral' temporal cue, and reduce the subsequent target P3 when this information is valid, i.e. corresponds to the target's attributes. The present study compared the cue-to-target P3 ratios in neutral and visuospatial cueing, in order to estimate the contribution of valid visuospatial information from the cue to target stages of the task performance, in terms of cognitive load. The P3 characteristics were also correlated with the results of individuals' performance of the visuospatial tasks, in order to estimate the relationship of the observed ERP with spatial reasoning. In 20 typically developing boys, aged 10-13 years (11.3±0.86), the intelligence quotient (I.Q.) was estimated by the Block Design and Vocabulary subtests from the WISC-III. The subjects performed the Attentional Network Test (ANT) accompanied by EEG recording. The cued two-choice task had three equiprobable cue conditions: No cue, with no information about the target; Neutral (temporal) cue, with an asterisk in the center of the visual field, predicting the target onset; and Spatial cues, with an asterisk in the upper or lower hemifield, predicting the onset and corresponding location of the target. The ERPs were estimated for the mid-frontal (Fz) and mid-parietal (Pz) scalp derivations. In the Pz, the Neutral cue P3 had a lower amplitude than the Spatial cue P3; whereas for the target ERPs, the P3 of the Neutral cue condition was larger than that of the Spatial cue condition. However, the sums of the magnitudes of the cue and target P3 were equal in the spatial and neutral cueing, probably indicating that in both cases the equivalent information processing load is included in either the cue or the target reaction, respectively. Meantime, in the Fz, the analog ERP components for both the cue and target

  15. Selective attention and avoidance on a pictorial cueing task during stress in clinically anxious and depressed participants.

    Science.gov (United States)

    Ellenbogen, Mark A; Schwartzman, Alex E

    2009-02-01

    Although it is well established that attentional biases exist in anxious populations, the specific components of visual orienting towards and away from emotional stimuli are not well delineated. The present study was designed to examine these processes. We used a modified spatial cueing task to assess the speed of engagement and disengagement from supraliminal and masked pictorial cues depicting threat, dysphoria, or neutral content in 36 clinically anxious, 41 depressed and 41 control participants. Participants were randomly assigned to a stress or neutral condition. During stress, anxious participants were slow to disengage from masked left hemifield pictures depicting threat or dysphoria, but were quick to disengage from supraliminal threat pictures. Information processing in anxious participants during stress was characterized by early selective attention of emotional stimuli, occurring prior to full conscious awareness, followed by effortful avoidance of threat. Depressed participants were distinct from the anxious group, displaying selective attention for stimuli depicting dysphoria, but not threat, during the neutral condition. In sum, attentional biases in clinical populations are associated with difficulties in the disengagement component of visual orienting. Further, a vigilant-avoidant pattern of attentional bias may represent a strategic attempt to compensate for the early activation of a fear response.

  16. Obese adults have visual attention bias for food cue images: evidence for altered reward system function.

    Science.gov (United States)

    Castellanos, E H; Charboneau, E; Dietrich, M S; Park, S; Bradley, B P; Mogg, K; Cowan, R L

    2009-09-01

    The major aim of this study was to investigate whether the motivational salience of food cues (as reflected by their attention-grabbing properties) differs between obese and normal-weight subjects in a manner consistent with altered reward system function in obesity. A total of 18 obese and 18 normal-weight, otherwise healthy, adult women between the ages of 18 and 35 participated in an eye-tracking paradigm in combination with a visual probe task. Eye movements and reaction time to food and non-food images were recorded during both fasted and fed conditions in a counterbalanced design. Eating behavior and hunger level were assessed by self-report measures. Obese individuals had higher scores than normal-weight individuals on self-report measures of responsiveness to external food cues and vulnerability to disruptions in control of eating behavior. Both obese and normal-weight individuals demonstrated increased gaze duration for food compared to non-food images in the fasted condition. In the fed condition, however, despite reduced hunger in both groups, obese individuals maintained the increased attention to food images, whereas normal-weight individuals had similar gaze duration for food and non-food images. Additionally, obese individuals had preferential orienting toward food images at the onset of each image. Obese and normal-weight individuals did not differ in reaction time measures in the fasted or fed condition. Food cue incentive salience is elevated equally in normal-weight and obese individuals during fasting. Obese individuals retain incentive salience for food cues despite feeding and decreased self-report of hunger. Sensitization to food cues in the environment and their dysregulation in obese individuals may play a role in the development and/or maintenance of obesity.

  17. Comprehensive evaluation of a child with an auditory brainstem implant.

    Science.gov (United States)

    Eisenberg, Laurie S; Johnson, Karen C; Martinez, Amy S; DesJardin, Jean L; Stika, Carren J; Dzubak, Danielle; Mahalak, Mandy Lutz; Rector, Emily P

    2008-02-01

    We had an opportunity to evaluate an American child whose family traveled to Italy to receive an auditory brainstem implant (ABI). The goal of this evaluation was to obtain insight into possible benefits derived from the ABI and to begin developing assessment protocols for pediatric clinical trials. Case study. Tertiary referral center. Pediatric ABI Patient 1 was born with auditory nerve agenesis. Auditory brainstem implant surgery was performed in December, 2005, in Verona, Italy. The child was assessed at the House Ear Institute, Los Angeles, in July 2006 at the age of 3 years 11 months. Follow-up assessment has continued at the HEAR Center in Birmingham, Alabama. Auditory brainstem implant. Performance was assessed for the domains of audition, speech and language, intelligence and behavior, quality of life, and parental factors. Patient 1 demonstrated detection of sound, speech pattern perception with visual cues, and inconsistent auditory-only vowel discrimination. Language age with signs was approximately 2 years, and vocalizations were increasing. Of normal intelligence, he exhibited attention deficits with difficulty completing structured tasks. Twelve months later, this child was able to identify speech patterns consistently; closed-set word identification was emerging. These results were within the range of performance for a small sample of similarly aged pediatric cochlear implant users. Pediatric ABI assessment with a group of well-selected children is needed to examine risk versus benefit in this population and to analyze whether open-set speech recognition is achievable.

  18. Visual attention to food cues is differentially modulated by gustatory-hedonic and post-ingestive attributes.

    Science.gov (United States)

    Garcia-Burgos, David; Lao, Junpeng; Munsch, Simone; Caldara, Roberto

    2017-07-01

    Although attentional biases towards food cues may play a critical role in food choices and eating behaviours, it remains largely unexplored which specific food attribute governs visual attentional deployment. The allocation of visual attention might be modulated by anticipatory postingestive consequences, from taste sensations derived from eating itself, or both. Therefore, in order to obtain a comprehensive understanding of the attentional mechanisms involved in the processing of food-related cues, we recorded the eye movements to five categories of well-standardised pictures: neutral non-food, high-calorie, good taste, distaste and dangerous food. In particular, forty-four healthy adults of both sexes were assessed with an antisaccade paradigm (which requires the generation of a voluntary saccade and the suppression of a reflex one) and a free viewing paradigm (which implies the free visual exploration of two images). The results showed that observers directed their initial fixations more often and faster on items with high survival relevance such as nutrient and possible dangers; although an increase in antisaccade error rates was only detected for high-calorie items. We also found longer prosaccade fixation duration and initial fixation duration bias score related to maintained attention towards high-calorie, good taste and danger categories; while shorter reaction times to correct an incorrect prosaccade related to less difficulties in inhibiting distasteful images. Altogether, these findings suggest that visual attention is differentially modulated by both the accepted and rejected food attributes, but also that normal-weight, non-eating disordered individuals exhibit enhanced approach to food's postingestive effects and avoidance of distasteful items (such as bitter vegetables or pungent products). Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Emotionally negative pictures increase attention to a subsequent auditory stimulus.

    Science.gov (United States)

    Tartar, Jaime L; de Almeida, Kristen; McIntosh, Roger C; Rosselli, Monica; Nash, Allan J

    2012-01-01

    Emotionally negative stimuli serve as a mechanism of biological preparedness to enhance attention. We hypothesized that emotionally negative stimuli would also serve as motivational priming to increase attention resources for subsequent stimuli. To that end, we tested 11 participants in a dual sensory modality task, wherein emotionally negative pictures were contrasted with emotionally neutral pictures and each picture was followed 600 ms later by a tone in an auditory oddball paradigm. Each trial began with a picture displayed for 200 ms; half of the trials began with an emotionally negative picture and half of the trials began with an emotionally neutral picture; 600 ms following picture presentation, the participants heard either an oddball tone or a standard tone. At the end of each trial (picture followed by tone), the participants categorized, with a button press, the picture and tone combination. As expected, and consistent with previous studies, we found an enhanced visual late positive potential (latency range=300-700 ms) to the negative picture stimuli. We further found that compared to neutral pictures, negative pictures resulted in early attention and orienting effects to subsequent tones (measured through an enhanced N1 and N2) and sustained attention effects only to the subsequent oddball tones (measured through late processing negativity, latency range=400-700 ms). Number pad responses to both the picture and tone category showed the shortest response latencies and greatest percentage of correct picture-tone categorization on the negative picture followed by oddball tone trials. Consistent with previous work on natural selective attention, our results support the idea that emotional stimuli can alter attention resource allocation. This finding has broad implications for human attention and performance as it specifically shows the conditions in which an emotionally negative stimulus can result in extended stimulus evaluation. Copyright © 2011

  20. Attentional bias and disinhibition toward gaming cues are related to problem gaming in male adolescents.

    Science.gov (United States)

    van Holst, Ruth J; Lemmens, Jeroen S; Valkenburg, Patti M; Peter, Jochen; Veltman, Dick J; Goudriaan, Anna E

    2012-06-01

    The aim of this study was to examine whether behavioral tendencies commonly related to addictive behaviors are also related to problematic computer and video game playing in adolescents. The study of attentional bias and response inhibition, characteristic for addictive disorders, is relevant to the ongoing discussion on whether problematic gaming should be classified as an addictive disorder. We tested the relation between self-reported levels of problem gaming and two behavioral domains: attentional bias and response inhibition. Ninety-two male adolescents performed two attentional bias tasks (addiction-Stroop, dot-probe) and a behavioral inhibition task (go/no-go). Self-reported problem gaming was measured by the game addiction scale, based on the Diagnostic and Statistical Manual of Mental Disorders-fourth edition criteria for pathological gambling and time spent on computer and/or video games. Male adolescents with higher levels of self-reported problem gaming displayed signs of error-related attentional bias to game cues. Higher levels of problem gaming were also related to more errors on response inhibition, but only when game cues were presented. These findings are in line with the findings of attentional bias reported in clinically recognized addictive disorders, such as substance dependence and pathological gambling, and contribute to the discussion on the proposed concept of "Addiction and Related Disorders" (which may include non-substance-related addictive behaviors) in the Diagnostic and Statistical Manual of Mental Disorders-fourth edition. Copyright © 2012 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  1. Contextual cueing of pop-out visual search: when context guides the deployment of attention.

    Science.gov (United States)

    Geyer, Thomas; Zehetleitner, Michael; Müller, Hermann J

    2010-05-01

    Visual context information can guide attention in demanding (i.e., inefficient) search tasks. When participants are repeatedly presented with identically arranged ('repeated') displays, reaction times are faster relative to newly composed ('non-repeated') displays. The present article examines whether this 'contextual cueing' effect operates also in simple (i.e., efficient) search tasks and if so, whether there it influences target, rather than response, selection. The results were that singleton-feature targets were detected faster when the search items were presented in repeated, rather than non-repeated, arrangements. Importantly, repeated, relative to novel, displays also led to an increase in signal detection accuracy. Thus, contextual cueing can expedite the selection of pop-out targets, most likely by enhancing feature contrast signals at the overall-salience computation stage.

  2. The contribution of dynamic visual cues to audiovisual speech perception.

    Science.gov (United States)

    Jaekl, Philip; Pesquita, Ana; Alsius, Agnes; Munhall, Kevin; Soto-Faraco, Salvador

    2015-08-01

    Seeing a speaker's facial gestures can significantly improve speech comprehension, especially in noisy environments. However, the nature of the visual information from the speaker's facial movements that is relevant for this enhancement is still unclear. Like auditory speech signals, visual speech signals unfold over time and contain both dynamic configural information and luminance-defined local motion cues; two information sources that are thought to engage anatomically and functionally separate visual systems. Whereas, some past studies have highlighted the importance of local, luminance-defined motion cues in audiovisual speech perception, the contribution of dynamic configural information signalling changes in form over time has not yet been assessed. We therefore attempted to single out the contribution of dynamic configural information to audiovisual speech processing. To this aim, we measured word identification performance in noise using unimodal auditory stimuli, and with audiovisual stimuli. In the audiovisual condition, speaking faces were presented as point light displays achieved via motion capture of the original talker. Point light displays could be isoluminant, to minimise the contribution of effective luminance-defined local motion information, or with added luminance contrast, allowing the combined effect of dynamic configural cues and local motion cues. Audiovisual enhancement was found in both the isoluminant and contrast-based luminance conditions compared to an auditory-only condition, demonstrating, for the first time the specific contribution of dynamic configural cues to audiovisual speech improvement. These findings imply that globally processed changes in a speaker's facial shape contribute significantly towards the perception of articulatory gestures and the analysis of audiovisual speech. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Evidence for impairments in using static line drawings of eye gaze cues to orient visual-spatial attention in children with high functioning autism.

    Science.gov (United States)

    Goldberg, Melissa C; Mostow, Allison J; Vecera, Shaun P; Larson, Jennifer C Gidley; Mostofsky, Stewart H; Mahone, E Mark; Denckla, Martha B

    2008-09-01

    We examined the ability to use static line drawings of eye gaze cues to orient visual-spatial attention in children with high functioning autism (HFA) compared to typically developing children (TD). The task was organized such that on valid trials, gaze cues were directed toward the same spatial location as the appearance of an upcoming target, while on invalid trials gaze cues were directed to an opposite location. Unlike TD children, children with HFA showed no advantage in reaction time (RT) on valid trials compared to invalid trials (i.e., no significant validity effect). The two stimulus onset asynchronies (200 ms, 700 ms) did not differentially affect these findings. The results suggest that children with HFA show impairments in utilizing static line drawings of gaze cues to orient visual-spatial attention.

  4. Low-level visual attention and its relation to joint attention in autism spectrum disorder.

    Science.gov (United States)

    Jaworski, Jessica L Bean; Eigsti, Inge-Marie

    2017-04-01

    Visual attention is integral to social interaction and is a critical building block for development in other domains (e.g., language). Furthermore, atypical attention (especially joint attention) is one of the earliest markers of autism spectrum disorder (ASD). The current study assesses low-level visual attention and its relation to social attentional processing in youth with ASD and typically developing (TD) youth, aged 7 to 18 years. The findings indicate difficulty overriding incorrect attentional cues in ASD, particularly with non-social (arrow) cues relative to social (face) cues. The findings also show reduced competition in ASD from cues that remain on-screen. Furthermore, social attention, autism severity, and age were all predictors of competing cue processing. The results suggest that individuals with ASD may be biased towards speeded rather than accurate responding, and further, that reduced engagement with visual information may impede responses to visual attentional cues. Once attention is engaged, individuals with ASD appear to interpret directional cues as meaningful. These findings from a controlled, experimental paradigm were mirrored in results from an ecologically valid measure of social attention. Attentional difficulties may be exacerbated during the complex and dynamic experience of actual social interaction. Implications for intervention are discussed.

  5. P2-23: Deficits on Preference but Not Attention in Patients with Depression: Evidence from Gaze Cue

    Directory of Open Access Journals (Sweden)

    Jingling Li

    2012-10-01

    Full Text Available Gaze is an important social cue and can easily capture attention. Our preference judgment is biased by others' gaze; that is, we prefer objects gazed by happy or neutral faces and dislike objects gazed by disgust faces. Since patients with depression have a negative bias in emotional perception, we hypothesized that they may have different preference judgment on the gazed objects than healthy controls. Twenty-one patients with major depressive disorder and 21 healthy age-matched controls completed an object categorization task and then rated their preference on those objects. In the categorization task, a schematic face either gazed toward or away from the to-be-categorized object. The results showed that both groups categorized faster for gazed objects than non-gazed objects, suggesting that patients did not have deficits on their attention to gaze cues. Nevertheless, healthy controls preferred gazed objects more than non-gazed objects, while patients did not have significant preference. Our result indicated that patients with depression have deficits on their social cognition rather than basic attentional mechanism.

  6. Review: Auditory Integration Training

    Directory of Open Access Journals (Sweden)

    Zahra Ja'fari

    2003-01-01

    Full Text Available Auditory integration training (AIT is a hearing enhancement training process for sensory input anomalies found in individuals with autism, attention deficit hyperactive disorder, dyslexia, hyperactivity, learning disability, language impairments, pervasive developmental disorder, central auditory processing disorder, attention deficit disorder, depression, and hyper acute hearing. AIT, recently introduced in the United States, and has received much notice of late following the release of the sound of a miracle, by Annabel Stehli. In her book, Mrs. Stehli describes before and after auditory integration training experiences with her daughter, who was diagnosed at age four as having autism.

  7. Visible propagation from invisible exogenous cueing.

    Science.gov (United States)

    Lin, Zhicheng; Murray, Scott O

    2013-09-20

    Perception and performance is affected not just by what we see but also by what we do not see-inputs that escape our awareness. While conscious processing and unconscious processing have been assumed to be separate and independent, here we report the propagation of unconscious exogenous cueing as determined by conscious motion perception. In a paradigm combining masked exogenous cueing and apparent motion, we show that, when an onset cue was rendered invisible, the unconscious exogenous cueing effect traveled, manifesting at uncued locations (4° apart) in accordance with conscious perception of visual motion; the effect diminished when the cue-to-target distance was 8° apart. In contrast, conscious exogenous cueing manifested in both distances. Further evidence reveals that the unconscious and conscious nonretinotopic effects could not be explained by an attentional gradient, nor by bottom-up, energy-based motion mechanisms, but rather they were subserved by top-down, tracking-based motion mechanisms. We thus term these effects mobile cueing. Taken together, unconscious mobile cueing effects (a) demonstrate a previously unknown degree of flexibility of unconscious exogenous attention; (b) embody a simultaneous dissociation and association of attention and consciousness, in which exogenous attention can occur without cue awareness ("dissociation"), yet at the same time its effect is contingent on conscious motion tracking ("association"); and (c) underscore the interaction of conscious and unconscious processing, providing evidence for an unconscious effect that is not automatic but controlled.

  8. Volitional Mechanisms Mediate the Cuing Effect of Pitch on Attention Orienting: The Influences of Perceptual Difficulty and Response Pressure.

    Science.gov (United States)

    Chiou, Rocco; Rich, Anina N

    2015-02-01

    Our cognitive system tends to link auditory pitch with spatial location in a specific manner (ie high-pitched sounds are usually associated with an upper location, and low sounds are associated with a lower location). Recent studies have demonstrated that this cross-modality association biases the allocation of visual attention and affects performance despite the auditory stimuli being irrelevant to the behavioural task. There is, however, a discrepancy between studies in their interpretation of the underlying mechanisms. Whereas we have previously claimed that the pitch-location mapping is mediated by volitional shifts of attention (Chiou & Rich, 2012, Perception, 41: , 339-353), other researchers suggest that this cross-modal effect reflects automatic shifts of attention (Mossbridge, Grabowecky, & Suzuki, 2011, Cognition, 121: , 133-139). Here we report a series of three experiments examining the effects of perceptual and response-related pressure on the ability of nonpredictive pitch to bias visual attention. We compare it with two control cues: a predictive pitch that triggers voluntary attention shifts and a salient peripheral flash that evokes involuntary shifts. The results show that the effect of nonpredictive pitch is abolished by pressure at either perceptual or response levels. By contrast, the effects of the two control cues remain significant, demonstrating the robustness of informative and perceptually salient stimuli in directing attention. This distinction suggests that, in contexts of high perceptual demand and response pressure, cognitive resources are primarily engaged by the task-relevant stimuli, which effectively prevents uninformative pitch from orienting attention to its cross-modally associated location. These findings are consistent with the hypothesis that the link between pitch and location affects attentional deployment via volitional rather than automatic mechanisms. © 2015 SAGE Publications.

  9. From ear to body: the auditory-motor loop in spatial cognition.

    Science.gov (United States)

    Viaud-Delmon, Isabelle; Warusfel, Olivier

    2014-01-01

    SPATIAL MEMORY IS MAINLY STUDIED THROUGH THE VISUAL SENSORY MODALITY: navigation tasks in humans rarely integrate dynamic and spatial auditory information. In order to study how a spatial scene can be memorized on the basis of auditory and idiothetic cues only, we constructed an auditory equivalent of the Morris water maze, a task widely used to assess spatial learning and memory in rodents. Participants were equipped with wireless headphones, which delivered a soundscape updated in real time according to their movements in 3D space. A wireless tracking system (video infrared with passive markers) was used to send the coordinates of the subject's head to the sound rendering system. The rendering system used advanced HRTF-based synthesis of directional cues and room acoustic simulation for the auralization of a realistic acoustic environment. Participants were guided blindfolded in an experimental room. Their task was to explore a delimitated area in order to find a hidden auditory target, i.e., a sound that was only triggered when walking on a precise location of the area. The position of this target could be coded in relationship to auditory landmarks constantly rendered during the exploration of the area. The task was composed of a practice trial, 6 acquisition trials during which they had to memorize the localization of the target, and 4 test trials in which some aspects of the auditory scene were modified. The task ended with a probe trial in which the auditory target was removed. The configuration of searching paths allowed observing how auditory information was coded to memorize the position of the target. They suggested that space can be efficiently coded without visual information in normal sighted subjects. In conclusion, space representation can be based on sensorimotor and auditory cues only, providing another argument in favor of the hypothesis that the brain has access to a modality-invariant representation of external space.

  10. From ear to body: the auditory-motor loop in spatial cognition

    Directory of Open Access Journals (Sweden)

    Isabelle eViaud-Delmon

    2014-09-01

    Full Text Available Spatial memory is mainly studied through the visual sensory modality: navigation tasks in humans rarely integrate dynamic and spatial auditory information. In order to study how a spatial scene can be memorized on the basis of auditory and idiothetic cues only, we constructed an auditory equivalent of the Morris water maze, a task widely used to assess spatial learning and memory in rodents. Participants were equipped with wireless headphones, which delivered a soundscape updated in real time according to their movements in 3D space. A wireless tracking system (video infrared with passive markers was used to send the coordinates of the subject’s head to the sound rendering system. The rendering system used advanced HRTF-based synthesis of directional cues and room acoustic simulation for the auralization of a realistic acoustic environment. Participants were guided blindfolded in an experimental room. Their task was to explore a delimitated area in order to find a hidden auditory target, i.e. a sound that was only triggered when walking on a precise location of the area. The position of this target could be coded in relationship to auditory landmarks constantly rendered during the exploration of the area. The task was composed of a practice trial, 6 acquisition trials during which they had to memorise the localisation of the target, and 4 test trials in which some aspects of the auditory scene were modified. The task ended with a probe trial in which the auditory target was removed.The configuration of searching paths allowed observing how auditory information was coded to memorise the position of the target. They suggested that space can be efficiently coded without visual information in normal sighted subjects. In conclusion, space representation can be based on sensorimotor and auditory cues only, providing another argument in favour of the hypothesis that the brain has access to a modality-invariant representation of external space.

  11. Auditory Motion Elicits a Visual Motion Aftereffect.

    Science.gov (United States)

    Berger, Christopher C; Ehrsson, H Henrik

    2016-01-01

    The visual motion aftereffect is a visual illusion in which exposure to continuous motion in one direction leads to a subsequent illusion of visual motion in the opposite direction. Previous findings have been mixed with regard to whether this visual illusion can be induced cross-modally by auditory stimuli. Based on research on multisensory perception demonstrating the profound influence auditory perception can have on the interpretation and perceived motion of visual stimuli, we hypothesized that exposure to auditory stimuli with strong directional motion cues should induce a visual motion aftereffect. Here, we demonstrate that horizontally moving auditory stimuli induced a significant visual motion aftereffect-an effect that was driven primarily by a change in visual motion perception following exposure to leftward moving auditory stimuli. This finding is consistent with the notion that visual and auditory motion perception rely on at least partially overlapping neural substrates.

  12. Auditory Motion Elicits a Visual Motion Aftereffect

    Directory of Open Access Journals (Sweden)

    Christopher C. Berger

    2016-12-01

    Full Text Available The visual motion aftereffect is a visual illusion in which exposure to continuous motion in one direction leads to a subsequent illusion of visual motion in the opposite direction. Previous findings have been mixed with regard to whether this visual illusion can be induced cross-modally by auditory stimuli. Based on research on multisensory perception demonstrating the profound influence auditory perception can have on the interpretation and perceived motion of visual stimuli, we hypothesized that exposure to auditory stimuli with strong directional motion cues should induce a visual motion aftereffect. Here, we demonstrate that horizontally moving auditory stimuli induced a significant visual motion aftereffect—an effect that was driven primarily by a change in visual motion perception following exposure to leftward moving auditory stimuli. This finding is consistent with the notion that visual and auditory motion perception rely on at least partially overlapping neural substrates.

  13. The "where" of social attention: Head and body direction aftereffects arise from representations specific to cue type and not direction alone.

    Science.gov (United States)

    Lawson, Rebecca P; Calder, Andrew J

    2016-01-01

    Human beings have remarkable social attention skills. From the initial processing of cues, such as eye gaze, head direction, and body orientation, we perceive where other people are attending, allowing us to draw inferences about the intentions, desires, and dispositions of others. But before we can infer why someone is attending to something in the world we must first accurately represent where they are attending. Here we investigate the "where" of social attention perception, and employ adaptation paradigms to ascertain how head and body orientation are visually represented in the human brain. Across two experiments we show that the representation of two cues to social attention (head and body orientation) exists at the category-specific level. This suggests that aftereffects do not arise from "social attention cells" discovered in macaques or from abstract representations of "leftness" or "rightness."

  14. Depression, not PTSD, is associated with attentional biases for emotional visual cues in early traumatized individuals with PTSD

    Directory of Open Access Journals (Sweden)

    Charlotte Elisabeth Wittekind

    2015-01-01

    Full Text Available Using variants of the emotional Stroop task (EST, a large number of studies demonstrated attentional biases in individuals with PTSD across different types of trauma. However, the specificity and robustness of the emotional Stroop effect in PTSD were questioned recently. In particular, the paradigm cannot disentangle underlying cognitive mechanisms. Transgenerational studies provide evidence that consequences of trauma are not limited to the traumatized people, but extend to close relatives, especially the children. To further investigate attentional biases in PTSD and to shed light on the underlying cognitive mechanism(s, a spatial-cueing paradigm with pictures of different emotional valence (neutral, anxiety, depression, trauma was administered to individuals displaced as children during World War II with (n = 22 and without PTSD (n = 26 as well as to nontraumatized controls (n = 22. To assess whether parental PTSD is associated with biased information processing in children, each one adult offspring was also included in the study. PTSD was not associated with attentional biases for trauma-related stimuli. There was no evidence for a transgenerational transmission of biased information processing. However, when samples were regrouped based on current depression, a reduced inhibition of return (IOR effect emerged for depression-related cues. IOR refers to the phenomenon that with longer intervals between cue and target the validity effect is reversed: uncued locations are associated with shorter and cued locations with longer RTs. The results diverge from EST studies and demonstrate that findings on attentional biases yield equivocal results across different paradigms. Attentional biases for trauma-related material may only appear for verbal but not for visual stimuli in an elderly population with childhood trauma with PTSD. Future studies should more closely investigate whether findings from younger trauma populations also manifest in older

  15. Modality-specificity of Selective Attention Networks

    OpenAIRE

    Stewart, Hannah J.; Amitay, Sygal

    2015-01-01

    Objective: To establish the modality specificity and generality of selective attention networks. Method: Forty-eight young adults completed a battery of four auditory and visual selective attention tests based upon the Attention Network framework: the visual and auditory Attention Network Tests (vANT, aANT), the Test of Everyday Attention (TEA), and the Test of Attention in Listening (TAiL). These provided independent measures for auditory and visual alerting, orienting, and conflict resoluti...

  16. [Attention characteristics of children with different clinical subtypes of attention deficit hyperactivity disorder].

    Science.gov (United States)

    Liu, Wen-Long; Zhao, Xu; Tan, Jian-Hui; Wang, Juan

    2014-09-01

    To explore the attention characteristics of children with different clinical subtypes of attention deficit hyperactivity disorder (ADHD) and to provide a basis for clinical intervention. A total of 345 children diagnosed with ADHD were selected and the subtypes were identified. Attention assessment was performed by the intermediate visual and auditory continuous performance test at diagnosis, and the visual and auditory attention characteristics were compared between children with different subtypes. A total of 122 normal children were recruited in the control group and their attention characteristics were compared with those of children with ADHD. The scores of full scale attention quotient (AQ) and full scale response control quotient (RCQ) of children with all three subtypes of ADHD were significantly lower than those of normal children (Phyperactive/impulsive subtype (Pattention function of children with ADHD is worse than that of normal children, and the impairment of visual attention function is severer than that of auditory attention function. The degree of functional impairment of visual or auditory attention shows no significant differences between three subtypes of ADHD.

  17. Nicotine, auditory sensory memory and attention in a human ketamine model of schizophrenia: moderating influence of a hallucinatory trait

    Directory of Open Access Journals (Sweden)

    Verner eKnott

    2012-09-01

    Full Text Available Background: The procognitive actions of the nicotinic acetylcholine receptor (nAChR agonist nicotine are believed, in part, to motivate the excessive cigarette smoking in schizophrenia, a disorder associated with deficits in multiple cognitive domains, including low level auditory sensory processes and higher order attention-dependent operations. Objectives: As N-methyl-D-aspartate receptor (NMDAR hypofunction has been shown to contribute to these cognitive impairments, the primary aims of this healthy volunteer study were to: a to shed light on the separate and interactive roles of nAChR and NMDAR systems in the modulation of auditory sensory memory (and sustained attention, as indexed by the auditory event-related brain potential (ERP – mismatch negativity (MMN, and b to examine how these effects are moderated by a predisposition to auditory hallucinations/delusions (HD. Methods: In a randomized, double-blind, placebo controlled design involving a low intravenous dose of ketamine (.04 mg/kg and a 4 mg dose of nicotine gum, MMN and performance on a rapid visual information processing (RVIP task of sustained attention were examined in 24 healthy controls psychometrically stratified as being lower (L-HD, n = 12 or higher (H-HD for HD propensity. Results: Ketamine significantly slowed MMN, and reduced MMN in H-HD, with amplitude attenuation being blocked by the co-administration of nicotine. Nicotine significantly enhanced response speed (reaction time and accuracy (increased % hits and d΄ and reduced false alarms on the RIVIP, with improved performance accuracy being prevented when nicotine was administered with ketamine. Both % hits and d΄, as well as reaction time were poorer in H-HD (vs. L-HD and while hit rate and d΄ was increased by nicotine in H-HD, reaction time was slowed by ketamine in L-HD. Conclusions: Nicotine alleviated ketamine-induced sensory memory impairments and improved attention, particularly in individuals prone to HD.

  18. Verbal Auditory Cueing of Improvisational Dance: A Proposed Method for Training Agency in Parkinson’s Disease

    Science.gov (United States)

    Batson, Glenna; Hugenschmidt, Christina E.; Soriano, Christina T.

    2016-01-01

    Dance is a non-pharmacological intervention that helps maintain functional independence and quality of life in people with Parkinson’s disease (PPD). Results from controlled studies on group-delivered dance for people with mild-to-moderate stage Parkinson’s have shown statistically and clinically significant improvements in gait, balance, and psychosocial factors. Tested interventions include non-partnered dance forms (ballet and modern dance) and partnered (tango). In all of these dance forms, specific movement patterns initially are learned through repetition and performed in time-to-music. Once the basic steps are mastered, students may be encouraged to improvise on the learned steps as they perform them in rhythm with the music. Here, we summarize a method of teaching improvisational dance that advances previous reported benefits of dance for people with Parkinson’s disease (PD). The method relies primarily on improvisational verbal auditory cueing with less emphasis on directed movement instruction. This method builds on the idea that daily living requires flexible, adaptive responses to real-life challenges. In PD, movement disorders not only limit mobility but also impair spontaneity of thought and action. Dance improvisation demands open and immediate interpretation of verbally delivered movement cues, potentially fostering the formation of spontaneous movement strategies. Here, we present an introduction to a proposed method, detailing its methodological specifics, and pointing to future directions. The viewpoint advances an embodied cognitive approach that has eco-validity in helping PPD meet the changing demands of daily living. PMID:26925029

  19. Comparison of congruence judgment and auditory localization tasks for assessing the spatial limits of visual capture.

    Science.gov (United States)

    Bosen, Adam K; Fleming, Justin T; Brown, Sarah E; Allen, Paul D; O'Neill, William E; Paige, Gary D

    2016-12-01

    Vision typically has better spatial accuracy and precision than audition and as a result often captures auditory spatial perception when visual and auditory cues are presented together. One determinant of visual capture is the amount of spatial disparity between auditory and visual cues: when disparity is small, visual capture is likely to occur, and when disparity is large, visual capture is unlikely. Previous experiments have used two methods to probe how visual capture varies with spatial disparity. First, congruence judgment assesses perceived unity between cues by having subjects report whether or not auditory and visual targets came from the same location. Second, auditory localization assesses the graded influence of vision on auditory spatial perception by having subjects point to the remembered location of an auditory target presented with a visual target. Previous research has shown that when both tasks are performed concurrently they produce similar measures of visual capture, but this may not hold when tasks are performed independently. Here, subjects alternated between tasks independently across three sessions. A Bayesian inference model of visual capture was used to estimate perceptual parameters for each session, which were compared across tasks. Results demonstrated that the range of audiovisual disparities over which visual capture was likely to occur was narrower in auditory localization than in congruence judgment, which the model indicates was caused by subjects adjusting their prior expectation that targets originated from the same location in a task-dependent manner.

  20. Comparison of Congruence Judgment and Auditory Localization Tasks for Assessing the Spatial Limits of Visual Capture

    Science.gov (United States)

    Bosen, Adam K.; Fleming, Justin T.; Brown, Sarah E.; Allen, Paul D.; O'Neill, William E.; Paige, Gary D.

    2016-01-01

    Vision typically has better spatial accuracy and precision than audition, and as a result often captures auditory spatial perception when visual and auditory cues are presented together. One determinant of visual capture is the amount of spatial disparity between auditory and visual cues: when disparity is small visual capture is likely to occur, and when disparity is large visual capture is unlikely. Previous experiments have used two methods to probe how visual capture varies with spatial disparity. First, congruence judgment assesses perceived unity between cues by having subjects report whether or not auditory and visual targets came from the same location. Second, auditory localization assesses the graded influence of vision on auditory spatial perception by having subjects point to the remembered location of an auditory target presented with a visual target. Previous research has shown that when both tasks are performed concurrently they produce similar measures of visual capture, but this may not hold when tasks are performed independently. Here, subjects alternated between tasks independently across three sessions. A Bayesian inference model of visual capture was used to estimate perceptual parameters for each session, which were compared across tasks. Results demonstrated that the range of audio-visual disparities over which visual capture was likely to occur were narrower in auditory localization than in congruence judgment, which the model indicates was caused by subjects adjusting their prior expectation that targets originated from the same location in a task-dependent manner. PMID:27815630

  1. Auditory Automotive Mechanics Diagnostic Achievement Test. Center Technical Paper No. 2.

    Science.gov (United States)

    Swanson, Richard Arthur

    The Auditory Automotive Mechanics Diagnostic Achievement Test assesses an automobile mechanic's ability to determine mechanical faults from auditory cues alone. The 44-item test and its instructions are recorded on magnetic tape; answer choices are presented on tape, and are also written in the printed test booklets. The norming and validity…

  2. Brain activity during divided and selective attention to auditory and visual sentence comprehension tasks.

    Science.gov (United States)

    Moisala, Mona; Salmela, Viljami; Salo, Emma; Carlson, Synnöve; Vuontela, Virve; Salonen, Oili; Alho, Kimmo

    2015-01-01

    Using functional magnetic resonance imaging (fMRI), we measured brain activity of human participants while they performed a sentence congruence judgment task in either the visual or auditory modality separately, or in both modalities simultaneously. Significant performance decrements were observed when attention was divided between the two modalities compared with when one modality was selectively attended. Compared with selective attention (i.e., single tasking), divided attention (i.e., dual-tasking) did not recruit additional cortical regions, but resulted in increased activity in medial and lateral frontal regions which were also activated by the component tasks when performed separately. Areas involved in semantic language processing were revealed predominantly in the left lateral prefrontal cortex by contrasting incongruent with congruent sentences. These areas also showed significant activity increases during divided attention in relation to selective attention. In the sensory cortices, no crossmodal inhibition was observed during divided attention when compared with selective attention to one modality. Our results suggest that the observed performance decrements during dual-tasking are due to interference of the two tasks because they utilize the same part of the cortex. Moreover, semantic dual-tasking did not appear to recruit additional brain areas in comparison with single tasking, and no crossmodal inhibition was observed during intermodal divided attention.

  3. Brain activity during divided and selective attention to auditory and visual sentence comprehension tasks

    Science.gov (United States)

    Moisala, Mona; Salmela, Viljami; Salo, Emma; Carlson, Synnöve; Vuontela, Virve; Salonen, Oili; Alho, Kimmo

    2015-01-01

    Using functional magnetic resonance imaging (fMRI), we measured brain activity of human participants while they performed a sentence congruence judgment task in either the visual or auditory modality separately, or in both modalities simultaneously. Significant performance decrements were observed when attention was divided between the two modalities compared with when one modality was selectively attended. Compared with selective attention (i.e., single tasking), divided attention (i.e., dual-tasking) did not recruit additional cortical regions, but resulted in increased activity in medial and lateral frontal regions which were also activated by the component tasks when performed separately. Areas involved in semantic language processing were revealed predominantly in the left lateral prefrontal cortex by contrasting incongruent with congruent sentences. These areas also showed significant activity increases during divided attention in relation to selective attention. In the sensory cortices, no crossmodal inhibition was observed during divided attention when compared with selective attention to one modality. Our results suggest that the observed performance decrements during dual-tasking are due to interference of the two tasks because they utilize the same part of the cortex. Moreover, semantic dual-tasking did not appear to recruit additional brain areas in comparison with single tasking, and no crossmodal inhibition was observed during intermodal divided attention. PMID:25745395

  4. Assessing Top-Down and Bottom-Up Contributions to Auditory Stream Segregation and Integration With Polyphonic Music.

    Science.gov (United States)

    Disbergen, Niels R; Valente, Giancarlo; Formisano, Elia; Zatorre, Robert J

    2018-01-01

    Polyphonic music listening well exemplifies processes typically involved in daily auditory scene analysis situations, relying on an interactive interplay between bottom-up and top-down processes. Most studies investigating scene analysis have used elementary auditory scenes, however real-world scene analysis is far more complex. In particular, music, contrary to most other natural auditory scenes, can be perceived by either integrating or, under attentive control, segregating sound streams, often carried by different instruments. One of the prominent bottom-up cues contributing to multi-instrument music perception is their timbre difference. In this work, we introduce and validate a novel paradigm designed to investigate, within naturalistic musical auditory scenes, attentive modulation as well as its interaction with bottom-up processes. Two psychophysical experiments are described, employing custom-composed two-voice polyphonic music pieces within a framework implementing a behavioral performance metric to validate listener instructions requiring either integration or segregation of scene elements. In Experiment 1, the listeners' locus of attention was switched between individual instruments or the aggregate (i.e., both instruments together), via a task requiring the detection of temporal modulations (i.e., triplets) incorporated within or across instruments. Subjects responded post-stimulus whether triplets were present in the to-be-attended instrument(s). Experiment 2 introduced the bottom-up manipulation by adding a three-level morphing of instrument timbre distance to the attentional framework. The task was designed to be used within neuroimaging paradigms; Experiment 2 was additionally validated behaviorally in the functional Magnetic Resonance Imaging (fMRI) environment. Experiment 1 subjects ( N = 29, non-musicians) completed the task at high levels of accuracy, showing no group differences between any experimental conditions. Nineteen listeners also

  5. Modality-specificity of Selective Attention Networks.

    Science.gov (United States)

    Stewart, Hannah J; Amitay, Sygal

    2015-01-01

    To establish the modality specificity and generality of selective attention networks. Forty-eight young adults completed a battery of four auditory and visual selective attention tests based upon the Attention Network framework: the visual and auditory Attention Network Tests (vANT, aANT), the Test of Everyday Attention (TEA), and the Test of Attention in Listening (TAiL). These provided independent measures for auditory and visual alerting, orienting, and conflict resolution networks. The measures were subjected to an exploratory factor analysis to assess underlying attention constructs. The analysis yielded a four-component solution. The first component comprised of a range of measures from the TEA and was labeled "general attention." The third component was labeled "auditory attention," as it only contained measures from the TAiL using pitch as the attended stimulus feature. The second and fourth components were labeled as "spatial orienting" and "spatial conflict," respectively-they were comprised of orienting and conflict resolution measures from the vANT, aANT, and TAiL attend-location task-all tasks based upon spatial judgments (e.g., the direction of a target arrow or sound location). These results do not support our a-priori hypothesis that attention networks are either modality specific or supramodal. Auditory attention separated into selectively attending to spatial and non-spatial features, with the auditory spatial attention loading onto the same factor as visual spatial attention, suggesting spatial attention is supramodal. However, since our study did not include a non-spatial measure of visual attention, further research will be required to ascertain whether non-spatial attention is modality-specific.

  6. A Characterization of Visual, Semantic and Auditory Memory in Children with Combination-Type Attention Deficit, Primarily Inattentive, and a Control Group

    Science.gov (United States)

    Ramirez, Luz Angela; Arenas, Angela Maria; Henao, Gloria Cecilia

    2005-01-01

    Introduction: This investigation describes and compares characteristics of visual, semantic and auditory memory in a group of children diagnosed with combined-type attention deficit with hyperactivity, attention deficit predominating, and a control group. Method: 107 boys and girls were selected, from 7 to 11 years of age, all residents in the…

  7. The development of prospective memory in young schoolchildren: the impact of ongoing task absorption, cue salience, and cue centrality.

    Science.gov (United States)

    Kliegel, Matthias; Mahy, Caitlin E V; Voigt, Babett; Henry, Julie D; Rendell, Peter G; Aberle, Ingo

    2013-12-01

    This study presents evidence that 9- and 10-year-old children outperform 6- and 7-year-old children on a measure of event-based prospective memory and that retrieval-based factors systematically influence performance and age differences. All experiments revealed significant age effects in prospective memory even after controlling for ongoing task performance. In addition, the provision of a less absorbing ongoing task (Experiment 1), higher cue salience (Experiment 2), and cues appearing in the center of attention (Experiment 3) were each associated with better performance. Of particular developmental importance was an age by cue centrality (in or outside of the center of attention) interaction that emerged in Experiment 3. Thus, age effects were restricted to prospective memory cues appearing outside of the center of attention, suggesting that the development of prospective memory across early school years may be modulated by whether a cue requires overt monitoring beyond the immediate attentional context. Because whether a cue is in or outside of the center of attention might determine the amount of executive control needed in a prospective memory task, findings suggest that developing executive control resources may drive prospective memory development across primary school age. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Psycho-physiological assessment of a prosthetic hand sensory feedback system based on an auditory display: a preliminary study.

    Science.gov (United States)

    Gonzalez, Jose; Soma, Hirokazu; Sekine, Masashi; Yu, Wenwei

    2012-06-09

    Prosthetic hand users have to rely extensively on visual feedback, which seems to lead to a high conscious burden for the users, in order to manipulate their prosthetic devices. Indirect methods (electro-cutaneous, vibrotactile, auditory cues) have been used to convey information from the artificial limb to the amputee, but the usability and advantages of these feedback methods were explored mainly by looking at the performance results, not taking into account measurements of the user's mental effort, attention, and emotions. The main objective of this study was to explore the feasibility of using psycho-physiological measurements to assess cognitive effort when manipulating a robot hand with and without the usage of a sensory substitution system based on auditory feedback, and how these psycho-physiological recordings relate to temporal and grasping performance in a static setting. 10 male subjects (26+/-years old), participated in this study and were asked to come for 2 consecutive days. On the first day the experiment objective, tasks, and experiment setting was explained. Then, they completed a 30 minutes guided training. On the second day each subject was tested in 3 different modalities: Auditory Feedback only control (AF), Visual Feedback only control (VF), and Audiovisual Feedback control (AVF). For each modality they were asked to perform 10 trials. At the end of each test, the subject had to answer the NASA TLX questionnaire. Also, during the test the subject's EEG, ECG, electro-dermal activity (EDA), and respiration rate were measured. The results show that a higher mental effort is needed when the subjects rely only on their vision, and that this effort seems to be reduced when auditory feedback is added to the human-machine interaction (multimodal feedback). Furthermore, better temporal performance and better grasping performance was obtained in the audiovisual modality. The performance improvements when using auditory cues, along with vision

  9. Psycho-physiological assessment of a prosthetic hand sensory feedback system based on an auditory display: a preliminary study

    Directory of Open Access Journals (Sweden)

    Gonzalez Jose

    2012-06-01

    Full Text Available Abstract Background Prosthetic hand users have to rely extensively on visual feedback, which seems to lead to a high conscious burden for the users, in order to manipulate their prosthetic devices. Indirect methods (electro-cutaneous, vibrotactile, auditory cues have been used to convey information from the artificial limb to the amputee, but the usability and advantages of these feedback methods were explored mainly by looking at the performance results, not taking into account measurements of the user’s mental effort, attention, and emotions. The main objective of this study was to explore the feasibility of using psycho-physiological measurements to assess cognitive effort when manipulating a robot hand with and without the usage of a sensory substitution system based on auditory feedback, and how these psycho-physiological recordings relate to temporal and grasping performance in a static setting. Methods 10 male subjects (26+/-years old, participated in this study and were asked to come for 2 consecutive days. On the first day the experiment objective, tasks, and experiment setting was explained. Then, they completed a 30 minutes guided training. On the second day each subject was tested in 3 different modalities: Auditory Feedback only control (AF, Visual Feedback only control (VF, and Audiovisual Feedback control (AVF. For each modality they were asked to perform 10 trials. At the end of each test, the subject had to answer the NASA TLX questionnaire. Also, during the test the subject’s EEG, ECG, electro-dermal activity (EDA, and respiration rate were measured. Results The results show that a higher mental effort is needed when the subjects rely only on their vision, and that this effort seems to be reduced when auditory feedback is added to the human-machine interaction (multimodal feedback. Furthermore, better temporal performance and better grasping performance was obtained in the audiovisual modality. Conclusions The performance

  10. [Thalamus and Attention].

    Science.gov (United States)

    Tokoro, Kazuhiko; Sato, Hironobu; Yamamoto, Mayumi; Nagai, Yoshiko

    2015-12-01

    Attention is the process by which information and selection occurs, the thalamus plays an important role in the selective attention of visual and auditory information. Selective attention is a conscious effort; however, it occurs subconsciously, as well. The lateral geniculate body (LGB) filters visual information before it reaches the cortex (bottom-up attention). The thalamic reticular nucleus (TRN) provides a strong inhibitory input to both the LGB and pulvinar. This regulation involves focusing a spotlight on important information, as well as inhibiting unnecessary background information. Behavioral contexts more strongly modulate activity of the TRN and pulvinar influencing feedforward and feedback information transmission between the frontal, temporal, parietal and occipital cortical areas (top-down attention). The medial geniculate body (MGB) filters auditory information the TRN inhibits the MGB. Attentional modulation occurring in the auditory pathway among the cochlea, cochlear nucleus, superior olivary complex, and inferior colliculus is more important than that of the MGB and TRN. We also discuss the attentional consequence of thalamic hemorrhage.

  11. Developmental Dyslexia: Exploring How Much Phonological and Visual Attention Span Disorders Are Linked to Simultaneous Auditory Processing Deficits

    Science.gov (United States)

    Lallier, Marie; Donnadieu, Sophie; Valdois, Sylviane

    2013-01-01

    The simultaneous auditory processing skills of 17 dyslexic children and 17 skilled readers were measured using a dichotic listening task. Results showed that the dyslexic children exhibited difficulties reporting syllabic material when presented simultaneously. As a measure of simultaneous visual processing, visual attention span skills were…

  12. Subcortical encoding of speech cues in children with attention deficit hyperactivity disorder.

    Science.gov (United States)

    Jafari, Zahra; Malayeri, Saeed; Rostami, Reza

    2015-02-01

    There is little information about processing of nonspeech and speech stimuli at the subcortical level in individuals with attention deficit hyperactivity disorder (ADHD). The auditory brainstem response (ABR) provides information about the function of the auditory brainstem pathways. We aim to investigate the subcortical function in neural encoding of click and speech stimuli in children with ADHD. The subjects include 50 children with ADHD and 34 typically developing (TD) children between the ages of 8 and 12 years. Click ABR (cABR) and speech ABR (sABR) with 40 ms synthetic /da/ syllable stimulus were recorded. Latencies of cABR in waves of III and V and duration of V-Vn (P⩽0.027), and latencies of sABR in waves A, D, E, F and O and duration of V-A (P⩽0.034) were significantly longer in children with ADHD than in TD children. There were no apparent differences in components the sustained frequency following response (FFR). We conclude that children with ADHD have deficits in temporal neural encoding of both nonspeech and speech stimuli. There is a common dysfunction in the processing of click and speech stimuli at the brainstem level in children with suspected ADHD. Copyright © 2015. Published by Elsevier Ireland Ltd.

  13. Global dynamics of selective attention and its lapses in primary auditory cortex.

    Science.gov (United States)

    Lakatos, Peter; Barczak, Annamaria; Neymotin, Samuel A; McGinnis, Tammy; Ross, Deborah; Javitt, Daniel C; O'Connell, Monica Noelle

    2016-12-01

    Previous research demonstrated that while selectively attending to relevant aspects of the external world, the brain extracts pertinent information by aligning its neuronal oscillations to key time points of stimuli or their sampling by sensory organs. This alignment mechanism is termed oscillatory entrainment. We investigated the global, long-timescale dynamics of this mechanism in the primary auditory cortex of nonhuman primates, and hypothesized that lapses of entrainment would correspond to lapses of attention. By examining electrophysiological and behavioral measures, we observed that besides the lack of entrainment by external stimuli, attentional lapses were also characterized by high-amplitude alpha oscillations, with alpha frequency structuring of neuronal ensemble and single-unit operations. Entrainment and alpha-oscillation-dominated periods were strongly anticorrelated and fluctuated rhythmically at an ultra-slow rate. Our results indicate that these two distinct brain states represent externally versus internally oriented computational resources engaged by large-scale task-positive and task-negative functional networks.

  14. Nogo stimuli do not receive more attentional suppression or response inhibition than neutral stimuli: evidence from the N2pc, PD and N2 components in a spatial cueing paradigm

    Directory of Open Access Journals (Sweden)

    Caroline eBarras

    2016-05-01

    Full Text Available It has been claimed that stimuli sharing the color of the nogo-target are suppressed because of the strong incentive to not process the nogo-target, but we failed to replicate this finding. Participants searched for a color singleton in the target display and indicated its shape when it was in the go color. If the color singleton in the target display was in the nogo color, they had to withhold the response. The target display was preceded by a cue display that also contained a color singleton (the cue. The cue was either in the color of the go or nogo target, or it was in an unrelated, neutral color. With cues in the go color, reaction times (RTs were shorter when the cue appeared at the same location as the target compared to when it appeared at a different location. Also, electrophysiological recordings showed that an index of attentional selection, the N2pc, was elicited by go cues. Surprisingly, we failed to replicate cueing costs for cues in the nogo color that were originally reported by Anderson and Folk (2012. Consistently, we also failed to find an electrophysiological index of attentional suppression (the PD for cues in the nogo color. Further, fronto-central ERPs to the cue display showed the same negativity for nogo and neutral stimuli relative to go stimuli, which is at odds with response inhibition and conflict monitoring accounts of the Nogo-N2. Thus, the modified cueing paradigm employed here provides little evidence that features associated with nogo-targets are suppressed at the level of attention or response selection. Rather, nogo-stimuli are efficiently ignored and attention is focused on features that require a response.

  15. Cueing Complex Animations: Does Direction of Attention Foster Learning Processes?

    Science.gov (United States)

    Lowe, Richard; Boucheix, Jean-Michel

    2011-01-01

    The time course of learners' processing of a complex animation was studied using a dynamic diagram of a piano mechanism. Over successive repetitions of the material, two forms of cueing (standard colour cueing and anti-cueing) were administered either before or during the animated segment of the presentation. An uncued group and two other control…

  16. An Eye Tracking Comparison of External Pointing Cues and Internal Continuous Cues in Learning with Complex Animations

    Science.gov (United States)

    Boucheix, Jean-Michel; Lowe, Richard K.

    2010-01-01

    Two experiments used eye tracking to investigate a novel cueing approach for directing learner attention to low salience, high relevance aspects of a complex animation. In the first experiment, comprehension of a piano mechanism animation containing spreading-colour cues was compared with comprehension obtained with arrow cues or no cues. Eye…

  17. Perceiving emotions: Cueing social categorization processes and attentional control through facial expressions.

    Science.gov (United States)

    Cañadas, Elena; Lupiáñez, Juan; Kawakami, Kerry; Niedenthal, Paula M; Rodríguez-Bailón, Rosa

    2016-09-01

    Individuals spontaneously categorise other people on the basis of their gender, ethnicity and age. But what about the emotions they express? In two studies we tested the hypothesis that facial expressions are similar to other social categories in that they can function as contextual cues to control attention. In Experiment 1 we associated expressions of anger and happiness with specific proportions of congruent/incongruent flanker trials. We also created consistent and inconsistent category members within each of these two general contexts. The results demonstrated that participants exhibited a larger congruency effect when presented with faces in the emotional group associated with a high proportion of congruent trials. Notably, this effect transferred to inconsistent members of the group. In Experiment 2 we replicated the effects with faces depicting true and false smiles. Together these findings provide consistent evidence that individuals spontaneously utilise emotions to categorise others and that such categories determine the allocation of attentional control.

  18. The human brain maintains contradictory and redundant auditory sensory predictions.

    Directory of Open Access Journals (Sweden)

    Marika Pieszek

    Full Text Available Computational and experimental research has revealed that auditory sensory predictions are derived from regularities of the current environment by using internal generative models. However, so far, what has not been addressed is how the auditory system handles situations giving rise to redundant or even contradictory predictions derived from different sources of information. To this end, we measured error signals in the event-related brain potentials (ERPs in response to violations of auditory predictions. Sounds could be predicted on the basis of overall probability, i.e., one sound was presented frequently and another sound rarely. Furthermore, each sound was predicted by an informative visual cue. Participants' task was to use the cue and to discriminate the two sounds as fast as possible. Violations of the probability based prediction (i.e., a rare sound as well as violations of the visual-auditory prediction (i.e., an incongruent sound elicited error signals in the ERPs (Mismatch Negativity [MMN] and Incongruency Response [IR]. Particular error signals were observed even in case the overall probability and the visual symbol predicted different sounds. That is, the auditory system concurrently maintains and tests contradictory predictions. Moreover, if the same sound was predicted, we observed an additive error signal (scalp potential and primary current density equaling the sum of the specific error signals. Thus, the auditory system maintains and tolerates functionally independently represented redundant and contradictory predictions. We argue that the auditory system exploits all currently active regularities in order to optimally prepare for future events.

  19. Amodal brain activation and functional connectivity in response to high-energy-density food cues in obesity.

    Science.gov (United States)

    Carnell, Susan; Benson, Leora; Pantazatos, Spiro P; Hirsch, Joy; Geliebter, Allan

    2014-11-01

    The obesogenic environment is pervasive, yet only some people become obese. The aim was to investigate whether obese individuals show differential neural responses to visual and auditory food cues, independent of cue modality. Obese (BMI 29-41, n = 10) and lean (BMI 20-24, n = 10) females underwent fMRI scanning during presentation of auditory (spoken word) and visual (photograph) cues representing high-energy-density (ED) and low-ED foods. The effect of obesity on whole-brain activation, and on functional connectivity with the midbrain/VTA, was examined. Obese compared with lean women showed greater modality-independent activation of the midbrain/VTA and putamen in response to high-ED (vs. low-ED) cues, as well as relatively greater functional connectivity between the midbrain/VTA and cerebellum (P food cues within the midbrain/VTA and putamen, and altered functional connectivity between the midbrain/VTA and cerebellum, could contribute to excessive food intake in obese individuals. © 2014 The Obesity Society.

  20. Drivers anticipate lead-vehicle conflicts during automated longitudinal control: Sensory cues capture driver attention and promote appropriate and timely responses.

    Science.gov (United States)

    Morando, Alberto; Victor, Trent; Dozza, Marco

    2016-12-01

    Adaptive Cruise Control (ACC) has been shown to reduce the exposure to critical situations by maintaining a safe speed and headway. It has also been shown that drivers adapt their visual behavior in response to the driving task demand with ACC, anticipating an impending lead vehicle conflict by directing their eyes to the forward path before a situation becomes critical. The purpose of this paper is to identify the causes related to this anticipatory mechanism, by investigating drivers' visual behavior while driving with ACC when a potential critical situation is encountered, identified as a forward collision warning (FCW) onset (including false positive warnings). This paper discusses how sensory cues capture attention to the forward path in anticipation of the FCW onset. The analysis used the naturalistic database EuroFOT to examine visual behavior with respect to two manually-coded metrics, glance location and glance eccentricity, and then related the findings to vehicle data (such as speed, acceleration, and radar information). Three sensory cues (longitudinal deceleration, looming, and brake lights) were found to be relevant for capturing driver attention and increase glances to the forward path in anticipation of the threat; the deceleration cue seems to be dominant. The results also show that the FCW acts as an effective attention-orienting mechanism when no threat anticipation is present. These findings, relevant to the study of automation, provide additional information about drivers' response to potential lead-vehicle conflicts when longitudinal control is automated. Moreover, these results suggest that sensory cues are important for alerting drivers to an impending critical situation, allowing for a prompt reaction. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Modality-specificity of selective attention networks

    Directory of Open Access Journals (Sweden)

    Hannah Jamieson Stewart

    2015-11-01

    Full Text Available Objective: To establish the modality specificity and generality of selective attention networks. Method: Forty-eight young adults completed a battery of four auditory and visual selective attention tests based upon the Attention Network framework: the visual and auditory Attention Network Tests (vANT, aANT, the Test of Everyday Attention (TEA, and the Test of Attention in Listening (TAiL. These provided independent measures for auditory and visual alerting, orienting, and conflict resolution networks. The measures were subjected to an exploratory factor analysis to assess underlying attention constructs. Results: The analysis yielded a four-component solution. The first component comprised of a range of measures from the TEA and was labeled ‘general attention’. The third component was labeled ‘auditory attention’, as it only contained measures from the TAiL using pitch as the attended stimulus feature. The second and fourth components were labeled as ‘spatial orienting’ and ‘spatial conflict’, respectively – they were comprised of orienting and conflict resolution measures from the vANT, aANT and TAiL attend-location task – all tasks based upon spatial judgments (e.g., the direction of a target arrow or sound location. Conclusions: These results do not support our a-priori hypothesis that attention networks are either modality specific or supramodal. Auditory attention separated into selectively attending to spatial and non-spatial features, with the auditory spatial attention loading onto the same factor as visual spatial attention, suggesting spatial attention is supramodal. However, since our study did not include a non-spatial measure of visual attention, further research will be required to ascertain whether non-spatial attention is modality-specific.

  2. Validation of auditory detection response task method for assessing the attentional effects of cognitive load.

    Science.gov (United States)

    Stojmenova, Kristina; Sodnik, Jaka

    2018-07-04

    There are 3 standardized versions of the Detection Response Task (DRT), 2 using visual stimuli (remote DRT and head-mounted DRT) and one using tactile stimuli. In this article, we present a study that proposes and validates a type of auditory signal to be used as DRT stimulus and evaluate the proposed auditory version of this method by comparing it with the standardized visual and tactile version. This was a within-subject design study performed in a driving simulator with 24 participants. Each participant performed 8 2-min-long driving sessions in which they had to perform 3 different tasks: driving, answering to DRT stimuli, and performing a cognitive task (n-back task). Presence of additional cognitive load and type of DRT stimuli were defined as independent variables. DRT response times and hit rates, n-back task performance, and pupil size were observed as dependent variables. Significant changes in pupil size for trials with a cognitive task compared to trials without showed that cognitive load was induced properly. Each DRT version showed a significant increase in response times and a decrease in hit rates for trials with a secondary cognitive task compared to trials without. Similar and significantly better results in differences in response times and hit rates were obtained for the auditory and tactile version compared to the visual version. There were no significant differences in performance rate between the trials without DRT stimuli compared to trials with and among the trials with different DRT stimuli modalities. The results from this study show that the auditory DRT version, using the signal implementation suggested in this article, is sensitive to the effects of cognitive load on driver's attention and is significantly better than the remote visual and tactile version for auditory-vocal cognitive (n-back) secondary tasks.

  3. Brain activity associated with selective attention, divided attention and distraction.

    Science.gov (United States)

    Salo, Emma; Salmela, Viljami; Salmi, Juha; Numminen, Jussi; Alho, Kimmo

    2017-06-01

    Top-down controlled selective or divided attention to sounds and visual objects, as well as bottom-up triggered attention to auditory and visual distractors, has been widely investigated. However, no study has systematically compared brain activations related to all these types of attention. To this end, we used functional magnetic resonance imaging (fMRI) to measure brain activity in participants performing a tone pitch or a foveal grating orientation discrimination task, or both, distracted by novel sounds not sharing frequencies with the tones or by extrafoveal visual textures. To force focusing of attention to tones or gratings, or both, task difficulty was kept constantly high with an adaptive staircase method. A whole brain analysis of variance (ANOVA) revealed fronto-parietal attention networks for both selective auditory and visual attention. A subsequent conjunction analysis indicated partial overlaps of these networks. However, like some previous studies, the present results also suggest segregation of prefrontal areas involved in the control of auditory and visual attention. The ANOVA also suggested, and another conjunction analysis confirmed, an additional activity enhancement in the left middle frontal gyrus related to divided attention supporting the role of this area in top-down integration of dual task performance. Distractors expectedly disrupted task performance. However, contrary to our expectations, activations specifically related to the distractors were found only in the auditory and visual cortices. This suggests gating of the distractors from further processing perhaps due to strictly focused attention in the current demanding discrimination tasks. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Comparison of Gated Audiovisual Speech Identification in Elderly Hearing Aid Users and Elderly Normal-Hearing Individuals: Effects of Adding Visual Cues to Auditory Speech Stimuli.

    Science.gov (United States)

    Moradi, Shahram; Lidestam, Björn; Rönnberg, Jerker

    2016-06-17

    The present study compared elderly hearing aid (EHA) users (n = 20) with elderly normal-hearing (ENH) listeners (n = 20) in terms of isolation points (IPs, the shortest time required for correct identification of a speech stimulus) and accuracy of audiovisual gated speech stimuli (consonants, words, and final words in highly and less predictable sentences) presented in silence. In addition, we compared the IPs of audiovisual speech stimuli from the present study with auditory ones extracted from a previous study, to determine the impact of the addition of visual cues. Both participant groups achieved ceiling levels in terms of accuracy in the audiovisual identification of gated speech stimuli; however, the EHA group needed longer IPs for the audiovisual identification of consonants and words. The benefit of adding visual cues to auditory speech stimuli was more evident in the EHA group, as audiovisual presentation significantly shortened the IPs for consonants, words, and final words in less predictable sentences; in the ENH group, audiovisual presentation only shortened the IPs for consonants and words. In conclusion, although the audiovisual benefit was greater for EHA group, this group had inferior performance compared with the ENH group in terms of IPs when supportive semantic context was lacking. Consequently, EHA users needed the initial part of the audiovisual speech signal to be longer than did their counterparts with normal hearing to reach the same level of accuracy in the absence of a semantic context. © The Author(s) 2016.

  5. Objective measures of binaural masking level differences and comodulation masking release based on late auditory evoked potentials.

    Science.gov (United States)

    Epp, Bastian; Yasin, Ifat; Verhey, Jesko L

    2013-12-01

    The audibility of important sounds is often hampered due to the presence of other masking sounds. The present study investigates if a correlate of the audibility of a tone masked by noise is found in late auditory evoked potentials measured from human listeners. The audibility of the target sound at a fixed physical intensity is varied by introducing auditory cues of (i) interaural target signal phase disparity and (ii) coherent masker level fluctuations in different frequency regions. In agreement with previous studies, psychoacoustical experiments showed that both stimulus manipulations result in a masking release (i: binaural masking level difference; ii: comodulation masking release) compared to a condition where those cues are not present. Late auditory evoked potentials (N1, P2) were recorded for the stimuli at a constant masker level, but different signal levels within the same set of listeners who participated in the psychoacoustical experiment. The data indicate differences in N1 and P2 between stimuli with and without interaural phase disparities. However, differences for stimuli with and without coherent masker modulation were only found for P2, i.e., only P2 is sensitive to the increase in audibility, irrespective of the cue that caused the masking release. The amplitude of P2 is consistent with the psychoacoustical finding of an addition of the masking releases when both cues are present. Even though it cannot be concluded where along the auditory pathway the audibility is represented, the P2 component of auditory evoked potentials is a candidate for an objective measure of audibility in the human auditory system. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Multisensory object perception in infancy: 4-month-olds perceive a mistuned harmonic as a separate auditory and visual object.

    Science.gov (United States)

    Smith, Nicholas A; Folland, Nicole A; Martinez, Diana M; Trainor, Laurel J

    2017-07-01

    Infants learn to use auditory and visual information to organize the sensory world into identifiable objects with particular locations. Here we use a behavioural method to examine infants' use of harmonicity cues to auditory object perception in a multisensory context. Sounds emitted by different objects sum in the air and the auditory system must figure out which parts of the complex waveform belong to different sources (auditory objects). One important cue to this source separation is that complex tones with pitch typically contain a fundamental frequency and harmonics at integer multiples of the fundamental. Consequently, adults hear a mistuned harmonic in a complex sound as a distinct auditory object (Alain, Theunissen, Chevalier, Batty, & Taylor, 2003). Previous work by our group demonstrated that 4-month-old infants are also sensitive to this cue. They behaviourally discriminate a complex tone with a mistuned harmonic from the same complex with in-tune harmonics, and show an object-related event-related potential (ERP) electrophysiological (EEG) response to the stimulus with mistuned harmonics. In the present study we use an audiovisual procedure to investigate whether infants perceive a complex tone with an 8% mistuned harmonic as emanating from two objects, rather than merely detecting the mistuned cue. We paired in-tune and mistuned complex tones with visual displays that contained either one or two bouncing balls. Four-month-old infants showed surprise at the incongruous pairings, looking longer at the display of two balls when paired with the in-tune complex and at the display of one ball when paired with the mistuned harmonic complex. We conclude that infants use harmonicity as a cue for source separation when integrating auditory and visual information in object perception. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Central auditory processing outcome after stroke in children

    Directory of Open Access Journals (Sweden)

    Karla M. I. Freiria Elias

    2014-09-01

    Full Text Available Objective To investigate central auditory processing in children with unilateral stroke and to verify whether the hemisphere affected by the lesion influenced auditory competence. Method 23 children (13 male between 7 and 16 years old were evaluated through speech-in-noise tests (auditory closure; dichotic digit test and staggered spondaic word test (selective attention; pitch pattern and duration pattern sequence tests (temporal processing and their results were compared with control children. Auditory competence was established according to the performance in auditory analysis ability. Results Was verified similar performance between groups in auditory closure ability and pronounced deficits in selective attention and temporal processing abilities. Most children with stroke showed an impaired auditory ability in a moderate degree. Conclusion Children with stroke showed deficits in auditory processing and the degree of impairment was not related to the hemisphere affected by the lesion.

  8. How hearing aids, background noise, and visual cues influence objective listening effort.

    Science.gov (United States)

    Picou, Erin M; Ricketts, Todd A; Hornsby, Benjamin W Y

    2013-09-01

    The purpose of this article was to evaluate factors that influence the listening effort experienced when processing speech for people with hearing loss. Specifically, the change in listening effort resulting from introducing hearing aids, visual cues, and background noise was evaluated. An additional exploratory aim was to investigate the possible relationships between the magnitude of listening effort change and individual listeners' working memory capacity, verbal processing speed, or lipreading skill. Twenty-seven participants with bilateral sensorineural hearing loss were fitted with linear behind-the-ear hearing aids and tested using a dual-task paradigm designed to evaluate listening effort. The primary task was monosyllable word recognition and the secondary task was a visual reaction time task. The test conditions varied by hearing aids (unaided, aided), visual cues (auditory-only, auditory-visual), and background noise (present, absent). For all participants, the signal to noise ratio was set individually so that speech recognition performance in noise was approximately 60% in both the auditory-only and auditory-visual conditions. In addition to measures of listening effort, working memory capacity, verbal processing speed, and lipreading ability were measured using the Automated Operational Span Task, a Lexical Decision Task, and the Revised Shortened Utley Lipreading Test, respectively. In general, the effects measured using the objective measure of listening effort were small (~10 msec). Results indicated that background noise increased listening effort, and hearing aids reduced listening effort, while visual cues did not influence listening effort. With regard to the individual variables, verbal processing speed was negatively correlated with hearing aid benefit for listening effort; faster processors were less likely to derive benefit. Working memory capacity, verbal processing speed, and lipreading ability were related to benefit from visual cues. No

  9. Assessment of Spectral and Temporal Resolution in Cochlear Implant Users Using Psychoacoustic Discrimination and Speech Cue Categorization.

    Science.gov (United States)

    Winn, Matthew B; Won, Jong Ho; Moon, Il Joon

    This study was conducted to measure auditory perception by cochlear implant users in the spectral and temporal domains, using tests of either categorization (using speech-based cues) or discrimination (using conventional psychoacoustic tests). The authors hypothesized that traditional nonlinguistic tests assessing spectral and temporal auditory resolution would correspond to speech-based measures assessing specific aspects of phonetic categorization assumed to depend on spectral and temporal auditory resolution. The authors further hypothesized that speech-based categorization performance would ultimately be a superior predictor of speech recognition performance, because of the fundamental nature of speech recognition as categorization. Nineteen cochlear implant listeners and 10 listeners with normal hearing participated in a suite of tasks that included spectral ripple discrimination, temporal modulation detection, and syllable categorization, which was split into a spectral cue-based task (targeting the /ba/-/da/ contrast) and a timing cue-based task (targeting the /b/-/p/ and /d/-/t/ contrasts). Speech sounds were manipulated to contain specific spectral or temporal modulations (formant transitions or voice onset time, respectively) that could be categorized. Categorization responses were quantified using logistic regression to assess perceptual sensitivity to acoustic phonetic cues. Word recognition testing was also conducted for cochlear implant listeners. Cochlear implant users were generally less successful at utilizing both spectral and temporal cues for categorization compared with listeners with normal hearing. For the cochlear implant listener group, spectral ripple discrimination was significantly correlated with the categorization of formant transitions; both were correlated with better word recognition. Temporal modulation detection using 100- and 10-Hz-modulated noise was not correlated either with the cochlear implant subjects' categorization of

  10. Visual Search and Target Cueing: A Comparison of Head-Mounted Versus Hand-Held Displays on the Allocation of Visual Attention

    National Research Council Canada - National Science Library

    Yeh, Michelle; Wickens, Christopher D

    1998-01-01

    We conducted a study to examine the effects of target cueing and conformality with a hand-held or head-mounted display to determine their effects on visual search tasks requiring focused and divided attention...

  11. Automaticity of phasic alertness: Evidence for a three-component model of visual cueing.

    Science.gov (United States)

    Lin, Zhicheng; Lu, Zhong-Lin

    2016-10-01

    The automaticity of phasic alertness is investigated using the attention network test. Results show that the cueing effect from the alerting cue-double cue-is strongly enhanced by the task relevance of visual cues, as determined by the informativeness of the orienting cue-single cue-that is being mixed (80 % vs. 50 % valid in predicting where the target will appear). Counterintuitively, the cueing effect from the alerting cue can be negatively affected by its visibility, such that masking the cue from awareness can reveal a cueing effect that is otherwise absent when the cue is visible. Evidently, then, top-down influences-in the form of contextual relevance and cue awareness-can have opposite influences on the cueing effect from the alerting cue. These findings lead us to the view that a visual cue can engage three components of attention-orienting, alerting, and inhibition-to determine the behavioral cueing effect. We propose that phasic alertness, particularly in the form of specific response readiness, is regulated by both internal, top-down expectation and external, bottom-up stimulus properties. In contrast to some existing views, we advance the perspective that phasic alertness is strongly tied to temporal orienting, attentional capture, and spatial orienting. Finally, we discuss how translating attention research to clinical applications would benefit from an improved ability to measure attention. To this end, controlling the degree of intraindividual variability in the attentional components and improving the precision of the measurement tools may prove vital.

  12. Amygdala and auditory cortex exhibit distinct sensitivity to relevant acoustic features of auditory emotions.

    Science.gov (United States)

    Pannese, Alessia; Grandjean, Didier; Frühholz, Sascha

    2016-12-01

    Discriminating between auditory signals of different affective value is critical to successful social interaction. It is commonly held that acoustic decoding of such signals occurs in the auditory system, whereas affective decoding occurs in the amygdala. However, given that the amygdala receives direct subcortical projections that bypass the auditory cortex, it is possible that some acoustic decoding occurs in the amygdala as well, when the acoustic features are relevant for affective discrimination. We tested this hypothesis by combining functional neuroimaging with the neurophysiological phenomena of repetition suppression (RS) and repetition enhancement (RE) in human listeners. Our results show that both amygdala and auditory cortex responded differentially to physical voice features, suggesting that the amygdala and auditory cortex decode the affective quality of the voice not only by processing the emotional content from previously processed acoustic features, but also by processing the acoustic features themselves, when these are relevant to the identification of the voice's affective value. Specifically, we found that the auditory cortex is sensitive to spectral high-frequency voice cues when discriminating vocal anger from vocal fear and joy, whereas the amygdala is sensitive to vocal pitch when discriminating between negative vocal emotions (i.e., anger and fear). Vocal pitch is an instantaneously recognized voice feature, which is potentially transferred to the amygdala by direct subcortical projections. These results together provide evidence that, besides the auditory cortex, the amygdala too processes acoustic information, when this is relevant to the discrimination of auditory emotions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. The Effects of Spatial Endogenous Pre-cueing across Eccentricities.

    Science.gov (United States)

    Feng, Jing; Spence, Ian

    2017-01-01

    Frequently, we use expectations about likely locations of a target to guide the allocation of our attention. Despite the importance of this attentional process in everyday tasks, examination of pre-cueing effects on attention, particularly endogenous pre-cueing effects, has been relatively little explored outside an eccentricity of 20°. Given the visual field has functional subdivisions that attentional processes can differ significantly among the foveal, perifoveal, and more peripheral areas, how endogenous pre-cues that carry spatial information of targets influence our allocation of attention across a large visual field (especially in the more peripheral areas) remains unclear. We present two experiments examining how the expectation of the location of the target shapes the distribution of attention across eccentricities in the visual field. We measured participants' ability to pick out a target among distractors in the visual field after the presentation of a highly valid cue indicating the size of the area in which the target was likely to occur, or the likely direction of the target (left or right side of the display). Our first experiment showed that participants had a higher target detection rate with faster responses, particularly at eccentricities of 20° and 30°. There was also a marginal advantage of pre-cueing effects when trials of the same size cue were blocked compared to when trials were mixed. Experiment 2 demonstrated a higher target detection rate when the target occurred at the cued direction. This pre-cueing effect was greater at larger eccentricities and with a longer cue-target interval. Our findings on the endogenous pre-cueing effects across a large visual area were summarized using a simple model to assist in conceptualizing the modifications of the distribution of attention over the visual field. We discuss our finding in light of cognitive penetration of perception, and highlight the importance of examining attentional process across

  14. The Effects of Spatial Endogenous Pre-cueing across Eccentricities

    Directory of Open Access Journals (Sweden)

    Jing Feng

    2017-06-01

    Full Text Available Frequently, we use expectations about likely locations of a target to guide the allocation of our attention. Despite the importance of this attentional process in everyday tasks, examination of pre-cueing effects on attention, particularly endogenous pre-cueing effects, has been relatively little explored outside an eccentricity of 20°. Given the visual field has functional subdivisions that attentional processes can differ significantly among the foveal, perifoveal, and more peripheral areas, how endogenous pre-cues that carry spatial information of targets influence our allocation of attention across a large visual field (especially in the more peripheral areas remains unclear. We present two experiments examining how the expectation of the location of the target shapes the distribution of attention across eccentricities in the visual field. We measured participants’ ability to pick out a target among distractors in the visual field after the presentation of a highly valid cue indicating the size of the area in which the target was likely to occur, or the likely direction of the target (left or right side of the display. Our first experiment showed that participants had a higher target detection rate with faster responses, particularly at eccentricities of 20° and 30°. There was also a marginal advantage of pre-cueing effects when trials of the same size cue were blocked compared to when trials were mixed. Experiment 2 demonstrated a higher target detection rate when the target occurred at the cued direction. This pre-cueing effect was greater at larger eccentricities and with a longer cue-target interval. Our findings on the endogenous pre-cueing effects across a large visual area were summarized using a simple model to assist in conceptualizing the modifications of the distribution of attention over the visual field. We discuss our finding in light of cognitive penetration of perception, and highlight the importance of examining

  15. Towards a Cognitive Model of Distraction by Auditory Novelty: The Role of Involuntary Attention Capture and Semantic Processing

    Science.gov (United States)

    Parmentier, Fabrice B. R.

    2008-01-01

    Unexpected auditory stimuli are potent distractors, able to break through selective attention and disrupt performance in an unrelated visual task. This study examined the processing fate of novel sounds by examining the extent to which their semantic content is analyzed and whether the outcome of this processing can impact on subsequent behavior.…

  16. White Matter Integrity Dissociates Verbal Memory and Auditory Attention Span in Emerging Adults with Congenital Heart Disease.

    Science.gov (United States)

    Brewster, Ryan C; King, Tricia Z; Burns, Thomas G; Drossner, David M; Mahle, William T

    2015-01-01

    White matter disruptions have been identified in individuals with congenital heart disease (CHD). However, no specific theory-driven relationships between microstructural white matter disruptions and cognition have been established in CHD. We conducted a two-part study. First, we identified significant differences in fractional anisotropy (FA) of emerging adults with CHD using Tract-Based Spatial Statistics (TBSS). TBSS analyses between 22 participants with CHD and 18 demographically similar controls identified five regions of normal appearing white matter with significantly lower FA in CHD, and two higher. Next, two regions of lower FA in CHD were selected to examine theory-driven differential relationships with cognition: voxels along the left uncinate fasciculus (UF; a tract theorized to contribute to verbal memory) and voxels along the right middle cerebellar peduncle (MCP; a tract previously linked to attention). In CHD, a significant positive correlation between UF FA and memory was found, r(20)=.42, p=.049 (uncorrected). There was no correlation between UF and auditory attention span. A positive correlation between MCP FA and auditory attention span was found, r(20)=.47, p=.027 (uncorrected). There was no correlation between MCP and memory. In controls, no significant relationships were identified. These results are consistent with previous literature demonstrating lower FA in younger CHD samples, and provide novel evidence for disrupted white matter integrity in emerging adults with CHD. Furthermore, a correlational double dissociation established distinct white matter circuitry (UF and MCP) and differential cognitive correlates (memory and attention span, respectively) in young adults with CHD.

  17. Auditory perception of a human walker.

    Science.gov (United States)

    Cottrell, David; Campbell, Megan E J

    2014-01-01

    When one hears footsteps in the hall, one is able to instantly recognise it as a person: this is an everyday example of auditory biological motion perception. Despite the familiarity of this experience, research into this phenomenon is in its infancy compared with visual biological motion perception. Here, two experiments explored sensitivity to, and recognition of, auditory stimuli of biological and nonbiological origin. We hypothesised that the cadence of a walker gives rise to a temporal pattern of impact sounds that facilitates the recognition of human motion from auditory stimuli alone. First a series of detection tasks compared sensitivity with three carefully matched impact sounds: footsteps, a ball bouncing, and drumbeats. Unexpectedly, participants were no more sensitive to footsteps than to impact sounds of nonbiological origin. In the second experiment participants made discriminations between pairs of the same stimuli, in a series of recognition tasks in which the temporal pattern of impact sounds was manipulated to be either that of a walker or the pattern more typical of the source event (a ball bouncing or a drumbeat). Under these conditions, there was evidence that both temporal and nontemporal cues were important in recognising theses stimuli. It is proposed that the interval between footsteps, which reflects a walker's cadence, is a cue for the recognition of the sounds of a human walking.

  18. Eye Movement Evidence of Attentional Bias for Substance-Related Cues in Heroin Dependents on Methadone Maintenance Therapy.

    Science.gov (United States)

    Zhao, Hui; Yang, Bo; Zhu, Qian; Zhang, Guangqun; Xiao, Yuqin; Guo, Xiao; Huang, Xiu; Zhang, Zhuo

    2017-03-21

    Attentional biases toward substance-related stimuli might play a contributing role in addictive behaviors. This study investigated the selective attention to substance-related stimuli in heroin dependents receiving methadone maintenance therapy. Thirty outpatients receiving methadone maintenance treatment for heroin dependence and 38 healthy controls completed a visual probe task with concurrent eye movement monitoring. The results showed that the heroin group reacted faster to probes associated with substance-related pictures than neutral pictures, and they directed more initial fixations and maintained longer initial fixation durations toward substance-related pictures than neutral pictures. However, attentional bias was not correlated with addiction severity in the heroin group. These findings suggest that attentional bias towards substance-related cues occurs in heroin dependents, although this bias might not be associated with the severity of drug-using behavior.

  19. The impact of visual gaze direction on auditory object tracking

    OpenAIRE

    Pomper, U.; Chait, M.

    2017-01-01

    Subjective experience suggests that we are able to direct our auditory attention independent of our visual gaze, e.g when shadowing a nearby conversation at a cocktail party. But what are the consequences at the behavioural and neural level? While numerous studies have investigated both auditory attention and visual gaze independently, little is known about their interaction during selective listening. In the present EEG study, we manipulated visual gaze independently of auditory attention wh...

  20. Did You Listen to the Beat? Auditory Steady-State Responses in the Human Electroencephalogram at 4 and 7 Hz Modulation Rates Reflect Selective Attention.

    Science.gov (United States)

    Jaeger, Manuela; Bleichner, Martin G; Bauer, Anna-Katharina R; Mirkovic, Bojana; Debener, Stefan

    2018-02-27

    The acoustic envelope of human speech correlates with the syllabic rate (4-8 Hz) and carries important information for intelligibility, which is typically compromised in multi-talker, noisy environments. In order to better understand the dynamics of selective auditory attention to low frequency modulated sound sources, we conducted a two-stream auditory steady-state response (ASSR) selective attention electroencephalogram (EEG) study. The two streams consisted of 4 and 7 Hz amplitude and frequency modulated sounds presented from the left and right side. One of two streams had to be attended while the other had to be ignored. The attended stream always contained a target, allowing for the behavioral confirmation of the attention manipulation. EEG ASSR power analysis revealed a significant increase in 7 Hz power for the attend compared to the ignore conditions. There was no significant difference in 4 Hz power when the 4 Hz stream had to be attended compared to when it had to be ignored. This lack of 4 Hz attention modulation could be explained by a distracting effect of a third frequency at 3 Hz (beat frequency) perceivable when the 4 and 7 Hz streams are presented simultaneously. Taken together our results show that low frequency modulations at syllabic rate are modulated by selective spatial attention. Whether attention effects act as enhancement of the attended stream or suppression of to be ignored stream may depend on how well auditory streams can be segregated.

  1. The medial prefrontal cortex and memory of cue location in the rat.

    Science.gov (United States)

    Rawson, Tim; O'Kane, Michael; Talk, Andrew

    2010-01-01

    We developed a single-trial cue-location memory task in which rats experienced an auditory cue while exploring an environment. They then recalled and avoided the sound origination point after the cue was paired with shock in a separate context. Subjects with medial prefrontal cortical (mPFC) lesions made no such avoidance response, but both lesioned and control subjects avoided the cue itself when presented at test. A follow up assessment revealed no spatial learning impairment in either group. These findings suggest that the rodent mPFC is required for incidental learning or recollection of the location at which a discrete cue occurred, but is not required for cue recognition or for allocentric spatial memory. Copyright 2009 Elsevier Inc. All rights reserved.

  2. Magic and Misdirection: The Influence of Social Cues on the Allocation of Visual Attention While Watching a Cups-and-Balls Routine

    Directory of Open Access Journals (Sweden)

    Andreas eHergovich

    2016-05-01

    Full Text Available In recent years, a body of research that regards the scientific study of magic performances as a promising method of investigating psychological phenomena in an ecologically valid setting has emerged. Seemingly contradictory findings concerning the ability of social cues to strengthen a magic trick’s effectiveness have been published. In this experiment, an effort was made to disentangle the unique influence of different social and physical triggers of attentional misdirection on observers’ overt and covert attention. The ability of 120 participants to detect the mechanism of a cups-and-balls trick was assessed, and their visual fixations were recorded using an eye-tracker while they were watching the routine. All the investigated techniques of misdirection, including sole usage of social cues, were shown to increase the probability of missing the trick mechanism. Depending on the technique of misdirection used, very different gaze patterns were observed. A combination of social and physical techniques of misdirection influenced participants’ overt attention most effectively.

  3. Neural Correlates of Expert Behavior During a Domain-Specific Attentional Cueing Task in Badminton Players.

    Science.gov (United States)

    Wang, Chun-Hao; Tu, Kuo-Cheng

    2017-06-01

    The present study aimed to investigate the neural correlates associated with sports expertise during a domain-specific task in badminton players. We compared event-related potentials activity from collegiate male badminton players and a set of matched athletic controls when they performed a badminton-specific attentional cueing task in which the uncertainty and validity were manipulated. The data showed that, regardless of cue type, the badminton players had faster responses along with greater P3 amplitudes than the athletic controls on the task. Specifically, the contingent negative variation amplitude was smaller for the players than for the controls in the condition involving higher uncertainty. Such an effect, however, was absent in the condition with lower uncertainty. We conclude that expertise in sports is associated with proficient modulation of brain activity during cognitive and motor preparation, as well as response execution, when performing a task related to an individual's specific sport domain.

  4. Gaze Cueing by Pareidolia Faces

    Directory of Open Access Journals (Sweden)

    Kohske Takahashi

    2013-12-01

    Full Text Available Visual images that are not faces are sometimes perceived as faces (the pareidolia phenomenon. While the pareidolia phenomenon provides people with a strong impression that a face is present, it is unclear how deeply pareidolia faces are processed as faces. In the present study, we examined whether a shift in spatial attention would be produced by gaze cueing of face-like objects. A robust cueing effect was observed when the face-like objects were perceived as faces. The magnitude of the cueing effect was comparable between the face-like objects and a cartoon face. However, the cueing effect was eliminated when the observer did not perceive the objects as faces. These results demonstrated that pareidolia faces do more than give the impression of the presence of faces; indeed, they trigger an additional face-specific attentional process.

  5. Gaze cueing by pareidolia faces.

    Science.gov (United States)

    Takahashi, Kohske; Watanabe, Katsumi

    2013-01-01

    Visual images that are not faces are sometimes perceived as faces (the pareidolia phenomenon). While the pareidolia phenomenon provides people with a strong impression that a face is present, it is unclear how deeply pareidolia faces are processed as faces. In the present study, we examined whether a shift in spatial attention would be produced by gaze cueing of face-like objects. A robust cueing effect was observed when the face-like objects were perceived as faces. The magnitude of the cueing effect was comparable between the face-like objects and a cartoon face. However, the cueing effect was eliminated when the observer did not perceive the objects as faces. These results demonstrated that pareidolia faces do more than give the impression of the presence of faces; indeed, they trigger an additional face-specific attentional process.

  6. Predictive Power of Attention and Reading Readiness Variables on Auditory Reasoning and Processing Skills of Six-Year-Old Children

    Science.gov (United States)

    Erbay, Filiz

    2013-01-01

    The aim of present research was to describe the relation of six-year-old children's attention and reading readiness skills (general knowledge, word comprehension, sentences, and matching) with their auditory reasoning and processing skills. This was a quantitative study based on scanning model. Research sampling consisted of 204 kindergarten…

  7. Incorporating modern neuroscience findings to improve brain-computer interfaces: tracking auditory attention.

    Science.gov (United States)

    Wronkiewicz, Mark; Larson, Eric; Lee, Adrian Kc

    2016-10-01

    Brain-computer interface (BCI) technology allows users to generate actions based solely on their brain signals. However, current non-invasive BCIs generally classify brain activity recorded from surface electroencephalography (EEG) electrodes, which can hinder the application of findings from modern neuroscience research. In this study, we use source imaging-a neuroimaging technique that projects EEG signals onto the surface of the brain-in a BCI classification framework. This allowed us to incorporate prior research from functional neuroimaging to target activity from a cortical region involved in auditory attention. Classifiers trained to detect attention switches performed better with source imaging projections than with EEG sensor signals. Within source imaging, including subject-specific anatomical MRI information (instead of using a generic head model) further improved classification performance. This source-based strategy also reduced accuracy variability across three dimensionality reduction techniques-a major design choice in most BCIs. Our work shows that source imaging provides clear quantitative and qualitative advantages to BCIs and highlights the value of incorporating modern neuroscience knowledge and methods into BCI systems.

  8. The Effect of Retrieval Cues on Visual Preferences and Memory in Infancy: Evidence for a Four-Phase Attention Function.

    Science.gov (United States)

    Bahrick, Lorraine E.; Hernandez-Reif, Maria; Pickens, Jeffrey N.

    1997-01-01

    Tested hypothesis from Bahrick and Pickens' infant attention model that retrieval cues increase memory accessibility and shift visual preferences toward greater novelty to resemble recent memories. Found that after retention intervals associated with remote or intermediate memory, previous familiarity preferences shifted to null or novelty…

  9. Readout from iconic memory and selective spatial attention involve similar neural processes.

    Science.gov (United States)

    Ruff, Christian C; Kristjánsson, Arni; Driver, Jon

    2007-10-01

    Iconic memory and spatial attention are often considered separately, but they may have functional similarities. Here we provide functional magnetic resonance imaging evidence for some common underlying neural effects. Subjects judged three visual stimuli in one hemifield of a bilateral array comprising six stimuli. The relevant hemifield for partial report was indicated by an auditory cue, administered either before the visual array (precue, spatial attention) or shortly after the array (postcue, iconic memory). Pre- and postcues led to similar activity modulations in lateral occipital cortex contralateral to the cued side. This finding indicates that readout from iconic memory can have some neural effects similar to those of spatial attention. We also found common bilateral activation of a fronto-parietal network for postcue and precue trials. These neuroimaging data suggest that some common neural mechanisms underlie selective spatial attention and readout from iconic memory. Some differences were also found; compared with precues, postcues led to higher activity in the right middle frontal gyrus.

  10. Assessing Top-Down and Bottom-Up Contributions to Auditory Stream Segregation and Integration With Polyphonic Music

    Directory of Open Access Journals (Sweden)

    Niels R. Disbergen

    2018-03-01

    Full Text Available Polyphonic music listening well exemplifies processes typically involved in daily auditory scene analysis situations, relying on an interactive interplay between bottom-up and top-down processes. Most studies investigating scene analysis have used elementary auditory scenes, however real-world scene analysis is far more complex. In particular, music, contrary to most other natural auditory scenes, can be perceived by either integrating or, under attentive control, segregating sound streams, often carried by different instruments. One of the prominent bottom-up cues contributing to multi-instrument music perception is their timbre difference. In this work, we introduce and validate a novel paradigm designed to investigate, within naturalistic musical auditory scenes, attentive modulation as well as its interaction with bottom-up processes. Two psychophysical experiments are described, employing custom-composed two-voice polyphonic music pieces within a framework implementing a behavioral performance metric to validate listener instructions requiring either integration or segregation of scene elements. In Experiment 1, the listeners' locus of attention was switched between individual instruments or the aggregate (i.e., both instruments together, via a task requiring the detection of temporal modulations (i.e., triplets incorporated within or across instruments. Subjects responded post-stimulus whether triplets were present in the to-be-attended instrument(s. Experiment 2 introduced the bottom-up manipulation by adding a three-level morphing of instrument timbre distance to the attentional framework. The task was designed to be used within neuroimaging paradigms; Experiment 2 was additionally validated behaviorally in the functional Magnetic Resonance Imaging (fMRI environment. Experiment 1 subjects (N = 29, non-musicians completed the task at high levels of accuracy, showing no group differences between any experimental conditions. Nineteen

  11. Auditory interfaces: The human perceiver

    Science.gov (United States)

    Colburn, H. Steven

    1991-01-01

    A brief introduction to the basic auditory abilities of the human perceiver with particular attention toward issues that may be important for the design of auditory interfaces is presented. The importance of appropriate auditory inputs to observers with normal hearing is probably related to the role of hearing as an omnidirectional, early warning system and to its role as the primary vehicle for communication of strong personal feelings.

  12. Multiperson visual focus of attention from head pose and meeting contextual cues.

    Science.gov (United States)

    Ba, Sileye O; Odobez, Jean-Marc

    2011-01-01

    This paper introduces a novel contextual model for the recognition of people's visual focus of attention (VFOA) in meetings from audio-visual perceptual cues. More specifically, instead of independently recognizing the VFOA of each meeting participant from his own head pose, we propose to jointly recognize the participants' visual attention in order to introduce context-dependent interaction models that relate to group activity and the social dynamics of communication. Meeting contextual information is represented by the location of people, conversational events identifying floor holding patterns, and a presentation activity variable. By modeling the interactions between the different contexts and their combined and sometimes contradictory impact on the gazing behavior, our model allows us to handle VFOA recognition in difficult task-based meetings involving artifacts, presentations, and moving people. We validated our model through rigorous evaluation on a publicly available and challenging data set of 12 real meetings (5 hours of data). The results demonstrated that the integration of the presentation and conversation dynamical context using our model can lead to significant performance improvements.

  13. Auditory temporal processing skills in musicians with dyslexia.

    Science.gov (United States)

    Bishop-Liebler, Paula; Welch, Graham; Huss, Martina; Thomson, Jennifer M; Goswami, Usha

    2014-08-01

    The core cognitive difficulty in developmental dyslexia involves phonological processing, but adults and children with dyslexia also have sensory impairments. Impairments in basic auditory processing show particular links with phonological impairments, and recent studies with dyslexic children across languages reveal a relationship between auditory temporal processing and sensitivity to rhythmic timing and speech rhythm. As rhythm is explicit in music, musical training might have a beneficial effect on the auditory perception of acoustic cues to rhythm in dyslexia. Here we took advantage of the presence of musicians with and without dyslexia in musical conservatoires, comparing their auditory temporal processing abilities with those of dyslexic non-musicians matched for cognitive ability. Musicians with dyslexia showed equivalent auditory sensitivity to musicians without dyslexia and also showed equivalent rhythm perception. The data support the view that extensive rhythmic experience initiated during childhood (here in the form of music training) can affect basic auditory processing skills which are found to be deficient in individuals with dyslexia. Copyright © 2014 John Wiley & Sons, Ltd.

  14. The effects of distraction and a brief intervention on auditory and visual-spatial working memory in college students with attention deficit hyperactivity disorder.

    Science.gov (United States)

    Lineweaver, Tara T; Kercood, Suneeta; O'Keeffe, Nicole B; O'Brien, Kathleen M; Massey, Eric J; Campbell, Samantha J; Pierce, Jenna N

    2012-01-01

    Two studies addressed how young adult college students with attention deficit hyperactivity disorder (ADHD) (n = 44) compare to their nonaffected peers (n = 42) on tests of auditory and visual-spatial working memory (WM), are vulnerable to auditory and visual distractions, and are affected by a simple intervention. Students with ADHD demonstrated worse auditory WM than did controls. A near significant trend indicated that auditory distractions interfered with the visual WM of both groups and that, whereas controls were also vulnerable to visual distractions, visual distractions improved visual WM in the ADHD group. The intervention was ineffective. Limited correlations emerged between self-reported ADHD symptoms and objective test performances; students with ADHD who perceived themselves as more symptomatic often had better WM and were less vulnerable to distractions than their ADHD peers.

  15. Multisensory brand search: How the meaning of sounds guides consumers' visual attention.

    Science.gov (United States)

    Knoeferle, Klemens M; Knoeferle, Pia; Velasco, Carlos; Spence, Charles

    2016-06-01

    Building on models of crossmodal attention, the present research proposes that brand search is inherently multisensory, in that the consumers' visual search for a specific brand can be facilitated by semantically related stimuli that are presented in another sensory modality. A series of 5 experiments demonstrates that the presentation of spatially nonpredictive auditory stimuli associated with products (e.g., usage sounds or product-related jingles) can crossmodally facilitate consumers' visual search for, and selection of, products. Eye-tracking data (Experiment 2) revealed that the crossmodal effect of auditory cues on visual search manifested itself not only in RTs, but also in the earliest stages of visual attentional processing, thus suggesting that the semantic information embedded within sounds can modulate the perceptual saliency of the target products' visual representations. Crossmodal facilitation was even observed for newly learnt associations between unfamiliar brands and sonic logos, implicating multisensory short-term learning in establishing audiovisual semantic associations. The facilitation effect was stronger when searching complex rather than simple visual displays, thus suggesting a modulatory role of perceptual load. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. Differential Contributions of Nucleus Accumbens Subregions to Cue-Guided Risk/Reward Decision Making and Implementation of Conditional Rules.

    Science.gov (United States)

    Floresco, Stan B; Montes, David R; Tse, Maric M T; van Holstein, Mieke

    2018-02-21

    The nucleus accumbens (NAc) is a key node within corticolimbic circuitry for guiding action selection and cost/benefit decision making in situations involving reward uncertainty. Preclinical studies have typically assessed risk/reward decision making using assays where decisions are guided by internally generated representations of choice-outcome contingencies. Yet, real-life decisions are often influenced by external stimuli that inform about likelihoods of obtaining rewards. How different subregions of the NAc mediate decision making in such situations is unclear. Here, we used a novel assay colloquially termed the "Blackjack" task that models these types of situations. Male Long-Evans rats were trained to choose between one lever that always delivered a one-pellet reward and another that delivered four pellets with different probabilities [either 50% (good-odds) or 12.5% (poor-odds)], which were signaled by one of two auditory cues. Under control conditions, rats selected the large/risky option more often on good-odds versus poor-odds trials. Inactivation of the NAc core caused indiscriminate choice patterns. In contrast, NAc shell inactivation increased risky choice, more prominently on poor-odds trials. Additional experiments revealed that both subregions contribute to auditory conditional discrimination. NAc core or shell inactivation reduced Pavlovian approach elicited by an auditory CS+, yet shell inactivation also increased responding during presentation of a CS-. These data highlight distinct contributions for NAc subregions in decision making and reward seeking guided by discriminative stimuli. The core is crucial for implementation of conditional rules, whereas the shell refines reward seeking by mitigating the allure of larger, unlikely rewards and reducing expression of inappropriate or non-rewarded actions. SIGNIFICANCE STATEMENT Using external cues to guide decision making is crucial for adaptive behavior. Deficits in cue-guided behavior have been

  17. Training Basic Visual Attention Leads to Changes in Responsiveness to Social-Communicative Cues in 9-Month-Olds.

    Science.gov (United States)

    Forssman, Linda; Wass, Sam V

    2017-04-24

    This study investigated transfer effects of gaze-interactive attention training to more complex social and cognitive skills in infancy. Seventy 9-month-olds were assigned to a training group (n = 35) or an active control group (n = 35). Before, after, and at 6-week follow-up both groups completed an assessment battery assessing transfer to nontrained aspects of attention control, including table top tasks assessing social attention in seminaturalistic contexts. Transfer effects were found on nontrained screen-based tasks but importantly also on a structured observation task assessing the infants' likelihood to respond to an adult's social-communication cues. The results causally link basic attention skills and more complex social-communicative skills and provide a principle for studying causal mechanisms of early development. © 2017 The Authors. Child Development © 2017 Society for Research in Child Development, Inc.

  18. Evidence for a shared representation of sequential cues that engage sign-tracking.

    Science.gov (United States)

    Smedley, Elizabeth B; Smith, Kyle S

    2018-06-19

    Sign-tracking is a phenomenon whereby cues that predict rewards come to acquire their own motivational value (incentive salience) and attract appetitive behavior. Typically, sign-tracking paradigms have used single auditory, visual, or lever cues presented prior to a reward delivery. Yet, real world examples of events often can be predicted by a sequence of cues. We have shown that animals will sign-track to multiple cues presented in temporal sequence, and with time develop a bias in responding toward a reward distal cue over a reward proximal cue. Further, extinction of responding to the reward proximal cue directly decreases responding to the reward distal cue. One possible explanation of this result is that serial cues become representationally linked with one another. Here we provide further support of this by showing that extinction of responding to a reward distal cue directly reduces responding to a reward proximal cue. We suggest that the incentive salience of one cue can influence the incentive salience of the other cue. Copyright © 2018. Published by Elsevier B.V.

  19. Comparison of auditory and visual oddball fMRI in schizophrenia.

    Science.gov (United States)

    Collier, Azurii K; Wolf, Daniel H; Valdez, Jeffrey N; Turetsky, Bruce I; Elliott, Mark A; Gur, Raquel E; Gur, Ruben C

    2014-09-01

    Individuals with schizophrenia often suffer from attentional deficits, both in focusing on task-relevant targets and in inhibiting responses to distractors. Schizophrenia also has a differential impact on attention depending on modality: auditory or visual. However, it remains unclear how abnormal activation of attentional circuitry differs between auditory and visual modalities, as these two modalities have not been directly compared in the same individuals with schizophrenia. We utilized event-related functional magnetic resonance imaging (fMRI) to compare patterns of brain activation during an auditory and visual oddball task in order to identify modality-specific attentional impairment. Healthy controls (n=22) and patients with schizophrenia (n=20) completed auditory and visual oddball tasks in separate sessions. For responses to targets, the auditory modality yielded greater activation than the visual modality (A-V) in auditory cortex, insula, and parietal operculum, but visual activation was greater than auditory (V-A) in visual cortex. For responses to novels, A-V differences were found in auditory cortex, insula, and supramarginal gyrus; and V-A differences in the visual cortex, inferior temporal gyrus, and superior parietal lobule. Group differences in modality-specific activation were found only for novel stimuli; controls showed larger A-V differences than patients in prefrontal cortex and the putamen. Furthermore, for patients, greater severity of negative symptoms was associated with greater divergence of A-V novel activation in the visual cortex. Our results demonstrate that patients have more pronounced activation abnormalities in auditory compared to visual attention, and link modality specific abnormalities to negative symptom severity. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Metronome cueing of walking reduces gait variability after a cerebellar stroke

    Directory of Open Access Journals (Sweden)

    Rachel Lindsey Wright

    2016-06-01

    Full Text Available Cerebellar stroke typically results in increased variability during walking. Previous research has suggested that auditory-cueing reduces excessive variability in conditions such as Parkinson’s disease and post-stroke hemiparesis. The aim of this case report was to investigate whether the use of a metronome cue during walking could reduce excessive variability in gait parameters after a cerebellar stroke. An elderly female with a history of cerebellar stroke and recurrent falling undertook 3 standard gait trials and 3 gait trials with an auditory metronome. A Vicon system was used to collect 3-D marker trajectory data. The coefficient of variation was calculated for temporal and spatial gait parameters. Standard deviations of the joint angles were calculated and used to give a measure of joint kinematic variability. Step time, stance time and double support time variability were reduced with metronome cueing. Variability in the sagittal hip, knee and ankle angles were reduced to normal values when walking to the metronome. In summary, metronome cueing resulted in a decrease in variability for step, stance and double support times and joint kinematics. Further research is needed to establish whether a metronome may be useful in gait rehabilitation after cerebellar stroke, and whether this leads to a decreased risk of falling.