WorldWideScience

Sample records for auditory spatial cues

  1. Cueing Visual Attention to Spatial Locations With Auditory Cues

    OpenAIRE

    Kean, Matthew; Crawford, Trevor J

    2008-01-01

    We investigated exogenous and endogenous orienting of visual attention to the spatial loca-tion of an auditory cue. In Experiment 1, significantly faster saccades were observed to vis-ual targets appearing ipsilateral, compared to contralateral, to the peripherally-presented cue. This advantage was greatest in an 80% target-at-cue (TAC) condition but equivalent in 20% and 50% TAC conditions. In Experiment 2, participants maintained central fixation while making an elevation judgment of the pe...

  2. Attention Modulates the Auditory Cortical Processing of Spatial and Category Cues in Naturalistic Auditory Scenes

    Science.gov (United States)

    Renvall, Hanna; Staeren, Noël; Barz, Claudia S.; Ley, Anke; Formisano, Elia

    2016-01-01

    This combined fMRI and MEG study investigated brain activations during listening and attending to natural auditory scenes. We first recorded, using in-ear microphones, vocal non-speech sounds, and environmental sounds that were mixed to construct auditory scenes containing two concurrent sound streams. During the brain measurements, subjects attended to one of the streams while spatial acoustic information of the scene was either preserved (stereophonic sounds) or removed (monophonic sounds). Compared to monophonic sounds, stereophonic sounds evoked larger blood-oxygenation-level-dependent (BOLD) fMRI responses in the bilateral posterior superior temporal areas, independent of which stimulus attribute the subject was attending to. This finding is consistent with the functional role of these regions in the (automatic) processing of auditory spatial cues. Additionally, significant differences in the cortical activation patterns depending on the target of attention were observed. Bilateral planum temporale and inferior frontal gyrus were preferentially activated when attending to stereophonic environmental sounds, whereas when subjects attended to stereophonic voice sounds, the BOLD responses were larger at the bilateral middle superior temporal gyrus and sulcus, previously reported to show voice sensitivity. In contrast, the time-resolved MEG responses were stronger for mono- than stereophonic sounds in the bilateral auditory cortices at ~360 ms after the stimulus onset when attending to the voice excerpts within the combined sounds. The observed effects suggest that during the segregation of auditory objects from the auditory background, spatial sound cues together with other relevant temporal and spectral cues are processed in an attention-dependent manner at the cortical locations generally involved in sound recognition. More synchronous neuronal activation during monophonic than stereophonic sound processing, as well as (local) neuronal inhibitory mechanisms in

  3. A Dominance Hierarchy of Auditory Spatial Cues in Barn Owls

    OpenAIRE

    Witten, Ilana B.; Phyllis F Knudsen; Knudsen, Eric I.

    2010-01-01

    BACKGROUND: Barn owls integrate spatial information across frequency channels to localize sounds in space. METHODOLOGY/PRINCIPAL FINDINGS: We presented barn owls with synchronous sounds that contained different bands of frequencies (3-5 kHz and 7-9 kHz) from different locations in space. When the owls were confronted with the conflicting localization cues from two synchronous sounds of equal level, their orienting responses were dominated by one of the sounds: they oriented toward the locatio...

  4. Selective attention modulates human auditory brainstem responses: relative contributions of frequency and spatial cues.

    Directory of Open Access Journals (Sweden)

    Alexandre Lehmann

    Full Text Available Selective attention is the mechanism that allows focusing one's attention on a particular stimulus while filtering out a range of other stimuli, for instance, on a single conversation in a noisy room. Attending to one sound source rather than another changes activity in the human auditory cortex, but it is unclear whether attention to different acoustic features, such as voice pitch and speaker location, modulates subcortical activity. Studies using a dichotic listening paradigm indicated that auditory brainstem processing may be modulated by the direction of attention. We investigated whether endogenous selective attention to one of two speech signals affects amplitude and phase locking in auditory brainstem responses when the signals were either discriminable by frequency content alone, or by frequency content and spatial location. Frequency-following responses to the speech sounds were significantly modulated in both conditions. The modulation was specific to the task-relevant frequency band. The effect was stronger when both frequency and spatial information were available. Patterns of response were variable between participants, and were correlated with psychophysical discriminability of the stimuli, suggesting that the modulation was biologically relevant. Our results demonstrate that auditory brainstem responses are susceptible to efferent modulation related to behavioral goals. Furthermore they suggest that mechanisms of selective attention actively shape activity at early subcortical processing stages according to task relevance and based on frequency and spatial cues.

  5. Motor Training: Comparison of Visual and Auditory Coded Proprioceptive Cues

    Directory of Open Access Journals (Sweden)

    Philip Jepson

    2012-05-01

    Full Text Available Self-perception of body posture and movement is achieved through multi-sensory integration, particularly the utilisation of vision, and proprioceptive information derived from muscles and joints. Disruption to these processes can occur following a neurological accident, such as stroke, leading to sensory and physical impairment. Rehabilitation can be helped through use of augmented visual and auditory biofeedback to stimulate neuro-plasticity, but the effective design and application of feedback, particularly in the auditory domain, is non-trivial. Simple auditory feedback was tested by comparing the stepping accuracy of normal subjects when given a visual spatial target (step length and an auditory temporal target (step duration. A baseline measurement of step length and duration was taken using optical motion capture. Subjects (n=20 took 20 ‘training’ steps (baseline ±25% using either an auditory target (950 Hz tone, bell-shaped gain envelope or visual target (spot marked on the floor and were then asked to replicate the target step (length or duration corresponding to training with all feedback removed. Visual cues (mean percentage error=11.5%; SD ± 7.0%; auditory cues (mean percentage error = 12.9%; SD ± 11.8%. Visual cues elicit a high degree of accuracy both in training and follow-up un-cued tasks; despite the novelty of the auditory cues present for subjects, the mean accuracy of subjects approached that for visual cues, and initial results suggest a limited amount of practice using auditory cues can improve performance.

  6. Cross-modal cueing in audiovisual spatial attention

    OpenAIRE

    Blurton, Steven Paul; Mark W Greenlee; Gondan, Matthias

    2015-01-01

    Visual processing is most effective at the location of our attentional focus. It has long been known that various spatial cues can direct visuospatial attention and influence the detection of auditory targets. Cross-modal cueing, however, seems to depend on the type of the visual cue: facilitation effects have been reported for endogenous visual cues while exogenous cues seem to be mostly ineffective. In three experiments, we investigated cueing effects on the processing of audiovisual signal...

  7. Developing Spatial Knowledge in the Absence of Vision: Allocentric and Egocentric Representations Generated by Blind People When Supported by Auditory Cues

    Directory of Open Access Journals (Sweden)

    Luca Latini Corazzini

    2010-10-01

    Full Text Available The study of visuospatial representations and visuospatial memory can profit from the analysis of the performance of specific groups. in particular, the surprising skills and limitations of blind people may be an important source of information. For example, converging evidence indicates that, even though blind individuals are able to develop both egocentric and allocentric space representations, the latter tend to be much more restricted than those in blindfolded sighted individuals. however, no study has explored yet whether this conclusion also holds when people receive practice with the spatial environment and are supported by auditory stimuli. The present research examined these issues with the use of an experimental apparatus based on the morris Water maze (morris et al., 1982. in this setup, blind people and blindfolded controls were given the opportunity to develop knowledge of the environment with the support of simultaneous auditory cues. The results show that even in this favourable case blind people spontaneously maintain to rely on an egocentric spatial representation.

  8. Spatial auditory processing in pinnipeds

    Science.gov (United States)

    Holt, Marla M.

    Given the biological importance of sound for a variety of activities, pinnipeds must be able to obtain spatial information about their surroundings thorough acoustic input in the absence of other sensory cues. The three chapters of this dissertation address spatial auditory processing capabilities of pinnipeds in air given that these amphibious animals use acoustic signals for reproduction and survival on land. Two chapters are comparative lab-based studies that utilized psychophysical approaches conducted in an acoustic chamber. Chapter 1 addressed the frequency-dependent sound localization abilities at azimuth of three pinniped species (the harbor seal, Phoca vitulina, the California sea lion, Zalophus californianus, and the northern elephant seal, Mirounga angustirostris). While performances of the sea lion and harbor seal were consistent with the duplex theory of sound localization, the elephant seal, a low-frequency hearing specialist, showed a decreased ability to localize the highest frequencies tested. In Chapter 2 spatial release from masking (SRM), which occurs when a signal and masker are spatially separated resulting in improvement in signal detectability relative to conditions in which they are co-located, was determined in a harbor seal and sea lion. Absolute and masked thresholds were measured at three frequencies and azimuths to determine the detection advantages afforded by this type of spatial auditory processing. Results showed that hearing sensitivity was enhanced by up to 19 and 12 dB in the harbor seal and sea lion, respectively, when the signal and masker were spatially separated. Chapter 3 was a field-based study that quantified both sender and receiver variables of the directional properties of male northern elephant seal calls produce within communication system that serves to delineate dominance status. This included measuring call directivity patterns, observing male-male vocally-mediated interactions, and an acoustic playback study

  9. Retrosplenial Cortex Is Required for the Retrieval of Remote Memory for Auditory Cues

    Science.gov (United States)

    Todd, Travis P.; Mehlman, Max L.; Keene, Christopher S.; DeAngeli, Nicole E.; Bucci, David J.

    2016-01-01

    The retrosplenial cortex (RSC) has a well-established role in contextual and spatial learning and memory, consistent with its known connectivity with visuo-spatial association areas. In contrast, RSC appears to have little involvement with delay fear conditioning to an auditory cue. However, all previous studies have examined the contribution of…

  10. Tactile feedback improves auditory spatial localization

    OpenAIRE

    Gori, Monica; Vercillo, Tiziana; Sandini, Giulio; Burr, David

    2014-01-01

    Our recent studies suggest that congenitally blind adults have severely impaired thresholds in an auditory spatial bisection task, pointing to the importance of vision in constructing complex auditory spatial maps (Gori et al., 2014). To explore strategies that may improve the auditory spatial sense in visually impaired people, we investigated the impact of tactile feedback on spatial auditory localization in 48 blindfolded sighted subjects. We measured auditory spatial bisection thresholds b...

  11. Tactile feedback improves auditory spatial localization

    OpenAIRE

    Monica eGori; Tiziana eVercillo; Giulio eSandini; David eBurr

    2014-01-01

    Our recent studies suggest that congenitally blind adults have severely impaired thresholds in an auditory spatial-bisection task, pointing to the importance of vision in constructing complex auditory spatial maps (Gori et al., 2014). To explore strategies that may improve the auditory spatial sense in visually impaired people, we investigated the impact of tactile feedback on spatial auditory localization in 48 blindfolded sighted subjects. We measured auditory spatial bisection thresholds b...

  12. The plastic ear and perceptual relearning in auditory spatial perception.

    Science.gov (United States)

    Carlile, Simon

    2014-01-01

    The auditory system of adult listeners has been shown to accommodate to altered spectral cues to sound location which presumably provides the basis for recalibration to changes in the shape of the ear over a life time. Here we review the role of auditory and non-auditory inputs to the perception of sound location and consider a range of recent experiments looking at the role of non-auditory inputs in the process of accommodation to these altered spectral cues. A number of studies have used small ear molds to modify the spectral cues that result in significant degradation in localization performance. Following chronic exposure (10-60 days) performance recovers to some extent and recent work has demonstrated that this occurs for both audio-visual and audio-only regions of space. This begs the questions as to the teacher signal for this remarkable functional plasticity in the adult nervous system. Following a brief review of influence of the motor state in auditory localization, we consider the potential role of auditory-motor learning in the perceptual recalibration of the spectral cues. Several recent studies have considered how multi-modal and sensory-motor feedback might influence accommodation to altered spectral cues produced by ear molds or through virtual auditory space stimulation using non-individualized spectral cues. The work with ear molds demonstrates that a relatively short period of training involving audio-motor feedback (5-10 days) significantly improved both the rate and extent of accommodation to altered spectral cues. This has significant implications not only for the mechanisms by which this complex sensory information is encoded to provide spatial cues but also for adaptive training to altered auditory inputs. The review concludes by considering the implications for rehabilitative training with hearing aids and cochlear prosthesis. PMID:25147497

  13. The plastic ear and perceptual relearning in auditory spatial perception.

    Science.gov (United States)

    Carlile, Simon

    2014-01-01

    The auditory system of adult listeners has been shown to accommodate to altered spectral cues to sound location which presumably provides the basis for recalibration to changes in the shape of the ear over a life time. Here we review the role of auditory and non-auditory inputs to the perception of sound location and consider a range of recent experiments looking at the role of non-auditory inputs in the process of accommodation to these altered spectral cues. A number of studies have used small ear molds to modify the spectral cues that result in significant degradation in localization performance. Following chronic exposure (10-60 days) performance recovers to some extent and recent work has demonstrated that this occurs for both audio-visual and audio-only regions of space. This begs the questions as to the teacher signal for this remarkable functional plasticity in the adult nervous system. Following a brief review of influence of the motor state in auditory localization, we consider the potential role of auditory-motor learning in the perceptual recalibration of the spectral cues. Several recent studies have considered how multi-modal and sensory-motor feedback might influence accommodation to altered spectral cues produced by ear molds or through virtual auditory space stimulation using non-individualized spectral cues. The work with ear molds demonstrates that a relatively short period of training involving audio-motor feedback (5-10 days) significantly improved both the rate and extent of accommodation to altered spectral cues. This has significant implications not only for the mechanisms by which this complex sensory information is encoded to provide spatial cues but also for adaptive training to altered auditory inputs. The review concludes by considering the implications for rehabilitative training with hearing aids and cochlear prosthesis.

  14. Designing auditory cues for Parkinson's disease gait rehabilitation.

    Science.gov (United States)

    Cancela, Jorge; Moreno, Eugenio M; Arredondo, Maria T; Bonato, Paolo

    2014-01-01

    Recent works have proved that Parkinson's disease (PD) patients can be largely benefit by performing rehabilitation exercises based on audio cueing and music therapy. Specially, gait can benefit from repetitive sessions of exercises using auditory cues. Nevertheless, all the experiments are based on the use of a metronome as auditory stimuli. Within this work, Human-Computer Interaction methodologies have been used to design new cues that could benefit the long-term engagement of PD patients in these repetitive routines. The study has been also extended to commercial music and musical pieces by analyzing features and characteristics that could benefit the engagement of PD patients to rehabilitation tasks. PMID:25571327

  15. The plastic ear and perceptual relearning in auditory spatial perception.

    Directory of Open Access Journals (Sweden)

    Simon eCarlile

    2014-08-01

    Full Text Available The auditory system of adult listeners has been shown to accommodate to altered spectral cues to sound location which presumably provides the basis for recalibration to changes in the shape of the ear over a life time. Here we review the role of auditory and non-auditory inputs to the perception of sound location and consider a range of recent experiments looking at the role of non-auditory inputs in the process of accommodation to these altered spectral cues. A number of studies have used small ear moulds to modify the spectral cues that result in significant degradation in localization performance. Following chronic exposure (10-60 days performance recovers to some extent and recent work has demonstrated that this occurs for both audio-visual and audio-only regions of space. This begs the questions as to the teacher signal for this remarkable functional plasticity in the adult nervous system. Following a brief review of influence of the motor state in auditory localisation, we consider the potential role of auditory-motor learning in the perceptual recalibration of the spectral cues. Several recent studies have considered how multi-modal and sensory-motor feedback might influence accommodation to altered spectral cues produced by ear moulds or through virtual auditory space stimulation using non-individualised spectral cues. The work with ear moulds demonstrates that a relatively short period of training involving sensory-motor feedback (5 – 10 days significantly improved both the rate and extent of accommodation to altered spectral cues. This has significant implications not only for the mechanisms by which this complex sensory information is encoded to provide a spatial code but also for adaptive training to altered auditory inputs. The review concludes by considering the implications for rehabilitative training with hearing aids and cochlear prosthesis.

  16. Auditory rhythmic cueing in movement rehabilitation: findings and possible mechanisms

    Science.gov (United States)

    Schaefer, Rebecca S.

    2014-01-01

    Moving to music is intuitive and spontaneous, and music is widely used to support movement, most commonly during exercise. Auditory cues are increasingly also used in the rehabilitation of disordered movement, by aligning actions to sounds such as a metronome or music. Here, the effect of rhythmic auditory cueing on movement is discussed and representative findings of cued movement rehabilitation are considered for several movement disorders, specifically post-stroke motor impairment, Parkinson's disease and Huntington's disease. There are multiple explanations for the efficacy of cued movement practice. Potentially relevant, non-mutually exclusive mechanisms include the acceleration of learning; qualitatively different motor learning owing to an auditory context; effects of increased temporal skills through rhythmic practices and motivational aspects of musical rhythm. Further considerations of rehabilitation paradigm efficacy focus on specific movement disorders, intervention methods and complexity of the auditory cues. Although clinical interventions using rhythmic auditory cueing do not show consistently positive results, it is argued that internal mechanisms of temporal prediction and tracking are crucial, and further research may inform rehabilitation practice to increase intervention efficacy. PMID:25385780

  17. Visual–auditory spatial processing in auditory cortical neurons

    OpenAIRE

    Bizley, Jennifer K.; King, Andrew J

    2008-01-01

    Neurons responsive to visual stimulation have now been described in the auditory cortex of various species, but their functions are largely unknown. Here we investigate the auditory and visual spatial sensitivity of neurons recorded in 5 different primary and non-primary auditory cortical areas of the ferret. We quantified the spatial tuning of neurons by measuring the responses to stimuli presented across a range of azimuthal positions and calculating the mutual information (MI) between the ...

  18. Auditory spatial localization: Developmental delay in children with visual impairments.

    Science.gov (United States)

    Cappagli, Giulia; Gori, Monica

    2016-01-01

    For individuals with visual impairments, auditory spatial localization is one of the most important features to navigate in the environment. Many works suggest that blind adults show similar or even enhanced performance for localization of auditory cues compared to sighted adults (Collignon, Voss, Lassonde, & Lepore, 2009). To date, the investigation of auditory spatial localization in children with visual impairments has provided contrasting results. Here we report, for the first time, that contrary to visually impaired adults, children with low vision or total blindness show a significant impairment in the localization of static sounds. These results suggest that simple auditory spatial tasks are compromised in children, and that this capacity recovers over time. PMID:27002960

  19. An auditory cue-depreciation effect.

    Science.gov (United States)

    Gibson, J M; Watkins, M J

    1991-01-01

    An experiment is reported in which subjects first heard a list of words and then tried to identify these same words from degraded utterances. Paralleling previous findings in the visual modality, the probability of identifying a given utterance was reduced when the utterance was immediately preceded by other, more degraded, utterances of the same word. A second experiment replicated this "cue-depreciation effect" and in addition found the effect to be weakened, if not eliminated, when the target word was not included in the initial list or when the test was delayed by two days.

  20. Cross-modal cueing in audiovisual spatial attention

    DEFF Research Database (Denmark)

    Blurton, Steven Paul; Greenlee, Mark W.; Gondan, Matthias

    2015-01-01

    effects have been reported for endogenous visual cues while exogenous cues seem to be mostly ineffective. In three experiments, we investigated cueing effects on the processing of audiovisual signals. In Experiment 1 we used endogenous cues to investigate their effect on the detection of auditory, visual......, and audiovisual targets presented with onset asynchrony. Consistent cueing effects were found in all target conditions. In Experiment 2 we used exogenous cues and found cueing effects only for visual target detection, but not auditory target detection. In Experiment 3 we used predictive exogenous cues to examine...

  1. Spatial audition in a static virtual environment: the role of auditory-visual interaction

    Directory of Open Access Journals (Sweden)

    Isabelle Viaud-Delmon

    2009-04-01

    Full Text Available The integration of the auditory modality in virtual reality environments is known to promote the sensations of immersion and presence. However it is also known from psychophysics studies that auditory-visual interaction obey to complex rules and that multisensory conflicts may disrupt the adhesion of the participant to the presented virtual scene. It is thus important to measure the accuracy of the auditory spatial cues reproduced by the auditory display and their consistency with the spatial visual cues. This study evaluates auditory localization performances under various unimodal and auditory-visual bimodal conditions in a virtual reality (VR setup using a stereoscopic display and binaural reproduction over headphones in static conditions. The auditory localization performances observed in the present study are in line with those reported in real conditions, suggesting that VR gives rise to consistent auditory and visual spatial cues. These results validate the use of VR for future psychophysics experiments with auditory and visual stimuli. They also emphasize the importance of a spatially accurate auditory and visual rendering for VR setups.

  2. Tactile feedback improves auditory spatial localization.

    Science.gov (United States)

    Gori, Monica; Vercillo, Tiziana; Sandini, Giulio; Burr, David

    2014-01-01

    Our recent studies suggest that congenitally blind adults have severely impaired thresholds in an auditory spatial bisection task, pointing to the importance of vision in constructing complex auditory spatial maps (Gori et al., 2014). To explore strategies that may improve the auditory spatial sense in visually impaired people, we investigated the impact of tactile feedback on spatial auditory localization in 48 blindfolded sighted subjects. We measured auditory spatial bisection thresholds before and after training, either with tactile feedback, verbal feedback, or no feedback. Audio thresholds were first measured with a spatial bisection task: subjects judged whether the second sound of a three sound sequence was spatially closer to the first or the third sound. The tactile feedback group underwent two audio-tactile feedback sessions of 100 trials, where each auditory trial was followed by the same spatial sequence played on the subject's forearm; auditory spatial bisection thresholds were evaluated after each session. In the verbal feedback condition, the positions of the sounds were verbally reported to the subject after each feedback trial. The no feedback group did the same sequence of trials, with no feedback. Performance improved significantly only after audio-tactile feedback. The results suggest that direct tactile feedback interacts with the auditory spatial localization system, possibly by a process of cross-sensory recalibration. Control tests with the subject rotated suggested that this effect occurs only when the tactile and acoustic sequences are spatially congruent. Our results suggest that the tactile system can be used to recalibrate the auditory sense of space. These results encourage the possibility of designing rehabilitation programs to help blind persons establish a robust auditory sense of space, through training with the tactile modality. PMID:25368587

  3. Tactile feedback improves auditory spatial localization

    Directory of Open Access Journals (Sweden)

    Monica eGori

    2014-10-01

    Full Text Available Our recent studies suggest that congenitally blind adults have severely impaired thresholds in an auditory spatial-bisection task, pointing to the importance of vision in constructing complex auditory spatial maps (Gori et al., 2014. To explore strategies that may improve the auditory spatial sense in visually impaired people, we investigated the impact of tactile feedback on spatial auditory localization in 48 blindfolded sighted subjects. We measured auditory spatial bisection thresholds before and after training, either with tactile feedback, verbal feedback or no feedback. Audio thresholds were first measured with a spatial bisection task: subjects judged whether the second sound of a three sound sequence was spatially closer to the first or the third sound. The tactile-feedback group underwent two audio-tactile feedback sessions of 100 trials, where each auditory trial was followed by the same spatial sequence played on the subject’s forearm; auditory spatial bisection thresholds were evaluated after each session. In the verbal-feedback condition, the positions of the sounds were verbally reported to the subject after each feedback trial. The no-feedback group did the same sequence of trials, with no feedback. Performance improved significantly only after audio-tactile feedback. The results suggest that direct tactile feedback interacts with the auditory spatial localization system, possibly by a process of cross-sensory recalibration. Control tests with the subject rotated suggested that this effect occurs only when the tactile and acoustic sequences are spatially coherent. Our results suggest that the tactile system can be used to recalibrate the auditory sense of space. These results encourage the possibility of designing rehabilitation programs to help blind persons establish a robust auditory sense of space, through training with the tactile modality.

  4. The effect of exogenous spatial attention on auditory information processing.

    OpenAIRE

    Kanai, Kenichi; Ikeda, Kazuo; Tayama, Tadayuki

    2007-01-01

    This study investigated the effect of exogenous spatial attention on auditory information processing. In Experiments 1, 2 and 3, temporal order judgment tasks were performed to examine the effect. In Experiment 1 and 2, a cue tone was presented to either the left or right ear, followed by sequential presentation of two target tones. The subjects judged the order of presentation of the target tones. The results showed that subjects heard both tones simultaneously when the target tone, which wa...

  5. Auditory Cues Used for Wayfinding in Urban Environments by Individuals with Visual Impairments

    Science.gov (United States)

    Koutsoklenis, Athanasios; Papadopoulos, Konstantinos

    2011-01-01

    The study presented here examined which auditory cues individuals with visual impairments use more frequently and consider to be the most important for wayfinding in urban environments. It also investigated the ways in which these individuals use the most significant auditory cues. (Contains 1 table and 3 figures.)

  6. Effects of incongruent auditory and visual room-related cues on sound externalization

    DEFF Research Database (Denmark)

    Carvajal, Juan Camilo Gil; Santurette, Sébastien; Cubick, Jens;

    recorded [1,2,3]. This may be due to incongruent auditory cues between the recording and playback room during sound reproduction [2]. Alternatively, an expectation effect caused by the visual impression of the room may affect the position of the perceived auditory image [3]. Here, we systematically...... investigated whether incongruent auditory and visual roomrelated cues affected sound externalization in terms of perceived distance, azimuthal localization, and compactness....

  7. A review on auditory space adaptations to altered head-related cues

    Directory of Open Access Journals (Sweden)

    Catarina eMendonça

    2014-07-01

    Full Text Available In this article we present a review of current literature on adaptations to altered head-related auditory localization cues. Localization cues can be altered through ear blocks, ear molds, electronic hearing devices and altered head-related transfer functions. Three main methods have been used to induce auditory space adaptation: sound exposure, training with feedback, and explicit training. Adaptations induced by training, rather than exposure, are consistently faster. Studies on localization with altered head-related cues have reported poor initial localization, but improved accuracy and discriminability with training. Also, studies that displaced the auditory space by altering cue values reported adaptations in perceived source position to compensate for such displacements. Auditory space adaptations can last for a few months even without further contact with the learned cues. In most studies, localization with the subject’s own unaltered cues remained intact despite the adaptation to a second set of cues. Generalization is observed from trained to untrained sound source positions, but there is mixed evidence regarding cross-frequency generalization. Multiple brain areas might be involved in auditory space adaptation processes, but the auditory cortex may play a critical role. Auditory space plasticity may involve context-dependent cue reweighting.

  8. Guiding a Person with Blindness and Intellectual Disability in Indoor Travel with Fewer Auditory Cues.

    Science.gov (United States)

    Lancioni, Giulio E.; O'Reilly, Mark F.; Oliva, Doretta; Bracalente, Sandro

    1998-01-01

    This study assessed the possibility of guiding a person with blindness and intellectual disability during indoor travel with fewer auditory cues. Results indicated that infrequent presentation of the cues and the provision of extra cues in case of errors maintained high levels of independent moves, albeit of increased duration. (Author/CR)

  9. Auditory-visual spatial interaction and modularity

    Science.gov (United States)

    Radeau, M

    1994-02-01

    The results of dealing with the conditions for pairing visual and auditory data coming from spatially separate locations argue for cognitive impenetrability and computational autonomy, the pairing rules being the Gestalt principles of common fate and proximity. Other data provide evidence for pairing with several properties of modular functioning. Arguments for domain specificity are inferred from comparison with audio-visual speech. Suggestion of innate specification can be found in developmental data indicating that the grouping of visual and auditory signals is supported very early in life by the same principles that operate in adults. Support for a specific neural architecture comes from neurophysiological studies of the bimodal (auditory-visual) neurons of the cat superior colliculus. Auditory-visual pairing thus seems to present the four main properties of the Fodorian module.

  10. Capuchin monkeys (Cebus apella) use positive, but not negative auditory cues to infer food location

    OpenAIRE

    Heimbauer, Lisa A.; Antworth, Rebecca L.; Owren, Michael J.

    2011-01-01

    Nonhuman primates appear to capitalize more effectively on visual cues than corresponding auditory versions. For example, studies of inferential reasoning have shown that monkeys and apes readily respond to seeing that food is present (“positive” cuing) or absent (“negative” cuing). Performance is markedly less effective with auditory cues, with many subjects failing to use this input. Extending recent work, we tested eight captive tufted capuchins (Cebus apella) in locating food using positi...

  11. The possible price of auditory cueing: influence on obstacle avoidance in Parkinson's disease.

    NARCIS (Netherlands)

    Nanhoe-Mahabier, S.W.; Delval, A.; Snijders, A.H.; Weerdesteijn, V.G.; Overeem, S.; Bloem, B.R.

    2012-01-01

    BACKGROUND: Under carefully controlled conditions, rhythmic auditory cueing can improve gait in patients with Parkinson's disease (PD). In complex environments, attention paid to cueing might adversely affect gait, for example when a simultaneous task-such as avoiding obstacles-has to be executed. W

  12. A Psychophysical Imaging Method Evidencing Auditory Cue Extraction during Speech Perception: A Group Analysis of Auditory Classification Images

    OpenAIRE

    Varnet, Léo; Knoblauch, Kenneth; Serniclaes, Willy; Meunier, Fanny; Hoen, Michel

    2015-01-01

    Although there is a large consensus regarding the involvement of specific acoustic cues in speech perception, the precise mechanisms underlying the transformation from continuous acoustical properties into discrete perceptual units remains undetermined. This gap in knowledge is partially due to the lack of a turnkey solution for isolating critical speech cues from natural stimuli. In this paper, we describe a psychoacoustic imaging method known as the Auditory Classification Image technique t...

  13. Auditory Verbal Cues Alter the Perceived Flavor of Beverages and Ease of Swallowing: A Psychometric and Electrophysiological Analysis

    OpenAIRE

    Aya Nakamura; Satoshi Imaizumi

    2013-01-01

    We investigated the possible effects of auditory verbal cues on flavor perception and swallow physiology for younger and elder participants. Apple juice, aojiru (grass) juice, and water were ingested with or without auditory verbal cues. Flavor perception and ease of swallowing were measured using a visual analog scale and swallow physiology by surface electromyography and cervical auscultation. The auditory verbal cues had significant positive effects on flavor and ease of swallowing as well...

  14. Sonic morphology: Aesthetic dimensional auditory spatial awareness

    Science.gov (United States)

    Whitehouse, Martha M.

    The sound and ceramic sculpture installation, " Skirting the Edge: Experiences in Sound & Form," is an integration of art and science demonstrating the concept of sonic morphology. "Sonic morphology" is herein defined as aesthetic three-dimensional auditory spatial awareness. The exhibition explicates my empirical phenomenal observations that sound has a three-dimensional form. Composed of ceramic sculptures that allude to different social and physical situations, coupled with sound compositions that enhance and create a three-dimensional auditory and visual aesthetic experience (see accompanying DVD), the exhibition supports the research question, "What is the relationship between sound and form?" Precisely how people aurally experience three-dimensional space involves an integration of spatial properties, auditory perception, individual history, and cultural mores. People also utilize environmental sound events as a guide in social situations and in remembering their personal history, as well as a guide in moving through space. Aesthetically, sound affects the fascination, meaning, and attention one has within a particular space. Sonic morphology brings art forms such as a movie, video, sound composition, and musical performance into the cognitive scope by generating meaning from the link between the visual and auditory senses. This research examined sonic morphology as an extension of musique concrete, sound as object, originating in Pierre Schaeffer's work in the 1940s. Pointing, as John Cage did, to the corporeal three-dimensional experience of "all sound," I composed works that took their total form only through the perceiver-participant's participation in the exhibition. While contemporary artist Alvin Lucier creates artworks that draw attention to making sound visible, "Skirting the Edge" engages the perceiver-participant visually and aurally, leading to recognition of sonic morphology.

  15. Valid cues for auditory or somatosensory targets affect their perception: a signal detection approach.

    Science.gov (United States)

    Van Hulle, Lore; Van Damme, Stefaan; Crombez, Geert

    2013-01-01

    We investigated the effects of focusing attention towards auditory or somatosensory stimuli on perceptual sensitivity and response bias using a signal detection task. Participants (N = 44) performed an unspeeded detection task in which weak (individually calibrated) somatosensory or auditory stimuli were delivered. The focus of attention was manipulated by the presentation of a visual cue at the start of each trial. The visual cue consisted of the word "warmth" or the word "tone". This word cue was predictive of the corresponding target on two-thirds of the trials. As hypothesised, the results showed that cueing attention to a specific sensory modality resulted in a higher perceptual sensitivity for validly cued targets than for invalidly cued targets, as well as in a more liberal response criterion for reporting stimuli in the valid modality than in the invalid modality. The value of this experimental paradigm for investigating excessive attentional focus or hypervigilance in various non-clinical and clinical populations is discussed.

  16. Auditory Verbal Cues Alter the Perceived Flavor of Beverages and Ease of Swallowing: A Psychometric and Electrophysiological Analysis

    Directory of Open Access Journals (Sweden)

    Aya Nakamura

    2013-01-01

    Full Text Available We investigated the possible effects of auditory verbal cues on flavor perception and swallow physiology for younger and elder participants. Apple juice, aojiru (grass juice, and water were ingested with or without auditory verbal cues. Flavor perception and ease of swallowing were measured using a visual analog scale and swallow physiology by surface electromyography and cervical auscultation. The auditory verbal cues had significant positive effects on flavor and ease of swallowing as well as on swallow physiology. The taste score and the ease of swallowing score significantly increased when the participant’s anticipation was primed by accurate auditory verbal cues. There was no significant effect of auditory verbal cues on distaste score. Regardless of age, the maximum suprahyoid muscle activity significantly decreased when a beverage was ingested without auditory verbal cues. The interval between the onset of swallowing sounds and the peak timing point of the infrahyoid muscle activity significantly shortened when the anticipation induced by the cue was contradicted in the elderly participant group. These results suggest that auditory verbal cues can improve the perceived flavor of beverages and swallow physiology.

  17. Auditory cueing in Parkinson's patients with freezing of gait. What matters most: Action-relevance or cue-continuity?

    Science.gov (United States)

    Young, William R; Shreve, Lauren; Quinn, Emma Jane; Craig, Cathy; Bronte-Stewart, Helen

    2016-07-01

    Gait disturbances are a common feature of Parkinson's disease, one of the most severe being freezing of gait. Sensory cueing is a common method used to facilitate stepping in people with Parkinson's. Recent work has shown that, compared to walking to a metronome, Parkinson's patients without freezing of gait (nFOG) showed reduced gait variability when imitating recorded sounds of footsteps made on gravel. However, it is not known if these benefits are realised through the continuity of the acoustic information or the action-relevance. Furthermore, no study has examined if these benefits extend to PD with freezing of gait. We prepared four different auditory cues (varying in action-relevance and acoustic continuity) and asked 19 Parkinson's patients (10 nFOG, 9 with freezing of gait (FOG)) to step in place to each cue. Results showed a superiority of action-relevant cues (regardless of cue-continuity) for inducing reductions in Step coefficient of variation (CV). Acoustic continuity was associated with a significant reduction in Swing CV. Neither cue-continuity nor action-relevance was independently sufficient to increase the time spent stepping before freezing. However, combining both attributes in the same cue did yield significant improvements. This study demonstrates the potential of using action-sounds as sensory cues for Parkinson's patients with freezing of gait. We suggest that the improvements shown might be considered audio-motor 'priming' (i.e., listening to the sounds of footsteps will engage sensorimotor circuitry relevant to the production of that same action, thus effectively bypassing the defective basal ganglia). PMID:27163397

  18. Visual and Auditory Cue Effects on Risk Assessment in a Highway Training Simulation

    NARCIS (Netherlands)

    Toet, A.; Houtkamp, J.M.; Meulen, van der R.

    2013-01-01

    We investigated whether manipulation of visual and auditory depth and speed cues can affect a user’s sense of risk for a low-cost nonimmersive virtual environment (VE) representing a highway environment with traffic incidents. The VE is currently used in an examination program to assess procedural k

  19. Volume Attenuation and High Frequency Loss as Auditory Depth Cues in Stereoscopic 3D Cinema

    Science.gov (United States)

    Manolas, Christos; Pauletto, Sandra

    2014-09-01

    Assisted by the technological advances of the past decades, stereoscopic 3D (S3D) cinema is currently in the process of being established as a mainstream form of entertainment. The main focus of this collaborative effort is placed on the creation of immersive S3D visuals. However, with few exceptions, little attention has been given so far to the potential effect of the soundtrack on such environments. The potential of sound both as a means to enhance the impact of the S3D visual information and to expand the S3D cinematic world beyond the boundaries of the visuals is large. This article reports on our research into the possibilities of using auditory depth cues within the soundtrack as a means of affecting the perception of depth within cinematic S3D scenes. We study two main distance-related auditory cues: high-end frequency loss and overall volume attenuation. A series of experiments explored the effectiveness of these auditory cues. Results, although not conclusive, indicate that the studied auditory cues can influence the audience judgement of depth in cinematic 3D scenes, sometimes in unexpected ways. We conclude that 3D filmmaking can benefit from further studies on the effectiveness of specific sound design techniques to enhance S3D cinema.

  20. Auditory gist: recognition of very short sounds from timbre cues.

    Science.gov (United States)

    Suied, Clara; Agus, Trevor R; Thorpe, Simon J; Mesgarani, Nima; Pressnitzer, Daniel

    2014-03-01

    Sounds such as the voice or musical instruments can be recognized on the basis of timbre alone. Here, sound recognition was investigated with severely reduced timbre cues. Short snippets of naturally recorded sounds were extracted from a large corpus. Listeners were asked to report a target category (e.g., sung voices) among other sounds (e.g., musical instruments). All sound categories covered the same pitch range, so the task had to be solved on timbre cues alone. The minimum duration for which performance was above chance was found to be short, on the order of a few milliseconds, with the best performance for voice targets. Performance was independent of pitch and was maintained when stimuli contained less than a full waveform cycle. Recognition was not generally better when the sound snippets were time-aligned with the sound onset compared to when they were extracted with a random starting time. Finally, performance did not depend on feedback or training, suggesting that the cues used by listeners in the artificial gating task were similar to those relevant for longer, more familiar sounds. The results show that timbre cues for sound recognition are available at a variety of time scales, including very short ones. PMID:24606276

  1. Getting the cue: sensory contributions to auditory emotion recognition impairments in schizophrenia.

    Science.gov (United States)

    Leitman, David I; Laukka, Petri; Juslin, Patrik N; Saccente, Erica; Butler, Pamela; Javitt, Daniel C

    2010-05-01

    Individuals with schizophrenia show reliable deficits in the ability to recognize emotions from vocal expressions. Here, we examined emotion recognition ability in 23 schizophrenia patients relative to 17 healthy controls using a stimulus battery with well-characterized acoustic features. We further evaluated performance deficits relative to ancillary assessments of underlying pitch perception abilities. As predicted, patients showed reduced emotion recognition ability across a range of emotions, which correlated with impaired basic tone matching abilities. Emotion identification deficits were strongly related to pitch-based acoustic cues such as mean and variability of fundamental frequency. Whereas healthy subjects' performance varied as a function of the relative presence or absence of these cues, with higher cue levels leading to enhanced performance, schizophrenia patients showed significantly less variation in performance as a function of cue level. In contrast to pitch-based cues, both groups showed equivalent variation in performance as a function of intensity-based cues. Finally, patients were less able than controls to differentiate between expressions with high and low emotion intensity, and this deficit was also correlated with impaired tone matching ability. Both emotion identification and intensity rating deficits were unrelated to valence of intended emotions. Deficits in both auditory emotion identification and more basic perceptual abilities correlated with impaired functional outcome. Overall, these findings support the concept that auditory emotion identification deficits in schizophrenia reflect, at least in part, a relative inability to process critical acoustic characteristics of prosodic stimuli and that such deficits contribute to poor global outcome. PMID:18791077

  2. Auditory and visual spatial impression: Recent studies of three auditoria

    Science.gov (United States)

    Nguyen, Andy; Cabrera, Densil

    2004-10-01

    Auditory spatial impression is widely studied for its contribution to auditorium acoustical quality. By contrast, visual spatial impression in auditoria has received relatively little attention in formal studies. This paper reports results from a series of experiments investigating the auditory and visual spatial impression of concert auditoria. For auditory stimuli, a fragment of an anechoic recording of orchestral music was convolved with calibrated binaural impulse responses, which had been made with the dummy head microphone at a wide range of positions in three auditoria and the sound source on the stage. For visual stimuli, greyscale photographs were used, taken at the same positions in the three auditoria, with a visual target on the stage. Subjective experiments were conducted with auditory stimuli alone, visual stimuli alone, and visual and auditory stimuli combined. In these experiments, subjects rated apparent source width, listener envelopment, intimacy and source distance (auditory stimuli), and spaciousness, envelopment, stage dominance, intimacy and target distance (visual stimuli). Results show target distance to be of primary importance in auditory and visual spatial impression-thereby providing a basis for covariance between some attributes of auditory and visual spatial impression. Nevertheless, some attributes of spatial impression diverge between the senses.

  3. An exploration of spatial auditory BCI paradigms with different sounds: music notes versus beeps.

    Science.gov (United States)

    Huang, Minqiang; Daly, Ian; Jin, Jing; Zhang, Yu; Wang, Xingyu; Cichocki, Andrzej

    2016-06-01

    Visual brain-computer interfaces (BCIs) are not suitable for people who cannot reliably maintain their eye gaze. Considering that this group usually maintains audition, an auditory based BCI may be a good choice for them. In this paper, we explore two auditory patterns: (1) a pattern utilizing symmetrical spatial cues with multiple frequency beeps [called the high low medium (HLM) pattern], and (2) a pattern utilizing non-symmetrical spatial cues with six tones derived from the diatonic scale [called the diatonic scale (DS) pattern]. These two patterns are compared to each other in terms of accuracy to determine which auditory pattern is better. The HLM pattern uses three different frequency beeps and has a symmetrical spatial distribution. The DS pattern uses six spoken stimuli, which are six notes solmizated as "do", "re", "mi", "fa", "sol" and "la", and derived from the diatonic scale. These six sounds are distributed to six, spatially distributed, speakers. Thus, we compare a BCI paradigm using beeps with another BCI paradigm using tones on the diatonic scale, when the stimuli are spatially distributed. Although no significant differences are found between the ERPs, the HLM pattern performs better than the DS pattern: the online accuracy achieved with the HLM pattern is significantly higher than that achieved with the DS pattern (p = 0.0028). PMID:27275376

  4. Deceptive Body Movements Reverse Spatial Cueing in Soccer

    OpenAIRE

    Wright, MJ; Jackson, RC

    2014-01-01

    The purpose of the experiments was to analyse the spatial cueing effects of the movements of soccer players executing normal and deceptive (step-over) turns with the ball. Stimuli comprised normal resolution or point-light video clips of soccer players dribbling a football towards the observer then turning right or left with the ball. Clips were curtailed before or on the turn (-160, -80, 0 or +80 ms) to examine the time course of direction prediction and spatial cueing effects. Participants ...

  5. Auditory cues increase the hippocampal response to unimodal virtual reality.

    Science.gov (United States)

    Andreano, Joseph; Liang, Kevin; Kong, Lingjun; Hubbard, David; Wiederhold, Brenda K; Wiederhold, Mark D

    2009-06-01

    Previous research suggests that the effectiveness of virtual reality exposure therapy should increase as the experience becomes more immersive. However, the neural mechanisms underlying the experience of immersion are not yet well understood. To address this question, neural activity during exposure to two virtual worlds was measured by functional magnetic resonance imaging (fMRI). Two levels of immersion were used: unimodal (video only) and multimodal (video plus audio). The results indicated increased activity in both auditory and visual sensory cortices during multimodal presentation. Additionally, multimodal presentation elicited increased activity in the hippocampus, a region well known to be involved in learning and memory. The implications of this finding for exposure therapy are discussed. PMID:19500000

  6. Tactile Cueing as a Gravitational Substitute for Spatial Navigation During Parabolic Flight

    Science.gov (United States)

    Montgomery, K. L.; Beaton, K. H.; Barba, J. M.; Cackler, J. M.; Son, J. H.; Horsfield, S. P.; Wood, S. J.

    2010-01-01

    INTRODUCTION: Spatial navigation requires an accurate awareness of orientation in your environment. The purpose of this experiment was to examine how spatial awareness was impaired with changing gravitational cues during parabolic flight, and the extent to which vibrotactile feedback of orientation could be used to help improve performance. METHODS: Six subjects were restrained in a chair tilted relative to the plane floor, and placed at random positions during the start of the microgravity phase. Subjects reported their orientation using verbal reports, and used a hand-held controller to point to a desired target location presented using a virtual reality video mask. This task was repeated with and without constant tactile cueing of "down" direction using a belt of 8 tactors placed around the mid-torso. Control measures were obtained during ground testing using both upright and tilted conditions. RESULTS: Perceptual estimates of orientation and pointing accuracy were impaired during microgravity or during rotation about an upright axis in 1g. The amount of error was proportional to the amount of chair displacement. Perceptual errors were reduced during movement about a tilted axis on earth. CONCLUSIONS: Reduced perceptual errors during tilts in 1g indicate the importance of otolith and somatosensory cues for maintaining spatial awareness. Tactile cueing may improve navigation in operational environments or clinical populations, providing a non-visual non-auditory feedback of orientation or desired direction heading.

  7. Covert Auditory Spatial Orienting: An Evaluation of the Spatial Relevance Hypothesis

    Science.gov (United States)

    Roberts, Katherine L.; Summerfield, A. Quentin; Hall, Deborah A.

    2009-01-01

    The spatial relevance hypothesis (J. J. McDonald & L. M. Ward, 1999) proposes that covert auditory spatial orienting can only be beneficial to auditory processing when task stimuli are encoded spatially. We present a series of experiments that evaluate 2 key aspects of the hypothesis: (a) that "reflexive activation of location-sensitive neurons is…

  8. Interface Design Implications for Recalling the Spatial Configuration of Virtual Auditory Environments

    Science.gov (United States)

    McMullen, Kyla A.

    Although the concept of virtual spatial audio has existed for almost twenty-five years, only in the past fifteen years has modern computing technology enabled the real-time processing needed to deliver high-precision spatial audio. Furthermore, the concept of virtually walking through an auditory environment did not exist. The applications of such an interface have numerous potential uses. Spatial audio has the potential to be used in various manners ranging from enhancing sounds delivered in virtual gaming worlds to conveying spatial locations in real-time emergency response systems. To incorporate this technology in real-world systems, various concerns should be addressed. First, to widely incorporate spatial audio into real-world systems, head-related transfer functions (HRTFs) must be inexpensively created for each user. The present study further investigated an HRTF subjective selection procedure previously developed within our research group. Users discriminated auditory cues to subjectively select their preferred HRTF from a publicly available database. Next, the issue of training to find virtual sources was addressed. Listeners participated in a localization training experiment using their selected HRTFs. The training procedure was created from the characterization of successful search strategies in prior auditory search experiments. Search accuracy significantly improved after listeners performed the training procedure. Next, in the investigation of auditory spatial memory, listeners completed three search and recall tasks with differing recall methods. Recall accuracy significantly decreased in tasks that required the storage of sound source configurations in memory. To assess the impacts of practical scenarios, the present work assessed the performance effects of: signal uncertainty, visual augmentation, and different attenuation modeling. Fortunately, source uncertainty did not affect listeners' ability to recall or identify sound sources. The present

  9. Deceptive body movements reverse spatial cueing in soccer.

    Directory of Open Access Journals (Sweden)

    Michael J Wright

    Full Text Available The purpose of the experiments was to analyse the spatial cueing effects of the movements of soccer players executing normal and deceptive (step-over turns with the ball. Stimuli comprised normal resolution or point-light video clips of soccer players dribbling a football towards the observer then turning right or left with the ball. Clips were curtailed before or on the turn (-160, -80, 0 or +80 ms to examine the time course of direction prediction and spatial cueing effects. Participants were divided into higher-skilled (HS and lower-skilled (LS groups according to soccer experience. In experiment 1, accuracy on full video clips was higher than on point-light but results followed the same overall pattern. Both HS and LS groups correctly identified direction on normal moves at all occlusion levels. For deceptive moves, LS participants were significantly worse than chance and HS participants were somewhat more accurate but nevertheless substantially impaired. In experiment 2, point-light clips were used to cue a lateral target. HS and LS groups showed faster reaction times to targets that were congruent with the direction of normal turns, and to targets incongruent with the direction of deceptive turns. The reversed cueing by deceptive moves coincided with earlier kinematic events than cueing by normal moves. It is concluded that the body kinematics of soccer players generate spatial cueing effects when viewed from an opponent's perspective. This could create a reaction time advantage when anticipating the direction of a normal move. A deceptive move is designed to turn this cueing advantage into a disadvantage. Acting on the basis of advance information, the presence of deceptive moves primes responses in the wrong direction, which may be only partly mitigated by delaying a response until veridical cues emerge.

  10. Deceptive body movements reverse spatial cueing in soccer.

    Science.gov (United States)

    Wright, Michael J; Jackson, Robin C

    2014-01-01

    The purpose of the experiments was to analyse the spatial cueing effects of the movements of soccer players executing normal and deceptive (step-over) turns with the ball. Stimuli comprised normal resolution or point-light video clips of soccer players dribbling a football towards the observer then turning right or left with the ball. Clips were curtailed before or on the turn (-160, -80, 0 or +80 ms) to examine the time course of direction prediction and spatial cueing effects. Participants were divided into higher-skilled (HS) and lower-skilled (LS) groups according to soccer experience. In experiment 1, accuracy on full video clips was higher than on point-light but results followed the same overall pattern. Both HS and LS groups correctly identified direction on normal moves at all occlusion levels. For deceptive moves, LS participants were significantly worse than chance and HS participants were somewhat more accurate but nevertheless substantially impaired. In experiment 2, point-light clips were used to cue a lateral target. HS and LS groups showed faster reaction times to targets that were congruent with the direction of normal turns, and to targets incongruent with the direction of deceptive turns. The reversed cueing by deceptive moves coincided with earlier kinematic events than cueing by normal moves. It is concluded that the body kinematics of soccer players generate spatial cueing effects when viewed from an opponent's perspective. This could create a reaction time advantage when anticipating the direction of a normal move. A deceptive move is designed to turn this cueing advantage into a disadvantage. Acting on the basis of advance information, the presence of deceptive moves primes responses in the wrong direction, which may be only partly mitigated by delaying a response until veridical cues emerge. PMID:25100444

  11. Positive effects of auditory cue in locomotor pattern of people with Parkinson’s disease (off and on medication

    Directory of Open Access Journals (Sweden)

    Natalia Madalena Rinaldi

    2014-12-01

    Full Text Available Gait disorders are identified in people with Parkinson’s disease. The aim of this study was to investigate the effect of auditory cues and medication on kinematic, kinetic and EMG parameters, during different gait phases of people with PD and healthy elderly. Thirty subjects distributed in two groups (Group 1, PD patients off and on medication; Group 2, healthy elderly participated in this study and were instructed to walk in two experimental conditions: non-cued and cued. Therefore, kinematic, kinetic and electromyography analyses were utilized to investigate the locomotor pattern. Changes in locomotor pattern (greater muscular activity with auditory cue were observed for PD patients. Regarding the medication, locomotor parameter improvement was observed after levodopa intake in association with the auditory cue. These results confirm the hypothesis about the external cues therapy that could be used as a complement to drug therapy to achieve improvement in the locomotor pattern of PD patients.

  12. The effect of visual cues on auditory stream segregation in musicians and non-musicians.

    Directory of Open Access Journals (Sweden)

    Jeremy Marozeau

    Full Text Available BACKGROUND: The ability to separate two interleaved melodies is an important factor in music appreciation. This ability is greatly reduced in people with hearing impairment, contributing to difficulties in music appreciation. The aim of this study was to assess whether visual cues, musical training or musical context could have an effect on this ability, and potentially improve music appreciation for the hearing impaired. METHODS: Musicians (N = 18 and non-musicians (N = 19 were asked to rate the difficulty of segregating a four-note repeating melody from interleaved random distracter notes. Visual cues were provided on half the blocks, and two musical contexts were tested, with the overlap between melody and distracter notes either gradually increasing or decreasing. CONCLUSIONS: Visual cues, musical training, and musical context all affected the difficulty of extracting the melody from a background of interleaved random distracter notes. Visual cues were effective in reducing the difficulty of segregating the melody from distracter notes, even in individuals with no musical training. These results are consistent with theories that indicate an important role for central (top-down processes in auditory streaming mechanisms, and suggest that visual cues may help the hearing-impaired enjoy music.

  13. Facilitation of learning spatial relations among locations by visual cues: generality across spatial configurations.

    Science.gov (United States)

    Sturz, Bradley R; Kelly, Debbie M; Brown, Michael F

    2010-03-01

    Spatial pattern learning permits the learning of the location of objects in space relative to each other without reference to discrete visual landmarks or environmental geometry. In the present experiment, we investigated conditions that facilitate spatial pattern learning. Specifically, human participants searched in a real environment or interactive 3-D computer-generated virtual environment open-field search task for four hidden goal locations arranged in a diamond configuration located in a 5 x 5 matrix of raised bins. Participants were randomly assigned to one of three groups: Pattern Only, Landmark + Pattern, or Cues + Pattern. All participants experienced a Training phase followed by a Testing phase. Visual cues were coincident with the goal locations during Training only in the Cues + Pattern group whereas a single visual cue at a non-goal location maintained a consistent spatial relationship with the goal locations during Training only in the Landmark + Pattern group. All groups were then tested in the absence of visual cues. Results in both environments indicated that participants in all three groups learned the spatial configuration of goal locations. The presence of the visual cues during Training facilitated acquisition of the task for the Landmark + Pattern and Cues + Pattern groups compared to the Pattern Only group. During Testing the Landmark + Pattern and Cues + Pattern groups did not differ when their respective visual cues were removed. Furthermore, during Testing the performance of these two groups was superior to the Pattern Only group. Results generalize prior research to a different configuration of spatial locations, isolate spatial pattern learning as the process facilitated by visual cues, and indicate that the facilitation of learning spatial relations among locations by visual cues does not require coincident visual cues. PMID:19777275

  14. Confusing what you heard with what you did: False action-memories from auditory cues.

    Science.gov (United States)

    Lindner, Isabel; Henkel, Linda A

    2015-12-01

    Creating a mental image of one's own performance, observing someone else performing an action, and viewing a photograph of a completed action all can lead to the illusory recollection that one has performed this action. While there are fundamental differences in the nature of these three processes, they are aligned by the fact that they involve primarily or solely the visual modality. According to the source-monitoring framework, the corresponding visual memory traces later can be mistakenly attributed to self-performance. However, when people perform actions, they do not only engage vision, but also other modalities, such as auditory and tactile systems. The present study focused on the role of audition in the creation of false beliefs about performing an action and explored whether auditory cues alone-in the absence of any visual cues-can induce false beliefs and memories for actions. After performing a series of simple actions, participants listened to the sound of someone performing various actions, watched someone perform the actions, or simultaneously both heard and saw someone perform them. Some of these actions had been performed earlier by the participants and others were new. A later source-memory test revealed that all three types of processing (hearing, seeing, or hearing plus seeing someone perform the actions) led to comparable increases in false claims of having performed actions oneself. The potential mechanisms underlying false action-memories from sound and vision are discussed. PMID:25925600

  15. Attention to sound improves auditory reliability in audio-tactile spatial optimal integration

    Directory of Open Access Journals (Sweden)

    Tiziana eVercillo

    2015-05-01

    Full Text Available The role of attention on multisensory processing is still poorly understood. In particular, it is unclear whether directing attention toward a sensory cue dynamically reweights cue reliability during integration of multiple sensory signals. In this study, we investigated the impact of attention in combining audio-tactile signals in an optimal fashion. We used the Maximum Likelihood Estimation (MLE model to predict audio-tactile spatial localization on the body surface. We developed a new audio-tactile device composed by several small units, each one consisting of a speaker and a tactile vibrator independently controllable by external software. We tested subjects in an attentional and a non-attentional condition. In the attention experiment participants performed a dual task paradigm: they were required to evaluate the duration of a sound while performing an audio-tactile spatial task. Three unisensory or multisensory stimuli (conflictual or not conflictual sounds and vibrations arranged along the horizontal axis were presented sequentially. In the primary task subjects had to evaluate the position of the second stimulus (the probe with respect to the others (in a space bisection task. In the secondary task they had to report occasionally changes in duration of the second auditory stimulus. In the non-attentional task participants had only to perform the primary task (space bisection. Our results showed enhanced auditory precision (and auditory weights in the auditory attentional condition with respect to the control non-attentional condition. Interestingly in both conditions the multisensory results are well predicted by the MLE model. The results of this study support the idea that modality-specific attention modulates multisensory integration.

  16. Task-dependent calibration of auditory spatial perception through environmental visual observation

    OpenAIRE

    Luca Brayda

    2015-01-01

    Visual information is paramount to space perception. Vision influences auditory space estimation. Many studies show that simultaneous visual and auditory cues improve precision of the final multisensory estimate. However, the amount or the temporal extent of visual information, that is sufficient to influence auditory perception, is still unknown. It is therefore interesting to know if vision can improve auditory precision through a short-term environmental observation preceding the audio tas...

  17. Encoding of sound localization cues by an identified auditory interneuron: effects of stimulus temporal pattern.

    Science.gov (United States)

    Samson, Annie-Hélène; Pollack, Gerald S

    2002-11-01

    An important cue for sound localization is binaural comparison of stimulus intensity. Two features of neuronal responses, response strength, i.e., spike count and/or rate, and response latency, vary with stimulus intensity, and binaural comparison of either or both might underlie localization. Previous studies at the receptor-neuron level showed that these response features are affected by the stimulus temporal pattern. When sounds are repeated rapidly, as occurs in many natural sounds, response strength decreases and latency increases, resulting in altered coding of localization cues. In this study we analyze binaural cues for sound localization at the level of an identified pair of interneurons (the left and right AN2) in the cricket auditory system, with emphasis on the effects of stimulus temporal pattern on binaural response differences. AN2 spike count decreases with rapidly repeated stimulation and latency increases. Both effects depend on stimulus intensity. Because of the difference in intensity at the two ears, binaural differences in spike count and latency change as stimulation continues. The binaural difference in spike count decreases, whereas the difference in latency increases. The proportional changes in response strength and in latency are greater at the interneuron level than at the receptor level, suggesting that factors in addition to decrement of receptor responses are involved. Intracellular recordings reveal that a slowly building, long-lasting hyperpolarization is established in AN2. At the same time, the level of depolarization reached during the excitatory postsynaptic potential (EPSP) resulting from each sound stimulus decreases. Neither these effects on membrane potential nor the changes in spiking response are accounted for by contralateral inhibition. Based on comparison of our results with earlier behavioral experiments, it is unlikely that crickets use the binaural difference in latency of AN2 responses as the main cue for

  18. Nonlinear dynamics of human locomotion: effects of rhythmic auditory cueing on local dynamic stability

    Directory of Open Access Journals (Sweden)

    Philippe eTerrier

    2013-09-01

    Full Text Available It has been observed that times series of gait parameters (stride length (SL, stride time (ST and stride speed (SS, exhibit long-term persistence and fractal-like properties. Synchronizing steps with rhythmic auditory stimuli modifies the persistent fluctuation pattern to anti-persistence. Another nonlinear method estimates the degree of resilience of gait control to small perturbations, i.e. the local dynamic stability (LDS. The method makes use of the maximal Lyapunov exponent, which estimates how fast a nonlinear system embedded in a reconstructed state space (attractor diverges after an infinitesimal perturbation. We propose to use an instrumented treadmill to simultaneously measure basic gait parameters (time series of SL, ST and SS from which the statistical persistence among consecutive strides can be assessed, and the trajectory of the center of pressure (from which the LDS can be estimated. In 20 healthy participants, the response to rhythmic auditory cueing (RAC of LDS and of statistical persistence (assessed with detrended fluctuation analysis (DFA was compared. By analyzing the divergence curves, we observed that long-term LDS (computed as the reverse of the average logarithmic rate of divergence between the 4th and the 10th strides downstream from nearest neighbors in the reconstructed attractor was strongly enhanced (relative change +47%. That is likely the indication of a more dampened dynamics. The change in short-term LDS (divergence over one step was smaller (+3%. DFA results (scaling exponents confirmed an anti-persistent pattern in ST, SL and SS. Long-term LDS (but not short-term LDS and scaling exponents exhibited a significant correlation between them (r=0.7. Both phenomena probably result from the more conscious/voluntary gait control that is required by RAC. We suggest that LDS and statistical persistence should be used to evaluate the efficiency of cueing therapy in patients with neurological gait disorders.

  19. Spatial organization of tettigoniid auditory receptors: insights from neuronal tracing.

    Science.gov (United States)

    Strauß, Johannes; Lehmann, Gerlind U C; Lehmann, Arne W; Lakes-Harlan, Reinhard

    2012-11-01

    The auditory sense organ of Tettigoniidae (Insecta, Orthoptera) is located in the foreleg tibia and consists of scolopidial sensilla which form a row termed crista acustica. The crista acustica is associated with the tympana and the auditory trachea. This ear is a highly ordered, tonotopic sensory system. As the neuroanatomy of the crista acustica has been documented for several species, the most distal somata and dendrites of receptor neurons have occasionally been described as forming an alternating or double row. We investigate the spatial arrangement of receptor cell bodies and dendrites by retrograde tracing with cobalt chloride solution. In six tettigoniid species studied, distal receptor neurons are consistently arranged in double-rows of somata rather than a linear sequence. This arrangement of neurons is shown to affect 30-50% of the overall auditory receptors. No strict correlation of somata positions between the anterio-posterior and dorso-ventral axis was evident within the distal crista acustica. Dendrites of distal receptors occasionally also occur in a double row or are even massed without clear order. Thus, a substantial part of auditory receptors can deviate from a strictly straight organization into a more complex morphology. The linear organization of dendrites is not a morphological criterion that allows hearing organs to be distinguished from nonhearing sense organs serially homologous to ears in all species. Both the crowded arrangement of receptor somata and dendrites may result from functional constraints relating to frequency discrimination, or from developmental constraints of auditory morphogenesis in postembryonic development. PMID:22807283

  20. Verbal Auditory Cueing of Improvisational Dance: A Proposed Method for Training Agency in Parkinson's Disease.

    Science.gov (United States)

    Batson, Glenna; Hugenschmidt, Christina E; Soriano, Christina T

    2016-01-01

    Dance is a non-pharmacological intervention that helps maintain functional independence and quality of life in people with Parkinson's disease (PPD). Results from controlled studies on group-delivered dance for people with mild-to-moderate stage Parkinson's have shown statistically and clinically significant improvements in gait, balance, and psychosocial factors. Tested interventions include non-partnered dance forms (ballet and modern dance) and partnered (tango). In all of these dance forms, specific movement patterns initially are learned through repetition and performed in time-to-music. Once the basic steps are mastered, students may be encouraged to improvise on the learned steps as they perform them in rhythm with the music. Here, we summarize a method of teaching improvisational dance that advances previous reported benefits of dance for people with Parkinson's disease (PD). The method relies primarily on improvisational verbal auditory cueing with less emphasis on directed movement instruction. This method builds on the idea that daily living requires flexible, adaptive responses to real-life challenges. In PD, movement disorders not only limit mobility but also impair spontaneity of thought and action. Dance improvisation demands open and immediate interpretation of verbally delivered movement cues, potentially fostering the formation of spontaneous movement strategies. Here, we present an introduction to a proposed method, detailing its methodological specifics, and pointing to future directions. The viewpoint advances an embodied cognitive approach that has eco-validity in helping PPD meet the changing demands of daily living. PMID:26925029

  1. Mechanisms of spatial and non-spatial auditory selective attention

    OpenAIRE

    Paltoglou, Aspasia Eleni

    2009-01-01

    Selective attention is a crucial function that encompasses all perceptual modalities and which enables us to focus on the behaviorally relevant information and ignore the rest. The main goal of the thesis is to test well-established hypotheses about the mechanisms of visual selective attention in the auditory domain using behavioral and neuroimaging methods. Two fMRI studies (Experiments 1 and 2) test the hypothesis of feature-specific attentional enhancement. This hypothesis states that ...

  2. Listenmee and Listenmee smartphone application: synchronizing walking to rhythmic auditory cues to improve gait in Parkinson's disease.

    Science.gov (United States)

    Lopez, William Omar Contreras; Higuera, Carlos Andres Escalante; Fonoff, Erich Talamoni; Souza, Carolina de Oliveira; Albicker, Ulrich; Martinez, Jairo Alberto Espinoza

    2014-10-01

    Evidence supports the use of rhythmic external auditory signals to improve gait in PD patients (Arias & Cudeiro, 2008; Kenyon & Thaut, 2000; McIntosh, Rice & Thaut, 1994; McIntosh et al., 1997; Morris, Iansek, & Matyas, 1994; Thaut, McIntosh, & Rice, 1997; Suteerawattananon, Morris, Etnyre, Jankovic, & Protas , 2004; Willems, Nieuwboer, Chavert, & Desloovere, 2006). However, few prototypes are available for daily use, and to our knowledge, none utilize a smartphone application allowing individualized sounds and cadence. Therefore, we analyzed the effects on gait of Listenmee®, an intelligent glasses system with a portable auditory device, and present its smartphone application, the Listenmee app®, offering over 100 different sounds and an adjustable metronome to individualize the cueing rate as well as its smartwatch with accelerometer to detect magnitude and direction of the proper acceleration, track calorie count, sleep patterns, steps count and daily distances. The present study included patients with idiopathic PD presented gait disturbances including freezing. Auditory rhythmic cues were delivered through Listenmee®. Performance was analyzed in a motion and gait analysis laboratory. The results revealed significant improvements in gait performance over three major dependent variables: walking speed in 38.1%, cadence in 28.1% and stride length in 44.5%. Our findings suggest that auditory cueing through Listenmee® may significantly enhance gait performance. Further studies are needed to elucidate the potential role and maximize the benefits of these portable devices. PMID:25215623

  3. Domestic pigs' (Sus scrofa domestica) use of direct and indirect visual and auditory cues in an object choice task.

    Science.gov (United States)

    Nawroth, Christian; von Borell, Eberhard

    2015-05-01

    Recently, foraging strategies have been linked to the ability to use indirect visual information. More selective feeders should express a higher aversion against losses compared to non-selective feeders and should therefore be more prone to avoid empty food locations. To extend these findings, in this study, we present a series of studies investigating the use of direct and indirect visual and auditory information by an omnivorous but selective feeder-the domestic pig. Subjects had to choose between two buckets, with only one containing a reward. Before making a choice, the subjects in Experiment 1 (N = 8) received full information regarding both the baited and non-baited location, either in a visual or auditory domain. In this experiment, the subjects were able to use visual but not auditory cues to infer the location of the reward spontaneously. Additionally, four individuals learned to use auditory cues after a period of training. In Experiment 2 (N = 8), the pigs were given different amounts of visual information about the content of the buckets-lifting either both of the buckets (full information), the baited bucket (direct information), the empty bucket (indirect information) or no bucket at all (no information). The subjects as a group were able to use direct and indirect visual cues. However, over the course of the experiment, the performance dropped to chance level when indirect information was provided. A final experiment (N = 3) provided preliminary results for pigs' use of indirect auditory information to infer the location of a reward. We conclude that pigs at a very young age are able to make decisions based on indirect information in the visual domain, whereas their performance in the use of indirect auditory information warrants further investigation. PMID:25650328

  4. Auditory spatial perception dynamically realigns with changing eye position.

    Science.gov (United States)

    Razavi, Babak; O'Neill, William E; Paige, Gary D

    2007-09-19

    Audition and vision both form spatial maps of the environment in the brain, and their congruency requires alignment and calibration. Because audition is referenced to the head and vision is referenced to movable eyes, the brain must accurately account for eye position to maintain alignment between the two modalities as well as perceptual space constancy. Changes in eye position are known to variably, but inconsistently, shift sound localization, suggesting subtle shortcomings in the accuracy or use of eye position signals. We systematically and directly quantified sound localization across a broad spatial range and over time after changes in eye position. A sustained fixation task addressed the spatial (steady-state) attributes of eye position-dependent effects on sound localization. Subjects continuously fixated visual reference spots straight ahead (center), to the left (20 degrees), or to the right (20 degrees) of the midline in separate sessions while localizing auditory targets using a laser pointer guided by peripheral vision. An alternating fixation task focused on the temporal (dynamic) aspects of auditory spatial shifts after changes in eye position. Localization proceeded as in sustained fixation, except that eye position alternated between the three fixation references over multiple epochs, each lasting minutes. Auditory space shifted by approximately 40% toward the new eye position and dynamically over several minutes. We propose that this spatial shift reflects an adaptation mechanism for aligning the "straight-ahead" of perceived sensory-motor maps, particularly during early childhood when normal ocular alignment is achieved, but also resolving challenges to normal spatial perception throughout life. PMID:17881531

  5. Independent effects of bottom-up temporal expectancy and top-down spatial attention. An audiovisual study using rhythmic cueing.

    Directory of Open Access Journals (Sweden)

    Alexander eJones

    2015-01-01

    Full Text Available Selective attention to a spatial location has shown enhance perception and facilitate behaviour for events at attended locations. However, selection relies not only on where but also when an event occurs. Recently, interest has turned to how intrinsic neural oscillations in the brain entrain to rhythms in our environment, and, stimuli appearing in or out of synch with a rhythm have shown to modulate perception and performance. Temporal expectations created by rhythms and spatial attention are two processes which have independently shown to affect stimulus processing but it remains largely unknown how, and if, they interact. In four separate tasks, this study investigated the effects of voluntary spatial attention and bottom-up temporal expectations created by rhythms in both unimodal and crossmodal conditions. In each task the participant used an informative cue, either colour or pitch, to direct their covert spatial attention to the left or right, and respond as quickly as possible to a target. The lateralized target (visual or auditory was then presented at the attended or unattended side. Importantly, although not task relevant, the cue was a rhythm of either flashes or beeps. The target was presented in or out of sync (early or late with the rhythmic cue. The results showed participants were faster responding to spatially attended compared to unattended targets in all tasks. Moreover, there was an effect of rhythmic cueing upon response times in both unimodal and crossmodal conditions. Responses were faster to targets presented in sync with the rhythm compared to when they appeared too early in both crossmodal tasks. That is, rhythmic stimuli in one modality influenced the temporal expectancy in the other modality, suggesting temporal expectancies created by rhythms are crossmodal. Interestingly, there was no interaction between top-down spatial attention and rhythmic cueing in any task suggesting these two processes largely influenced

  6. The neural circuitry underlying the executive control of auditory spatial attention

    OpenAIRE

    Wu, C-T; Weissman, D.H.; Roberts, K. C.; Woldorff, M.G.

    2007-01-01

    Although a fronto-parietal network has consistently been implicated in the control of visual spatial attention, the network that guides spatial attention in the auditory domain is not yet clearly understood. To investigate this issue, we measured brain activity using functional magnetic resonance imaging while participants performed a cued auditory spatial attention task. We found that cued orienting of auditory spatial attention activated a medial-superior distributed fronto-parietal network...

  7. Hand proximity facilitates spatial discrimination of auditory tones

    Directory of Open Access Journals (Sweden)

    Philip eTseng

    2014-06-01

    Full Text Available The effect of hand proximity on vision and visual attention has been well documented. In this study we tested whether such effect(s would also be present in the auditory modality. With hands placed either near or away from the audio sources, participants performed an auditory-spatial discrimination (Exp 1: left or right side, pitch discrimination (Exp 2: high, med, or low tone, and spatial-plus-pitch (Exp 3: left or right; high, med, or low discrimination task. In Exp 1, when hands were away from the audio source, participants consistently responded faster with their right hand regardless of stimulus location. This right hand advantage, however, disappeared in the hands-near condition because of a significant improvement in left hand’s reaction time. No effect of hand proximity was found in Exp 2 or 3, where a choice reaction time task requiring pitch discrimination was used. Together, these results suggest that the effect of hand proximity is not exclusive to vision alone, but is also present in audition, though in a much weaker form. Most important, these findings provide evidence from auditory attention that supports the multimodal account originally raised by Reed et al. in 2006.

  8. Quadri-stability of a spatially ambiguous auditory illusion

    Directory of Open Access Journals (Sweden)

    Constance May Bainbridge

    2015-01-01

    Full Text Available In addition to vision, audition plays an important role in sound localization in our world. One way we estimate the motion of an auditory object moving towards or away from us is from changes in volume intensity. However, the human auditory system has unequally distributed spatial resolution, including difficulty distinguishing sounds in front versus behind the listener. Here, we introduce a novel quadri-stable illusion, the Transverse-and-Bounce Auditory Illusion, which combines front-back confusion with changes in volume levels of a nonspatial sound to create ambiguous percepts of an object approaching and withdrawing from the listener. The sound can be perceived as traveling transversely from front to back or back to front, or bouncing to remain exclusively in front of or behind the observer. Here we demonstrate how human listeners experience this illusory phenomenon by comparing ambiguous and unambiguous stimuli for each of the four possible motion percepts. When asked to rate their confidence in perceiving each sound’s motion, participants reported equal confidence for the illusory and unambiguous stimuli. Participants perceived all four illusory motion percepts, and could not distinguish the illusion from the unambiguous stimuli. These results show that this illusion is effectively quadri-stable. In a second experiment, the illusory stimulus was looped continuously in headphones while participants identified its perceived path of motion to test properties of perceptual switching, locking, and biases. Participants were biased towards perceiving transverse compared to bouncing paths, and they became perceptually locked into alternating between front-to-back and back-to-front percepts, perhaps reflecting how auditory objects commonly move in the real world. This multi-stable auditory illusion opens opportunities for studying the perceptual, cognitive, and neural representation of objects in motion, as well as exploring multimodal perceptual

  9. Speed on the dance floor: Auditory and visual cues for musical tempo.

    Science.gov (United States)

    London, Justin; Burger, Birgitta; Thompson, Marc; Toiviainen, Petri

    2016-02-01

    Musical tempo is most strongly associated with the rate of the beat or "tactus," which may be defined as the most prominent rhythmic periodicity present in the music, typically in a range of 1.67-2 Hz. However, other factors such as rhythmic density, mean rhythmic inter-onset interval, metrical (accentual) structure, and rhythmic complexity can affect perceived tempo (Drake, Gros, & Penel, 1999; London, 2011 Drake, Gros, & Penel, 1999; London, 2011). Visual information can also give rise to a perceived beat/tempo (Iversen, et al., 2015), and auditory and visual temporal cues can interact and mutually influence each other (Soto-Faraco & Kingstone, 2004; Spence, 2015). A five-part experiment was performed to assess the integration of auditory and visual information in judgments of musical tempo. Participants rated the speed of six classic R&B songs on a seven point scale while observing an animated figure dancing to them. Participants were presented with original and time-stretched (±5%) versions of each song in audio-only, audio+video (A+V), and video-only conditions. In some videos the animations were of spontaneous movements to the different time-stretched versions of each song, and in other videos the animations were of "vigorous" versus "relaxed" interpretations of the same auditory stimulus. Two main results were observed. First, in all conditions with audio, even though participants were able to correctly rank the original vs. time-stretched versions of each song, a song-specific tempo-anchoring effect was observed, such that sped-up versions of slower songs were judged to be faster than slowed-down versions of faster songs, even when their objective beat rates were the same. Second, when viewing a vigorous dancing figure in the A+V condition, participants gave faster tempo ratings than from the audio alone or when viewing the same audio with a relaxed dancing figure. The implications of this illusory tempo percept for cross-modal sensory integration and

  10. The role of vowel perceptual cues in compensatory responses to perturbations of speech auditory feedback

    OpenAIRE

    Reilly, Kevin J.; Dougherty, Kathleen E.

    2013-01-01

    The perturbation of acoustic features in a speaker's auditory feedback elicits rapid compensatory responses that demonstrate the importance of auditory feedback for control of speech output. The current study investigated whether responses to a perturbation of speech auditory feedback vary depending on the importance of the perturbed feature to perception of the vowel being produced. Auditory feedback of speakers' first formant frequency (F1) was shifted upward by 130 mels in randomly selecte...

  11. Selective importance of the rat anterior thalamic nuclei for configural learning involving distal spatial cues.

    Science.gov (United States)

    Dumont, Julie R; Amin, Eman; Aggleton, John P

    2014-01-01

    To test potential parallels between hippocampal and anterior thalamic function, rats with anterior thalamic lesions were trained on a series of biconditional learning tasks. The anterior thalamic lesions did not disrupt learning two biconditional associations in operant chambers where a specific auditory stimulus (tone or click) had a differential outcome depending on whether it was paired with a particular visual context (spot or checkered wall-paper) or a particular thermal context (warm or cool). Likewise, rats with anterior thalamic lesions successfully learnt a biconditional task when they were reinforced for digging in one of two distinct cups (containing either beads or shredded paper), depending on the particular appearance of the local context on which the cup was placed (one of two textured floors). In contrast, the same rats were severely impaired at learning the biconditional rule to select a specific cup when in a particular location within the test room. Place learning was then tested with a series of go/no-go discriminations. Rats with anterior thalamic nuclei lesions could learn to discriminate between two locations when they were approached from a constant direction. They could not, however, use this acquired location information to solve a subsequent spatial biconditional task where those same places dictated the correct choice of digging cup. Anterior thalamic lesions produced a selective, but severe, biconditional learning deficit when the task incorporated distal spatial cues. This deficit mirrors that seen in rats with hippocampal lesions, so extending potential interdependencies between the two sites.

  12. Three-dimensional motion analysis of the effects of auditory cueing on gait pattern in patients with Parkinson's disease: a preliminary investigation.

    Science.gov (United States)

    Picelli, Alessandro; Camin, Maruo; Tinazzi, Michele; Vangelista, Antonella; Cosentino, Alessandro; Fiaschi, Antonio; Smania, Nicola

    2010-08-01

    Auditory cueing enhances gait in parkinsonian patients. Our aim was to evaluate its effects on spatiotemporal (stride length, stride time, cadence, gait speed, single and double support duration) kinematic (range of amplitude of the hip, knee and ankle joint angles registered in the sagittal plane) and kinetic (maximal values of the hip and ankle joint power) gait parameters using three-dimensional motion analysis. Eight parkinsonian patients performed 12 walking tests: 3 repetitions of 4 conditions (normal walking, 90, 100, and 110% of the mean cadence at preferred pace cued walking). Subjects were asked to uniform their cadence to the cueing rhythm. In the presence of auditory cues stride length, cadence, gait speed and ratio single/double support duration increased. Range of motion of the ankle joint decreased and the maximal values within the pull-off phase of the hip joint power increased. Thus, auditory cues could improve gait modifying motor strategy in parkinsonian patients.

  13. The role of different cues in the brain mechanism on visual spatial attention

    Institute of Scientific and Technical Information of China (English)

    SONG Weiqun; LUO Yuejia; CHI Song; JI Xunming; LING Feng; ZHAO Lun; WANG Maobin; SHI Jiannong

    2006-01-01

    The visual spatial attention mechanism in the brain was studied in 16 young subjects through the visual search paradigm of precue-target by the event-related potential (ERP) technique, with the attentive ranges cued by different scales of Chinese character and region cues. The results showed that the response time for Chinese character cues was much longer than that for region cues especially for small region cues. With the exterior interferences, the target stimuli recognition under region cues was much quicker than that under Chinese character cues. Compared with that under region cues, targets under Chinese character cues could lead to increase of the posterior P1,decrease of the N1 and increase of the P2. It should also be noted that the differences between region cues and Chinese character cues were affected by the interference types. Under exterior interferences, no significant difference was found between region cues and Chinese character cues; however, it was not the case under the interior interferences. Considering the difference between the exterior interferences and the interior interferences, we could conclude that with the increase of difficulty in target recognition there was obvious difference in the consumption of anterior frontal resources by target stimuli under the two kinds of cues.

  14. Auditory Spatial Coding Flexibly Recruits Anterior, but Not Posterior, Visuotopic Parietal Cortex

    OpenAIRE

    Michalka, Samantha W.; Rosen, Maya L.; Kong, Lingqiang; Shinn-Cunningham, Barbara G.; Somers, David C.

    2015-01-01

    Audition and vision both convey spatial information about the environment, but much less is known about mechanisms of auditory spatial cognition than visual spatial cognition. Human cortex contains >20 visuospatial map representations but no reported auditory spatial maps. The intraparietal sulcus (IPS) contains several of these visuospatial maps, which support visuospatial attention and short-term memory (STM). Neuroimaging studies also demonstrate that parietal cortex is activated during au...

  15. The Perceptual and Statistics Characteristic of Spatial Cues and its application

    Directory of Open Access Journals (Sweden)

    Heng Wang

    2013-01-01

    Full Text Available In present mobile communication system, low bit rate audio signal is supposed to be provided with high quality. This paper researches the mechanism exists of perceptual and statistics redundancy in spatial cues and establishes a selection model by joint perceptual and statistics characteristic of spatial cues. It does not quantize the values of spatial cues where the frequency bands cant easily be perceived by human ears according to the selection model. Experimental results showed that this method can bring down the parametric bitrate by about 15% compared with parametric stereo, while maintaining the subjective sound quality.

  16. Spatial selective auditory attention in the presence of reverberant energy: individual differences in normal-hearing listeners.

    Science.gov (United States)

    Ruggles, Dorea; Shinn-Cunningham, Barbara

    2011-06-01

    Listeners can selectively attend to a desired target by directing attention to known target source features, such as location or pitch. Reverberation, however, reduces the reliability of the cues that allow a target source to be segregated and selected from a sound mixture. Given this, it is likely that reverberant energy interferes with selective auditory attention. Anecdotal reports suggest that the ability to focus spatial auditory attention degrades even with early aging, yet there is little evidence that middle-aged listeners have behavioral deficits on tasks requiring selective auditory attention. The current study was designed to look for individual differences in selective attention ability and to see if any such differences correlate with age. Normal-hearing adults, ranging in age from 18 to 55 years, were asked to report a stream of digits located directly ahead in a simulated rectangular room. Simultaneous, competing masker digit streams were simulated at locations 15° left and right of center. The level of reverberation was varied to alter task difficulty by interfering with localization cues (increasing localization blur). Overall, performance was best in the anechoic condition and worst in the high-reverberation condition. Listeners nearly always reported a digit from one of the three competing streams, showing that reverberation did not render the digits unintelligible. Importantly, inter-subject differences were extremely large. These differences, however, were not significantly correlated with age, memory span, or hearing status. These results show that listeners with audiometrically normal pure tone thresholds differ in their ability to selectively attend to a desired source, a task important in everyday communication. Further work is necessary to determine if these differences arise from differences in peripheral auditory function or in more central function.

  17. Within-hemifield posture changes affect tactile–visual exogenous spatial cueing without spatial precision, especially in the dark

    OpenAIRE

    Kennett, Steffan; Driver, Jon

    2014-01-01

    We investigated the effects of seen and unseen within-hemifield posture changes on crossmodal visual–tactile links in covert spatial attention. In all experiments, a spatially nonpredictive tactile cue was presented to the left or the right hand, with the two hands placed symmetrically across the midline. Shortly after a tactile cue, a visual target appeared at one of two eccentricities within either of the hemifields. For half of the trial blocks, the hands were aligned with the inner visual...

  18. Auditory spatial resolution in horizontal, vertical, and diagonal planes.

    Science.gov (United States)

    Grantham, D Wesley; Hornsby, Benjamin W Y; Erpenbeck, Eric A

    2003-08-01

    Minimum audible angle (MAA) and minimum audible movement angle (MAMA) thresholds were measured for stimuli in horizontal, vertical, and diagonal (60 degrees) planes. A pseudovirtual technique was employed in which signals were recorded through KEMAR's ears and played back to subjects through insert earphones. Thresholds were obtained for wideband, high-pass, and low-pass noises. Only 6 of 20 subjects obtained wideband vertical-plane MAAs less than 10 degrees, and only these 6 subjects were retained for the complete study. For all three filter conditions thresholds were lowest in the horizontal plane, slightly (but significantly) higher in the diagonal plane, and highest for the vertical plane. These results were similar in magnitude and pattern to those reported by Perrott and Saberi [J. Acoust. Soc. Am. 87, 1728-1731 (1990)] and Saberi and Perrott [J. Acoust. Soc. Am. 88, 2639-2644 (1990)], except that these investigators generally found that thresholds for diagonal planes were as good as those for the horizontal plane. The present results are consistent with the hypothesis that diagonal-plane performance is based on independent contributions from a horizontal-plane system (sensitive to interaural differences) and a vertical-plane system (sensitive to pinna-based spectral changes). Measurements of the stimuli recorded through KEMAR indicated that sources presented from diagonal planes can produce larger interaural level differences (ILDs) in certain frequency regions than would be expected based on the horizontal projection of the trajectory. Such frequency-specific ILD cues may underlie the very good performance reported in previous studies for diagonal spatial resolution. Subjects in the present study could apparently not take advantage of these cues in the diagonal-plane condition, possibly because they did not externalize the images to their appropriate positions in space or possibly because of the absence of a patterned visual field.

  19. Visual spatial cue use for guiding orientation in two-to-three-year-old children

    Directory of Open Access Journals (Sweden)

    Danielle evan den Brink

    2013-12-01

    Full Text Available In spatial development representations of the environment and the use of spatial cues change over time. To date, the influence of individual differences in skills relevant for orientation and navigation has not received much attention. The current study investigated orientation abilities on the basis of visual spatial cues in two-to-three-year-old children, and assessed factors that possibly influence spatial task performance. Thirty-month and 35-month-olds performed an on-screen Virtual Reality orientation task searching for an animated target in the presence of visual self-movement cues and landmark information. Results show that, in contrast to 30-month-old children, 35-month-olds were successful in using visual spatial cues for maintaining orientation. Neither age group benefited from landmarks present in the environment, suggesting that successful task performance relied on the use of optic flow cues, rather than object-to-object relations. Analysis of individual differences revealed that two-year-olds who were relatively more independent in comparison to their peers, as measured by the daily living skills scale of the parental questionnaire Vineland-Screener were most successful at the orientation task. These results support previous findings indicating that the use of various spatial cues gradually improves during early childhood. Our data show that a developmental transition in spatial cue use can be witnessed within a relatively short period of 5 months only. Furthermore, this study indicates that rather than chronological age, individual differences may play a role in successful use of visual cues for spatial updating in an orientation task. Future studies are necessary to assess the exact nature of these individual differences.

  20. Increased Variability and Asymmetric Expansion of the Hippocampal Spatial Representation in a Distal Cue-Dependent Memory Task.

    Science.gov (United States)

    Park, Seong-Beom; Lee, Inah

    2016-08-01

    Place cells in the hippocampus fire at specific positions in space, and distal cues in the environment play critical roles in determining the spatial firing patterns of place cells. Many studies have shown that place fields are influenced by distal cues in foraging animals. However, it is largely unknown whether distal-cue-dependent changes in place fields appear in different ways in a memory task if distal cues bear direct significance to achieving goals. We investigated this possibility in this study. Rats were trained to choose different spatial positions in a radial arm in association with distal cue configurations formed by visual cue sets attached to movable curtains around the apparatus. The animals were initially trained to associate readily discernible distal cue configurations (0° vs. 80° angular separation between distal cue sets) with different food-well positions and then later experienced ambiguous cue configurations (14° and 66°) intermixed with the original cue configurations. Rats showed no difficulty in transferring the associated memory formed for the original cue configurations when similar cue configurations were presented. Place field positions remained at the same locations across different cue configurations, whereas stability and coherence of spatial firing patterns were significantly disrupted when ambiguous cue configurations were introduced. Furthermore, the spatial representation was extended backward and skewed more negatively at the population level when processing ambiguous cue configurations, compared with when processing the original cue configurations only. This effect was more salient for large cue-separation conditions than for small cue-separation conditions. No significant rate remapping was observed across distal cue configurations. These findings suggest that place cells in the hippocampus dynamically change their detailed firing characteristics in response to a modified cue environment and that some of the firing

  1. Within-hemifield posture changes affect tactile-visual exogenous spatial cueing without spatial precision, especially in the dark.

    Science.gov (United States)

    Kennett, Steffan; Driver, Jon

    2014-05-01

    We investigated the effects of seen and unseen within-hemifield posture changes on crossmodal visual-tactile links in covert spatial attention. In all experiments, a spatially nonpredictive tactile cue was presented to the left or the right hand, with the two hands placed symmetrically across the midline. Shortly after a tactile cue, a visual target appeared at one of two eccentricities within either of the hemifields. For half of the trial blocks, the hands were aligned with the inner visual target locations, and for the remainder, the hands were aligned with the outer target locations. In Experiments 1 and 2, the inner and outer eccentricities were 17.5º and 52.5º, respectively. In Experiment 1, the arms were completely covered, and visual up-down judgments were better when on the same side as the preceding tactile cue. Cueing effects were not significantly affected by hand or target alignment. In Experiment 2, the arms were in view, and now some target responses were affected by cue alignment: Cueing for outer targets was only significant when the hands were aligned with them. In Experiment 3, we tested whether any unseen posture changes could alter the cueing effects, by widely separating the inner and outer target eccentricities (now 10º and 86º). In this case, hand alignment did affect some of the cueing effects: Cueing for outer targets was now only significant when the hands were in the outer position. Although these results confirm that proprioception can, in some cases, influence tactile-visual links in exogenous spatial attention, they also show that spatial precision is severely limited, especially when posture is unseen. PMID:24470256

  2. Multimodal information Management: Evaluation of Auditory and Haptic Cues for NextGen Communication Displays

    Science.gov (United States)

    Begault, Durand R.; Bittner, Rachel M.; Anderson, Mark R.

    2012-01-01

    Auditory communication displays within the NextGen data link system may use multiple synthetic speech messages replacing traditional ATC and company communications. The design of an interface for selecting amongst multiple incoming messages can impact both performance (time to select, audit and release a message) and preference. Two design factors were evaluated: physical pressure-sensitive switches versus flat panel "virtual switches", and the presence or absence of auditory feedback from switch contact. Performance with stimuli using physical switches was 1.2 s faster than virtual switches (2.0 s vs. 3.2 s); auditory feedback provided a 0.54 s performance advantage (2.33 s vs. 2.87 s). There was no interaction between these variables. Preference data were highly correlated with performance.

  3. Auditory and Visual Cues for Topic Maintenance with Persons Who Exhibit Dementia of Alzheimer's Type.

    Science.gov (United States)

    Teten, Amy F; Dagenais, Paul A; Friehe, Mary J

    2015-01-01

    This study compared the effectiveness of auditory and visual redirections in facilitating topic coherence for persons with Dementia of Alzheimer's Type (DAT). Five persons with moderate stage DAT engaged in conversation with the first author. Three topics related to activities of daily living, recreational activities, food, and grooming, were broached. Each topic was presented three times to each participant: once as a baseline condition, once with auditory redirection to topic, and once with visual redirection to topic. Transcripts of the interactions were scored for overall coherence. Condition was a significant factor in that the DAT participants exhibited better topic maintenance under visual and auditory conditions as opposed to baseline. In general, the performance of the participants was not affected by the topic, except for significantly higher overall coherence ratings for the visually redirected interactions dealing with the topic of food. PMID:26171273

  4. Attention Cueing and Activity Equally Reduce False Alarm Rate in Visual-Auditory Associative Learning through Improving Memory

    Science.gov (United States)

    Haghgoo, Hojjat Allah; Azizi, Solmaz; Nili Ahmadabadi, Majid

    2016-01-01

    In our daily life, we continually exploit already learned multisensory associations and form new ones when facing novel situations. Improving our associative learning results in higher cognitive capabilities. We experimentally and computationally studied the learning performance of healthy subjects in a visual-auditory sensory associative learning task across active learning, attention cueing learning, and passive learning modes. According to our results, the learning mode had no significant effect on learning association of congruent pairs. In addition, subjects’ performance in learning congruent samples was not correlated with their vigilance score. Nevertheless, vigilance score was significantly correlated with the learning performance of the non-congruent pairs. Moreover, in the last block of the passive learning mode, subjects significantly made more mistakes in taking non-congruent pairs as associated and consciously reported lower confidence. These results indicate that attention and activity equally enhanced visual-auditory associative learning for non-congruent pairs, while false alarm rate in the passive learning mode did not decrease after the second block. We investigated the cause of higher false alarm rate in the passive learning mode by using a computational model, composed of a reinforcement learning module and a memory-decay module. The results suggest that the higher rate of memory decay is the source of making more mistakes and reporting lower confidence in non-congruent pairs in the passive learning mode. PMID:27314235

  5. Two Persons with Multiple Disabilities Use Orientation Technology with Auditory Cues to Manage Simple Indoor Traveling

    Science.gov (United States)

    Lancioni, Giulio E.; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff; Campodonico, Francesca; Oliva, Doretta

    2010-01-01

    This study was an effort to extend the evaluation of orientation technology for promoting independent indoor traveling in persons with multiple disabilities. Two participants (adults) were included, who were to travel to activity destinations within occupational settings. The orientation system involved (a) cueing sources only at the destinations…

  6. Verbal Auditory Cueing of Improvisational Dance: A Proposed Method for Training Agency in Parkinson’s Disease

    Science.gov (United States)

    Batson, Glenna; Hugenschmidt, Christina E.; Soriano, Christina T.

    2016-01-01

    Dance is a non-pharmacological intervention that helps maintain functional independence and quality of life in people with Parkinson’s disease (PPD). Results from controlled studies on group-delivered dance for people with mild-to-moderate stage Parkinson’s have shown statistically and clinically significant improvements in gait, balance, and psychosocial factors. Tested interventions include non-partnered dance forms (ballet and modern dance) and partnered (tango). In all of these dance forms, specific movement patterns initially are learned through repetition and performed in time-to-music. Once the basic steps are mastered, students may be encouraged to improvise on the learned steps as they perform them in rhythm with the music. Here, we summarize a method of teaching improvisational dance that advances previous reported benefits of dance for people with Parkinson’s disease (PD). The method relies primarily on improvisational verbal auditory cueing with less emphasis on directed movement instruction. This method builds on the idea that daily living requires flexible, adaptive responses to real-life challenges. In PD, movement disorders not only limit mobility but also impair spontaneity of thought and action. Dance improvisation demands open and immediate interpretation of verbally delivered movement cues, potentially fostering the formation of spontaneous movement strategies. Here, we present an introduction to a proposed method, detailing its methodological specifics, and pointing to future directions. The viewpoint advances an embodied cognitive approach that has eco-validity in helping PPD meet the changing demands of daily living. PMID:26925029

  7. Verbal auditory cueing of improvisational dance: A proposed method for training agency in Parkinson’s disease

    Directory of Open Access Journals (Sweden)

    Glenna eBatson

    2016-02-01

    Full Text Available Dance is a non-pharmacological intervention that helps maintain functional independence and quality of life in people with Parkinson’s disease (PPD. Results from controlled studies on group-delivered dance for people with mild-to-moderate stage Parkinson’s have shown statistically and clinically significant improvements in gait, balance, and psychosocial factors. Tested interventions include non-partnered dance forms (ballet and modern dance and partnered (tango. In all of these dance forms, specific movement patterns initially are learned through repetition and performed in time to music. Once the basic steps are mastered, students may be encouraged to improvise on the learned steps as they perform them in rhythm with the music. Here, we summarize a method of teaching improvisational dance that advances previous reported benefits of dance for people with PD. The method relies primarily on improvisational verbal auditory cueing (VAC with less emphasis on directed movement instruction. This method builds on the idea that daily living requires flexible, adaptive responses to real-life challenges. In PD, movement disorders not only limit mobility, but also impair spontaneity of thought and action. Dance improvisation trains spontaneity of thought, fostering open and immediate interpretation of verbally delivered movement cues. Here we present an introduction to a proposed method, detailing its methodological specifics, and pointing to future directions. The viewpoint advances an embodied cognitive approach that has eco-validity in helping PPD meet the changing demands of daily living.

  8. Perception of auditory, visual, and egocentric spatial alignment adapts differently to changes in eye position.

    Science.gov (United States)

    Cui, Qi N; Razavi, Babak; O'Neill, William E; Paige, Gary D

    2010-02-01

    Vision and audition represent the outside world in spatial synergy that is crucial for guiding natural activities. Input conveying eye-in-head position is needed to maintain spatial congruence because the eyes move in the head while the ears remain head-fixed. Recently, we reported that the human perception of auditory space shifts with changes in eye position. In this study, we examined whether this phenomenon is 1) dependent on a visual fixation reference, 2) selective for frequency bands (high-pass and low-pass noise) related to specific auditory spatial channels, 3) matched by a shift in the perceived straight-ahead (PSA), and 4) accompanied by a spatial shift for visual and/or bimodal (visual and auditory) targets. Subjects were tested in a dark echo-attenuated chamber with their heads fixed facing a cylindrical screen, behind which a mobile speaker/LED presented targets across the frontal field. Subjects fixated alternating reference spots (0, +/-20 degrees ) horizontally or vertically while either localizing targets or indicating PSA using a laser pointer. Results showed that the spatial shift induced by ocular eccentricity is 1) preserved for auditory targets without a visual fixation reference, 2) generalized for all frequency bands, and thus all auditory spatial channels, 3) paralleled by a shift in PSA, and 4) restricted to auditory space. Findings are consistent with a set-point control strategy by which eye position governs multimodal spatial alignment. The phenomenon is robust for auditory space and egocentric perception, and highlights the importance of controlling for eye position in the examination of spatial perception and behavior. PMID:19846626

  9. The Effect of Attentional Cueing and Spatial Uncertainty in Visual Field Testing.

    Directory of Open Access Journals (Sweden)

    Jack Phu

    Full Text Available To determine the effect of reducing spatial uncertainty by attentional cueing on contrast sensitivity at a range of spatial locations and with different stimulus sizes.Six observers underwent perimetric testing with the Humphrey Visual Field Analyzer (HFA full threshold paradigm, and the output thresholds were compared to conditions where stimulus location was verbally cued to the observer. We varied the number of points cued, the eccentric and spatial location, and stimulus size (Goldmann size I, III and V. Subsequently, four observers underwent laboratory-based psychophysical testing on a custom computer program using Method of Constant Stimuli to determine the frequency-of-seeing (FOS curves with similar variables.We found that attentional cueing increased contrast sensitivity when measured using the HFA. We report a difference of approximately 2 dB with size I at peripheral and mid-peripheral testing locations. For size III, cueing had a greater effect for points presented in the periphery than in the mid-periphery. There was an exponential decay of the effect of cueing with increasing number of elements cued. Cueing a size V stimulus led to no change. FOS curves generated from laboratory-based psychophysical testing confirmed an increase in contrast detection sensitivity under the same conditions. We found that the FOS curve steepened when spatial uncertainty was reduced.We show that attentional cueing increases contrast sensitivity when using a size I or size III test stimulus on the HFA when up to 8 points are cued but not when a size V stimulus is cued. We show that this cueing also alters the slope of the FOS curve. This suggests that at least 8 points should be used to minimise potential attentional factors that may affect measurement of contrast sensitivity in the visual field.

  10. Hierarchical and serial processing in the spatial auditory cortical pathway is degraded by natural aging

    OpenAIRE

    Juarez-Salinas, Dina L.; Engle, James R.; Navarro, Xochi O.; Gregg H Recanzone

    2010-01-01

    The compromised abilities to localize sounds and to understand speech are two hallmark deficits in aged individuals. The auditory cortex is necessary for these processes, yet we know little about how normal aging affects these early cortical fields. In this study, we recorded the spatial tuning of single neurons in primary (area A1) and secondary (area CL) auditory cortical areas in young and aged alert rhesus macaques. We found that the neurons of aged animals had greater spontaneous and dri...

  11. A dynamic model of how feature cues guide spatial attention

    OpenAIRE

    Hamker, Fred H.

    2004-01-01

    We will describe a computational model of attention which explains the guidance of spatial attention by feedback within a distributed network. We hypothesize that feedback within the ventral pathway transfers the target template from prefrontal areas into intermediate areas like V4. The oculomotor circuit consisting of FEF, LIP and superior colliculus picks up this distributed activity and provides a continuous spatial reentry signal from premotor cells. In order to test this hypothesis, we s...

  12. The Effect of Tactile Cues on Auditory Stream Segregation Ability of Musicians and Nonmusicians

    DEFF Research Database (Denmark)

    Slater, Kyle D.; Marozeau, Jeremy

    2016-01-01

    Difficulty perceiving music is often cited as one of the main problems facing hearing-impaired listeners. It has been suggested that musical enjoyment could be enhanced if sound information absent due to impairment is transmitted via other sensory modalities such as vision or touch. In this study...... was always better; however, the magnitude of improvement with the introduction of tactile cues was similar in both groups. This study suggests that hearing-impaired listeners could potentially benefit from a system transmitting such information via a tactile modality...

  13. Flexible spatial perspective-taking: Conversational partners weigh multiple cues in collaborative tasks

    Directory of Open Access Journals (Sweden)

    Alexia eGalati

    2013-09-01

    Full Text Available Research on spatial perspective-taking often focuses on the cognitive processes of isolated individuals as they adopt or maintain imagined perspectives. Collaborative studies of spatial perspective-taking typically examine speakers’ linguistic choices, while overlooking their underlying processes and representations. We review evidence from two collaborative experiments that examine the contribution of social and representational cues to spatial perspective choices in both language and the organization of spatial memory. Across experiments, speakers organized their memory representations according to the convergence of various cues. When layouts were randomly configured and did not afford intrinsic cues, speakers encoded their partner’s viewpoint in memory, if available, but did not use it as an organizing direction. On the other hand, when the layout afforded an intrinsic structure, speakers organized their spatial memories according to the person-centered perspective reinforced by the layout’s structure. Similarly, in descriptions, speakers considered multiple cues whether available a priori or at the interaction. They used partner-centered expressions more frequently (e.g., to your right when the partner’s viewpoint was misaligned by a small offset or coincided with the layout’s structure. Conversely, they used egocentric expressions more frequently when their own viewpoint coincided with the intrinsic structure or when the partner was misaligned by a computationally difficult, oblique offset. Based on these findings we advocate for a framework for flexible perspective-taking: people weigh multiple cues (including social ones to make attributions about the relative difficulty of perspective-taking for each partner, and adapt behavior to minimize their collective effort. This framework is not specialized for spatial reasoning but instead emerges from the same principles and memory-depended processes that govern perspective-taking in

  14. Training-induced plasticity of auditory localization in adult mammals.

    Directory of Open Access Journals (Sweden)

    Oliver Kacelnik

    2006-04-01

    Full Text Available Accurate auditory localization relies on neural computations based on spatial cues present in the sound waves at each ear. The values of these cues depend on the size, shape, and separation of the two ears and can therefore vary from one individual to another. As with other perceptual skills, the neural circuits involved in spatial hearing are shaped by experience during development and retain some capacity for plasticity in later life. However, the factors that enable and promote plasticity of auditory localization in the adult brain are unknown. Here we show that mature ferrets can rapidly relearn to localize sounds after having their spatial cues altered by reversibly occluding one ear, but only if they are trained to use these cues in a behaviorally relevant task, with greater and more rapid improvement occurring with more frequent training. We also found that auditory adaptation is possible in the absence of vision or error feedback. Finally, we show that this process involves a shift in sensitivity away from the abnormal auditory spatial cues to other cues that are less affected by the earplug. The mature auditory system is therefore capable of adapting to abnormal spatial information by reweighting different localization cues. These results suggest that training should facilitate acclimatization to hearing aids in the hearing impaired.

  15. Overshadowing of geometric cues by a beacon in a spatial navigation task.

    Science.gov (United States)

    Redhead, Edward S; Hamilton, Derek A; Parker, Matthew O; Chan, Wai; Allison, Craig

    2013-06-01

    In three experiments, we examined whether overshadowing of geometric cues by a discrete landmark (beacon) is due to the relative saliences of the cues. Using a virtual water maze task, human participants were required to locate a platform marked by a beacon in a distinctively shaped pool. In Experiment 1, the beacon overshadowed geometric cues in a trapezium, but not in an isosceles triangle. The longer escape latencies during acquisition in the trapezium control group with no beacon suggest that the geometric cues in the trapezium were less salient than those in the triangle. In Experiment 2, we evaluated whether generalization decrement, caused by the removal of the beacon at test, could account for overshadowing. An additional beacon was placed in an alternative corner. For the control groups, the beacons were identical; for the overshadow groups, they were visually unique. Overshadowing was again found in the trapezium. In Experiment 3, we tested whether the absence of overshadowing in the triangle was due to the geometric cues being more salient than the beacon. Following training, the beacon was relocated to a different corner. Participants approached the beacon rather than the trained platform corner, suggesting that the beacon was more salient. These results suggest that associative processes do not fully explain cue competition in the spatial domain.

  16. Assessing implicit odor localization in humans using a cross-modal spatial cueing paradigm.

    Directory of Open Access Journals (Sweden)

    Carolin Moessnang

    Full Text Available BACKGROUND: Navigation based on chemosensory information is one of the most important skills in the animal kingdom. Studies on odor localization suggest that humans have lost this ability. However, the experimental approaches used so far were limited to explicit judgements, which might ignore a residual ability for directional smelling on an implicit level without conscious appraisal. METHODS: A novel cueing paradigm was developed in order to determine whether an implicit ability for directional smelling exists. Participants performed a visual two-alternative forced choice task in which the target was preceded either by a side-congruent or a side-incongruent olfactory spatial cue. An explicit odor localization task was implemented in a second experiment. RESULTS: No effect of cue congruency on mean reaction times could be found. However, a time by condition interaction emerged, with significantly slower responses to congruently compared to incongruently cued targets at the beginning of the experiment. This cueing effect gradually disappeared throughout the course of the experiment. In addition, participants performed at chance level in the explicit odor localization task, thus confirming the results of previous research. CONCLUSION: The implicit cueing task suggests the existence of spatial information processing in the olfactory system. Response slowing after a side-congruent olfactory cue is interpreted as a cross-modal attentional interference effect. In addition, habituation might have led to a gradual disappearance of the cueing effect. It is concluded that under immobile conditions with passive monorhinal stimulation, humans are unable to explicitly determine the location of a pure odorant. Implicitly, however, odor localization seems to exert an influence on human behaviour. To our knowledge, these data are the first to show implicit effects of odor localization on overt human behaviour and thus support the hypothesis of residual

  17. Use of local visual cues for spatial orientation in terrestrial toads (Rhinella arenarum): The role of distance to a goal.

    Science.gov (United States)

    Daneri, M Florencia; Casanave, Emma B; Muzio, Rubén N

    2015-08-01

    The use of environmental visual cues for navigation is an ability present in many groups of animals. The effect of spatial proximity between a visual cue and a goal on reorientation in an environment has been studied in several vertebrate groups, but never previously in amphibians. In this study, we tested the use of local visual cues (beacons) to orient in an open field in the terrestrial toad (Rhinella arenarum). Experiment 1 showed that toads could orient in space using 2 cues located near the rewarded container. Experiment 2 used only 1 cue placed at different distances to the goal and revealed that learning speed was affected by the proximity to the goal (the closer the cue was to the goal, the faster toads learned its location). Experiment 3 showed that the position of a cue results in a different predictive value. Toads preferred cues located closer to the goal more than those located farther away as a reference for orientation. Present results revealed, for the first time, that (a) toads can learn to orient in an open space using visual cues, and that (b) the effect of spatial proximity between a cue and a goal, a learning phenomenon previously observed in other groups of animals such as mammals, birds, fish, and invertebrates, also affects orientation in amphibians. Thus, our results suggest that toads are able to employ spatial strategies that closely parallel those described in other vertebrate groups, supporting an early evolutionary origin for these spatial orientation skills.

  18. The Use of Spatialized Speech in Auditory Interfaces for Computer Users Who Are Visually Impaired

    Science.gov (United States)

    Sodnik, Jaka; Jakus, Grega; Tomazic, Saso

    2012-01-01

    Introduction: This article reports on a study that explored the benefits and drawbacks of using spatially positioned synthesized speech in auditory interfaces for computer users who are visually impaired (that is, are blind or have low vision). The study was a practical application of such systems--an enhanced word processing application compared…

  19. Perception of Auditory, Visual, and Egocentric Spatial Alignment Adapts Differently to Changes in Eye Position

    OpenAIRE

    Cui, Qi N; Razavi, Babak; O'Neill, William E.; Paige, Gary D.

    2009-01-01

    Vision and audition represent the outside world in spatial synergy that is crucial for guiding natural activities. Input conveying eye-in-head position is needed to maintain spatial congruence because the eyes move in the head while the ears remain head-fixed. Recently, we reported that the human perception of auditory space shifts with changes in eye position. In this study, we examined whether this phenomenon is 1) dependent on a visual fixation reference, 2) selective for frequency bands (...

  20. Encoding and retrieval of landmark-related spatial cues during navigation: an fMRI study.

    Science.gov (United States)

    Wegman, Joost; Tyborowska, Anna; Janzen, Gabriele

    2014-07-01

    To successfully navigate, humans can use different cues from their surroundings. Learning locations in an environment can be supported by parallel subsystems in the hippocampus and the striatum. We used fMRI to look at differences in the use of object-related spatial cues while 47 participants actively navigated in an open-field virtual environment. In each trial, participants navigated toward a target object. During encoding, three positional cues (columns) with directional cues (shadows) were available. During retrieval, the removed target had to be replaced while either two objects without shadows (objects trial) or one object with a shadow (shadow trial) were available. Participants were informed in blocks about which type of retrieval trial was most likely to occur, thereby modulating expectations of having to rely on a single landmark or on a configuration of landmarks. How the spatial learning systems in the hippocampus and caudate nucleus were involved in these landmark-based encoding and retrieval processes were investigated. Landmark configurations can create a geometry similar to boundaries in an environment. It was found that the hippocampus was involved in encoding when relying on configurations of landmarks, whereas the caudate nucleus was involved in encoding when relying on single landmarks. This might suggest that the observed hippocampal activation for configurations of objects is linked to a spatial representation observed with environmental boundaries. Retrieval based on configurations of landmarks activated regions associated with the spatial updation of object locations for reorientation. When only a single landmark was available during retrieval, regions associated with updating the location of oneself were activated. There was also evidence that good between-participant performance was predicted by right hippocampal activation. This study therefore sheds light on how the brain deals with changing demands on spatial processing related purely

  1. Effects of spatially correlated acoustic-tactile information on judgments of auditory circular direction

    Science.gov (United States)

    Cohen, Annabel J.; Lamothe, M. J. Reina; Toms, Ian D.; Fleming, Richard A. G.

    2002-05-01

    Cohen, Lamothe, Fleming, MacIsaac, and Lamoureux [J. Acoust. Soc. Am. 109, 2460 (2001)] reported that proximity governed circular direction judgments (clockwise/counterclockwise) of two successive tones emanating from all pairs of 12 speakers located at 30-degree intervals around a listeners' head (cranium). Many listeners appeared to experience systematic front-back confusion. Diametrically opposed locations (180-degrees-theoretically ambiguous direction) produced a direction bias pattern resembling Deutsch's tritone paradox [Deutsch, Kuyper, and Fisher, Music Percept. 5, 7992 (1987)]. In Experiment 1 of the present study, the circular direction task was conducted in the tactile domain using 12 circumcranial points of vibration. For all 5 participants, proximity governed direction (without front-back confusion) and a simple clockwise bias was shown for 180-degree pairs. Experiment 2 tested 9 new participants in one unimodal auditory condition and two bimodal auditory-tactile conditions (spatially-correlated/spatially-uncorrelated). Correlated auditory-tactile information eliminated front-back confusion for 8 participants and replaced the ``paradoxical'' bias for 180-degree pairs with the clockwise bias. Thus, spatially correlated audio-tactile location information improves the veridical representation of 360-degree acoustic space, and modality-specific principles are implicated by the unique circular direction bias patterns for 180-degree pairs in the separate auditory and tactile modalities. [Work supported by NSERC.

  2. Spatially valid proprioceptive cues improve the detection of a visual stimulus

    DEFF Research Database (Denmark)

    Jackson, Carl P T; Miall, R Chris; Balslev, Daniela

    2010-01-01

    Vision and proprioception are the main sensory modalities that convey hand location and direction of movement. Fusion of these sensory signals into a single robust percept is now well documented. However, it is not known whether these modalities also interact in the spatial allocation of attention......, which has been demonstrated for other modality pairings. The aim of this study was to test whether proprioceptive signals can spatially cue a visual target to improve its detection. Participants were instructed to use a planar manipulandum in a forward reaching action and determine during this movement...... whether a near-threshold visual target appeared at either of two lateral positions. The target presentation was followed by a masking stimulus, which made its possible location unambiguous, but not its presence. Proprioceptive cues were given by applying a brief lateral force to the participant's arm...

  3. The Relationship between Visual-Spatial and Auditory-Verbal Working Memory Span in Senegalese and Ugandan Children

    OpenAIRE

    Michael J Boivin; Paul Bangirana; Rebecca C Shaffer

    2010-01-01

    BACKGROUND: Using the Kaufman Assessment Battery for Children (K-ABC) Conant et al. (1999) observed that visual and auditory working memory (WM) span were independent in both younger and older children from DR Congo, but related in older American children and in Lao children. The present study evaluated whether visual and auditory WM span were independent in Ugandan and Senegalese children. METHOD: In a linear regression analysis we used visual (Spatial Memory, Hand Movements) and auditory (N...

  4. Cross-modal activation of auditory regions during visuo-spatial working memory in early deafness.

    Science.gov (United States)

    Ding, Hao; Qin, Wen; Liang, Meng; Ming, Dong; Wan, Baikun; Li, Qiang; Yu, Chunshui

    2015-09-01

    Early deafness can reshape deprived auditory regions to enable the processing of signals from the remaining intact sensory modalities. Cross-modal activation has been observed in auditory regions during non-auditory tasks in early deaf subjects. In hearing subjects, visual working memory can evoke activation of the visual cortex, which further contributes to behavioural performance. In early deaf subjects, however, whether and how auditory regions participate in visual working memory remains unclear. We hypothesized that auditory regions may be involved in visual working memory processing and activation of auditory regions may contribute to the superior behavioural performance of early deaf subjects. In this study, 41 early deaf subjects (22 females and 19 males, age range: 20-26 years, age of onset of deafness deaf subjects exhibited faster reaction times on the spatial working memory task than did the hearing controls. Compared with hearing controls, deaf subjects exhibited increased activation in the superior temporal gyrus bilaterally during the recognition stage. This increased activation amplitude predicted faster and more accurate working memory performance in deaf subjects. Deaf subjects also had increased activation in the superior temporal gyrus bilaterally during the maintenance stage and in the right superior temporal gyrus during the encoding stage. These increased activation amplitude also predicted faster reaction times on the spatial working memory task in deaf subjects. These findings suggest that cross-modal plasticity occurs in auditory association areas in early deaf subjects. These areas are involved in visuo-spatial working memory. Furthermore, amplitudes of cross-modal activation during the maintenance stage were positively correlated with the age of onset of hearing aid use and were negatively correlated with the percentage of lifetime hearing aid use in deaf subjects. These findings suggest that earlier and longer hearing aid use may

  5. Male Music Frogs Compete Vocally on the Basis of Temporal Sequence Rather Than Spatial Cues of Rival Calls

    Institute of Scientific and Technical Information of China (English)

    Fan JIANG; Guangzhan FANG; Fei XUE; Jianguo CUI; Steven E BRAUTH; Yezhong TANG

    2015-01-01

    Male-male vocal competition in anuran species may be influenced by cues related to the temporal sequence of male calls as well by internal temporal, spectral and spatial ones. Nevertheless, the conditions under which each type of cue is important remain unclear. Since the salience of different cues could be reflected by dynamic properties of male-male competition under certain experimental manipulation, we investigated the effects of repeating playbacks of conspecific calls on male call production in the Emei music frog (Babina daunchina). In Babina, most males produce calls from nest burrows which modify the spectral features of the cues. Females prefer calls produced from inside burrows which are defined as highly sexually attractive (HSA) while those produced outside burrows as low sexual attractiveness (LSA). In this study HSA and LSA calls were broadcasted either antiphonally or stereophonically through spatially separated speakers in which the temporal sequence and/or spatial position of the playbacks was either predictable or random. Results showed that most males consistently avoided producing advertisement calls overlapping the playback stimuli and generally produced calls competitively in advance of the playbacks. Furthermore males preferentially competed with the HSA calls when the sequence was predictable but competed equally with HSA and LSA calls if the sequence was random regardless of the availability of spatial cues, implying that males relied more on available sequence cues than spatial ones to remain competitive.

  6. Interference between postural control and spatial vs. non-spatial auditory reaction time tasks in older adults.

    Science.gov (United States)

    Fuhrman, Susan I; Redfern, Mark S; Jennings, J Richard; Furman, Joseph M

    2015-01-01

    This study investigated whether spatial aspects of an information processing task influence dual-task interference. Two groups (Older/Young) of healthy adults participated in dual-task experiments. Two auditory information processing tasks included a frequency discrimination choice reaction time task (non-spatial task) and a lateralization choice reaction time task (spatial task). Postural tasks included combinations of standing with eyes open or eyes closed on either a fixed floor or a sway-referenced floor. Reaction times and postural sway via center of pressure were recorded. Baseline measures of reaction time and sway were subtracted from the corresponding dual-task results to calculate reaction time task costs and postural task costs. Reaction time task cost increased with eye closure (p = 0.01), sway-referenced flooring (p visual-spatial interference may occur in older subjects when vision is used to maintain posture. PMID:26410669

  7. Comparison of Gated Audiovisual Speech Identification in Elderly Hearing Aid Users and Elderly Normal-Hearing Individuals: Effects of Adding Visual Cues to Auditory Speech Stimuli.

    Science.gov (United States)

    Moradi, Shahram; Lidestam, Björn; Rönnberg, Jerker

    2016-06-17

    The present study compared elderly hearing aid (EHA) users (n = 20) with elderly normal-hearing (ENH) listeners (n = 20) in terms of isolation points (IPs, the shortest time required for correct identification of a speech stimulus) and accuracy of audiovisual gated speech stimuli (consonants, words, and final words in highly and less predictable sentences) presented in silence. In addition, we compared the IPs of audiovisual speech stimuli from the present study with auditory ones extracted from a previous study, to determine the impact of the addition of visual cues. Both participant groups achieved ceiling levels in terms of accuracy in the audiovisual identification of gated speech stimuli; however, the EHA group needed longer IPs for the audiovisual identification of consonants and words. The benefit of adding visual cues to auditory speech stimuli was more evident in the EHA group, as audiovisual presentation significantly shortened the IPs for consonants, words, and final words in less predictable sentences; in the ENH group, audiovisual presentation only shortened the IPs for consonants and words. In conclusion, although the audiovisual benefit was greater for EHA group, this group had inferior performance compared with the ENH group in terms of IPs when supportive semantic context was lacking. Consequently, EHA users needed the initial part of the audiovisual speech signal to be longer than did their counterparts with normal hearing to reach the same level of accuracy in the absence of a semantic context.

  8. Follow the Sound : Design of mobile spatial audio applications for pedestrian navigation

    OpenAIRE

    2012-01-01

    Auditory displays are slower than graphical user interfaces. We believe spatial audio can change that. Human perception can localize the position of sound sources due to psychoacoustical cues. Spatial audio reproduces these cues to produce virtual sound source position by headphones. The spatial attribute of sound can be used to produce richer and more effective auditory displays. In this work, there is proposed a set of interaction design guidelines for the use of spatial audio displays i...

  9. Cues, context, and long-term memory: the role of the retrosplenial cortex in spatial cognition

    Directory of Open Access Journals (Sweden)

    Adam M P Miller

    2014-08-01

    Full Text Available Spatial navigation requires representations of landmarks and other navigation cues. The retrosplenial cortex (RSC is anatomically positioned between limbic areas important for memory formation, such as the hippocampus and the anterior thalamus, and cortical regions along the dorsal stream known to contribute importantly to long-term spatial representation, such as the posterior parietal cortex. Damage to the RSC severely impairs allocentric representations of the environment, including the ability to derive navigational information from landmarks. The specific deficits seen in tests of human and rodent navigation suggest that the RSC supports allocentric representation by processing the stable features of the environment and the spatial relationships among them. In addition to spatial cognition, the RSC plays a key role in contextual and episodic memory. The RSC also contributes importantly to the acquisition and consolidation of long-term spatial and contextual memory through its interactions with the hippocampus. Within this framework, the RSC plays a dual role as part of the feedforward network providing sensory and mnemonic input to the hippocampus and as a target of the hippocampal-dependent systems consolidation of long-term memory.

  10. The relationship between visual-spatial and auditory-verbal working memory span in Senegalese and Ugandan children.

    Directory of Open Access Journals (Sweden)

    Michael J Boivin

    Full Text Available BACKGROUND: Using the Kaufman Assessment Battery for Children (K-ABC Conant et al. (1999 observed that visual and auditory working memory (WM span were independent in both younger and older children from DR Congo, but related in older American children and in Lao children. The present study evaluated whether visual and auditory WM span were independent in Ugandan and Senegalese children. METHOD: In a linear regression analysis we used visual (Spatial Memory, Hand Movements and auditory (Number Recall WM along with education and physical development (weight/height as predictors. The predicted variable in this analysis was Word Order, which is a verbal memory task that has both visual and auditory memory components. RESULTS: Both the younger (8.5 yrs Ugandan children had auditory memory span (Number Recall that was strongly predictive of Word Order performance. For both the younger and older groups of Senegalese children, only visual WM span (Spatial Memory was strongly predictive of Word Order. Number Recall was not significantly predictive of Word Order in either age group. CONCLUSIONS: It is possible that greater literacy from more schooling for the Ugandan age groups mediated their greater degree of interdependence between auditory and verbal WM. Our findings support those of Conant et al., who observed in their cross-cultural comparisons that stronger education seemed to enhance the dominance of the phonological-auditory processing loop for WM.

  11. Auditory and Visual Cues for Topic Maintenance with Persons Who Exhibit Dementia of Alzheimer’s Type

    Directory of Open Access Journals (Sweden)

    Amy F. Teten

    2015-01-01

    Full Text Available This study compared the effectiveness of auditory and visual redirections in facilitating topic coherence for persons with Dementia of Alzheimer’s Type (DAT. Five persons with moderate stage DAT engaged in conversation with the first author. Three topics related to activities of daily living, recreational activities, food, and grooming, were broached. Each topic was presented three times to each participant: once as a baseline condition, once with auditory redirection to topic, and once with visual redirection to topic. Transcripts of the interactions were scored for overall coherence. Condition was a significant factor in that the DAT participants exhibited better topic maintenance under visual and auditory conditions as opposed to baseline. In general, the performance of the participants was not affected by the topic, except for significantly higher overall coherence ratings for the visually redirected interactions dealing with the topic of food.

  12. Greater anterior cingulate activation and connectivity in response to visual and auditory high-calorie food cues in binge eating: Preliminary findings.

    Science.gov (United States)

    Geliebter, Allan; Benson, Leora; Pantazatos, Spiro P; Hirsch, Joy; Carnell, Susan

    2016-01-01

    Obese individuals show altered neural responses to high-calorie food cues. Individuals with binge eating [BE], who exhibit heightened impulsivity and emotionality, may show a related but distinct pattern of irregular neural responses. However, few neuroimaging studies have compared BE and non-BE groups. To examine neural responses to food cues in BE, 10 women with BE and 10 women without BE (non-BE) who were matched for obesity (5 obese and 5 lean in each group) underwent fMRI scanning during presentation of visual (picture) and auditory (spoken word) cues representing high energy density (ED) foods, low-ED foods, and non-foods. We then compared regional brain activation in BE vs. non-BE groups for high-ED vs. low-ED foods. To explore differences in functional connectivity, we also compared psychophysiologic interactions [PPI] with dorsal anterior cingulate cortex [dACC] for BE vs. non-BE groups. Region of interest (ROI) analyses revealed that the BE group showed more activation than the non-BE group in the dACC, with no activation differences in the striatum or orbitofrontal cortex [OFC]. Exploratory PPI analyses revealed a trend towards greater functional connectivity with dACC in the insula, cerebellum, and supramarginal gyrus in the BE vs. non-BE group. Our results suggest that women with BE show hyper-responsivity in the dACC as well as increased coupling with other brain regions when presented with high-ED cues. These differences are independent of body weight, and appear to be associated with the BE phenotype. PMID:26275334

  13. Influence of age, spatial memory, and ocular fixation on localization of auditory, visual, and bimodal targets by human subjects.

    Science.gov (United States)

    Dobreva, Marina S; O'Neill, William E; Paige, Gary D

    2012-12-01

    A common complaint of the elderly is difficulty identifying and localizing auditory and visual sources, particularly in competing background noise. Spatial errors in the elderly may pose challenges and even threats to self and others during everyday activities, such as localizing sounds in a crowded room or driving in traffic. In this study, we investigated the influence of aging, spatial memory, and ocular fixation on the localization of auditory, visual, and combined auditory-visual (bimodal) targets. Head-restrained young and elderly subjects localized targets in a dark, echo-attenuated room using a manual laser pointer. Localization accuracy and precision (repeatability) were quantified for both ongoing and transient (remembered) targets at response delays up to 10 s. Because eye movements bias auditory spatial perception, localization was assessed under target fixation (eyes free, pointer guided by foveal vision) and central fixation (eyes fixed straight ahead, pointer guided by peripheral vision) conditions. Spatial localization across the frontal field in young adults demonstrated (1) horizontal overshoot and vertical undershoot for ongoing auditory targets under target fixation conditions, but near-ideal horizontal localization with central fixation; (2) accurate and precise localization of ongoing visual targets guided by foveal vision under target fixation that degraded when guided by peripheral vision during central fixation; (3) overestimation in horizontal central space (±10°) of remembered auditory, visual, and bimodal targets with increasing response delay. In comparison with young adults, elderly subjects showed (1) worse precision in most paradigms, especially when localizing with peripheral vision under central fixation; (2) greatly impaired vertical localization of auditory and bimodal targets; (3) increased horizontal overshoot in the central field for remembered visual and bimodal targets across response delays; (4) greater vulnerability to

  14. Using spatial manipulation to examine interactions between visual and auditory encoding of pitch and time

    Directory of Open Access Journals (Sweden)

    Neil M McLachlan

    2010-12-01

    Full Text Available Music notations use both symbolic and spatial representation systems. Novice musicians do not have the training to associate symbolic information with musical identities, such as chords or rhythmic and melodic patterns. They provide an opportunity to explore the mechanisms underpinning multimodal learning when spatial encoding strategies of feature dimensions might be expected to dominate. In this study, we applied a range of transformations (such as time reversal to short melodies and rhythms and asked novice musicians to identify them with or without the aid of notation. Performance using a purely spatial (graphic notation was contrasted with the more symbolic, traditional western notation over a series of weekly sessions. The results showed learning effects for both notation types, but performance improved more for graphic notation. This points to greater compatibility of auditory and visual neural codes for novice musicians when using spatial notation, suggesting that pitch and time may be spatially encoded in multimodal associative memory. The findings also point to new strategies for training novice musicians.

  15. Trial-by-trial changes in a priori informational value of external cues and subjective expectancies in human auditory attention.

    Directory of Open Access Journals (Sweden)

    Antonio Arjona

    Full Text Available BACKGROUND: Preparatory activity based on a priori probabilities generated in previous trials and subjective expectancies would produce an attentional bias. However, preparation can be correct (valid or incorrect (invalid depending on the actual target stimulus. The alternation effect refers to the subjective expectancy that a target will not be repeated in the same position, causing RTs to increase if the target location is repeated. The present experiment, using the Posner's central cue paradigm, tries to demonstrate that not only the credibility of the cue, but also the expectancy about the next position of the target are changed in a trial by trial basis. Sequences of trials were analyzed. RESULTS: The results indicated an increase in RT benefits when sequences of two and three valid trials occurred. The analysis of errors indicated an increase in anticipatory behavior which grows as the number of valid trials is increased. On the other hand, there was also an RT benefit when a trial was preceded by trials in which the position of the target changed with respect to the current trial (alternation effect. Sequences of two alternations or two repetitions were faster than sequences of trials in which a pattern of repetition or alternation is broken. CONCLUSIONS: Taken together, these results suggest that in Posner's central cue paradigm, and with regard to the anticipatory activity, the credibility of the external cue and of the endogenously anticipated patterns of target location are constantly updated. The results suggest that Bayesian rules are operating in the generation of anticipatory activity as a function of the previous trial's outcome, but also on biases or prior beliefs like the "gambler fallacy".

  16. The Effects of Cueing Temporal and Spatial Attention on Word Recognition in a Complex Listening Task in Hearing-Impaired Listeners

    OpenAIRE

    Gatehouse, Stuart; Akeroyd, Michael A.

    2008-01-01

    In a complex listening situation such as a multiperson conversation, the demands on an individual's attention are considerable: There will often be many sounds occurring simultaneously, with continual changes in source and direction. A laboratory analog of this was designed to measure the benefit that helping attention (by visual cueing) would have on word identification. These words were presented unpredictably but were sometimes cued with a temporal cue or a temporal-and-spatial cue. Two gr...

  17. Express attentional re-engagement but delayed entry into consciousness following invalid spatial cues in visual search.

    Directory of Open Access Journals (Sweden)

    Benoit Brisson

    Full Text Available BACKGROUND: In predictive spatial cueing studies, reaction times (RT are shorter for targets appearing at cued locations (valid trials than at other locations (invalid trials. An increase in the amplitude of early P1 and/or N1 event-related potential (ERP components is also present for items appearing at cued locations, reflecting early attentional sensory gain control mechanisms. However, it is still unknown at which stage in the processing stream these early amplitude effects are translated into latency effects. METHODOLOGY/PRINCIPAL FINDINGS: Here, we measured the latency of two ERP components, the N2pc and the sustained posterior contralateral negativity (SPCN, to evaluate whether visual selection (as indexed by the N2pc and visual-short term memory processes (as indexed by the SPCN are delayed in invalid trials compared to valid trials. The P1 was larger contralateral to the cued side, indicating that attention was deployed to the cued location prior to the target onset. Despite these early amplitude effects, the N2pc onset latency was unaffected by cue validity, indicating an express, quasi-instantaneous re-engagement of attention in invalid trials. In contrast, latency effects were observed for the SPCN, and these were correlated to the RT effect. CONCLUSIONS/SIGNIFICANCE: Results show that latency differences that could explain the RT cueing effects must occur after visual selection processes giving rise to the N2pc, but at or before transfer in visual short-term memory, as reflected by the SPCN, at least in discrimination tasks in which the target is presented concurrently with at least one distractor. Given that the SPCN was previously associated to conscious report, these results further show that entry into consciousness is delayed following invalid cues.

  18. Characterizing spatial tuning functions of neurons in the auditory cortex of young and aged monkeys: A new perspective on old data.

    OpenAIRE

    James Engle; Gregg H Recanzone

    2013-01-01

    Age-related hearing deficits are a leading cause of disability among the aged. While some forms of hearing deficits are peripheral in origin, others are centrally mediated. One such deficit is the ability to localize sounds, a critical component for segregating different acoustic objects and events, which is dependent on the auditory cortex. Recent evidence indicates that in aged animals the normal sharpening of spatial tuning between neurons in primary auditory cortex to the caudal latera...

  19. Temporal asymmetries in auditory coding and perception reflect multi-layered nonlinearities.

    Science.gov (United States)

    Deneux, Thomas; Kempf, Alexandre; Daret, Aurélie; Ponsot, Emmanuel; Bathellier, Brice

    2016-01-01

    Sound recognition relies not only on spectral cues, but also on temporal cues, as demonstrated by the profound impact of time reversals on perception of common sounds. To address the coding principles underlying such auditory asymmetries, we recorded a large sample of auditory cortex neurons using two-photon calcium imaging in awake mice, while playing sounds ramping up or down in intensity. We observed clear asymmetries in cortical population responses, including stronger cortical activity for up-ramping sounds, which matches perceptual saliency assessments in mice and previous measures in humans. Analysis of cortical activity patterns revealed that auditory cortex implements a map of spatially clustered neuronal ensembles, detecting specific combinations of spectral and intensity modulation features. Comparing different models, we show that cortical responses result from multi-layered nonlinearities, which, contrary to standard receptive field models of auditory cortex function, build divergent representations of sounds with similar spectral content, but different temporal structure. PMID:27580932

  20. Colorful Success: Preschoolers' Use of Perceptual Color Cues to Solve a Spatial Reasoning Problem

    Science.gov (United States)

    Joh, Amy S.; Spivey, Leigh A.

    2012-01-01

    Spatial reasoning, a crucial skill for everyday actions, develops gradually during the first several years of childhood. Previous studies have shown that perceptual information and problem solving strategies are critical for successful spatial reasoning in young children. Here, we sought to link these two factors by examining children's use of…

  1. Auditory attention in childhood and adolescence: An event-related potential study of spatial selective attention to one of two simultaneous stories

    Directory of Open Access Journals (Sweden)

    Christina M. Karns

    2015-06-01

    Full Text Available Auditory selective attention is a critical skill for goal-directed behavior, especially where noisy distractions may impede focusing attention. To better understand the developmental trajectory of auditory spatial selective attention in an acoustically complex environment, in the current study we measured auditory event-related potentials (ERPs across five age groups: 3–5 years; 10 years; 13 years; 16 years; and young adults. Using a naturalistic dichotic listening paradigm, we characterized the ERP morphology for nonlinguistic and linguistic auditory probes embedded in attended and unattended stories. We documented robust maturational changes in auditory evoked potentials that were specific to the types of probes. Furthermore, we found a remarkable interplay between age and attention-modulation of auditory evoked potentials in terms of morphology and latency from the early years of childhood through young adulthood. The results are consistent with the view that attention can operate across age groups by modulating the amplitude of maturing auditory early-latency evoked potentials or by invoking later endogenous attention processes. Development of these processes is not uniform for probes with different acoustic properties within our acoustically dense speech-based dichotic listening task. In light of the developmental differences we demonstrate, researchers conducting future attention studies of children and adolescents should be wary of combining analyses across diverse ages.

  2. Plasticity in the neural coding of auditory space in the mammalian brain

    Science.gov (United States)

    King, Andrew J.; Parsons, Carl H.; Moore, David R.

    2000-10-01

    Sound localization relies on the neural processing of monaural and binaural spatial cues that arise from the way sounds interact with the head and external ears. Neurophysiological studies of animals raised with abnormal sensory inputs show that the map of auditory space in the superior colliculus is shaped during development by both auditory and visual experience. An example of this plasticity is provided by monaural occlusion during infancy, which leads to compensatory changes in auditory spatial tuning that tend to preserve the alignment between the neural representations of visual and auditory space. Adaptive changes also take place in sound localization behavior, as demonstrated by the fact that ferrets raised and tested with one ear plugged learn to localize as accurately as control animals. In both cases, these adjustments may involve greater use of monaural spectral cues provided by the other ear. Although plasticity in the auditory space map seems to be restricted to development, adult ferrets show some recovery of sound localization behavior after long-term monaural occlusion. The capacity for behavioral adaptation is, however, task dependent, because auditory spatial acuity and binaural unmasking (a measure of the spatial contribution to the "cocktail party effect") are permanently impaired by chronically plugging one ear, both in infancy but especially in adulthood. Experience-induced plasticity allows the neural circuitry underlying sound localization to be customized to individual characteristics, such as the size and shape of the head and ears, and to compensate for natural conductive hearing losses, including those associated with middle ear disease in infancy.

  3. From ear to hand: the role of the auditory-motor loop in pointing to an auditory source

    Science.gov (United States)

    Boyer, Eric O.; Babayan, Bénédicte M.; Bevilacqua, Frédéric; Noisternig, Markus; Warusfel, Olivier; Roby-Brami, Agnes; Hanneton, Sylvain; Viaud-Delmon, Isabelle

    2013-01-01

    Studies of the nature of the neural mechanisms involved in goal-directed movements tend to concentrate on the role of vision. We present here an attempt to address the mechanisms whereby an auditory input is transformed into a motor command. The spatial and temporal organization of hand movements were studied in normal human subjects as they pointed toward unseen auditory targets located in a horizontal plane in front of them. Positions and movements of the hand were measured by a six infrared camera tracking system. In one condition, we assessed the role of auditory information about target position in correcting the trajectory of the hand. To accomplish this, the duration of the target presentation was varied. In another condition, subjects received continuous auditory feedback of their hand movement while pointing to the auditory targets. Online auditory control of the direction of pointing movements was assessed by evaluating how subjects reacted to shifts in heard hand position. Localization errors were exacerbated by short duration of target presentation but not modified by auditory feedback of hand position. Long duration of target presentation gave rise to a higher level of accuracy and was accompanied by early automatic head orienting movements consistently related to target direction. These results highlight the efficiency of auditory feedback processing in online motor control and suggest that the auditory system takes advantages of dynamic changes of the acoustic cues due to changes in head orientation in order to process online motor control. How to design an informative acoustic feedback needs to be carefully studied to demonstrate that auditory feedback of the hand could assist the monitoring of movements directed at objects in auditory space. PMID:23626532

  4. Keeping one's distance: the influence of spatial distance cues on affect and evaluation.

    Science.gov (United States)

    Williams, Lawrence E; Bargh, John A

    2008-03-01

    Current conceptualizations of psychological distance (e.g., construal-level theory) refer to the degree of overlap between the self and some other person, place, or point in time. We propose a complementary view in which perceptual and motor representations of physical distance influence people's thoughts and feelings without reference to the self, extending research and theory on the effects of distance into domains where construal-level theory is silent. Across four experiments, participants were primed with either spatial closeness or spatial distance by plotting an assigned set of points on a Cartesian coordinate plane. Compared with the closeness prime, the distance prime produced greater enjoyment of media depicting embarrassment (Study 1), less emotional distress from violent media (Study 2), lower estimates of the number of calories in unhealthy food (Study 3), and weaker reports of emotional attachments to family members and hometowns (Study 4). These results support a broader conceptualization of distance-mediated effects on judgment and affect.

  5. From repulsion to attraction: species- and spatial context-dependent threat sensitive response of the spider mite Tetranychus urticae to predatory mite cues

    Science.gov (United States)

    Fernández Ferrari, M. Celeste; Schausberger, Peter

    2013-06-01

    Prey perceiving predation risk commonly change their behavior to avoid predation. However, antipredator strategies are costly. Therefore, according to the threat-sensitive predator avoidance hypothesis, prey should match the intensity of their antipredator behaviors to the degree of threat, which may depend on the predator species and the spatial context. We assessed threat sensitivity of the two-spotted spider mite, Tetranychus urticae, to the cues of three predatory mites, Phytoseiulus persimilis, Neoseiulus californicus, and Amblyseius andersoni, posing different degrees of risk in two spatial contexts. We first conducted a no-choice test measuring oviposition and activity of T. urticae exposed to chemical traces of predators or traces plus predator eggs. Then, we tested the site preference of T. urticae in choice tests, using artificial cages and leaves. In the no-choice test, T. urticae deposited their first egg later in the presence of cues of P. persimilis than of the other two predators and cue absence, indicating interspecific threat-sensitivity. T. urticae laid also fewer eggs in the presence of cues of P. persimilis and A. andersoni than of N. californicus and cue absence. In the artificial cage test, the spider mites preferred the site with predator traces, whereas in the leaf test, they preferentially resided on leaves without traces. We argue that in a nonplant environment, chemical predator traces do not indicate a risk for T. urticae, and instead, these traces function as indirect habitat cues. The spider mites were attracted to these cues because they associated them with the existence of a nearby host plant.

  6. Keeping one's distance: the influence of spatial distance cues on affect and evaluation.

    Science.gov (United States)

    Williams, Lawrence E; Bargh, John A

    2008-03-01

    Current conceptualizations of psychological distance (e.g., construal-level theory) refer to the degree of overlap between the self and some other person, place, or point in time. We propose a complementary view in which perceptual and motor representations of physical distance influence people's thoughts and feelings without reference to the self, extending research and theory on the effects of distance into domains where construal-level theory is silent. Across four experiments, participants were primed with either spatial closeness or spatial distance by plotting an assigned set of points on a Cartesian coordinate plane. Compared with the closeness prime, the distance prime produced greater enjoyment of media depicting embarrassment (Study 1), less emotional distress from violent media (Study 2), lower estimates of the number of calories in unhealthy food (Study 3), and weaker reports of emotional attachments to family members and hometowns (Study 4). These results support a broader conceptualization of distance-mediated effects on judgment and affect. PMID:18315805

  7. Spatial attention and reading ability: ERP correlates of flanker and cue-size effects in good and poor adult phonological decoders.

    Science.gov (United States)

    Matthews, Allison Jane; Martin, Frances Heritage

    2015-12-01

    To investigate facilitatory and inhibitory processes during selective attention among adults with good (n=17) and poor (n=14) phonological decoding skills, a go/nogo flanker task was completed while EEG was recorded. Participants responded to a middle target letter flanked by compatible or incompatible flankers. The target was surrounded by a small or large circular cue which was presented simultaneously or 500ms prior. Poor decoders showed a greater RT cost for incompatible stimuli preceded by large cues and less RT benefit for compatible stimuli. Poor decoders also showed reduced modulation of ERPs by cue-size at left hemisphere posterior sites (N1) and by flanker compatibility at right hemisphere posterior sites (N1) and frontal sites (N2), consistent with processing differences in fronto-parietal attention networks. These findings have potential implications for understanding the relationship between spatial attention and phonological decoding in dyslexia. PMID:26562794

  8. Crossmodal and incremental perception of audiovisual cues to emotional speech

    NARCIS (Netherlands)

    Barkhuysen, Pashiera; Krahmer, E.J.; Swerts, M.G.J.

    2010-01-01

    In this article we report on two experiments about the perception of audiovisual cues to emotional speech. The article addresses two questions: (1) how do visual cues from a speaker's face to emotion relate to auditory cues, and (2) what is the recognition speed for various facial cues to emotion? B

  9. Crossmodal and Incremental Perception of Audiovisual Cues to Emotional Speech

    Science.gov (United States)

    Barkhuysen, Pashiera; Krahmer, Emiel; Swerts, Marc

    2010-01-01

    In this article we report on two experiments about the perception of audiovisual cues to emotional speech. The article addresses two questions: (1) how do visual cues from a speaker's face to emotion relate to auditory cues, and (2) what is the recognition speed for various facial cues to emotion? Both experiments reported below are based on tests…

  10. The Effect of Auditory and Contextual Emotional Cues on the Ability to Recognise Facial Expressions of Emotion in Healthy Adult Aging

    OpenAIRE

    Duncan, Nikki

    2013-01-01

    Previous emotion recognition studies have suggested an age-related decline in the recognition of facial expressions of emotion. However, these studies often lack ecological validity and do not consider the multiple interacting sensory stimuli that are critical to realworld emotion recognition. In the current study, emotion recognition in everyday life was considered to comprise of the interaction between facial expressions, accompanied by an auditory expression and embedded in a situational c...

  11. Chronic exposure to broadband noise at moderate sound pressure levels spatially shifts tone-evoked responses in the rat auditory midbrain.

    Science.gov (United States)

    Lau, Condon; Pienkowski, Martin; Zhang, Jevin W; McPherson, Bradley; Wu, Ed X

    2015-11-15

    Noise-induced hearing disorders are a significant public health concern. One cause of such disorders is exposure to high sound pressure levels (SPLs) above 85 dBA for eight hours/day. High SPL exposures occur in occupational and recreational settings and affect a substantial proportion of the population. However, an even larger proportion is exposed to more moderate SPLs for longer durations. Therefore, there is significant need to better understand the impact of chronic, moderate SPL exposures on auditory processing, especially in the absence of hearing loss. In this study, we applied functional magnetic resonance imaging (fMRI) with tonal acoustic stimulation on an established broadband rat exposure model (65 dB SPL, 30 kHz low-pass, 60 days). The auditory midbrain response of exposed subjects to 7 kHz stimulation (within exposure bandwidth) shifts dorsolaterally to regions that typically respond to lower stimulation frequencies. This shift is quantified by a region of interest analysis that shows that fMRI signals are higher in the dorsolateral midbrain of exposed subjects and in the ventromedial midbrain of control subjects (pmidbrain regions above the exposure bandwidth spatially expand due to exposure. This expansion shifts lower frequency regions dorsolaterally. Similar observations have previously been made in the rat auditory cortex. Therefore, moderate SPL exposures affect auditory processing at multiple levels, from the auditory cortex to the midbrain.

  12. The role of diffusive architectural surfaces on auditory spatial discrimination in performance venues.

    Science.gov (United States)

    Robinson, Philip W; Pätynen, Jukka; Lokki, Tapio; Jang, Hyung Suk; Jeon, Jin Yong; Xiang, Ning

    2013-06-01

    In musical or theatrical performance, some venues allow listeners to individually localize and segregate individual performers, while others produce a well blended ensemble sound. The room acoustic conditions that make this possible, and the psycho-acoustic effects at work are not fully understood. This research utilizes auralizations from measured and simulated performance venues to investigate spatial discrimination of multiple acoustic sources in rooms. Signals were generated from measurements taken in a small theater, and listeners in the audience area were asked to distinguish pairs of speech sources on stage with various spatial separations. This experiment was repeated with the proscenium splay walls treated to be flat, diffusive, or absorptive. Similar experiments were conducted in a simulated hall, utilizing 11 early reflections with various characteristics, and measured late reverberation. The experiments reveal that discriminating the lateral arrangement of two sources is possible at narrower separation angles when reflections come from flat or absorptive rather than diffusive surfaces. PMID:23742348

  13. Impact of regularization of near field coding filters for 2D and 3D higher-order Ambisonics on auditory distance cues

    DEFF Research Database (Denmark)

    Favrot, Sylvain Emmanuel; Buchholz, Jörg

    2010-01-01

    about 500 Hz. However, since low-frequency ILD cues seem to be important for distance perception, a novel regularization function is proposed as AWW that can reproduce natural ILDs down to bout 250 Hz, even in realistic playback environments. Using this regularization function, a listening test showed...... an improved distance perception performance for lateral sources compared to frontal ones. This improvement was greater for 3D reproduction than for 2D but slightly lower than for real sources. The distance of virtual nearby sources reproduced by NFC-HOA with the regularization function as AWW can be perceived...

  14. Two- to Eight-Month-Old Infants' Perception of Dynamic Auditory-Visual Spatial Colocation

    OpenAIRE

    Bremner, J. Gavin; Slater, Alan M.; Scott P Johnson; Mason, Ursula; Spring, Joanne; Bremner, Maggie E.

    2011-01-01

    From birth, infants detect associations between the locations of static visual objects and sounds they emit, but there is limited evidence regarding their sensitivity to the dynamic equivalent when a sound-emitting object moves. In 4 experiments involving thirty-six 2-month-olds, forty-eight 5-month-olds, and forty-eight 8-month-olds, we investigated infants' ability to process this form of spatial colocation. Whereas there was no evidence of spontaneous sensitivity, all age groups detected a...

  15. Divided multimodal attention sensory trace and context coding strategies in spatially congruent auditory and visual presentation.

    Science.gov (United States)

    Kristjánsson, Tómas; Thorvaldsson, Tómas Páll; Kristjánsson, Arni

    2014-01-01

    Previous research involving both unimodal and multimodal studies suggests that single-response change detection is a capacity-free process while a discriminatory up or down identification is capacity-limited. The trace/context model assumes that this reflects different memory strategies rather than inherent differences between identification and detection. To perform such tasks, one of two strategies is used, a sensory trace or a context coding strategy, and if one is blocked, people will automatically use the other. A drawback to most preceding studies is that stimuli are presented at separate locations, creating the possibility of a spatial confound, which invites alternative interpretations of the results. We describe a series of experiments, investigating divided multimodal attention, without the spatial confound. The results challenge the trace/context model. Our critical experiment involved a gap before a change in volume and brightness, which according to the trace/context model blocks the sensory trace strategy, simultaneously with a roaming pedestal, which should block the context coding strategy. The results clearly show that people can use strategies other than sensory trace and context coding in the tasks and conditions of these experiments, necessitating changes to the trace/context model.

  16. Signaled two-way avoidance learning using electrical stimulation of the inferior colliculus as negative reinforcement: effects of visual and auditory cues as warning stimuli

    Directory of Open Access Journals (Sweden)

    A.C. Troncoso

    1998-03-01

    Full Text Available The inferior colliculus is a primary relay for the processing of auditory information in the brainstem. The inferior colliculus is also part of the so-called brain aversion system as animals learn to switch off the electrical stimulation of this structure. The purpose of the present study was to determine whether associative learning occurs between aversion induced by electrical stimulation of the inferior colliculus and visual and auditory warning stimuli. Rats implanted with electrodes into the central nucleus of the inferior colliculus were placed inside an open-field and thresholds for the escape response to electrical stimulation of the inferior colliculus were determined. The rats were then placed inside a shuttle-box and submitted to a two-way avoidance paradigm. Electrical stimulation of the inferior colliculus at the escape threshold (98.12 ± 6.15 (A, peak-to-peak was used as negative reinforcement and light or tone as the warning stimulus. Each session consisted of 50 trials and was divided into two segments of 25 trials in order to determine the learning rate of the animals during the sessions. The rats learned to avoid the inferior colliculus stimulation when light was used as the warning stimulus (13.25 ± 0.60 s and 8.63 ± 0.93 s for latencies and 12.5 ± 2.04 and 19.62 ± 1.65 for frequencies in the first and second halves of the sessions, respectively, P0.05 in both cases. Taken together, the present results suggest that rats learn to avoid the inferior colliculus stimulation when light is used as the warning stimulus. However, this learning process does not occur when the neutral stimulus used is an acoustic one. Electrical stimulation of the inferior colliculus may disturb the signal transmission of the stimulus to be conditioned from the inferior colliculus to higher brain structures such as amygdala

  17. Auditory Display

    DEFF Research Database (Denmark)

    volume. The conference's topics include auditory exploration of data via sonification and audification; real time monitoring of multivariate date; sound in immersive interfaces and teleoperation; perceptual issues in auditory display; sound in generalized computer interfaces; technologies supporting...... auditory display creation; data handling for auditory display systems; applications of auditory display....

  18. Setting Goals to Switch between Tasks: Effect of Cue Transparency on Children's Cognitive Flexibility

    Science.gov (United States)

    Chevalier, Nicolas; Blaye, Agnes

    2009-01-01

    Three experiments examined the difficulty of translating cues into verbal representations of task goals by varying the degree of cue transparency (auditory transparent cues, visual transparent cues, visual arbitrary cues) in the Advanced Dimensional Change Card Sort, which requires switching between color- and shape-sorting rules on the basis of…

  19. The Wellcome Prize Lecture. A map of auditory space in the mammalian brain: neural computation and development.

    Science.gov (United States)

    King, A J

    1993-09-01

    The experiments described in this review have demonstrated that the SC contains a two-dimensional map of auditory space, which is synthesized within the brain using a combination of monaural and binaural localization cues. There is also an adaptive fusion of auditory and visual space in this midbrain nucleus, providing for a common access to the motor pathways that control orientation behaviour. This necessitates a highly plastic relationship between the visual and auditory systems, both during postnatal development and in adult life. Because of the independent mobility of difference sense organs, gating mechanisms are incorporated into the auditory representation to provide up-to-date information about the spatial orientation of the eyes and ears. The SC therefore provides a valuable model system for studying a number of important issues in brain function, including the neural coding of sound location, the co-ordination of spatial information between different sensory systems, and the integration of sensory signals with motor outputs. PMID:8240794

  20. Eye Movements during Auditory Attention Predict Individual Differences in Dorsal Attention Network Activity

    Science.gov (United States)

    Braga, Rodrigo M.; Fu, Richard Z.; Seemungal, Barry M.; Wise, Richard J. S.; Leech, Robert

    2016-01-01

    The neural mechanisms supporting auditory attention are not fully understood. A dorsal frontoparietal network of brain regions is thought to mediate the spatial orienting of attention across all sensory modalities. Key parts of this network, the frontal eye fields (FEF) and the superior parietal lobes (SPL), contain retinotopic maps and elicit saccades when stimulated. This suggests that their recruitment during auditory attention might reflect crossmodal oculomotor processes; however this has not been confirmed experimentally. Here we investigate whether task-evoked eye movements during an auditory task can predict the magnitude of activity within the dorsal frontoparietal network. A spatial and non-spatial listening task was used with on-line eye-tracking and functional magnetic resonance imaging (fMRI). No visual stimuli or cues were used. The auditory task elicited systematic eye movements, with saccade rate and gaze position predicting attentional engagement and the cued sound location, respectively. Activity associated with these separate aspects of evoked eye-movements dissociated between the SPL and FEF. However these observed eye movements could not account for all the activation in the frontoparietal network. Our results suggest that the recruitment of the SPL and FEF during attentive listening reflects, at least partly, overt crossmodal oculomotor processes during non-visual attention. Further work is needed to establish whether the network’s remaining contribution to auditory attention is through covert crossmodal processes, or is directly involved in the manipulation of auditory information. PMID:27242465

  1. Characterizing spatial tuning functions of neurons in the auditory cortex of young and aged monkeys: A new perspective on old data.

    Directory of Open Access Journals (Sweden)

    James eEngle

    2013-01-01

    Full Text Available Age-related hearing deficits are a leading cause of disability among the aged. While some forms of hearing deficits are peripheral in origin, others are centrally mediated. One such deficit is the ability to localize sounds, a critical component for segregating different acoustic objects and events, which is dependent on the auditory cortex. Recent evidence indicates that in aged animals the normal sharpening of spatial tuning between neurons in primary auditory cortex to the caudal lateral field does not occur as it does in younger animals. As a decrease in inhibition with aging is common in the ascending auditory system, it is possible that this lack of spatial tuning sharpening is due to a decrease in inhibition at different periods within the response. It is also possible that spatial tuning was decreased as a consequence of reduced inhibition at non-best locations. In this report we found that aged animals did have greater activity throughout the response period, but primarily during the onset of the response. This was most prominent at non-best directions, consistent with the hypothesis that inhibition is a primary mechanism to sharpen spatial tuning curves. We also noted that in aged animals the latency of the response was much shorter than in younger animals, consistent with a decrease in pre-onset inhibition. These results can be interpreted in the context of a failure of the timing and efficiency of feed-forward thalamo-cortical and cortico-cortical circuits in aged animals. Such a mechanism, if generalized across cortical areas, could play a major role in age-related cognitive decline.

  2. Auditory Processing Disorders

    Science.gov (United States)

    Auditory Processing Disorders Auditory processing disorders (APDs) are referred to by many names: central auditory processing disorders , auditory perceptual disorders , and central auditory disorders . APDs ...

  3. A retroactive spatial cue improved VSTM capacity in mild cognitive impairment and medial temporal lobe amnesia but not in healthy older adults.

    Science.gov (United States)

    Newsome, Rachel N; Duarte, Audrey; Pun, Carson; Smith, Victoria M; Ferber, Susanne; Barense, Morgan D

    2015-10-01

    Visual short-term memory (VSTM) is a vital cognitive ability, connecting visual input with conscious awareness. VSTM performance declines with mild cognitive impairment (MCI) and medial temporal lobe (MTL) amnesia. Many studies have shown that providing a spatial retrospective cue ("retrocue") improves VSTM capacity estimates for healthy young adults. However, one study has demonstrated that older adults are unable to use a retrocue to inhibit irrelevant items from memory. It is unknown whether patients with MCI and MTL amnesia will be able to use a retrocue to benefit their memory. We administered a retrocue and a baseline (simultaneous cue, "simucue") task to young adults, older adults, MCI patients, and MTL cases. Consistent with previous findings, young adults showed a retrocue benefit, whereas healthy older adults did not. In contrast, both MCI patients and MTL cases showed a retrocue benefit--the use of a retrocue brought patient performance up to the level of age-matched controls. We speculate that the patients were able to use the spatial information from the retrocue to reduce interference and facilitate binding items to their locations. PMID:26300388

  4. Listeners' expectation of room acoustical parameters based on visual cues

    Science.gov (United States)

    Valente, Daniel L.

    Despite many studies investigating auditory spatial impressions in rooms, few have addressed the impact of simultaneous visual cues on localization and the perception of spaciousness. The current research presents an immersive audio-visual study, in which participants are instructed to make spatial congruency and quantity judgments in dynamic cross-modal environments. The results of these psychophysical tests suggest the importance of consilient audio-visual presentation to the legibility of an auditory scene. Several studies have looked into audio-visual interaction in room perception in recent years, but these studies rely on static images, speech signals, or photographs alone to represent the visual scene. Building on these studies, the aim is to propose a testing method that uses monochromatic compositing (blue-screen technique) to position a studio recording of a musical performance in a number of virtual acoustical environments and ask subjects to assess these environments. In the first experiment of the study, video footage was taken from five rooms varying in physical size from a small studio to a small performance hall. Participants were asked to perceptually align two distinct acoustical parameters---early-to-late reverberant energy ratio and reverberation time---of two solo musical performances in five contrasting visual environments according to their expectations of how the room should sound given its visual appearance. In the second experiment in the study, video footage shot from four different listening positions within a general-purpose space was coupled with sounds derived from measured binaural impulse responses (IRs). The relationship between the presented image, sound, and virtual receiver position was examined. It was found that many visual cues caused different perceived events of the acoustic environment. This included the visual attributes of the space in which the performance was located as well as the visual attributes of the performer

  5. Intestinal GPS: bile and bicarbonate control cyclic di-GMP to provide Vibrio cholerae spatial cues within the small intestine.

    Science.gov (United States)

    Koestler, Benjamin J; Waters, Christopher M

    2014-01-01

    The second messenger cyclic di-GMP (c-di-GMP) regulates numerous phenotypes in response to environmental stimuli to enable bacteria to transition between different lifestyles. Here we discuss our recent findings that the human pathogen Vibrio cholerae recognizes 2 host-specific signals, bile and bicarbonate, to regulate intracellular c-di-GMP. We have demonstrated that bile acids increase intracellular c-di-GMP to promote biofilm formation. We have also shown that this bile-mediated increase of intracellular c-di-GMP is negated by bicarbonate, and that this interaction is dependent on pH, suggesting that V. cholerae uses these 2 environmental cues to sense and adapt to its relative location in the small intestine. Increased intracellular c-di-GMP by bile is attributed to increased c-di-GMP synthesis by 3 diguanylate cyclases (DGCs) and decreased expression of one phosphodiesterase (PDE) in the presence of bile. The molecular mechanisms by which bile controls the activity of the 3 DGCs and the regulators of bile-mediated transcriptional repression of the PDE are not yet known. Moreover, the impact of varying concentrations of bile and bicarbonate at different locations within the small intestine and the response of V. cholerae to these cues remains unclear. The native microbiome and pharmaceuticals, such as omeprazole, can impact bile and pH within the small intestine, suggesting these are potential unappreciated factors that may alter V. cholerae pathogenesis. PMID:25621620

  6. Oscillations Go the Distance: Low-Frequency Human Hippocampal Oscillations Code Spatial Distance in the Absence of Sensory Cues during Teleportation.

    Science.gov (United States)

    Vass, Lindsay K; Copara, Milagros S; Seyal, Masud; Shahlaie, Kiarash; Farias, Sarah Tomaszewski; Shen, Peter Y; Ekstrom, Arne D

    2016-03-16

    Low-frequency (delta/theta band) hippocampal neural oscillations play prominent roles in computational models of spatial navigation, but their exact function remains unknown. Some theories propose they are primarily generated in response to sensorimotor processing, while others suggest a role in memory-related processing. We directly recorded hippocampal EEG activity in patients undergoing seizure monitoring while they explored a virtual environment containing teleporters. Critically, this manipulation allowed patients to experience movement through space in the absence of visual and self-motion cues. The prevalence and duration of low-frequency hippocampal oscillations were unchanged by this manipulation, indicating that sensorimotor processing was not required to elicit them during navigation. Furthermore, the frequency-wise pattern of oscillation prevalence during teleportation contained spatial information capable of classifying the distance teleported. These results demonstrate that movement-related sensory information is not required to drive spatially informative low-frequency hippocampal oscillations during navigation and suggest a specific function in memory-related spatial updating. PMID:26924436

  7. Oscillations Go the Distance: Low-Frequency Human Hippocampal Oscillations Code Spatial Distance in the Absence of Sensory Cues during Teleportation.

    Science.gov (United States)

    Vass, Lindsay K; Copara, Milagros S; Seyal, Masud; Shahlaie, Kiarash; Farias, Sarah Tomaszewski; Shen, Peter Y; Ekstrom, Arne D

    2016-03-16

    Low-frequency (delta/theta band) hippocampal neural oscillations play prominent roles in computational models of spatial navigation, but their exact function remains unknown. Some theories propose they are primarily generated in response to sensorimotor processing, while others suggest a role in memory-related processing. We directly recorded hippocampal EEG activity in patients undergoing seizure monitoring while they explored a virtual environment containing teleporters. Critically, this manipulation allowed patients to experience movement through space in the absence of visual and self-motion cues. The prevalence and duration of low-frequency hippocampal oscillations were unchanged by this manipulation, indicating that sensorimotor processing was not required to elicit them during navigation. Furthermore, the frequency-wise pattern of oscillation prevalence during teleportation contained spatial information capable of classifying the distance teleported. These results demonstrate that movement-related sensory information is not required to drive spatially informative low-frequency hippocampal oscillations during navigation and suggest a specific function in memory-related spatial updating.

  8. 不同范围提示下儿童视觉空间注意的ERP研究%An ERP Study of Children's Visual Spatial Attention to Different Spatial Scaling Cues

    Institute of Scientific and Technical Information of China (English)

    孙延超; 李秀艳; 高卫星; 许桂春; 杨海英; 刘晓芹

    2011-01-01

    目的:研究儿童在不同范围提示下视觉空间注意的事件相关电位(ERP)特征.方法:采用“提示-目标”的视觉实验范式,以圆圈提示不同等级的搜索范围,对14名儿童进行检测.通过ERP技术分析儿童不同空间注意等级的早成分.结果:随提示范围的减小,反应时加快,后部P1和N1波幅增大,前部P2波幅减小.结论:在空间注意加工的早期阶段,儿童依赖提示等级的有效性调动脑资源;而晚期阶段需要额外的脑资源.%Objective: To study the characteristic of children's Event-Related Potentials(ERP) by visual spatial attention to different spatial scaling cues. Methods: The "cue-target" experimental paradigm was adopted, and the attended range was cued by different circles. The subjects included fourteen health children. Results: The reaction time became shorter with the decrease of the cue scale, while P1 and N1 components amplitudes increased, P2 components amplitudes decreased. Conclusion: In the early stages of processing of spatial attention, children rely on the effectiveness of prompt mobilization of the brain resources; and the later stages of the brain requires additional resources.

  9. The Difference in the Profile of Working Memory, Auditory Working Memory, and Spatial Working Memory between Drug, Stimulant, and Methadone Abusers and Normal People

    Directory of Open Access Journals (Sweden)

    Ahmad Alipour

    2015-06-01

    Full Text Available Objective: The present study was an attempt to examine the difference in the profile of working memory, auditory working memory, and spatial working memory between drug, stimulant, and methadone abusers and normal people. Method: This study was a causal-comparative one with between-group comparison methodology. All the individuals addicted to opiates, stimulants, and methadone who had referred to Khomeini treatment centers of the city from September 2013 to February 2014 constituted the statistical population of the study. The number of 154 abusers (54 drug abusers, 50 stimulant abusers, and 50 methadone abusers and the number of 50 normal participants were chosen as the sample of the study by purposive sampling method. The participants responded to Wechsler Memory Scale—third edition (WMS-III. Results: There was a significant difference between the normal group and drug, stimulant, and methadone abusers in terms of working memory, auditory working memory, and spatial working memory. Conclusion: Drug and stimulant use leads to sustained damage in cognitive processes such as working memory. However, research indicates that these cognitive processes will improve with the passage of time.

  10. The impact of anterior thalamic lesions on active and passive spatial learning in stimulus controlled environments: geometric cues and pattern arrangement.

    Science.gov (United States)

    Dumont, Julie R; Wright, Nicholas F; Pearce, John M; Aggleton, John P

    2014-04-01

    The anterior thalamic nuclei are vital for many spatial tasks. To determine more precisely their role, the present study modified the conventional Morris watermaze task. In each of 3 experiments, rats were repeatedly placed on a submerged platform in 1 corner (the 'correct' corner) of either a rectangular pool (Experiment 1) or a square pool with walls of different appearances (Experiments 2 and 3). The rats were then released into the pool for a first test trial in the absence of the platform. In Experiment 1, normal rats distinguished the 2 sets of corners in the rectangular pool by their geometric properties, preferring the correct corner and its diagonally opposite partner. Anterior thalamic lesions severely impaired this discrimination. In Experiments 2 and 3, normal rats typically swam directly to the correct corner of the square pool on the first test trial. Rats with anterior thalamic lesions, however, often failed to initially select the correct corner, taking more time to reach that location. Nevertheless, the lesioned rats still showed a subsequent preference for the correct corner. The same lesioned rats also showed no deficits in Experiments 2 and 3 when subsequently trained to swim to the correct corner over repeated trials. The findings show how the anterior thalamic nuclei contribute to multiple aspects of spatial processing. These thalamic nuclei may be required to distinguish relative dimensions (Experiment 1) as well as translate the appearance of spatial cues when viewed for the first time from different perspectives (Experiments 2, 3).

  11. Cue validity probability influences neural processing of targets.

    Science.gov (United States)

    Arjona, Antonio; Escudero, Miguel; Gómez, Carlos M

    2016-09-01

    The neural bases of the so-called Spatial Cueing Effect in a visuo-auditory version of the Central Cue Posneŕs Paradigm (CCPP) are analyzed by means of behavioral patterns (Reaction Times and Errors) and Event-Related Potentials (ERPs), namely the Contingent Negative Variation (CNV), N1, P2a, P2p, P3a, P3b and Negative Slow Wave (NSW). The present version consisted of three types of trial blocks with different validity/invalidity proportions: 50% valid - 50% invalid trials, 68% valid - 32% invalid trials and 86% valid - 14% invalid trials. Thus, ERPs can be analyzed as the proportion of valid trials per block increases. Behavioral (Reaction Times and Incorrect responses) and ERP (lateralized component of CNV, P2a, P3b and NSW) results showed a spatial cueing effect as the proportion of valid trials per block increased. Results suggest a brain activity modulation related to sensory-motor attention and working memory updating, in order to adapt to external unpredictable contingencies.

  12. Binding ‘when’ and ‘where’ impairs temporal, but not spatial recall in auditory and visual working memory

    Directory of Open Access Journals (Sweden)

    Franco eDelogu

    2012-03-01

    Full Text Available Information about where and when events happened seem naturally linked to each other, but only few studies have investigated whether and how they are associated in working memory. We tested whether the location of items and their temporal order are jointly or independently encoded. We also verified if spatio-temporal binding is influenced by the sensory modality of items. Participants were requested to memorize the location and/or the serial order of five items (environmental sounds or pictures sequentially presented from five different locations. Next, they were asked to recall either the item location or their order of presentation within the sequence. Attention during encoding was manipulated by contrasting blocks of trials in which participants were requested to encode only one feature, to blocks of trials where they had to encode both features. Results show an interesting interaction between task and attention. Accuracy in the serial order recall was affected by the simultaneous encoding of item location, whereas the recall of item location was unaffected by the concurrent encoding of the serial order of items. This asymmetric influence of attention on the two tasks was similar for the auditory and visual modality. Together, these data indicate that item location is processed in a relatively automatic fashion, whereas maintaining serial order is more demanding in terms of attention. The remarkably analogous results for auditory and visual memory performance, suggest that the binding of serial order and location in working memory is not modality-dependent, and may involve common intersensory mechanisms.

  13. Auditory and Visual Sensations

    CERN Document Server

    Ando, Yoichi

    2010-01-01

    Professor Yoichi Ando, acoustic architectural designer of the Kirishima International Concert Hall in Japan, presents a comprehensive rational-scientific approach to designing performance spaces. His theory is based on systematic psychoacoustical observations of spatial hearing and listener preferences, whose neuronal correlates are observed in the neurophysiology of the human brain. A correlation-based model of neuronal signal processing in the central auditory system is proposed in which temporal sensations (pitch, timbre, loudness, duration) are represented by an internal autocorrelation representation, and spatial sensations (sound location, size, diffuseness related to envelopment) are represented by an internal interaural crosscorrelation function. Together these two internal central auditory representations account for the basic auditory qualities that are relevant for listening to music and speech in indoor performance spaces. Observed psychological and neurophysiological commonalities between auditor...

  14. The effect of rhythmic somatosensory cueing on gait in patients with Parkinson's disease.

    NARCIS (Netherlands)

    Wegen, E. van; Goede, C. de; Lim, I.; Rietberg, M.B.; Nieuwboer, A.; Willems, A.; Jones, D.; Rochester, L.; Hetherington, V.; Berendse, H.W.; Zijlmans, J.C.M.; Wolters, E.; Kwakkel, G.

    2006-01-01

    BACKGROUND AND AIMS: Gait and gait related activities in patients with Parkinson's disease (PD) can be improved with rhythmic auditory cueing (e.g. a metronome). In the context of a large European study, a portable prototype cueing device was developed to provide an alternative for rhythmic auditory

  15. Superior temporal activation in response to dynamic audio-visual emotional cues

    OpenAIRE

    Robins, Diana L.; Hunyadi, Elinora; Schultz, Robert T.

    2008-01-01

    Perception of emotion is critical for successful social interaction, yet the neural mechanisms underlying the perception of dynamic, audiovisual emotional cues are poorly understood. Evidence from language and sensory paradigms suggests that the superior temporal sulcus and gyrus (STS/STG) play a key role in the integration of auditory and visual cues. Emotion perception research has focused on static facial cues; however, dynamic audiovisual (AV) cues mimic real-world social cues more accura...

  16. The role of social cues in the deployment of spatial attention: Head-body relationships automatically activate directional spatial codes in a Simon task

    Directory of Open Access Journals (Sweden)

    Iwona ePomianowska

    2012-02-01

    Full Text Available The role of body orientation in the orienting and allocation of social attention was examined using an adapted Simon paradigm. Participants categorized the facial expression of forward facing, computer-generated human figures by pressing one of two response keys, each located left or right of the observers’ body midline, while the orientation of the stimulus figure’s body (trunk, arms, and legs, which was the task-irrelevant feature of interest, was manipulated (oriented towards the left or right visual hemifield with respect to the spatial location of the required response. We found that when the orientation of the body was compatible with the required response location, responses were slower relative to when body orientation was incompatible with the response location. This reverse compatibility effect suggests that body orientation is automatically processed into a directional spatial code, but that this code is based on an integration of head and body orientation within an allocentric-based frame of reference. Moreover, we argue that this code may be derived from the motion information implied in the image of a figure when head and body orientation are incongruent. Our results have implications for understanding the nature of the information that affects the allocation of attention for social orienting.

  17. Frequency band-importance functions for auditory and auditory-visual speech recognition

    Science.gov (United States)

    Grant, Ken W.

    2005-04-01

    In many everyday listening environments, speech communication involves the integration of both acoustic and visual speech cues. This is especially true in noisy and reverberant environments where the speech signal is highly degraded, or when the listener has a hearing impairment. Understanding the mechanisms involved in auditory-visual integration is a primary interest of this work. Of particular interest is whether listeners are able to allocate their attention to various frequency regions of the speech signal differently under auditory-visual conditions and auditory-alone conditions. For auditory speech recognition, the most important frequency regions tend to be around 1500-3000 Hz, corresponding roughly to important acoustic cues for place of articulation. The purpose of this study is to determine the most important frequency region under auditory-visual speech conditions. Frequency band-importance functions for auditory and auditory-visual conditions were obtained by having subjects identify speech tokens under conditions where the speech-to-noise ratio of different parts of the speech spectrum is independently and randomly varied on every trial. Point biserial correlations were computed for each separate spectral region and the normalized correlations are interpreted as weights indicating the importance of each region. Relations among frequency-importance functions for auditory and auditory-visual conditions will be discussed.

  18. Exogenous spatial attention decreases audiovisual integration.

    Science.gov (United States)

    Van der Stoep, N; Van der Stigchel, S; Nijboer, T C W

    2015-02-01

    Multisensory integration (MSI) and spatial attention are both mechanisms through which the processing of sensory information can be facilitated. Studies on the interaction between spatial attention and MSI have mainly focused on the interaction between endogenous spatial attention and MSI. Most of these studies have shown that endogenously attending a multisensory target enhances MSI. It is currently unclear, however, whether and how exogenous spatial attention and MSI interact. In the current study, we investigated the interaction between these two important bottom-up processes in two experiments. In Experiment 1 the target location was task-relevant, and in Experiment 2 the target location was task-irrelevant. Valid or invalid exogenous auditory cues were presented before the onset of unimodal auditory, unimodal visual, and audiovisual targets. We observed reliable cueing effects and multisensory response enhancement in both experiments. To examine whether audiovisual integration was influenced by exogenous spatial attention, the amount of race model violation was compared between exogenously attended and unattended targets. In both Experiment 1 and Experiment 2, a decrease in MSI was observed when audiovisual targets were exogenously attended, compared to when they were not. The interaction between exogenous attention and MSI was less pronounced in Experiment 2. Therefore, our results indicate that exogenous attention diminishes MSI when spatial orienting is relevant. The results are discussed in terms of models of multisensory integration and attention. PMID:25341648

  19. Improving target detection in visual search through the augmenting multi-sensory cues

    NARCIS (Netherlands)

    Hancock, P.A.; Mercado, J.E.; Merlo, J.; Erp, J.B.F. van

    2013-01-01

    The present experiment tested 60 individuals on a multiple screen, visual target detection task. Using a within-participant design, individuals received no-cue augmentation, an augmenting tactile cue alone, an augmenting auditory cue alone or both of the latter augmentations in combination. Results

  20. Mixed Messages: Illusory Durations Induced by Cue Combination

    Directory of Open Access Journals (Sweden)

    Craig Aaen-Stockdale

    2012-05-01

    Full Text Available Pairing a visual stimulus with a concurrent auditory stimulus of subtly longer or shorter duration expands or contracts the duration of that visual stimulus, even when the observer is asked to ignore the irrelevant auditory component. Here we map out this relationship and find a roughly linear relationship between perceived duration of the visual component and the duration of the irrelevant auditory component. Beyond this ‘window of integration’ the obligatory combination of cues breaks down rather suddenly, at durations 0.2 log units longer or shorter than baseline. Conversely, a visual duration has virtually no effect on the perceived duration of a concurrently presented auditory duration. A model is presented based on obligatory combination of visual and auditory cues within a window defined by the respective JNDs of vision and audition.

  1. Auditory Neuropathy

    Science.gov (United States)

    ... field differ in their opinions about the potential benefits of hearing aids, cochlear implants, and other technologies for people with auditory neuropathy. Some professionals report that hearing aids and personal listening devices such as frequency modulation (FM) systems are ...

  2. Two- to eight-month-old infants’ perception of dynamic auditory-visual spatial co-location

    OpenAIRE

    Bremner, J. Gavin; Slater, Alan M.; Scott P Johnson; Mason, Uschi C.; Spring, Jo; Bremner, Maggie E.

    2011-01-01

    From birth, infants detect associations between the locations of static visual objects and sounds they emit, but there is limited evidence regarding their sensitivity to the dynamic equivalent when a sound-emitting object moves. In four experiments involving 36 2-month-olds, 48 5-month-olds and 48 8-month-olds, we investigated infants’ ability to process this form of spatial co-location. Whereas there was no evidence of spontaneous sensitivity, all age groups detected a dynamic co-location du...

  3. Self-affirmation in auditory persuasion

    NARCIS (Netherlands)

    Elbert, Sarah; Dijkstra, Arie

    2011-01-01

    Persuasive health information can be presented through an auditory channel. Curiously enough, the effect of voice cues in health persuasion has hardly been studied. Research concerning visual persuasive messages showed that self-affirmation results in a more open-minded reaction to threatening infor

  4. Dissociating temporal attention from spatial attention and motor response preparation: A high-density EEG study.

    Science.gov (United States)

    Faugeras, Frédéric; Naccache, Lionel

    2016-01-01

    Engagement of various forms of attention and response preparation determines behavioral performance during stimulus-response tasks. Many studies explored the respective properties and neural signatures of each of these processes. However, very few experiments were conceived to explore their interaction. In the present work we used an auditory target detection task during which both temporal attention on the one side, and spatial attention and motor response preparation on the other side could be explicitly cued. Both cueing effects speeded response times, and showed strictly additive effects. Target ERP analysis revealed modulations of N1 and P3 responses by these two forms of cueing. Cue-target interval analysis revealed two main effects paralleling behavior. First, a typical contingent negative variation (CNV), induced by the cue and resolved immediately after target onset, was found larger for temporal attention cueing than for spatial and motor response cueing. Second, a posterior and late cue-P3 complex showed the reverse profile. Analyses of lateralized readiness potentials (LRP) revealed both patterns of motor response inhibition and activation. Taken together these results help to clarify and disentangle the respective effects of temporal attention on the one hand, and of the combination of spatial attention and motor response preparation on the other hand on brain activity and behavior.

  5. Dissociating temporal attention from spatial attention and motor response preparation: A high-density EEG study.

    Science.gov (United States)

    Faugeras, Frédéric; Naccache, Lionel

    2016-01-01

    Engagement of various forms of attention and response preparation determines behavioral performance during stimulus-response tasks. Many studies explored the respective properties and neural signatures of each of these processes. However, very few experiments were conceived to explore their interaction. In the present work we used an auditory target detection task during which both temporal attention on the one side, and spatial attention and motor response preparation on the other side could be explicitly cued. Both cueing effects speeded response times, and showed strictly additive effects. Target ERP analysis revealed modulations of N1 and P3 responses by these two forms of cueing. Cue-target interval analysis revealed two main effects paralleling behavior. First, a typical contingent negative variation (CNV), induced by the cue and resolved immediately after target onset, was found larger for temporal attention cueing than for spatial and motor response cueing. Second, a posterior and late cue-P3 complex showed the reverse profile. Analyses of lateralized readiness potentials (LRP) revealed both patterns of motor response inhibition and activation. Taken together these results help to clarify and disentangle the respective effects of temporal attention on the one hand, and of the combination of spatial attention and motor response preparation on the other hand on brain activity and behavior. PMID:26433120

  6. Surface presentation of biochemical cues for stem cell expansion - Spatial distribution of growth factors and self-assembly of extracellular matrix

    Science.gov (United States)

    Liu, Xingyu

    Despite its great potential applications to stem cell technology and tissue engineering, matrix presentation of biochemical cues such as growth factors and extracellular matrix (ECM) components remains undefined. This is largely due to the difficulty in preserving the bioactivities of signaling molecules and in controlling the spatial distribution, cellular accessibility, molecular orientation and intermolecular assembly of the biochemical cues. This dissertation comprises of two parts that focuses on understanding surface presentation of a growth factor and ECM components, respectively. This dissertation addresses two fundamental questions in stem cell biology using two biomaterials platforms. How does nanoscale distribution of growth factor impact signaling activation and cellular behaviors of adult neural stem cells? How does ECM self-assembly impact human embryonic stem cell survival and proliferation? The first question was addressed by the design of a novel quantitative platform that allows the control of FGF-2 molecular presentation locally as either monomers or clusters when tethered to a polymeric substrate. This substrate-tethered FGF-2 enables a switch-like signaling activation in response to dose titration of FGF-2. This is in contrast to a continuous MAPK activation pattern elicited by soluble FGF-2. Consequently, cell proliferation, and spreading were also consistent with this FGF-2 does-response pattern. We demonstrated that the combination of FGF-2 concentration and its cluster size, rather than concentration alone, serves as the determinants to govern its biological effect on neural stem cells. The second part of this dissertation was inspired by the challenge that hESCs have extremely low clonal efficiency and hESC survival is critically dependent on cell substrate adhesion. We postulated that ECM integrity is a critical factor in preventing hESC anchorage-dependent apoptosis, and that the matrix for feeder-free culture need to be properly

  7. Follow the sign! Top-down contingent attentional capture of masked arrow cues

    OpenAIRE

    Reuss, Heiko; Pohl, Carsten; Kiesel, Andrea; Kunde, Wilfried

    2011-01-01

    Arrow cues and other overlearned spatial symbols automatically orient attention according to their spatial meaning. This renders them similar to exogenous cues that occur at stimulus location. Exogenous cues trigger shifts of attention even when they are presented subliminally. Here, we investigate to what extent the mechanisms underlying the orienting of attention by exogenous cues and by arrow cues are comparable by analyzing the effects of visible and masked arrow cues on attention. In Exp...

  8. Characterization of auditory synaptic inputs to gerbil perirhinal cortex

    Directory of Open Access Journals (Sweden)

    Vibhakar C Kotak

    2015-08-01

    Full Text Available The representation of acoustic cues involves regions downstream from the auditory cortex (ACx. One such area, the perirhinal cortex (PRh, processes sensory signals containing mnemonic information. Therefore, our goal was to assess whether PRh receives auditory inputs from the auditory thalamus (MG and ACx in an auditory thalamocortical brain slice preparation and characterize these afferent-driven synaptic properties. When the MG or ACx was electrically stimulated, synaptic responses were recorded from the PRh neurons. Blockade of GABA-A receptors dramatically increased the amplitude of evoked excitatory potentials. Stimulation of the MG or ACx also evoked calcium transients in most PRh neurons. Separately, when fluoro ruby was injected in ACx in vivo, anterogradely labeled axons and terminals were observed in the PRh. Collectively, these data show that the PRh integrates auditory information from the MG and ACx and that auditory driven inhibition dominates the postsynaptic responses in a non-sensory cortical region downstream from the auditory cortex.

  9. [Visual cues as a therapeutic tool in Parkinson's disease. A systematic review].

    Science.gov (United States)

    Muñoz-Hellín, Elena; Cano-de-la-Cuerda, Roberto; Miangolarra-Page, Juan Carlos

    2013-01-01

    Sensory stimuli or sensory cues are being used as a therapeutic tool for improving gait disorders in Parkinson's disease patients, but most studies seem to focus on auditory stimuli. The aim of this study was to conduct a systematic review regarding the use of visual cues over gait disorders, dual tasks during gait, freezing and the incidence of falls in patients with Parkinson to obtain therapeutic implications. We conducted a systematic review in main databases such as Cochrane Database of Systematic Reviews, TripDataBase, PubMed, Ovid MEDLINE, Ovid EMBASE and Physiotherapy Evidence Database, during 2005 to 2012, according to the recommendations of the Consolidated Standards of Reporting Trials, evaluating the quality of the papers included with the Downs & Black Quality Index. 21 articles were finally included in this systematic review (with a total of 892 participants) with variable methodological quality, achieving an average of 17.27 points in the Downs and Black Quality Index (range: 11-21). Visual cues produce improvements over temporal-spatial parameters in gait, turning execution, reducing the appearance of freezing and falls in Parkinson's disease patients. Visual cues appear to benefit dual tasks during gait, reducing the interference of the second task. Further studies are needed to determine the preferred type of stimuli for each stage of the disease.

  10. Effects of spatial configurations on the resolution of spatial working memory.

    Science.gov (United States)

    Mutluturk, Aysu; Boduroglu, Aysecan

    2014-11-01

    Recent research demonstrated that people represent spatial information configurally and preservation of configural cues at retrieval helps memory for spatial locations (Boduroğlu & Shah, Memory & Cognition, 37(8), 1120-1131 2009; Jiang, Olson, & Chun, Journal of Experimental Psychology: Learning, Memory, and Cognition, 26(3), 683-702 2000). The present study investigated the effects of spatial configurations on the resolution of individual location representations. In an open-ended task, participants first studied a set of object locations (three and five locations). Then, in a test display where available configural cues were manipulated, participants were asked to determine the original location of a target object whose color was auditorially cued. The difference between the reported location and the original location was taken as a measure of spatial resolution. In three experiments, we consistently observed that the resolution of spatial representations was facilitated by the preservation of spatial configurations at retrieval. We argue that participants may be using available configural cues in conjunction with the summary representation (e.g., centroid) of the original display in the computation of target locations. PMID:24939236

  11. Visual Cues, Verbal Cues and Child Development

    Science.gov (United States)

    Valentini, Nadia

    2004-01-01

    In this article, the author discusses two strategies--visual cues (modeling) and verbal cues (short, accurate phrases) which are related to teaching motor skills in maximizing learning in physical education classes. Both visual and verbal cues are strong influences in facilitating and promoting day-to-day learning. Both strategies reinforce…

  12. Non-predictive cueing improves accuracy judgments for voluntary and involuntary spatial and feature/shape attention independent of backward masking.

    OpenAIRE

    Pack, Weston David

    2013-01-01

    Many psychophysics investigations have implemented pre-cues to direct an observer's attention to a specific location or feature. There is controversy over the mechanisms of involuntary attention and whether perceptual or decision processes can enhance target detection and identification as measured by accuracy judgments. Through four main experiments, this dissertation research has indicated that both involuntary and voluntary attention improve target identification and localization accuracy ...

  13. Feeling music: integration of auditory and tactile inputs in musical meter perception.

    Directory of Open Access Journals (Sweden)

    Juan Huang

    Full Text Available Musicians often say that they not only hear, but also "feel" music. To explore the contribution of tactile information in "feeling" musical rhythm, we investigated the degree that auditory and tactile inputs are integrated in humans performing a musical meter recognition task. Subjects discriminated between two types of sequences, 'duple' (march-like rhythms and 'triple' (waltz-like rhythms presented in three conditions: 1 Unimodal inputs (auditory or tactile alone, 2 Various combinations of bimodal inputs, where sequences were distributed between the auditory and tactile channels such that a single channel did not produce coherent meter percepts, and 3 Simultaneously presented bimodal inputs where the two channels contained congruent or incongruent meter cues. We first show that meter is perceived similarly well (70%-85% when tactile or auditory cues are presented alone. We next show in the bimodal experiments that auditory and tactile cues are integrated to produce coherent meter percepts. Performance is high (70%-90% when all of the metrically important notes are assigned to one channel and is reduced to 60% when half of these notes are assigned to one channel. When the important notes are presented simultaneously to both channels, congruent cues enhance meter recognition (90%. Performance drops dramatically when subjects were presented with incongruent auditory cues (10%, as opposed to incongruent tactile cues (60%, demonstrating that auditory input dominates meter perception. We believe that these results are the first demonstration of cross-modal sensory grouping between any two senses.

  14. Feeling music: integration of auditory and tactile inputs in musical meter perception.

    Science.gov (United States)

    Huang, Juan; Gamble, Darik; Sarnlertsophon, Kristine; Wang, Xiaoqin; Hsiao, Steven

    2012-01-01

    Musicians often say that they not only hear, but also "feel" music. To explore the contribution of tactile information in "feeling" musical rhythm, we investigated the degree that auditory and tactile inputs are integrated in humans performing a musical meter recognition task. Subjects discriminated between two types of sequences, 'duple' (march-like rhythms) and 'triple' (waltz-like rhythms) presented in three conditions: 1) Unimodal inputs (auditory or tactile alone), 2) Various combinations of bimodal inputs, where sequences were distributed between the auditory and tactile channels such that a single channel did not produce coherent meter percepts, and 3) Simultaneously presented bimodal inputs where the two channels contained congruent or incongruent meter cues. We first show that meter is perceived similarly well (70%-85%) when tactile or auditory cues are presented alone. We next show in the bimodal experiments that auditory and tactile cues are integrated to produce coherent meter percepts. Performance is high (70%-90%) when all of the metrically important notes are assigned to one channel and is reduced to 60% when half of these notes are assigned to one channel. When the important notes are presented simultaneously to both channels, congruent cues enhance meter recognition (90%). Performance drops dramatically when subjects were presented with incongruent auditory cues (10%), as opposed to incongruent tactile cues (60%), demonstrating that auditory input dominates meter perception. We believe that these results are the first demonstration of cross-modal sensory grouping between any two senses. PMID:23119038

  15. 邻近效应对多媒体学习中图文整合的影响:线索的作用%The Spatial Contiguity Effect in Multimedia Learning:The Role of Cueing

    Institute of Scientific and Technical Information of China (English)

    王福兴; 段朝辉; 周宗奎; 陈珺

    2015-01-01

    Text and illustrations integrated in spatial distribution could be helpful for learners’ performance during multimedia learning. In addition, recent studies showed that cues, e.g. highlighting with color, arrows, bold typeface, could guide learner’s attention and improve their learning outcomes. The researchers speculate that the picture and text close to each other can shorten the visual search time and reduce the cognitive load, thereby enhancing the learning results. Previous studies also showed that adding cues to the learning materials could guide the learners’ attention, promoted the organization and integration of the new learning knowledge. But what are the specific processes of the contiguity effect? Whether the changes of the picture-text’s location and adding cues would affect the allocation of attention? In this study, we expected that the contiguity effects and cueing would affect the learners' attention allocation, and then influence the memory tests. Consequently, the integrated text and pictures with cues would have more fixation counts and longer dwell time on the task related area, and higher scores in the retention test and transfer test. In this study, fifty one college students were recruited from Central China Normal University as the participants with prior knowledge questionnaire. And a computer-generated animation depicting the process of lightning formation was used as the experiment material. Highlighting red color on text and pictures were manipulated as cues. First of all, a demographic questionnaire including a prior knowledge questionnaire would be sent to all of the prospective participants who want to participate in the experiment. The student who could be the participants had been measured by the prior knowledge questionnaire, to ensure they knew little about the lightning knowledge. After that they were randomized into four groups. The four groups were as follows: the integrated text picture with cues, the integrated text

  16. Auditory temporal processing skills in musicians with dyslexia.

    Science.gov (United States)

    Bishop-Liebler, Paula; Welch, Graham; Huss, Martina; Thomson, Jennifer M; Goswami, Usha

    2014-08-01

    The core cognitive difficulty in developmental dyslexia involves phonological processing, but adults and children with dyslexia also have sensory impairments. Impairments in basic auditory processing show particular links with phonological impairments, and recent studies with dyslexic children across languages reveal a relationship between auditory temporal processing and sensitivity to rhythmic timing and speech rhythm. As rhythm is explicit in music, musical training might have a beneficial effect on the auditory perception of acoustic cues to rhythm in dyslexia. Here we took advantage of the presence of musicians with and without dyslexia in musical conservatoires, comparing their auditory temporal processing abilities with those of dyslexic non-musicians matched for cognitive ability. Musicians with dyslexia showed equivalent auditory sensitivity to musicians without dyslexia and also showed equivalent rhythm perception. The data support the view that extensive rhythmic experience initiated during childhood (here in the form of music training) can affect basic auditory processing skills which are found to be deficient in individuals with dyslexia.

  17. The role of temporal coherence in auditory stream segregation

    DEFF Research Database (Denmark)

    Christiansen, Simon Krogholt

    The ability to perceptually segregate concurrent sound sources and focus one’s attention on a single source at a time is essential for the ability to use acoustic information. While perceptual experiments have determined a range of acoustic cues that help facilitate auditory stream segregation...... of auditory processing, the role of auditory preprocessing and temporal coherence in auditory stream formation was evaluated. The computational model presented in this study assumes that auditory stream segregation occurs when sounds stimulate non-overlapping neural populations in a temporally incoherent...... on the stream segregation process was analysed. The model analysis showed that auditory frequency selectivity and physiological forward masking play a significant role in stream segregation based on frequency separation and tone rate. Secondly, the model analysis suggested that neural adaptation...

  18. Estimating the intended sound direction of the user: toward an auditory brain-computer interface using out-of-head sound localization.

    Directory of Open Access Journals (Sweden)

    Isao Nambu

    Full Text Available The auditory Brain-Computer Interface (BCI using electroencephalograms (EEG is a subject of intensive study. As a cue, auditory BCIs can deal with many of the characteristics of stimuli such as tone, pitch, and voices. Spatial information on auditory stimuli also provides useful information for a BCI. However, in a portable system, virtual auditory stimuli have to be presented spatially through earphones or headphones, instead of loudspeakers. We investigated the possibility of an auditory BCI using the out-of-head sound localization technique, which enables us to present virtual auditory stimuli to users from any direction, through earphones. The feasibility of a BCI using this technique was evaluated in an EEG oddball experiment and offline analysis. A virtual auditory stimulus was presented to the subject from one of six directions. Using a support vector machine, we were able to classify whether the subject attended the direction of a presented stimulus from EEG signals. The mean accuracy across subjects was 70.0% in the single-trial classification. When we used trial-averaged EEG signals as inputs to the classifier, the mean accuracy across seven subjects reached 89.5% (for 10-trial averaging. Further analysis showed that the P300 event-related potential responses from 200 to 500 ms in central and posterior regions of the brain contributed to the classification. In comparison with the results obtained from a loudspeaker experiment, we confirmed that stimulus presentation by out-of-head sound localization achieved similar event-related potential responses and classification performances. These results suggest that out-of-head sound localization enables us to provide a high-performance and loudspeaker-less portable BCI system.

  19. Neural Dynamics of Object-Based Multifocal Visual Spatial Attention and Priming: Object Cueing, Useful-Field-of-View, and Crowding

    Science.gov (United States)

    Foley, Nicholas C.; Grossberg, Stephen; Mingolla, Ennio

    2012-01-01

    How are spatial and object attention coordinated to achieve rapid object learning and recognition during eye movement search? How do prefrontal priming and parietal spatial mechanisms interact to determine the reaction time costs of intra-object attention shifts, inter-object attention shifts, and shifts between visible objects and covertly cued…

  20. Early Visual Deprivation Severely Compromises the Auditory Sense of Space in Congenitally Blind Children

    Science.gov (United States)

    Vercillo, Tiziana; Burr, David; Gori, Monica

    2016-01-01

    A recent study has shown that congenitally blind adults, who have never had visual experience, are impaired on an auditory spatial bisection task (Gori, Sandini, Martinoli, & Burr, 2014). In this study we investigated how thresholds for auditory spatial bisection and auditory discrimination develop with age in sighted and congenitally blind…

  1. Reduced Sensitivity to Slow-Rate Dynamic Auditory Information in Children with Dyslexia

    Science.gov (United States)

    Poelmans, Hanne; Luts, Heleen; Vandermosten, Maaike; Boets, Bart; Ghesquiere, Pol; Wouters, Jan

    2011-01-01

    The etiology of developmental dyslexia remains widely debated. An appealing theory postulates that the reading and spelling problems in individuals with dyslexia originate from reduced sensitivity to slow-rate dynamic auditory cues. This low-level auditory deficit is thought to provoke a cascade of effects, including inaccurate speech perception…

  2. Psychology of auditory perception.

    Science.gov (United States)

    Lotto, Andrew; Holt, Lori

    2011-09-01

    Audition is often treated as a 'secondary' sensory system behind vision in the study of cognitive science. In this review, we focus on three seemingly simple perceptual tasks to demonstrate the complexity of perceptual-cognitive processing involved in everyday audition. After providing a short overview of the characteristics of sound and their neural encoding, we present a description of the perceptual task of segregating multiple sound events that are mixed together in the signal reaching the ears. Then, we discuss the ability to localize the sound source in the environment. Finally, we provide some data and theory on how listeners categorize complex sounds, such as speech. In particular, we present research on how listeners weigh multiple acoustic cues in making a categorization decision. One conclusion of this review is that it is time for auditory cognitive science to be developed to match what has been done in vision in order for us to better understand how humans communicate with speech and music. WIREs Cogni Sci 2011 2 479-489 DOI: 10.1002/wcs.123 For further resources related to this article, please visit the WIREs website. PMID:26302301

  3. Auditory priming of frequency and temporal information: Effects of lateralized presentation

    OpenAIRE

    List, Alexandra; Justus, Timothy

    2007-01-01

    Asymmetric distribution of function between the cerebral hemispheres has been widely investigated in the auditory modality. The current approach borrows heavily from visual local-global research in an attempt to determine whether, as in vision, local-global auditory processing is lateralized. In vision, lateralized local-global processing likely relies on spatial frequency information. Drawing analogies between visual spatial frequency and auditory dimensions, two sets of auditory stimuli wer...

  4. Effects of an Auditory Lateralization Training in Children Suspected to Central Auditory Processing Disorder

    Science.gov (United States)

    Lotfi, Yones; Moosavi, Abdollah; Bakhshi, Enayatollah; Sadjedi, Hamed

    2016-01-01

    Background and Objectives Central auditory processing disorder [(C)APD] refers to a deficit in auditory stimuli processing in nervous system that is not due to higher-order language or cognitive factors. One of the problems in children with (C)APD is spatial difficulties which have been overlooked despite their significance. Localization is an auditory ability to detect sound sources in space and can help to differentiate between the desired speech from other simultaneous sound sources. Aim of this research was investigating effects of an auditory lateralization training on speech perception in presence of noise/competing signals in children suspected to (C)APD. Subjects and Methods In this analytical interventional study, 60 children suspected to (C)APD were selected based on multiple auditory processing assessment subtests. They were randomly divided into two groups: control (mean age 9.07) and training groups (mean age 9.00). Training program consisted of detection and pointing to sound sources delivered with interaural time differences under headphones for 12 formal sessions (6 weeks). Spatial word recognition score (WRS) and monaural selective auditory attention test (mSAAT) were used to follow the auditory lateralization training effects. Results This study showed that in the training group, mSAAT score and spatial WRS in noise (p value≤0.001) improved significantly after the auditory lateralization training. Conclusions We used auditory lateralization training for 6 weeks and showed that auditory lateralization can improve speech understanding in noise significantly. The generalization of this results needs further researches.

  5. Action Enhances Acoustic Cues for 3-D Target Localization by Echolocating Bats

    Science.gov (United States)

    Wohlgemuth, Melville J.

    2016-01-01

    Under natural conditions, animals encounter a barrage of sensory information from which they must select and interpret biologically relevant signals. Active sensing can facilitate this process by engaging motor systems in the sampling of sensory information. The echolocating bat serves as an excellent model to investigate the coupling between action and sensing because it adaptively controls both the acoustic signals used to probe the environment and movements to receive echoes at the auditory periphery. We report here that the echolocating bat controls the features of its sonar vocalizations in tandem with the positioning of the outer ears to maximize acoustic cues for target detection and localization. The bat’s adaptive control of sonar vocalizations and ear positioning occurs on a millisecond timescale to capture spatial information from arriving echoes, as well as on a longer timescale to track target movement. Our results demonstrate that purposeful control over sonar sound production and reception can serve to improve acoustic cues for localization tasks. This finding also highlights the general importance of movement to sensory processing across animal species. Finally, our discoveries point to important parallels between spatial perception by echolocation and vision. PMID:27608186

  6. Action Enhances Acoustic Cues for 3-D Target Localization by Echolocating Bats.

    Science.gov (United States)

    Wohlgemuth, Melville J; Kothari, Ninad B; Moss, Cynthia F

    2016-09-01

    Under natural conditions, animals encounter a barrage of sensory information from which they must select and interpret biologically relevant signals. Active sensing can facilitate this process by engaging motor systems in the sampling of sensory information. The echolocating bat serves as an excellent model to investigate the coupling between action and sensing because it adaptively controls both the acoustic signals used to probe the environment and movements to receive echoes at the auditory periphery. We report here that the echolocating bat controls the features of its sonar vocalizations in tandem with the positioning of the outer ears to maximize acoustic cues for target detection and localization. The bat's adaptive control of sonar vocalizations and ear positioning occurs on a millisecond timescale to capture spatial information from arriving echoes, as well as on a longer timescale to track target movement. Our results demonstrate that purposeful control over sonar sound production and reception can serve to improve acoustic cues for localization tasks. This finding also highlights the general importance of movement to sensory processing across animal species. Finally, our discoveries point to important parallels between spatial perception by echolocation and vision. PMID:27608186

  7. Categorical vowel perception enhances the effectiveness and generalization of auditory feedback in human-machine-interfaces.

    Directory of Open Access Journals (Sweden)

    Eric Larson

    Full Text Available Human-machine interface (HMI designs offer the possibility of improving quality of life for patient populations as well as augmenting normal user function. Despite pragmatic benefits, utilizing auditory feedback for HMI control remains underutilized, in part due to observed limitations in effectiveness. The goal of this study was to determine the extent to which categorical speech perception could be used to improve an auditory HMI. Using surface electromyography, 24 healthy speakers of American English participated in 4 sessions to learn to control an HMI using auditory feedback (provided via vowel synthesis. Participants trained on 3 targets in sessions 1-3 and were tested on 3 novel targets in session 4. An "established categories with text cues" group of eight participants were trained and tested on auditory targets corresponding to standard American English vowels using auditory and text target cues. An "established categories without text cues" group of eight participants were trained and tested on the same targets using only auditory cuing of target vowel identity. A "new categories" group of eight participants were trained and tested on targets that corresponded to vowel-like sounds not part of American English. Analyses of user performance revealed significant effects of session and group (established categories groups and the new categories group, and a trend for an interaction between session and group. Results suggest that auditory feedback can be effectively used for HMI operation when paired with established categorical (native vowel targets with an unambiguous cue.

  8. Multiple cue use and integration in pigeons (Columba livia).

    Science.gov (United States)

    Legge, Eric L G; Madan, Christopher R; Spetch, Marcia L; Ludvig, Elliot A

    2016-05-01

    Encoding multiple cues can improve the accuracy and reliability of navigation and goal localization. Problems may arise, however, if one cue is displaced and provides information which conflicts with other cues. Here we investigated how pigeons cope with cue conflict by training them to locate a goal relative to two landmarks and then varying the amount of conflict between the landmarks. When the amount of conflict was small, pigeons tended to integrate both cues in their search patterns. When the amount of conflict was large, however, pigeons used information from both cues independently. This context-dependent strategy for resolving spatial cue conflict agrees with Bayes optimal calculations for using information from multiple sources. PMID:26908004

  9. Multiple cue use and integration in pigeons (Columba livia).

    Science.gov (United States)

    Legge, Eric L G; Madan, Christopher R; Spetch, Marcia L; Ludvig, Elliot A

    2016-05-01

    Encoding multiple cues can improve the accuracy and reliability of navigation and goal localization. Problems may arise, however, if one cue is displaced and provides information which conflicts with other cues. Here we investigated how pigeons cope with cue conflict by training them to locate a goal relative to two landmarks and then varying the amount of conflict between the landmarks. When the amount of conflict was small, pigeons tended to integrate both cues in their search patterns. When the amount of conflict was large, however, pigeons used information from both cues independently. This context-dependent strategy for resolving spatial cue conflict agrees with Bayes optimal calculations for using information from multiple sources.

  10. Development of kinesthetic-motor and auditory-motor representations in school-aged children.

    Science.gov (United States)

    Kagerer, Florian A; Clark, Jane E

    2015-07-01

    In two experiments using a center-out task, we investigated kinesthetic-motor and auditory-motor integrations in 5- to 12-year-old children and young adults. In experiment 1, participants moved a pen on a digitizing tablet from a starting position to one of three targets (visuo-motor condition), and then to one of four targets without visual feedback of the movement. In both conditions, we found that with increasing age, the children moved faster and straighter, and became less variable in their feedforward control. Higher control demands for movements toward the contralateral side were reflected in longer movement times and decreased spatial accuracy across all age groups. When feedforward control relies predominantly on kinesthesia, 7- to 10-year-old children were more variable, indicating difficulties in switching between feedforward and feedback control efficiently during that age. An inverse age progression was found for directional endpoint error; larger errors increasing with age likely reflect stronger functional lateralization for the dominant hand. In experiment 2, the same visuo-motor condition was followed by an auditory-motor condition in which participants had to move to acoustic targets (either white band or one-third octave noise). Since in the latter directional cues come exclusively from transcallosally mediated interaural time differences, we hypothesized that auditory-motor representations would show age effects. The results did not show a clear age effect, suggesting that corpus callosum functionality is sufficient in children to allow them to form accurate auditory-motor maps already at a young age.

  11. Auditory perception of a human walker.

    Science.gov (United States)

    Cottrell, David; Campbell, Megan E J

    2014-01-01

    When one hears footsteps in the hall, one is able to instantly recognise it as a person: this is an everyday example of auditory biological motion perception. Despite the familiarity of this experience, research into this phenomenon is in its infancy compared with visual biological motion perception. Here, two experiments explored sensitivity to, and recognition of, auditory stimuli of biological and nonbiological origin. We hypothesised that the cadence of a walker gives rise to a temporal pattern of impact sounds that facilitates the recognition of human motion from auditory stimuli alone. First a series of detection tasks compared sensitivity with three carefully matched impact sounds: footsteps, a ball bouncing, and drumbeats. Unexpectedly, participants were no more sensitive to footsteps than to impact sounds of nonbiological origin. In the second experiment participants made discriminations between pairs of the same stimuli, in a series of recognition tasks in which the temporal pattern of impact sounds was manipulated to be either that of a walker or the pattern more typical of the source event (a ball bouncing or a drumbeat). Under these conditions, there was evidence that both temporal and nontemporal cues were important in recognising theses stimuli. It is proposed that the interval between footsteps, which reflects a walker's cadence, is a cue for the recognition of the sounds of a human walking.

  12. Auditory distance perception in humans : A summary of past and present research

    NARCIS (Netherlands)

    Zahorik, P.; Brungart, D.S.; Bronkhorst, A.W.

    2005-01-01

    Although auditory distance perception is a critical component of spatial hearing, it has received substantially less scienti.c attention than the directional aspects of auditory localization. Here we summarize current knowledge on auditory distance perception, with special emphasis on recent researc

  13. Nogo stimuli do not receive more attentional suppression or response inhibition than neutral stimuli: evidence from the N2pc, PD and N2 components in a spatial cueing paradigm

    Directory of Open Access Journals (Sweden)

    Caroline eBarras

    2016-05-01

    Full Text Available It has been claimed that stimuli sharing the color of the nogo-target are suppressed because of the strong incentive to not process the nogo-target, but we failed to replicate this finding. Participants searched for a color singleton in the target display and indicated its shape when it was in the go color. If the color singleton in the target display was in the nogo color, they had to withhold the response. The target display was preceded by a cue display that also contained a color singleton (the cue. The cue was either in the color of the go or nogo target, or it was in an unrelated, neutral color. With cues in the go color, reaction times (RTs were shorter when the cue appeared at the same location as the target compared to when it appeared at a different location. Also, electrophysiological recordings showed that an index of attentional selection, the N2pc, was elicited by go cues. Surprisingly, we failed to replicate cueing costs for cues in the nogo color that were originally reported by Anderson and Folk (2012. Consistently, we also failed to find an electrophysiological index of attentional suppression (the PD for cues in the nogo color. Further, fronto-central ERPs to the cue display showed the same negativity for nogo and neutral stimuli relative to go stimuli, which is at odds with response inhibition and conflict monitoring accounts of the Nogo-N2. Thus, the modified cueing paradigm employed here provides little evidence that features associated with nogo-targets are suppressed at the level of attention or response selection. Rather, nogo-stimuli are efficiently ignored and attention is focused on features that require a response.

  14. Nogo Stimuli Do Not Receive More Attentional Suppression or Response Inhibition than Neutral Stimuli: Evidence from the N2pc, PD, and N2 Components in a Spatial Cueing Paradigm.

    Science.gov (United States)

    Barras, Caroline; Kerzel, Dirk

    2016-01-01

    It has been claimed that stimuli sharing the color of the nogo-target are suppressed because of the strong incentive to not process the nogo-target, but we failed to replicate this finding. Participants searched for a color singleton in the target display and indicated its shape when it was in the go color. If the color singleton in the target display was in the nogo color, they had to withhold the response. The target display was preceded by a cue display that also contained a color singleton (the cue). The cue was either in the color of the go or nogo target, or it was in an unrelated, neutral color. With cues in the go color, reaction times were shorter when the cue appeared at the same location as the target compared to when it appeared at a different location. Also, electrophysiological recordings showed that an index of attentional selection, the N2pc, was elicited by go cues. Surprisingly, we failed to replicate cueing costs for cues in the nogo color that were originally reported by Anderson and Folk (2012). Consistently, we also failed to find an electrophysiological index of attentional suppression (the PD) for cues in the nogo color. Further, fronto-central event-related potentials to the cue display showed the same negativity for nogo and neutral stimuli relative to go stimuli, which is at odds with response inhibition and conflict monitoring accounts of the Nogo-N2. Thus, the modified cueing paradigm employed here provides little evidence that features associated with nogo-targets are suppressed at the level of attention or response selection. Rather, nogo-stimuli are efficiently ignored and attention is focused on features that require a response.

  15. Nogo Stimuli Do Not Receive More Attentional Suppression or Response Inhibition than Neutral Stimuli: Evidence from the N2pc, PD, and N2 Components in a Spatial Cueing Paradigm

    Science.gov (United States)

    Barras, Caroline; Kerzel, Dirk

    2016-01-01

    It has been claimed that stimuli sharing the color of the nogo-target are suppressed because of the strong incentive to not process the nogo-target, but we failed to replicate this finding. Participants searched for a color singleton in the target display and indicated its shape when it was in the go color. If the color singleton in the target display was in the nogo color, they had to withhold the response. The target display was preceded by a cue display that also contained a color singleton (the cue). The cue was either in the color of the go or nogo target, or it was in an unrelated, neutral color. With cues in the go color, reaction times were shorter when the cue appeared at the same location as the target compared to when it appeared at a different location. Also, electrophysiological recordings showed that an index of attentional selection, the N2pc, was elicited by go cues. Surprisingly, we failed to replicate cueing costs for cues in the nogo color that were originally reported by Anderson and Folk (2012). Consistently, we also failed to find an electrophysiological index of attentional suppression (the PD) for cues in the nogo color. Further, fronto-central event-related potentials to the cue display showed the same negativity for nogo and neutral stimuli relative to go stimuli, which is at odds with response inhibition and conflict monitoring accounts of the Nogo-N2. Thus, the modified cueing paradigm employed here provides little evidence that features associated with nogo-targets are suppressed at the level of attention or response selection. Rather, nogo-stimuli are efficiently ignored and attention is focused on features that require a response. PMID:27199858

  16. Age differences in visual-auditory self-motion perception during a simulated driving task

    Directory of Open Access Journals (Sweden)

    Robert eRamkhalawansingh

    2016-04-01

    Full Text Available Recent evidence suggests that visual-auditory cue integration may change as a function of age such that integration is heightened among older adults. Our goal was to determine whether these changes in multisensory integration are also observed in the context of self-motion perception under realistic task constraints. Thus, we developed a simulated driving paradigm in which we provided older and younger adults with visual motion cues (i.e. optic flow and systematically manipulated the presence or absence of congruent auditory cues to self-motion (i.e. engine, tire, and wind sounds. Results demonstrated that the presence or absence of congruent auditory input had different effects on older and younger adults. Both age groups demonstrated a reduction in speed variability when auditory cues were present compared to when they were absent, but older adults demonstrated a proportionally greater reduction in speed variability under combined sensory conditions. These results are consistent with evidence indicating that multisensory integration is heightened in older adults. Importantly, this study is the first to provide evidence to suggest that age differences in multisensory integration may generalize from simple stimulus detection tasks to the integration of the more complex and dynamic visual and auditory cues that are experienced during self-motion.

  17. Auditory imagery: empirical findings.

    Science.gov (United States)

    Hubbard, Timothy L

    2010-03-01

    The empirical literature on auditory imagery is reviewed. Data on (a) imagery for auditory features (pitch, timbre, loudness), (b) imagery for complex nonverbal auditory stimuli (musical contour, melody, harmony, tempo, notational audiation, environmental sounds), (c) imagery for verbal stimuli (speech, text, in dreams, interior monologue), (d) auditory imagery's relationship to perception and memory (detection, encoding, recall, mnemonic properties, phonological loop), and (e) individual differences in auditory imagery (in vividness, musical ability and experience, synesthesia, musical hallucinosis, schizophrenia, amusia) are considered. It is concluded that auditory imagery (a) preserves many structural and temporal properties of auditory stimuli, (b) can facilitate auditory discrimination but interfere with auditory detection, (c) involves many of the same brain areas as auditory perception, (d) is often but not necessarily influenced by subvocalization, (e) involves semantically interpreted information and expectancies, (f) involves depictive components and descriptive components, (g) can function as a mnemonic but is distinct from rehearsal, and (h) is related to musical ability and experience (although the mechanisms of that relationship are not clear). PMID:20192565

  18. Auditory and Visual Cues for Spatiotemporal Rhythm Reproduction

    DEFF Research Database (Denmark)

    Maculewicz, Justyna; Serafin, Stefania; Kofoed, Lise B.

    2013-01-01

    The goal of this experiment is to investigate the role of au- ditory and visual feedback in a rhythmic tapping task. Subjects had to tap with the finger following presented rhythms, which were divided into easy and difficult patterns. Specificity of the task was that participants had to take...

  19. Detecting anxiety and defensiveness from visual and auditory cues.

    Science.gov (United States)

    Harrigan, J A; Harrigan, K M; Sale, B A; Rosenthal, R

    1996-09-01

    Defensive individuals have been shown to differ from nondefensive individuals on a number of physiological and behavioral measures. We report two studies on observers' inferences of defensiveness, and the contribution of communication channels in the inference of defensiveness. Observers judged high and low state anxious segments of high and low trait anxious defensive and nondefensive individuals. Accurate assessments were made of (a) defensiveness, (b) state anxiety, and (c) trait anxiety: Individuals with higher levels of each variable were perceived as more anxious compared with the lower level. Effects for defensiveness and state anxiety were greater in audio-only segments, while effects for trait anxiety were greater in video-only segments. Inferences of defensiveness were greater at higher levels of state anxiety and trait anxiety. Low trait anxious defensive individuals were perceived as more anxious than the true low trait anxious. Results for defensiveness and trait anxiety were replicated in Study 2, and observers' perceptions of state anxiety matched individuals' self-reports: Defensive individuals with maximal differences between high and low state anxiety were seen as more anxious in high state anxiety, while defensive individuals with minimal differences between high and low state anxiety were regarded as less anxious in high state anxiety. PMID:8776883

  20. 感知提醒疗法治疗脑卒中后单侧空间忽略的疗效观察%Sensory cueing in the treatment of unilateral spatial neglect

    Institute of Scientific and Technical Information of China (English)

    杨永红; 王凤怡; 黄秋月; 左京京; 方乃权

    2015-01-01

    Objective To investigate the effects of sensory cueing (SC) on unilateral spatial neglect after stroke.Methods Five stroke survivors with unilateral spatial neglect underwent a tailored sensory cueing treatment (wearing a sensory cueing device 3 hours a day, 5 days a week for 2 weeks) in addition to their conventional rehabilitation.Two weeks before and one day before the treatment, and then one day, two weeks and 4 weeks after the treatment, all five patients were assessed using the Hong Kong edition of the behavioral inattention test (BIT-C).Results No significant changes were identified in the average BIT-C ratings at the two time point before the intervention.However, the average score had increased significantly only one day after the start of the intervention, with further significant improvement at each of the succeeding 2 week intervals.The greatest improvement was in finishing cancellation tasks, and the most severely affected patient showed the greatest improvement.Conclusion Sensory cueing treatment may be useful and feasible in reducing unilateral spatial neglect for stroke survivors.However, randomized and controlled trials with larger samples are needed to further verify its effects.%目的 观察感知提醒疗法治疗脑卒中后单侧空间忽略的疗效.方法 采用单组小样本治疗前、后对照预试验设计,共选取5例稳定期脑卒中后单侧忽略患者,在常规康复训练基础上辅以感知提醒干预(每天累计提醒时间为3h,每周治疗5d),持续治疗2周.分别于治疗开始前2周、治疗开始前1天、治疗结束后1天、治疗结束后2周及治疗结束后4周采用香港版单侧忽略行为测试常规子量表(BIT-C)对入选患者单侧空间忽略程度进行评定.结果 治疗前2周与治疗前1天时入选患者BIT-C评分[分别为(51.2±11.0)分、(61.4±12.1)分]差异无统计学意义(P>0.05);治疗结束后1天、治疗结束后2周及治疗结束后4周时入

  1. Persistent fluctuations in stride intervals under fractal auditory stimulation.

    Science.gov (United States)

    Marmelat, Vivien; Torre, Kjerstin; Beek, Peter J; Daffertshofer, Andreas

    2014-01-01

    Stride sequences of healthy gait are characterized by persistent long-range correlations, which become anti-persistent in the presence of an isochronous metronome. The latter phenomenon is of particular interest because auditory cueing is generally considered to reduce stride variability and may hence be beneficial for stabilizing gait. Complex systems tend to match their correlation structure when synchronizing. In gait training, can one capitalize on this tendency by using a fractal metronome rather than an isochronous one? We examined whether auditory cues with fractal variations in inter-beat intervals yield similar fractal inter-stride interval variability as isochronous auditory cueing in two complementary experiments. In Experiment 1, participants walked on a treadmill while being paced by either an isochronous or a fractal metronome with different variation strengths between beats in order to test whether participants managed to synchronize with a fractal metronome and to determine the necessary amount of variability for participants to switch from anti-persistent to persistent inter-stride intervals. Participants did synchronize with the metronome despite its fractal randomness. The corresponding coefficient of variation of inter-beat intervals was fixed in Experiment 2, in which participants walked on a treadmill while being paced by non-isochronous metronomes with different scaling exponents. As expected, inter-stride intervals showed persistent correlations similar to self-paced walking only when cueing contained persistent correlations. Our results open up a new window to optimize rhythmic auditory cueing for gait stabilization by integrating fractal fluctuations in the inter-beat intervals.

  2. The contribution of dynamic visual cues to audiovisual speech perception.

    Science.gov (United States)

    Jaekl, Philip; Pesquita, Ana; Alsius, Agnes; Munhall, Kevin; Soto-Faraco, Salvador

    2015-08-01

    Seeing a speaker's facial gestures can significantly improve speech comprehension, especially in noisy environments. However, the nature of the visual information from the speaker's facial movements that is relevant for this enhancement is still unclear. Like auditory speech signals, visual speech signals unfold over time and contain both dynamic configural information and luminance-defined local motion cues; two information sources that are thought to engage anatomically and functionally separate visual systems. Whereas, some past studies have highlighted the importance of local, luminance-defined motion cues in audiovisual speech perception, the contribution of dynamic configural information signalling changes in form over time has not yet been assessed. We therefore attempted to single out the contribution of dynamic configural information to audiovisual speech processing. To this aim, we measured word identification performance in noise using unimodal auditory stimuli, and with audiovisual stimuli. In the audiovisual condition, speaking faces were presented as point light displays achieved via motion capture of the original talker. Point light displays could be isoluminant, to minimise the contribution of effective luminance-defined local motion information, or with added luminance contrast, allowing the combined effect of dynamic configural cues and local motion cues. Audiovisual enhancement was found in both the isoluminant and contrast-based luminance conditions compared to an auditory-only condition, demonstrating, for the first time the specific contribution of dynamic configural cues to audiovisual speech improvement. These findings imply that globally processed changes in a speaker's facial shape contribute significantly towards the perception of articulatory gestures and the analysis of audiovisual speech.

  3. Shifts of attention in the early blind: an ERP study of attentional control processes in the absence of visual spatial information

    OpenAIRE

    Van Velzen, J.; Eardley, Alison F.; Forster, B.; Eimer, Martin

    2006-01-01

    To investigate the role of visual spatial information in the control of spatial attention, event-related brain potentials (ERPs) were recorded during a tactile attention task for a group of totally blind participants who were either congenitally blind or had lost vision during infancy, and for an age-matched, sighted control group who performed the task in the dark. Participants had to shift attention to the left or right hand (as indicated by an auditory cue presented at the start of each tr...

  4. Reactivity to nicotine cues over repeated cue reactivity sessions.

    Science.gov (United States)

    LaRowe, Steven D; Saladin, Michael E; Carpenter, Matthew J; Upadhyaya, Himanshu P

    2007-12-01

    The present study investigated whether reactivity to nicotine-related cues would attenuate across four experimental sessions held 1 week apart. Participants were nineteen non-treatment seeking, nicotine-dependent males. Cue reactivity sessions were performed in an outpatient research center using in vivo cues consisting of standardized smoking-related paraphernalia (e.g., cigarettes) and neutral comparison paraphernalia (e.g., pencils). Craving ratings were collected before and after both cue presentations while physiological measures (heart rate, skin conductance) were collected before and during the cue presentations. Although craving levels decreased across sessions, smoking-related cues consistently evoked significantly greater increases in craving relative to neutral cues over all four experimental sessions. Skin conductance was higher in response to smoking cues, though this effect was not as robust as that observed for craving. Results suggest that, under the described experimental parameters, craving can be reliably elicited over repeated cue reactivity sessions.

  5. Auditory localisation of conventional and electric cars : laboratory results and implications for cycling safety

    OpenAIRE

    Stelling-Konczak, A. & Hagenzieker, M.P.

    2016-01-01

    When driven at low speeds, cars operating in electric mode have been found to be quieter than conventional cars. As a result, the auditory cues which pedestrians and cyclists use to assess the presence, proximity and location oncoming traffic may be reduced, posing a safety hazard. This laboratory study examined auditory localisation of conventional and electric cars including vehicle motion paths relevant for cycling activity. Participants (N = 65) in three age groups (16–18, 30–40 and 65–70...

  6. Auditory-motor learning influences auditory memory for music.

    Science.gov (United States)

    Brown, Rachel M; Palmer, Caroline

    2012-05-01

    In two experiments, we investigated how auditory-motor learning influences performers' memory for music. Skilled pianists learned novel melodies in four conditions: auditory only (listening), motor only (performing without sound), strongly coupled auditory-motor (normal performance), and weakly coupled auditory-motor (performing along with auditory recordings). Pianists' recognition of the learned melodies was better following auditory-only or auditory-motor (weakly coupled and strongly coupled) learning than following motor-only learning, and better following strongly coupled auditory-motor learning than following auditory-only learning. Auditory and motor imagery abilities modulated the learning effects: Pianists with high auditory imagery scores had better recognition following motor-only learning, suggesting that auditory imagery compensated for missing auditory feedback at the learning stage. Experiment 2 replicated the findings of Experiment 1 with melodies that contained greater variation in acoustic features. Melodies that were slower and less variable in tempo and intensity were remembered better following weakly coupled auditory-motor learning. These findings suggest that motor learning can aid performers' auditory recognition of music beyond auditory learning alone, and that motor learning is influenced by individual abilities in mental imagery and by variation in acoustic features. PMID:22271265

  7. On the relative contributions of multisensory integration and crossmodal exogenous spatial attention to multisensory response enhancement.

    Science.gov (United States)

    Van der Stoep, N; Spence, C; Nijboer, T C W; Van der Stigchel, S

    2015-11-01

    Two processes that can give rise to multisensory response enhancement (MRE) are multisensory integration (MSI) and crossmodal exogenous spatial attention. It is, however, currently unclear what the relative contribution of each of these is to MRE. We investigated this issue using two tasks that are generally assumed to measure MSI (a redundant target effect task) and crossmodal exogenous spatial attention (a spatial cueing task). One block of trials consisted of unimodal auditory and visual targets designed to provide a unimodal baseline. In two other blocks of trials, the participants were presented with spatially and temporally aligned and misaligned audiovisual (AV) targets (0, 50, 100, and 200ms SOA). In the integration block, the participants were instructed to respond to the onset of the first target stimulus that they detected (A or V). The instruction for the cueing block was to respond only to the onset of the visual targets. The targets could appear at one of three locations: left, center, and right. The participants were instructed to respond only to lateral targets. The results indicated that MRE was caused by MSI at 0ms SOA. At 50ms SOA, both crossmodal exogenous spatial attention and MSI contributed to the observed MRE, whereas the MRE observed at the 100 and 200ms SOAs was attributable to crossmodal exogenous spatial attention, alerting, and temporal preparation. These results therefore suggest that there may be a temporal window in which both MSI and exogenous crossmodal spatial attention can contribute to multisensory response enhancement.

  8. Complex-tone pitch representations in the human auditory system

    DEFF Research Database (Denmark)

    Bianchi, Federica

    ) listeners and the effect of musical training for pitch discrimination of complex tones with resolved and unresolved harmonics. Concerning the first topic, behavioral and modeling results in listeners with sensorineural hearing loss (SNHL) indicated that temporal envelope cues of complex tones......Understanding how the human auditory system processes the physical properties of an acoustical stimulus to give rise to a pitch percept is a fascinating aspect of hearing research. Since most natural sounds are harmonic complex tones, this work focused on the nature of pitch-relevant cues...... that are necessary for the auditory system to retrieve the pitch of complex sounds. The existence of different pitch-coding mechanisms for low-numbered (spectrally resolved) and high-numbered (unresolved) harmonics was investigated by comparing pitch-discrimination performance across different cohorts of listeners...

  9. Viewpoint-independent contextual cueing effect

    Directory of Open Access Journals (Sweden)

    taiga etsuchiai

    2012-06-01

    Full Text Available We usually perceive things in our surroundings as unchanged despite viewpoint changes caused by self-motion. The visual system therefore must have a function to process objects independently of viewpoint. In this study, we examined whether viewpoint-independent spatial layout can be obtained implicitly. For this purpose, we used a contextual cueing effect, a learning effect of spatial layout in visual search displays known to be an implicit effect. We compared the transfer of the contextual cueing effect between cases with and without self-motion by using visual search displays for 3D objects, which changed according to the participant’s assumed location for viewing the stimuli. The contextual cueing effect was obtained with self-motion but disappeared when the display changed without self-motion. This indicates that there is an implicit learning effect in spatial coordinates and suggests that the spatial representation of object layouts or scenes can be obtained and updated implicitly. We also showed that binocular disparity play an important role in the layout representations.

  10. Measuring Auditory Selective Attention using Frequency Tagging

    Directory of Open Access Journals (Sweden)

    Hari M Bharadwaj

    2014-02-01

    Full Text Available Frequency tagging of sensory inputs (presenting stimuli that fluctuate periodically at rates to which the cortex can phase lock has been used to study attentional modulation of neural responses to inputs in different sensory modalities. For visual inputs, the visual steady-state response (VSSR at the frequency modulating an attended object is enhanced, while the VSSR to a distracting object is suppressed. In contrast, the effect of attention on the auditory steady-state response (ASSR is inconsistent across studies. However, most auditory studies analyzed results at the sensor level or used only a small number of equivalent current dipoles to fit cortical responses. In addition, most studies of auditory spatial attention used dichotic stimuli (independent signals at the ears rather than more natural, binaural stimuli. Here, we asked whether these methodological choices help explain discrepant results. Listeners attended to one of two competing speech streams, one simulated from the left and one from the right, that were modulated at different frequencies. Using distributed source modeling of magnetoencephalography results, we estimate how spatially directed attention modulates the ASSR in neural regions across the whole brain. Attention enhances the ASSR power at the frequency of the attended stream in the contralateral auditory cortex. The attended-stream modulation frequency also drives phase-locked responses in the left (but not right precentral sulcus (lPCS, a region implicated in control of eye gaze and visual spatial attention. Importantly, this region shows no phase locking to the distracting stream suggesting that the lPCS in engaged in an attention-specific manner. Modeling results that take account of the geometry and phases of the cortical sources phase locked to the two streams (including hemispheric asymmetry of lPCS activity help partly explain why past ASSR studies of auditory spatial attention yield seemingly contradictory

  11. Auditory temporal processing skills in musicians with dyslexia.

    Science.gov (United States)

    Bishop-Liebler, Paula; Welch, Graham; Huss, Martina; Thomson, Jennifer M; Goswami, Usha

    2014-08-01

    The core cognitive difficulty in developmental dyslexia involves phonological processing, but adults and children with dyslexia also have sensory impairments. Impairments in basic auditory processing show particular links with phonological impairments, and recent studies with dyslexic children across languages reveal a relationship between auditory temporal processing and sensitivity to rhythmic timing and speech rhythm. As rhythm is explicit in music, musical training might have a beneficial effect on the auditory perception of acoustic cues to rhythm in dyslexia. Here we took advantage of the presence of musicians with and without dyslexia in musical conservatoires, comparing their auditory temporal processing abilities with those of dyslexic non-musicians matched for cognitive ability. Musicians with dyslexia showed equivalent auditory sensitivity to musicians without dyslexia and also showed equivalent rhythm perception. The data support the view that extensive rhythmic experience initiated during childhood (here in the form of music training) can affect basic auditory processing skills which are found to be deficient in individuals with dyslexia. PMID:25044949

  12. Auditory function in individuals within Leber's hereditary optic neuropathy pedigrees.

    Science.gov (United States)

    Rance, Gary; Kearns, Lisa S; Tan, Johanna; Gravina, Anthony; Rosenfeld, Lisa; Henley, Lauren; Carew, Peter; Graydon, Kelley; O'Hare, Fleur; Mackey, David A

    2012-03-01

    The aims of this study are to investigate whether auditory dysfunction is part of the spectrum of neurological abnormalities associated with Leber's hereditary optic neuropathy (LHON) and to determine the perceptual consequences of auditory neuropathy (AN) in affected listeners. Forty-eight subjects confirmed by genetic testing as having one of four mitochondrial mutations associated with LHON (mt11778, mtDNA14484, mtDNA14482 and mtDNA3460) participated. Thirty-two of these had lost vision, and 16 were asymptomatic at the point of data collection. While the majority of individuals showed normal sound detection, >25% (of both symptomatic and asymptomatic participants) showed electrophysiological evidence of AN with either absent or severely delayed auditory brainstem potentials. Abnormalities were observed for each of the mutations, but subjects with the mtDNA11778 type were the most affected. Auditory perception was also abnormal in both symptomatic and asymptomatic subjects, with >20% of cases showing impaired detection of auditory temporal (timing) cues and >30% showing abnormal speech perception both in quiet and in the presence of background noise. The findings of this study indicate that a relatively high proportion of individuals with the LHON genetic profile may suffer functional hearing difficulties due to neural abnormality in the central auditory pathways.

  13. The effects of rhythmic sensory cues on the temporal dynamics of human gait.

    Science.gov (United States)

    Sejdić, Ervin; Fu, Yingying; Pak, Alison; Fairley, Jillian A; Chau, Tom

    2012-01-01

    Walking is a complex, rhythmic task performed by the locomotor system. However, natural gait rhythms can be influenced by metronomic auditory stimuli, a phenomenon of particular interest in neurological rehabilitation. In this paper, we examined the effects of aural, visual and tactile rhythmic cues on the temporal dynamics associated with human gait. Data were collected from fifteen healthy adults in two sessions. Each session consisted of five 15-minute trials. In the first trial of each session, participants walked at their preferred walking speed. In subsequent trials, participants were asked to walk to a metronomic beat, provided through visually, aurally, tactile or all three cues (simultaneously and in sync), the pace of which was set to the preferred walking speed of the first trial. Using the collected data, we extracted several parameters including: gait speed, mean stride interval, stride interval variability, scaling exponent and maximum Lyapunov exponent. The extracted parameters showed that rhythmic sensory cues affect the temporal dynamics of human gait. The auditory rhythmic cue had the greatest influence on the gait parameters, while the visual cue had no statistically significant effect on the scaling exponent. These results demonstrate that visual rhythmic cues could be considered as an alternative cueing modality in rehabilitation without concern of adversely altering the statistical persistence of walking.

  14. The effects of rhythmic sensory cues on the temporal dynamics of human gait.

    Directory of Open Access Journals (Sweden)

    Ervin Sejdić

    Full Text Available Walking is a complex, rhythmic task performed by the locomotor system. However, natural gait rhythms can be influenced by metronomic auditory stimuli, a phenomenon of particular interest in neurological rehabilitation. In this paper, we examined the effects of aural, visual and tactile rhythmic cues on the temporal dynamics associated with human gait. Data were collected from fifteen healthy adults in two sessions. Each session consisted of five 15-minute trials. In the first trial of each session, participants walked at their preferred walking speed. In subsequent trials, participants were asked to walk to a metronomic beat, provided through visually, aurally, tactile or all three cues (simultaneously and in sync, the pace of which was set to the preferred walking speed of the first trial. Using the collected data, we extracted several parameters including: gait speed, mean stride interval, stride interval variability, scaling exponent and maximum Lyapunov exponent. The extracted parameters showed that rhythmic sensory cues affect the temporal dynamics of human gait. The auditory rhythmic cue had the greatest influence on the gait parameters, while the visual cue had no statistically significant effect on the scaling exponent. These results demonstrate that visual rhythmic cues could be considered as an alternative cueing modality in rehabilitation without concern of adversely altering the statistical persistence of walking.

  15. Analysis of Parallel and Transverse Visual Cues on the Gait of Individuals with Idiopathic Parkinson's Disease

    Science.gov (United States)

    de Melo Roiz, Roberta; Azevedo Cacho, Enio Walker; Cliquet, Alberto, Jr.; Barasnevicius Quagliato, Elizabeth Maria Aparecida

    2011-01-01

    Idiopathic Parkinson's disease (IPD) has been defined as a chronic progressive neurological disorder with characteristics that generate changes in gait pattern. Several studies have reported that appropriate external influences, such as visual or auditory cues may improve the gait pattern of patients with IPD. Therefore, the objective of this…

  16. Composition: Cue Wheel

    DEFF Research Database (Denmark)

    Bergstrøm-Nielsen, Carl

    2014-01-01

    Cue Rondo is an open composition to be realised by improvising musicians. See more about my composition practise in the entry "Composition - General Introduction". This work is licensed under a Creative Commons "by-nc" License. You may for non-commercial purposes use and distribute it, performance...

  17. Multisensor image cueing (MUSIC)

    Science.gov (United States)

    Rodvold, David; Patterson, Tim J.

    2002-07-01

    There have been many years of research and development in the Automatic Target Recognition (ATR) community. This development has resulted in numerous algorithms to perform target detection automatically. The morphing of the ATR acronym to Aided Target Recognition provides a succinct commentary regarding the success of the automatic target recognition research. Now that the goal is aided recognition, many of the algorithms which were not able to provide autonomous recognition may now provide valuable assistance in cueing a human analyst where to look in the images under consideration. This paper describes the MUSIC system being developed for the US Air Force to provide multisensor image cueing. The tool works across multiple image phenomenologies and fuses the evidence across the set of available imagery. MUSIC is designed to work with a wide variety of sensors and platforms, and provide cueing to an image analyst in an information-rich environment. The paper concentrates on the current integration of algorithms into an extensible infrastructure to allow cueing in multiple image types.

  18. Neural coding and perception of pitch in the normal and impaired human auditory system

    DEFF Research Database (Denmark)

    Santurette, Sébastien

    2011-01-01

    Pitch is an important attribute of hearing that allows us to perceive the musical quality of sounds. Besides music perception, pitch contributes to speech communication, auditory grouping, and perceptual segregation of sound sources. In this work, several aspects of pitch perception in humans were...... that the use of spectral cues remained plausible. Simulations of auditory-nerve representations of the complex tones further suggested that a spectrotemporal mechanism combining precise timing information across auditory channels might best account for the behavioral data. Overall, this work provides insights...

  19. Cues for localization in the horizontal plane

    DEFF Research Database (Denmark)

    Jeppesen, Jakob; Møller, Henrik

    2005-01-01

    Spatial localization of sound is often described as unconscious evaluation of cues given by the interaural time difference (ITD) and the spectral information of the sound that reaches the two ears. Our present knowledge suggests the hypothesis that the ITD roughly determines the cone of the perce...... independently in HRTFs used for binaural synthesis. The ITD seems to be dominant for localization in the horizontal plane even when the spectral information is severely degraded....

  20. Auditory Responses of Infants

    Science.gov (United States)

    Watrous, Betty Springer; And Others

    1975-01-01

    Forty infants, 3- to 12-months-old, participated in a study designed to differentiate the auditory response characteristics of normally developing infants in the age ranges 3 - 5 months, 6 - 8 months, and 9 - 12 months. (Author)

  1. Using auditory-visual speech to probe the basis of noise-impaired consonant-vowel perception in dyslexia and auditory neuropathy

    Science.gov (United States)

    Ramirez, Joshua; Mann, Virginia

    2005-08-01

    Both dyslexics and auditory neuropathy (AN) subjects show inferior consonant-vowel (CV) perception in noise, relative to controls. To better understand these impairments, natural acoustic speech stimuli that were masked in speech-shaped noise at various intensities were presented to dyslexic, AN, and control subjects either in isolation or accompanied by visual articulatory cues. AN subjects were expected to benefit from the pairing of visual articulatory cues and auditory CV stimuli, provided that their speech perception impairment reflects a relatively peripheral auditory disorder. Assuming that dyslexia reflects a general impairment of speech processing rather than a disorder of audition, dyslexics were not expected to similarly benefit from an introduction of visual articulatory cues. The results revealed an increased effect of noise masking on the perception of isolated acoustic stimuli by both dyslexic and AN subjects. More importantly, dyslexics showed less effective use of visual articulatory cues in identifying masked speech stimuli and lower visual baseline performance relative to AN subjects and controls. Last, a significant positive correlation was found between reading ability and the ameliorating effect of visual articulatory cues on speech perception in noise. These results suggest that some reading impairments may stem from a central deficit of speech processing.

  2. [Central auditory prosthesis].

    Science.gov (United States)

    Lenarz, T; Lim, H; Joseph, G; Reuter, G; Lenarz, M

    2009-06-01

    Deaf patients with severe sensory hearing loss can benefit from a cochlear implant (CI), which stimulates the auditory nerve fibers. However, patients who do not have an intact auditory nerve cannot benefit from a CI. The majority of these patients are neurofibromatosis type 2 (NF2) patients who developed neural deafness due to growth or surgical removal of a bilateral acoustic neuroma. The only current solution is the auditory brainstem implant (ABI), which stimulates the surface of the cochlear nucleus in the brainstem. Although the ABI provides improvement in environmental awareness and lip-reading capabilities, only a few NF2 patients have achieved some limited open set speech perception. In the search for alternative procedures our research group in collaboration with Cochlear Ltd. (Australia) developed a human prototype auditory midbrain implant (AMI), which is designed to electrically stimulate the inferior colliculus (IC). The IC has the potential as a new target for an auditory prosthesis as it provides access to neural projections necessary for speech perception as well as a systematic map of spectral information. In this paper the present status of research and development in the field of central auditory prostheses is presented with respect to technology, surgical technique and hearing results as well as the background concepts of ABI and AMI. PMID:19517084

  3. Auditory reafferences: The influence of real-time feedback on movement control

    Directory of Open Access Journals (Sweden)

    Christian eKennel

    2015-01-01

    Full Text Available Auditory reafferences are real-time auditory products created by a person’s own movements. Whereas the interdependency of action and perception is generally well studied, the auditory feedback channel and the influence of perceptual processes during movement execution remain largely unconsidered. We argue that movements have a rhythmic character that is closely connected to sound, making it possible to manipulate auditory reafferences online to understand their role in motor control. We examined if step sounds, occurring as a by-product of running, have an influence on the performance of a complex movement task. Twenty participants completed a hurdling task in three auditory feedback conditions: a control condition with normal auditory feedback, a white noise condition in which sound was masked, and a delayed auditory feedback condition. Overall time and kinematic data were collected. Results show that delayed auditory feedback led to a significantly slower overall time and changed kinematic parameters. Our findings complement previous investigations in a natural movement situation with nonartificial auditory cues. Our results support the existing theoretical understanding of action–perception coupling and hold potential for applied work, where naturally occurring movement sounds can be implemented in the motor learning processes.

  4. Listeners use speaker identity to access representations of spatial perspective during online language comprehension.

    Science.gov (United States)

    Ryskin, Rachel A; Wang, Ranxiao Frances; Brown-Schmidt, Sarah

    2016-02-01

    Little is known about how listeners represent another person's spatial perspective during language processing (e.g., two people looking at a map from different angles). Can listeners use contextual cues such as speaker identity to access a representation of the interlocutor's spatial perspective? In two eye-tracking experiments, participants received auditory instructions to move objects around a screen from two randomly alternating spatial perspectives (45° vs. 315° or 135° vs. 225° rotations from the participant's viewpoint). Instructions were spoken either by one voice, where the speaker's perspective switched at random, or by two voices, where each speaker maintained one perspective. Analysis of participant eye-gaze showed that interpretation of the instructions improved when each viewpoint was associated with a different voice. These findings demonstrate that listeners can learn mappings between individual talkers and viewpoints, and use these mappings to guide online language processing.

  5. Mind your pricing cues.

    Science.gov (United States)

    Anderson, Eric; Simester, Duncan

    2003-09-01

    For most of the items they buy, consumers don't have an accurate sense of what the price should be. Ask them to guess how much a four-pack of 35-mm film costs, and you'll get a variety of wrong answers: Most people will underestimate; many will only shrug. Research shows that consumers' knowledge of the market is so far from perfect that it hardly deserves to be called knowledge at all. Yet people happily buy film and other products every day. Is this because they don't care what kind of deal they're getting? No. Remarkably, it's because they rely on retailers to tell them whether they're getting a good price. In subtle and not-so-subtle ways, retailers send signals to customers, telling them whether a given price is relatively high or low. In this article, the authors review several common pricing cues retailers use--"sale" signs, prices that end in 9, signpost items, and price-matching guarantees. They also offer some surprising facts about how--and how well--those cues work. For instance, the authors' tests with several mail-order catalogs reveal that including the word "sale" beside a price can increase demand by more than 50%. The practice of using a 9 at the end of a price to denote a bargain is so common, you'd think customers would be numb to it. Yet in a study the authors did involving a women's clothing catalog, they increased demand by a third just by changing the price of a dress from $34 to $39. Pricing cues are powerful tools for guiding customers' purchasing decisions, but they must be applied judiciously. Used inappropriately, the cues may breach customers' trust, reduce brand equity, and give rise to lawsuits. PMID:12964397

  6. Auditory localisation of conventional and electric cars : laboratory results and implications for cycling safety

    NARCIS (Netherlands)

    Stelling-Konczak, A. Hagenzieker, M.P. Commandeur, J.J.F. Agterberg, M.J.H. & Wee, B. van

    2016-01-01

    When driven at low speeds, cars operating in electric mode have been found to be quieter than conventional cars. As a result, the auditory cues which pedestrians and cyclists use to assess the presence, proximity and location oncoming traffic may be reduced, posing a safety hazard. This laboratory s

  7. Auditory localisation of conventional and electric cars : laboratory results and implications for cycling safety.

    NARCIS (Netherlands)

    Stelling-Konczak, A. Hagenzieker, M.P. Commandeur, J.J.F. Agterberg, M.J.H. & Wee, B. van

    2016-01-01

    When driven at low speeds, cars operating in electric mode have been found to be quieter than conventional cars. As a result, the auditory cues which pedestrians and cyclists use to assess the presence, proximity and location oncoming traffic may be reduced, posing a safety hazard. This laboratory s

  8. Auditory orientation in crickets: Pattern recognition controls reactive steering

    Science.gov (United States)

    Poulet, James F. A.; Hedwig, Berthold

    2005-10-01

    Many groups of insects are specialists in exploiting sensory cues to locate food resources or conspecifics. To achieve orientation, bees and ants analyze the polarization pattern of the sky, male moths orient along the females' odor plume, and cicadas, grasshoppers, and crickets use acoustic signals to locate singing conspecifics. In comparison with olfactory and visual orientation, where learning is involved, auditory processing underlying orientation in insects appears to be more hardwired and genetically determined. In each of these examples, however, orientation requires a recognition process identifying the crucial sensory pattern to interact with a localization process directing the animal's locomotor activity. Here, we characterize this interaction. Using a sensitive trackball system, we show that, during cricket auditory behavior, the recognition process that is tuned toward the species-specific song pattern controls the amplitude of auditory evoked steering responses. Females perform small reactive steering movements toward any sound patterns. Hearing the male's calling song increases the gain of auditory steering within 2-5 s, and the animals even steer toward nonattractive sound patterns inserted into the speciesspecific pattern. This gain control mechanism in the auditory-to-motor pathway allows crickets to pursue species-specific sound patterns temporarily corrupted by environmental factors and may reflect the organization of recognition and localization networks in insects. localization | phonotaxis

  9. The Relationship between the Field-Shifting Phenomenon and Representational Coherence of Place Cells in CA1 and CA3 in a Cue-Altered Environment

    Science.gov (United States)

    Lee, Inah; Knierim, James J.

    2007-01-01

    Subfields of the hippocampus display differential dynamics in processing a spatial environment, especially when changes are introduced to the environment. Specifically, when familiar cues in the environment are spatially rearranged, place cells in the CA3 subfield tend to rotate with a particular set of cues (e.g., proximal cues), maintaining a…

  10. Effect of harmonicity on the detection of a signal in a complex masker and on spatial release from masking.

    Directory of Open Access Journals (Sweden)

    Astrid Klinge

    Full Text Available The amount of masking of sounds from one source (signals by sounds from a competing source (maskers heavily depends on the sound characteristics of the masker and the signal and on their relative spatial location. Numerous studies investigated the ability to detect a signal in a speech or a noise masker or the effect of spatial separation of signal and masker on the amount of masking, but there is a lack of studies investigating the combined effects of many cues on the masking as is typical for natural listening situations. The current study using free-field listening systematically evaluates the combined effects of harmonicity and inharmonicity cues in multi-tone maskers and cues resulting from spatial separation of target signal and masker on the detection of a pure tone in a multi-tone or a noise masker. A linear binaural processing model was implemented to predict the masked thresholds in order to estimate whether the observed thresholds can be accounted for by energetic masking in the auditory periphery or whether other effects are involved. Thresholds were determined for combinations of two target frequencies (1 and 8 kHz, two spatial configurations (masker and target either co-located or spatially separated by 90 degrees azimuth, and five different masker types (four complex multi-tone stimuli, one noise masker. A spatial separation of target and masker resulted in a release from masking for all masker types. The amount of masking significantly depended on the masker type and frequency range. The various harmonic and inharmonic relations between target and masker or between components of the masker resulted in a complex pattern of increased or decreased masked thresholds in comparison to the predicted energetic masking. The results indicate that harmonicity cues affect the detectability of a tonal target in a complex masker.

  11. Effects of localized auditory information on visual target detection performance using a helmet-mounted display.

    Science.gov (United States)

    Nelson, W T; Hettinger, L J; Cunningham, J A; Brickman, B J; Haas, M W; McKinley, R L

    1998-09-01

    An experiment was conducted to evaluate the effects of localized auditory information on visual target detection performance. Visual targets were presented on either a wide field-of-view dome display or a helmet-mounted display and were accompanied by either localized, nonlocalized, or no auditory information. The addition of localized auditory information resulted in significant increases in target detection performance and significant reductions in workload ratings as compared with conditions in which auditory information was either nonlocalized or absent. Qualitative and quantitative analyses of participants' head motions revealed that the addition of localized auditory information resulted in extremely efficient and consistent search strategies. Implications for the development and design of multisensory virtual environments are discussed. Actual or potential applications of this research include the use of spatial auditory displays to augment visual information presented in helmet-mounted displays, thereby leading to increases in performance efficiency, reductions in physical and mental workload, and enhanced spatial awareness of objects in the environment.

  12. The Influence of Visual Cues on Sound Externalization

    DEFF Research Database (Denmark)

    Carvajal, Juan Camilo Gil; Santurette, Sébastien; Cubick, Jens;

    this is due to incongruent auditory cues between the recording and playback room during sound reproduction or to an expectation effect from the visual impression of the room. This study investigated the influence of a priori acoustic and visual knowledge of the playback room on sound externalization...... to the listener in rooms 2 and 3 than in room 1, with a larger effect in the reverberant than in the dry environment. In room 2, the perceived distance of the virtual sounds was more accurate in condition V than in conditions A and AV, where it was reduced. In room 3, differences in distance judgments between A...... the more reverberant the listening environment was. While the visual impression of the playback room did not affect perceived distance, visual cues helped resolve localization ambiguities and improved compactness perception....

  13. Cue conflicts in context

    DEFF Research Database (Denmark)

    Boeg Thomsen, Ditte; Poulsen, Mads

    2015-01-01

    preschoolers. However, object-first clauses may be context-sensitive structures, which are infelicitous in isolation. In a second act-out study we presented OVS clauses in supportive and unsupportive discourse contexts and in isolation and found that five-to-six-year-olds’ OVS comprehension was enhanced...... in discourse-pragmatically felicitous contexts. Our results extend previous findings of preschoolers’ sensitivity to discourse-contextual cues in sentence comprehension (Hurewitz, 2001; Song & Fisher, 2005) to the basic task of assigning agent and patient roles....

  14. Responses of mink to auditory stimuli: Prerequisites for applying the ‘cognitive bias’ approach

    DEFF Research Database (Denmark)

    Svendsen, Pernille Maj; Malmkvist, Jens; Halekoh, Ulrich;

    2012-01-01

    The aim of the study was to determine and validate prerequisites for applying a cognitive (judgement) bias approach to assessing welfare in farmed mink (Neovison vison). We investigated discrimination ability and associative learning ability using auditory cues. The mink (n = 15 females) were...... mink only showed habituation in experiment 2. Regardless of the frequency used (2 and 18 kHz), cues predicting the danger situation initially elicited slower responses compared to those predicting the safe situation but quickly became faster. Using auditory cues as discrimination stimuli for female...... farmed mink in a judgement bias approach would thus appear to be feasible. However several specific issues are to be considered in order to successfully adapt a cognitive bias approach to mink, and these are discussed....

  15. A magnetorheological haptic cue accelerator for manual transmission vehicles

    International Nuclear Information System (INIS)

    This paper proposes a new haptic cue function for manual transmission vehicles to achieve optimal gear shifting. This function is implemented on the accelerator pedal by utilizing a magnetorheological (MR) brake mechanism. By combining the haptic cue function with the accelerator pedal, the proposed haptic cue device can transmit the optimal moment of gear shifting for manual transmission to a driver without requiring the driver's visual attention. As a first step to achieve this goal, a MR fluid-based haptic device is devised to enable rotary motion of the accelerator pedal. Taking into account spatial limitations, the design parameters are optimally determined using finite element analysis to maximize the relative control torque. The proposed haptic cue device is then manufactured and its field-dependent torque and time response are experimentally evaluated. Then the manufactured MR haptic cue device is integrated with the accelerator pedal. A simple virtual vehicle emulating the operation of the engine of a passenger vehicle is constructed and put into communication with the haptic cue device. A feed-forward torque control algorithm for the haptic cue is formulated and control performances are experimentally evaluated and presented in the time domain

  16. Binaural cues provide for a release from informational masking.

    Science.gov (United States)

    Tolnai, Sandra; Dolležal, Lena-Vanessa; Klump, Georg M

    2015-10-01

    Informational masking (IM) describes the insensitivity of detecting a change in sound features in a complex acoustical environment when such a change could easily be detected in the absence of distracting sounds. IM occurs because of the similarity between deviant sound and distracting sounds (so-called similarity-based IM) and/or stimulus uncertainty stemming from trial-to-trial variability (so-called uncertainty-based IM). IM can be abolished if similarity-based or uncertainty-based IM are minimized. Here, we modulated similarity-based IM using binaural cues. Standard/deviant tones and distracting tones were presented sequentially, and level-increment thresholds were measured. Deviant tones differed from standard tones by a higher sound level. Distracting tones covered a wide range of levels. Standard/deviant tones and distracting tones were characterized by their interaural time difference (ITD), interaural level difference (ILD), or both ITD and ILD. The larger the ITD or ILD was, the better similarity-based IM was overcome. If both interaural differences were applied to standard/deviant tones, the release from IM was larger than when either interaural difference was used. The results show that binaural cues are potent cues to abolish similarity-based IM and that the auditory system makes use of multiple available cues. PMID:26413722

  17. The neglected neglect: auditory neglect.

    Science.gov (United States)

    Gokhale, Sankalp; Lahoti, Sourabh; Caplan, Louis R

    2013-08-01

    Whereas visual and somatosensory forms of neglect are commonly recognized by clinicians, auditory neglect is often not assessed and therefore neglected. The auditory cortical processing system can be functionally classified into 2 distinct pathways. These 2 distinct functional pathways deal with recognition of sound ("what" pathway) and the directional attributes of the sound ("where" pathway). Lesions of higher auditory pathways produce distinct clinical features. Clinical bedside evaluation of auditory neglect is often difficult because of coexisting neurological deficits and the binaural nature of auditory inputs. In addition, auditory neglect and auditory extinction may show varying degrees of overlap, which makes the assessment even harder. Shielding one ear from the other as well as separating the ear from space is therefore critical for accurate assessment of auditory neglect. This can be achieved by use of specialized auditory tests (dichotic tasks and sound localization tests) for accurate interpretation of deficits. Herein, we have reviewed auditory neglect with an emphasis on the functional anatomy, clinical evaluation, and basic principles of specialized auditory tests.

  18. The Power Cues%权力线索

    Institute of Scientific and Technical Information of China (English)

    魏秋江

    2012-01-01

    权力线索指人们判断权力所依赖的各种信息,其能预测人们的思维和行为。除以视觉刺激和听觉刺激的形式直接影响人们的权力感知外,权力线索也可利用人们对其在空间和数字上的心理表征,间接影响人们的权力判断。各种权力线索的具体效应仍存争议。学者已开始关注现有线索去伪存真、分类和标准化等问题,还从生理视角对其加以验证,并探求新的权力线索。%Power cues are the internal and external stimuli that people utilize to judge the power of others and themselves. Recognizing people's power is the basic interaction in social and organizational life, which reduces the likelihood of conflicts within and between the groups and effectively assigns resources. Recognizing power also important to self - reinforcing and self - definition. Power cues are not only the statement of targets' power, but also can be used to predict people's minds and behaviors. Generally speaking, there are two kinds of encoding, visual and auditory, for the input information. The visual encoding includes appearance, such as the formation of face, behaviors, especially non - verbal behaviors, which always come out without consciousness but indicate peoples' power more exactly. The auditory encoding includes several parameters of sound, such as formant dispersion (Dr) , fundamental frequency ( F0 ) , variation in F0 , intensity, and utterance duration. Some kinds of messages are different, such as semantic content, via both ways, which connect with power based on higher level of cognition. In these three viewpoints, more cues are needed to be explored. Surprisingly, there is another odd factor, i.e. , gender. Research related to it reveals a diversity of results. So gender is more of a moderator than a definite power cue, which calls for more attention to the interaction effect. Besides, the mental representation of power, which involves mental simulation of space

  19. Early visual deprivation severely compromises the auditory sense of space in congenitally blind children.

    Science.gov (United States)

    Vercillo, Tiziana; Burr, David; Gori, Monica

    2016-06-01

    A recent study has shown that congenitally blind adults, who have never had visual experience, are impaired on an auditory spatial bisection task (Gori, Sandini, Martinoli, & Burr, 2014). In this study we investigated how thresholds for auditory spatial bisection and auditory discrimination develop with age in sighted and congenitally blind children (9 to 14 years old). Children performed 2 spatial tasks (minimum audible angle and space bisection) and 1 temporal task (temporal bisection). There was no impairment in the temporal task for blind children but, like adults, they showed severely compromised thresholds for spatial bisection. Interestingly, the blind children also showed lower precision in judging minimum audible angle. These results confirm the adult study and go on to suggest that even simpler auditory spatial tasks are compromised in children, and that this capacity recovers over time. (PsycINFO Database Record PMID:27228448

  20. Early visual deprivation severely compromises the auditory sense of space in congenitally blind children.

    Science.gov (United States)

    Vercillo, Tiziana; Burr, David; Gori, Monica

    2016-06-01

    A recent study has shown that congenitally blind adults, who have never had visual experience, are impaired on an auditory spatial bisection task (Gori, Sandini, Martinoli, & Burr, 2014). In this study we investigated how thresholds for auditory spatial bisection and auditory discrimination develop with age in sighted and congenitally blind children (9 to 14 years old). Children performed 2 spatial tasks (minimum audible angle and space bisection) and 1 temporal task (temporal bisection). There was no impairment in the temporal task for blind children but, like adults, they showed severely compromised thresholds for spatial bisection. Interestingly, the blind children also showed lower precision in judging minimum audible angle. These results confirm the adult study and go on to suggest that even simpler auditory spatial tasks are compromised in children, and that this capacity recovers over time. (PsycINFO Database Record

  1. Representation of lateralization and tonotopy in primary versus secondary human auditory cortex

    NARCIS (Netherlands)

    Langers, Dave R. M.; Backes, Walter H.; van Dijk, Pim

    2007-01-01

    Functional MRI was performed to investigate differences in the basic functional organization of the primary and secondary auditory cortex regarding preferred stimulus lateratization and frequency. A modified sparse acquisition scheme was used to spatially map the characteristics of the auditory cort

  2. Auditory processing in the brainstem and audiovisual integration in humans studied with fMRI

    NARCIS (Netherlands)

    Slabu, Lavinia Mihaela

    2008-01-01

    Functional magnetic resonance imaging (fMRI) is a powerful technique because of the high spatial resolution and the noninvasiveness. The applications of the fMRI to the auditory pathway remain a challenge due to the intense acoustic scanner noise of approximately 110 dB SPL. The auditory system cons

  3. Perception of aircraft Deviation Cues

    Science.gov (United States)

    Martin, Lynne; Azuma, Ronald; Fox, Jason; Verma, Savita; Lozito, Sandra

    2005-01-01

    To begin to address the need for new displays, required by a future airspace concept to support new roles that will be assigned to flight crews, a study of potentially informative display cues was undertaken. Two cues were tested on a simple plan display - aircraft trajectory and flight corridor. Of particular interest was the speed and accuracy with which participants could detect an aircraft deviating outside its flight corridor. Presence of the trajectory cue significantly reduced participant reaction time to a deviation while the flight corridor cue did not. Although non-significant, the flight corridor cue seemed to have a relationship with the accuracy of participants judgments rather than their speed. As this is the second of a series of studies, these issues will be addressed further in future studies.

  4. Auditory pathways: anatomy and physiology.

    Science.gov (United States)

    Pickles, James O

    2015-01-01

    This chapter outlines the anatomy and physiology of the auditory pathways. After a brief analysis of the external, middle ears, and cochlea, the responses of auditory nerve fibers are described. The central nervous system is analyzed in more detail. A scheme is provided to help understand the complex and multiple auditory pathways running through the brainstem. The multiple pathways are based on the need to preserve accurate timing while extracting complex spectral patterns in the auditory input. The auditory nerve fibers branch to give two pathways, a ventral sound-localizing stream, and a dorsal mainly pattern recognition stream, which innervate the different divisions of the cochlear nucleus. The outputs of the two streams, with their two types of analysis, are progressively combined in the inferior colliculus and onwards, to produce the representation of what can be called the "auditory objects" in the external world. The progressive extraction of critical features in the auditory stimulus in the different levels of the central auditory system, from cochlear nucleus to auditory cortex, is described. In addition, the auditory centrifugal system, running from cortex in multiple stages to the organ of Corti of the cochlea, is described.

  5. Proportional spike-timing precision and firing reliability underlie efficient temporal processing of periodicity and envelope shape cues.

    Science.gov (United States)

    Zheng, Y; Escabí, M A

    2013-08-01

    Temporal sound cues are essential for sound recognition, pitch, rhythm, and timbre perception, yet how auditory neurons encode such cues is subject of ongoing debate. Rate coding theories propose that temporal sound features are represented by rate tuned modulation filters. However, overwhelming evidence also suggests that precise spike timing is an essential attribute of the neural code. Here we demonstrate that single neurons in the auditory midbrain employ a proportional code in which spike-timing precision and firing reliability covary with the sound envelope cues to provide an efficient representation of the stimulus. Spike-timing precision varied systematically with the timescale and shape of the sound envelope and yet was largely independent of the sound modulation frequency, a prominent cue for pitch. In contrast, spike-count reliability was strongly affected by the modulation frequency. Spike-timing precision extends from sub-millisecond for brief transient sounds up to tens of milliseconds for sounds with slow-varying envelope. Information theoretic analysis further confirms that spike-timing precision depends strongly on the sound envelope shape, while firing reliability was strongly affected by the sound modulation frequency. Both the information efficiency and total information were limited by the firing reliability and spike-timing precision in a manner that reflected the sound structure. This result supports a temporal coding strategy in the auditory midbrain where proportional changes in spike-timing precision and firing reliability can efficiently signal shape and periodicity temporal cues.

  6. Neural Representation of Concurrent Vowels in Macaque Primary Auditory Cortex.

    Science.gov (United States)

    Fishman, Yonatan I; Micheyl, Christophe; Steinschneider, Mitchell

    2016-01-01

    Successful speech perception in real-world environments requires that the auditory system segregate competing voices that overlap in frequency and time into separate streams. Vowels are major constituents of speech and are comprised of frequencies (harmonics) that are integer multiples of a common fundamental frequency (F0). The pitch and identity of a vowel are determined by its F0 and spectral envelope (formant structure), respectively. When two spectrally overlapping vowels differing in F0 are presented concurrently, they can be readily perceived as two separate "auditory objects" with pitches at their respective F0s. A difference in pitch between two simultaneous vowels provides a powerful cue for their segregation, which in turn, facilitates their individual identification. The neural mechanisms underlying the segregation of concurrent vowels based on pitch differences are poorly understood. Here, we examine neural population responses in macaque primary auditory cortex (A1) to single and double concurrent vowels (/a/ and /i/) that differ in F0 such that they are heard as two separate auditory objects with distinct pitches. We find that neural population responses in A1 can resolve, via a rate-place code, lower harmonics of both single and double concurrent vowels. Furthermore, we show that the formant structures, and hence the identities, of single vowels can be reliably recovered from the neural representation of double concurrent vowels. We conclude that A1 contains sufficient spectral information to enable concurrent vowel segregation and identification by downstream cortical areas.

  7. Tuned with a Tune: Talker Normalization via General Auditory Processes.

    Science.gov (United States)

    Laing, Erika J C; Liu, Ran; Lotto, Andrew J; Holt, Lori L

    2012-01-01

    Voices have unique acoustic signatures, contributing to the acoustic variability listeners must contend with in perceiving speech, and it has long been proposed that listeners normalize speech perception to information extracted from a talker's speech. Initial attempts to explain talker normalization relied on extraction of articulatory referents, but recent studies of context-dependent auditory perception suggest that general auditory referents such as the long-term average spectrum (LTAS) of a talker's speech similarly affect speech perception. The present study aimed to differentiate the contributions of articulatory/linguistic versus auditory referents for context-driven talker normalization effects and, more specifically, to identify the specific constraints under which such contexts impact speech perception. Synthesized sentences manipulated to sound like different talkers influenced categorization of a subsequent speech target only when differences in the sentences' LTAS were in the frequency range of the acoustic cues relevant for the target phonemic contrast. This effect was true both for speech targets preceded by spoken sentence contexts and for targets preceded by non-speech tone sequences that were LTAS-matched to the spoken sentence contexts. Specific LTAS characteristics, rather than perceived talker, predicted the results suggesting that general auditory mechanisms play an important role in effects considered to be instances of perceptual talker normalization. PMID:22737140

  8. Tuned with a tune: Talker normalization via general auditory processes

    Directory of Open Access Journals (Sweden)

    Erika J C Laing

    2012-06-01

    Full Text Available Voices have unique acoustic signatures, contributing to the acoustic variability listeners must contend with in perceiving speech, and it has long been proposed that listeners normalize speech perception to information extracted from a talker’s speech. Initial attempts to explain talker normalization relied on extraction of articulatory referents, but recent studies of context-dependent auditory perception suggest that general auditory referents such as the long-term average spectrum (LTAS of a talker’s speech similarly affect speech perception. The present study aimed to differentiate the contributions of articulatory/linguistic versus auditory referents for context-driven talker normalization effects and, more specifically, to identify the specific constraints under which such contexts impact speech perception. Synthesized sentences manipulated to sound like different talkers influenced categorization of a subsequent speech target only when differences in the sentences’ LTAS were in the frequency range of the acoustic cues relevant for the target phonemic contrast. This effect was true both for speech targets preceded by spoken sentence contexts and for targets preceded by nonspeech tone sequences that were LTAS-matched to the spoken sentence contexts. Specific LTAS characteristics, rather than perceived talker, predicted the results suggesting that general auditory mechanisms play an important role in effects considered to be instances of perceptual talker normalization.

  9. Feasibility of external rhythmic cueing with the Google Glass for improving gait in people with Parkinson's disease.

    Science.gov (United States)

    Zhao, Yan; Nonnekes, Jorik; Storcken, Erik J M; Janssen, Sabine; van Wegen, Erwin E H; Bloem, Bastiaan R; Dorresteijn, Lucille D A; van Vugt, Jeroen P P; Heida, Tjitske; van Wezel, Richard J A

    2016-06-01

    New mobile technologies like smartglasses can deliver external cues that may improve gait in people with Parkinson's disease in their natural environment. However, the potential of these devices must first be assessed in controlled experiments. Therefore, we evaluated rhythmic visual and auditory cueing in a laboratory setting with a custom-made application for the Google Glass. Twelve participants (mean age = 66.8; mean disease duration = 13.6 years) were tested at end of dose. We compared several key gait parameters (walking speed, cadence, stride length, and stride length variability) and freezing of gait for three types of external cues (metronome, flashing light, and optic flow) and a control condition (no-cue). For all cueing conditions, the subjects completed several walking tasks of varying complexity. Seven inertial sensors attached to the feet, legs and pelvis captured motion data for gait analysis. Two experienced raters scored the presence and severity of freezing of gait using video recordings. User experience was evaluated through a semi-open interview. During cueing, a more stable gait pattern emerged, particularly on complicated walking courses; however, freezing of gait did not significantly decrease. The metronome was more effective than rhythmic visual cues and most preferred by the participants. Participants were overall positive about the usability of the Google Glass and willing to use it at home. Thus, smartglasses like the Google Glass could be used to provide personalized mobile cueing to support gait; however, in its current form, auditory cues seemed more effective than rhythmic visual cues.

  10. Feasibility of external rhythmic cueing with the Google Glass for improving gait in people with Parkinson's disease.

    Science.gov (United States)

    Zhao, Yan; Nonnekes, Jorik; Storcken, Erik J M; Janssen, Sabine; van Wegen, Erwin E H; Bloem, Bastiaan R; Dorresteijn, Lucille D A; van Vugt, Jeroen P P; Heida, Tjitske; van Wezel, Richard J A

    2016-06-01

    New mobile technologies like smartglasses can deliver external cues that may improve gait in people with Parkinson's disease in their natural environment. However, the potential of these devices must first be assessed in controlled experiments. Therefore, we evaluated rhythmic visual and auditory cueing in a laboratory setting with a custom-made application for the Google Glass. Twelve participants (mean age = 66.8; mean disease duration = 13.6 years) were tested at end of dose. We compared several key gait parameters (walking speed, cadence, stride length, and stride length variability) and freezing of gait for three types of external cues (metronome, flashing light, and optic flow) and a control condition (no-cue). For all cueing conditions, the subjects completed several walking tasks of varying complexity. Seven inertial sensors attached to the feet, legs and pelvis captured motion data for gait analysis. Two experienced raters scored the presence and severity of freezing of gait using video recordings. User experience was evaluated through a semi-open interview. During cueing, a more stable gait pattern emerged, particularly on complicated walking courses; however, freezing of gait did not significantly decrease. The metronome was more effective than rhythmic visual cues and most preferred by the participants. Participants were overall positive about the usability of the Google Glass and willing to use it at home. Thus, smartglasses like the Google Glass could be used to provide personalized mobile cueing to support gait; however, in its current form, auditory cues seemed more effective than rhythmic visual cues. PMID:27113598

  11. Owl monkeys (Aotus nigriceps and A. infulatus follow routes instead of food-related cues during foraging in captivity.

    Directory of Open Access Journals (Sweden)

    Renata Souza da Costa

    Full Text Available Foraging at night imposes different challenges from those faced during daylight, including the reliability of sensory cues. Owl monkeys (Aotus spp. are ideal models among anthropoids to study the information used during foraging at low light levels because they are unique by having a nocturnal lifestyle. Six Aotus nigriceps and four A. infulatus individuals distributed into five enclosures were studied for testing their ability to rely on olfactory, visual, auditory, or spatial and quantitative information for locating food rewards and for evaluating the use of routes to navigate among five visually similar artificial feeding boxes mounted in each enclosure. During most experiments only a single box was baited with a food reward in each session. The baited box changed randomly throughout the experiment. In the spatial and quantitative information experiment there were two baited boxes varying in the amount of food provided. These baited boxes remained the same throughout the experiment. A total of 45 sessions (three sessions per night during 15 consecutive nights per enclosure was conducted in each experiment. Only one female showed a performance suggestive of learning of the usefulness of sight to locate the food reward in the visual information experiment. Subjects showed a chance performance in the remaining experiments. All owl monkeys showed a preference for one box or a subset of boxes to inspect upon the beginning of each experimental session and consistently followed individual routes among feeding boxes.

  12. Resizing Auditory Communities

    DEFF Research Database (Denmark)

    Kreutzfeldt, Jacob

    2012-01-01

    Heard through the ears of the Canadian composer and music teacher R. Murray Schafer the ideal auditory community had the shape of a village. Schafer’s work with the World Soundscape Project in the 70s represent an attempt to interpret contemporary environments through musical and auditory...... of sound as an active component in shaping urban environments. As urban conditions spreads globally, new scales, shapes and forms of communities appear and call for new distinctions and models in the study and representation of sonic environments. Particularly so, since urban environments are increasingly...... presents some terminologies for mapping urban environments through its sonic configuration. Such probing into the practices of acoustic territorialisation may direct attention to some of the conflicting and disharmonious interests defining public inclusive domains. The paper investigates the concept...

  13. The influence of presentation method on auditory length perception

    DEFF Research Database (Denmark)

    Kirkwood, Brent Christopher

    2005-01-01

    Humans are capable of hearing the lengths of wooden rods dropped onto hard floors. In an attempt to understand the influence of the stimulus presentation method for testing this kind of everyday listening task, listener performance was compared for three presentation methods in an auditory length......-estimation experiment. A comparison of the length-estimation accuracy for the three presentation methods indicates that the choice of presentation method is important for maintaining realism and for maintaining the acoustic cues utilized by listeners in perceiving length....

  14. The Influence of Presentation Method on Auditory Length Perception

    DEFF Research Database (Denmark)

    Kirkwood, Brent Christopher

    Humans are capable of hearing the lengths of wooden rods dropped onto hard floors. In an attempt to understand the influence of the stimulus presentation method for testing this kind of everyday listening task, listener performance was compared for three presentation methods in an auditory length......-estimation experiment. A comparison of the length-estimation accuracy for the three presentation methods indicates that the choice of presentation method is important for maintaining realism and for maintaining the acoustic cues utilized by listeners in perceiving length....

  15. Event-related potentials in response to 3-D auditory stimuli.

    Science.gov (United States)

    Fuchigami, Tatsuo; Okubo, Osami; Fujita, Yukihiko; Kohira, Ryutaro; Arakawa, Chikako; Endo, Ayumi; Haruyama, Wakako; Imai, Yuki; Mugishima, Hideo

    2009-09-01

    To evaluate auditory spatial cognitive function, age correlations for event-related potentials (ERPs) in response to auditory stimuli with a Doppler effect were studied in normal children. A sound with a Doppler effect is perceived as a moving audio image. A total of 99 normal subjects (age range, 4-21 years) were tested. In the task-relevant oddball paradigm, P300 and key-press reaction time were elicited using auditory stimuli (1000 Hz fixed and enlarged tones with a Doppler effect). From the age of 4 years, the P300 latency for the enlarged tone with a Doppler effect shortened more rapidly with age than did the P300 latency for tone-pips, and the latencies for the different conditions became similar towards the late teens. The P300 of auditory stimuli with a Doppler effect may be used to evaluate auditory spatial cognitive function in children.

  16. Reduced recruitment of orbitofrontal cortex to human social chemosensory cues in social anxiety

    OpenAIRE

    Zhou, Wen; HOU, PING; Zhou, Yuxiang; Chen, Denise

    2010-01-01

    Social anxiety refers to the prevalent and debilitating experience of fear and anxiety of being scrutinized in social situations. It originates from both learned (e.g. adverse social conditioning) and innate (e.g. shyness) factors. Research on social anxiety has traditionally focused on negative emotions induced by visual and auditory social cues in socially anxious clinical populations, and posits a dysfunctional orbitofrontal-amygdala circuit as a primary etiological mechanism. Yet as a tra...

  17. Semantic Framing of Speech : Emotional and Topical Cues in Perception of Poorly Specified Speech

    OpenAIRE

    Lidestam, Björn

    2003-01-01

    The general aim of this thesis was to test the effects of paralinguistic (emotional) and prior contextual (topical) cues on perception of poorly specified visual, auditory, and audiovisual speech. The specific purposes were to (1) examine if facially displayed emotions can facilitate speechreading performance; (2) to study the mechanism for such facilitation; (3) to map information-processing factors that are involved in processing of poorly specified speech; and (4) to present a comprehensiv...

  18. The owl’s cochlear nuclei process different sound localization cues

    OpenAIRE

    Konishi, Masakazu; Sullivan, W. Edward; Takahashi, Terry

    1985-01-01

    This paper discusses how the barn owl’s brain stem auditory pathway is divided into two physiologically and anatomically segregated channels for separate processing of interaural phase and intensity cues for sound localization. The paper also points out the power of the ‘‘downstream’’ approach by which the emergence of a higher‐order neuron’s stimulus selectivity can be traced through lower‐order stations.

  19. Compression of auditory space during forward self-motion.

    Directory of Open Access Journals (Sweden)

    Wataru Teramoto

    Full Text Available BACKGROUND: Spatial inputs from the auditory periphery can be changed with movements of the head or whole body relative to the sound source. Nevertheless, humans can perceive a stable auditory environment and appropriately react to a sound source. This suggests that the inputs are reinterpreted in the brain, while being integrated with information on the movements. Little is known, however, about how these movements modulate auditory perceptual processing. Here, we investigate the effect of the linear acceleration on auditory space representation. METHODOLOGY/PRINCIPAL FINDINGS: Participants were passively transported forward/backward at constant accelerations using a robotic wheelchair. An array of loudspeakers was aligned parallel to the motion direction along a wall to the right of the listener. A short noise burst was presented during the self-motion from one of the loudspeakers when the listener's physical coronal plane reached the location of one of the speakers (null point. In Experiments 1 and 2, the participants indicated which direction the sound was presented, forward or backward relative to their subjective coronal plane. The results showed that the sound position aligned with the subjective coronal plane was displaced ahead of the null point only during forward self-motion and that the magnitude of the displacement increased with increasing the acceleration. Experiment 3 investigated the structure of the auditory space in the traveling direction during forward self-motion. The sounds were presented at various distances from the null point. The participants indicated the perceived sound location by pointing a rod. All the sounds that were actually located in the traveling direction were perceived as being biased towards the null point. CONCLUSIONS/SIGNIFICANCE: These results suggest a distortion of the auditory space in the direction of movement during forward self-motion. The underlying mechanism might involve anticipatory spatial

  20. The processing of visual and auditory information for reaching movements.

    Science.gov (United States)

    Glazebrook, Cheryl M; Welsh, Timothy N; Tremblay, Luc

    2016-09-01

    Presenting target and non-target information in different modalities influences target localization if the non-target is within the spatiotemporal limits of perceptual integration. When using auditory and visual stimuli, the influence of a visual non-target on auditory target localization is greater than the reverse. It is not known, however, whether or how such perceptual effects extend to goal-directed behaviours. To gain insight into how audio-visual stimuli are integrated for motor tasks, the kinematics of reaching movements towards visual or auditory targets with or without a non-target in the other modality were examined. When present, the simultaneously presented non-target could be spatially coincident, to the left, or to the right of the target. Results revealed that auditory non-targets did not influence reaching trajectories towards a visual target, whereas visual non-targets influenced trajectories towards an auditory target. Interestingly, the biases induced by visual non-targets were present early in the trajectory and persisted until movement end. Subsequent experimentation indicated that the magnitude of the biases was equivalent whether participants performed a perceptual or motor task, whereas variability was greater for the motor versus the perceptual tasks. We propose that visually induced trajectory biases were driven by the perceived mislocation of the auditory target, which in turn affected both the movement plan and subsequent control of the movement. Such findings provide further evidence of the dominant role visual information processing plays in encoding spatial locations as well as planning and executing reaching action, even when reaching towards auditory targets. PMID:26253323

  1. Learning auditory space: generalization and long-term effects.

    Directory of Open Access Journals (Sweden)

    Catarina Mendonça

    Full Text Available BACKGROUND: Previous findings have shown that humans can learn to localize with altered auditory space cues. Here we analyze such learning processes and their effects up to one month on both localization accuracy and sound externalization. Subjects were trained and retested, focusing on the effects of stimulus type in learning, stimulus type in localization, stimulus position, previous experience, externalization levels, and time. METHOD: We trained listeners in azimuth and elevation discrimination in two experiments. Half participated in the azimuth experiment first and half in the elevation first. In each experiment, half were trained in speech sounds and half in white noise. Retests were performed at several time intervals: just after training and one hour, one day, one week and one month later. In a control condition, we tested the effect of systematic retesting over time with post-tests only after training and either one day, one week, or one month later. RESULTS: With training all participants lowered their localization errors. This benefit was still present one month after training. Participants were more accurate in the second training phase, revealing an effect of previous experience on a different task. Training with white noise led to better results than training with speech sounds. Moreover, the training benefit generalized to untrained stimulus-position pairs. Throughout the post-tests externalization levels increased. In the control condition the long-term localization improvement was not lower without additional contact with the trained sounds, but externalization levels were lower. CONCLUSION: Our findings suggest that humans adapt easily to altered auditory space cues and that such adaptation spreads to untrained positions and sound types. We propose that such learning depends on all available cues, but each cue type might be learned and retrieved differently. The process of localization learning is global, not limited to

  2. Coding of auditory space

    OpenAIRE

    Konishi­, Masakazu

    2003-01-01

    Behavioral, anatomical, and physiological approaches can be integrated in the study of sound localization in barn owls. Space representation in owls provides a useful example for discussion of place and ensemble coding. Selectivity for space is broad and ambiguous in low-order neurons. Parallel pathways for binaural cues and for different frequency bands converge on high-order space-specific neurons, which encode space more precisely. An ensemble of broadly tuned place-coding neurons may conv...

  3. Auditory temporal resolution and integration - stages of analyzing time-varying sounds

    DEFF Research Database (Denmark)

    Pedersen, Benjamin

    2007-01-01

    , much is still unknown of how temporal information is analyzed and represented in the auditory system. The PhD lecture concerns the topic of temporal processing in hearing and the topic is approached via four different listening experiments designed to probe several aspects of temporal processing...... scheme: Effects such as attention seem to play an important role in loudness integration, and further, it will be demonstrated that the auditory system can rely on temporal cues at a much finer level of detail than predicted be existing models (temporal details in the time-range of 60 ?s can...

  4. Auditory and non-auditory effects of noise on health

    NARCIS (Netherlands)

    Basner, M.; Babisch, W.; Davis, A.; Brink, M.; Clark, C.; Janssen, S.A.; Stansfeld, S.

    2013-01-01

    Noise is pervasive in everyday life and can cause both auditory and non-auditory health eff ects. Noise-induced hearing loss remains highly prevalent in occupational settings, and is increasingly caused by social noise exposure (eg, through personal music players). Our understanding of molecular mec

  5. Evaluation of multimodal ground cues

    DEFF Research Database (Denmark)

    Nordahl, Rolf; Lecuyer, Anatole; Serafin, Stefania;

    2012-01-01

    This chapter presents an array of results on the perception of ground surfaces via multiple sensory modalities,with special attention to non visual perceptual cues, notably those arising from audition and haptics, as well as interactions between them. It also reviews approaches to combining...... synthetic multimodal cues, from vision, haptics, and audition, in order to realize virtual experiences of walking on simulated ground surfaces or other features....

  6. Midbrain auditory selectivity to natural sounds.

    Science.gov (United States)

    Wohlgemuth, Melville J; Moss, Cynthia F

    2016-03-01

    This study investigated auditory stimulus selectivity in the midbrain superior colliculus (SC) of the echolocating bat, an animal that relies on hearing to guide its orienting behaviors. Multichannel, single-unit recordings were taken across laminae of the midbrain SC of the awake, passively listening big brown bat, Eptesicus fuscus. Species-specific frequency-modulated (FM) echolocation sound sequences with dynamic spectrotemporal features served as acoustic stimuli along with artificial sound sequences matched in bandwidth, amplitude, and duration but differing in spectrotemporal structure. Neurons in dorsal sensory regions of the bat SC responded selectively to elements within the FM sound sequences, whereas neurons in ventral sensorimotor regions showed broad response profiles to natural and artificial stimuli. Moreover, a generalized linear model (GLM) constructed on responses in the dorsal SC to artificial linear FM stimuli failed to predict responses to natural sounds and vice versa, but the GLM produced accurate response predictions in ventral SC neurons. This result suggests that auditory selectivity in the dorsal extent of the bat SC arises through nonlinear mechanisms, which extract species-specific sensory information. Importantly, auditory selectivity appeared only in responses to stimuli containing the natural statistics of acoustic signals used by the bat for spatial orientation-sonar vocalizations-offering support for the hypothesis that sensory selectivity enables rapid species-specific orienting behaviors. The results of this study are the first, to our knowledge, to show auditory spectrotemporal selectivity to natural stimuli in SC neurons and serve to inform a more general understanding of mechanisms guiding sensory selectivity for natural, goal-directed orienting behaviors.

  7. Combining cues while avoiding perceptual conflicts

    NARCIS (Netherlands)

    Hogervorst, M.A.; Brenner, E.

    2004-01-01

    A common assumption in cue combination models is that small discrepancies between cues are due to the limited resolution of the individual cues. Whenever this assumption holds, information from the separate cues can best be combined to give a single, more accurate estimate of the property of interes

  8. Human Perception of Ambiguous Inertial Motion Cues

    Science.gov (United States)

    Zhang, Guan-Lu

    2010-01-01

    Human daily activities on Earth involve motions that elicit both tilt and translation components of the head (i.e. gazing and locomotion). With otolith cues alone, tilt and translation can be ambiguous since both motions can potentially displace the otolithic membrane by the same magnitude and direction. Transitions between gravity environments (i.e. Earth, microgravity and lunar) have demonstrated to alter the functions of the vestibular system and exacerbate the ambiguity between tilt and translational motion cues. Symptoms of motion sickness and spatial disorientation can impair human performances during critical mission phases. Specifically, Space Shuttle landing records show that particular cases of tilt-translation illusions have impaired the performance of seasoned commanders. This sensorimotor condition is one of many operational risks that may have dire implications on future human space exploration missions. The neural strategy with which the human central nervous system distinguishes ambiguous inertial motion cues remains the subject of intense research. A prevailing theory in the neuroscience field proposes that the human brain is able to formulate a neural internal model of ambiguous motion cues such that tilt and translation components can be perceptually decomposed in order to elicit the appropriate bodily response. The present work uses this theory, known as the GIF resolution hypothesis, as the framework for experimental hypothesis. Specifically, two novel motion paradigms are employed to validate the neural capacity of ambiguous inertial motion decomposition in ground-based human subjects. The experimental setup involves the Tilt-Translation Sled at Neuroscience Laboratory of NASA JSC. This two degree-of-freedom motion system is able to tilt subjects in the pitch plane and translate the subject along the fore-aft axis. Perception data will be gathered through subject verbal reports. Preliminary analysis of perceptual data does not indicate that

  9. Partial Epilepsy with Auditory Features

    Directory of Open Access Journals (Sweden)

    J Gordon Millichap

    2004-07-01

    Full Text Available The clinical characteristics of 53 sporadic (S cases of idiopathic partial epilepsy with auditory features (IPEAF were analyzed and compared to previously reported familial (F cases of autosomal dominant partial epilepsy with auditory features (ADPEAF in a study at the University of Bologna, Italy.

  10. Effects of aging on peripheral and central auditory processing in rats.

    Science.gov (United States)

    Costa, Margarida; Lepore, Franco; Prévost, François; Guillemot, Jean-Paul

    2016-08-01

    Hearing loss is a hallmark sign in the elderly population. Decline in auditory perception provokes deficits in the ability to localize sound sources and reduces speech perception, particularly in noise. In addition to a loss of peripheral hearing sensitivity, changes in more complex central structures have also been demonstrated. Related to these, this study examines the auditory directional maps in the deep layers of the superior colliculus of the rat. Hence, anesthetized Sprague-Dawley adult (10 months) and aged (22 months) rats underwent distortion product of otoacoustic emissions (DPOAEs) to assess cochlear function. Then, auditory brainstem responses (ABRs) were assessed, followed by extracellular single-unit recordings to determine age-related effects on central auditory functions. DPOAE amplitude levels were decreased in aged rats although they were still present between 3.0 and 24.0 kHz. ABR level thresholds in aged rats were significantly elevated at an early (cochlear nucleus - wave II) stage in the auditory brainstem. In the superior colliculus, thresholds were increased and the tuning widths of the directional receptive fields were significantly wider. Moreover, no systematic directional spatial arrangement was present among the neurons of the aged rats, implying that the topographical organization of the auditory directional map was abolished. These results suggest that the deterioration of the auditory directional spatial map can, to some extent, be attributable to age-related dysfunction at more central, perceptual stages of auditory processing.

  11. Making the invisible visible: verbal but not visual cues enhance visual detection.

    Directory of Open Access Journals (Sweden)

    Gary Lupyan

    Full Text Available BACKGROUND: Can hearing a word change what one sees? Although visual sensitivity is known to be enhanced by attending to the location of the target, perceptual enhancements of following cues to the identity of an object have been difficult to find. Here, we show that perceptual sensitivity is enhanced by verbal, but not visual cues. METHODOLOGY/PRINCIPAL FINDINGS: Participants completed an object detection task in which they made an object-presence or -absence decision to briefly-presented letters. Hearing the letter name prior to the detection task increased perceptual sensitivity (d'. A visual cue in the form of a preview of the to-be-detected letter did not. Follow-up experiments found that the auditory cuing effect was specific to validly cued stimuli. The magnitude of the cuing effect positively correlated with an individual measure of vividness of mental imagery; introducing uncertainty into the position of the stimulus did not reduce the magnitude of the cuing effect, but eliminated the correlation with mental imagery. CONCLUSIONS/SIGNIFICANCE: Hearing a word made otherwise invisible objects visible. Interestingly, seeing a preview of the target stimulus did not similarly enhance detection of the target. These results are compatible with an account in which auditory verbal labels modulate lower-level visual processing. The findings show that a verbal cue in the form of hearing a word can influence even the most elementary visual processing and inform our understanding of how language affects perception.

  12. The Effects of Age and Preoral Sensorimotor Cues on Anticipatory Mouth Movement During Swallowing

    Science.gov (United States)

    Moon, Jerald B.; Goodman, Shawn S.

    2016-01-01

    Purpose The aim of this study was to investigate the effects of preoral sensorimotor cues on anticipatory swallowing/eating-related mouth movements in older and younger adults. It was hypothesized that these cues are essential to timing anticipatory oral motor patterns, and these movements are delayed in older as compared with younger adults. Method Using a 2 × 2 repeated-measures design, eating-related lip, jaw, and hand movements were recorded from 24 healthy older (ages 70–85 years) and 24 healthy younger (ages 18–30 years) adults under 4 conditions: typical self-feeding, typical assisted feeding (proprioceptive loss), sensory-loss self-feeding (auditory and visual loss/degradation), and sensory-loss assisted feeding (loss/degradation of all cues). Results All participants demonstrated anticipatory mouth opening. The absence of proprioception delayed lip-lowering onset, and sensory loss more negatively affected offset. Given at least 1 preoral sensorimotor cue, older adults initiated movement earlier than younger adults. Conclusions Preoral sensorimotor information influences anticipatory swallowing/eating-related mouth movements, highlighting the importance of these cues. Earlier movement in older adults may be a compensation, facilitating safe swallowing given other age-related declines. Further research is needed to determine if the negative impact of cue removal may be further exacerbated in a nonhealthy system (e.g., presence of dysphagia or disease), potentially increasing swallowing- and eating-related risks. PMID:26540553

  13. The Perception of Auditory Motion.

    Science.gov (United States)

    Carlile, Simon; Leung, Johahn

    2016-01-01

    The growing availability of efficient and relatively inexpensive virtual auditory display technology has provided new research platforms to explore the perception of auditory motion. At the same time, deployment of these technologies in command and control as well as in entertainment roles is generating an increasing need to better understand the complex processes underlying auditory motion perception. This is a particularly challenging processing feat because it involves the rapid deconvolution of the relative change in the locations of sound sources produced by rotational and translations of the head in space (self-motion) to enable the perception of actual source motion. The fact that we perceive our auditory world to be stable despite almost continual movement of the head demonstrates the efficiency and effectiveness of this process. This review examines the acoustical basis of auditory motion perception and a wide range of psychophysical, electrophysiological, and cortical imaging studies that have probed the limits and possible mechanisms underlying this perception. PMID:27094029

  14. Visual landmarks facilitate rodent spatial navigation in virtual reality environments

    OpenAIRE

    Youngstrom, Isaac A.; Strowbridge, Ben W.

    2012-01-01

    Because many different sensory modalities contribute to spatial learning in rodents, it has been difficult to determine whether spatial navigation can be guided solely by visual cues. Rodents moving within physical environments with visual cues engage a variety of nonvisual sensory systems that cannot be easily inhibited without lesioning brain areas. Virtual reality offers a unique approach to ask whether visual landmark cues alone are sufficient to improve performance in a spatial task. We ...

  15. Peripheral Auditory Mechanisms

    CERN Document Server

    Hall, J; Hubbard, A; Neely, S; Tubis, A

    1986-01-01

    How weIl can we model experimental observations of the peripheral auditory system'? What theoretical predictions can we make that might be tested'? It was with these questions in mind that we organized the 1985 Mechanics of Hearing Workshop, to bring together auditory researchers to compare models with experimental observations. Tbe workshop forum was inspired by the very successful 1983 Mechanics of Hearing Workshop in Delft [1]. Boston University was chosen as the site of our meeting because of the Boston area's role as a center for hearing research in this country. We made a special effort at this meeting to attract students from around the world, because without students this field will not progress. Financial support for the workshop was provided in part by grant BNS- 8412878 from the National Science Foundation. Modeling is a traditional strategy in science and plays an important role in the scientific method. Models are the bridge between theory and experiment. Tbey test the assumptions made in experim...

  16. Unimodal and crossmodal gradients of spatial attention

    DEFF Research Database (Denmark)

    Föcker, J.; Hötting, K.; Gondan, Matthias;

    2010-01-01

    Behavioral and event-related potential (ERP) studies have shown that spatial attention is gradually distributed around the center of the attentional focus. The present study compared uni- and crossmodal gradients of spatial attention to investigate whether the orienting of auditory and visual...... spatial attention is based on modality specific or supramodal representations of space. Auditory and visual stimuli were presented from five speaker locations positioned in the right hemifield. Participants had to attend to the innermost or outmost right position in order to detect either visual...... or auditory deviant stimuli. Detection rates and event-related potentials (ERPs) indicated that spatial attention is distributed as a gradient. Unimodal spatial ERP gradients correlated with the spatial resolution of the modality. Crossmodal spatial gradients were always broader than the corresponding...

  17. Enhanced representation of spectral contrasts in the primary auditory cortex

    Directory of Open Access Journals (Sweden)

    Nicolas eCatz

    2013-06-01

    Full Text Available The role of early auditory processing may be to extract some elementary features from an acoustic mixture in order to organize the auditory scene. To accomplish this task, the central auditory system may rely on the fact that sensory objects are often composed of spectral edges, i.e. regions where the stimulus energy changes abruptly over frequency. The processing of acoustic stimuli may benefit from a mechanism enhancing the internal representation of spectral edges. While the visual system is thought to rely heavily on this mechanism (enhancing spatial edges, it is still unclear whether a related process plays a significant role in audition. We investigated the cortical representation of spectral edges, using acoustic stimuli composed of multi-tone pips whose time-averaged spectral envelope contained suppressed or enhanced regions. Importantly, the stimuli were designed such that neural responses properties could be assessed as a function of stimulus frequency during stimulus presentation. Our results suggest that the representation of acoustic spectral edges is enhanced in the auditory cortex, and that this enhancement is sensitive to the characteristics of the spectral contrast profile, such as depth, sharpness and width. Spectral edges are maximally enhanced for sharp contrast and large depth. Cortical activity was also suppressed at frequencies within the suppressed region. To note, the suppression of firing was larger at frequencies nearby the lower edge of the suppressed region than at the upper edge. Overall, the present study gives critical insights into the processing of spectral contrasts in the auditory system.

  18. Cue-elicited reward-seeking requires extracellular signal-regulated kinase activation in the nucleus accumbens.

    Science.gov (United States)

    Shiflett, Michael W; Martini, Ross P; Mauna, Jocelyn C; Foster, Rebecca L; Peet, Eloise; Thiels, Edda

    2008-02-01

    The motivation to seek out rewards can come under the control of stimuli associated with reward delivery. The ability of cues to motivate reward-seeking behavior depends on the nucleus accumbens (NAcc). The molecular mechanisms in the NAcc that underlie the ability of a cue to motivate reward-seeking are not well understood. We examined whether extracellular signal-regulated kinase (ERK), an important intracellular signaling pathway in learning and memory, has a role in these motivational processes. We first examined p42 ERK (ERK2) activation in the NAcc after rats were trained to associate an auditory stimulus with food delivery and found that, as a consequence of training, presentation of the auditory cue itself was sufficient to increase ERK2 activation in the NAcc. To examine whether inhibition of ERK in the NAcc prevents cue-induced reward-seeking, we infused an inhibitor of ERK, U0126, into the NAcc before assessing rats' instrumental responding in the presence versus absence of the conditioned cue. We found that, whereas vehicle-infused rats showed increased instrumental responding during cue presentation, rats infused with U0126 showed a profound impairment in cue-induced instrumental responding. In contrast, intra-NAcc U0126 infusion had no effect on rats' food-reinforced instrumental responding or their ability to execute conditioned approach behavior. Our results demonstrate learning-related changes in ERK signaling in the NAcc, and that disruption of ERK activation in this structure interferes with the incentive-motivational effects of conditioned stimuli. The molecular mechanisms described here may have implications for cue-elicited drug craving after repeated exposure to drugs of abuse.

  19. Moving Objects in the Barn Owl's Auditory World.

    Science.gov (United States)

    Langemann, Ulrike; Krumm, Bianca; Liebner, Katharina; Beutelmann, Rainer; Klump, Georg M

    2016-01-01

    Barn owls are keen hunters of moving prey. They have evolved an auditory system with impressive anatomical and physiological specializations for localizing their prey. Here we present behavioural data on the owl's sensitivity for discriminating acoustic motion direction in azimuth that, for the first time, allow a direct comparison of neuronal and perceptual sensitivity for acoustic motion in the same model species. We trained two birds to report a change in motion direction within a series of repeating wideband noise stimuli. For any trial the starting point, motion direction, velocity (53-2400°/s), duration (30-225 ms) and angular range (12-72°) of the noise sweeps were randomized. Each test stimulus had a motion direction being opposite to that of the reference stimuli. Stimuli were presented in the frontal or the lateral auditory space. The angular extent of the motion had a large effect on the owl's discrimination sensitivity allowing a better discrimination for a larger angular range of the motion. In contrast, stimulus velocity or stimulus duration had a smaller, although significant effect. Overall there was no difference in the owls' behavioural performance between "inward" noise sweeps (moving from lateral to frontal) compared to "outward" noise sweeps (moving from frontal to lateral). The owls did, however, respond more often to stimuli with changing motion direction in the frontal compared to the lateral space. The results of the behavioural experiments are discussed in relation to the neuronal representation of motion cues in the barn owl auditory midbrain. PMID:27080662

  20. Methylphenidate attenuates limbic brain inhibition after cocaine-cues exposure in cocaine abusers

    International Nuclear Information System (INIS)

    Dopamine (phasic release) is implicated in conditioned responses. Imaging studies in cocaine abusers show decreases in striatal dopamine levels, which we hypothesize may enhance conditioned responses since tonic dopamine levels modulate phasic dopamine release. To test this we assessed the effects of increasing tonic dopamine levels (using oral methylphenidate) on brain activation induced by cocaine-cues in cocaine abusers. Brain metabolism (marker of brain function) was measured with PET and 18FDG in 24 active cocaine abusers tested four times; twice watching a Neutral video (nature scenes) and twice watching a Cocaine-cues video; each video was preceded once by placebo and once by methylphenidate (20 mg). The Cocaine-cues video increased craving to the same extent with placebo (68%) and with methylphenidate (64%). In contrast, SPM analysis of metabolic images revealed that differences between Neutral versus Cocaine-cues conditions were greater with placebo than methylphenidate; whereas with placebo the Cocaine-cues decreased metabolism (p<0.005) in left limbic regions (insula, orbitofrontal, accumbens) and right parahippocampus, with methylphenidate it only decreased in auditory and visual regions, which also occurred with placebo. Decreases in metabolism in these regions were not associated with craving; in contrast the voxel-wise SPM analysis identified significant correlations with craving in anterior orbitofrontal cortex (p<0.005), amygdala, striatum and middle insula (p<0.05). This suggests that methylphenidate's attenuation of brain reactivity to Cocaine-cues is distinct from that involved in craving. Cocaine-cues decreased metabolism in limbic regions (reflects activity over 30 minutes), which contrasts with activations reported by fMRI studies (reflects activity over 2-5 minutes) that may reflect long-lasting limbic inhibition following activation. Studies to evaluate the clinical significance of methylphenidate's blunting of cue-induced limbic inhibition

  1. Methylphenidate attenuates limbic brain inhibition after cocaine-cues exposure in cocaine abusers.

    Energy Technology Data Exchange (ETDEWEB)

    Volkow, N.D.; Wang, G.; Volkow, N.D.; Wang, G.-J.; Tomasi, D.; Telang, F.; Fowler, J.S.; Pradhan, K.; Jayne, M.; Logan, J.; Goldstein, R.Z.; Alia-Klein, N.; Wong, C.T.

    2010-07-01

    Dopamine (phasic release) is implicated in conditioned responses. Imaging studies in cocaine abusers show decreases in striatal dopamine levels, which we hypothesize may enhance conditioned responses since tonic dopamine levels modulate phasic dopamine release. To test this we assessed the effects of increasing tonic dopamine levels (using oral methylphenidate) on brain activation induced by cocaine-cues in cocaine abusers. Brain metabolism (marker of brain function) was measured with PET and {sup 18}FDG in 24 active cocaine abusers tested four times; twice watching a Neutral video (nature scenes) and twice watching a Cocaine-cues video; each video was preceded once by placebo and once by methylphenidate (20 mg). The Cocaine-cues video increased craving to the same extent with placebo (68%) and with methylphenidate (64%). In contrast, SPM analysis of metabolic images revealed that differences between Neutral versus Cocaine-cues conditions were greater with placebo than methylphenidate; whereas with placebo the Cocaine-cues decreased metabolism (p<0.005) in left limbic regions (insula, orbitofrontal, accumbens) and right parahippocampus, with methylphenidate it only decreased in auditory and visual regions, which also occurred with placebo. Decreases in metabolism in these regions were not associated with craving; in contrast the voxel-wise SPM analysis identified significant correlations with craving in anterior orbitofrontal cortex (p<0.005), amygdala, striatum and middle insula (p<0.05). This suggests that methylphenidate's attenuation of brain reactivity to Cocaine-cues is distinct from that involved in craving. Cocaine-cues decreased metabolism in limbic regions (reflects activity over 30 minutes), which contrasts with activations reported by fMRI studies (reflects activity over 2-5 minutes) that may reflect long-lasting limbic inhibition following activation. Studies to evaluate the clinical significance of methylphenidate's blunting of cue

  2. Methylphenidate attenuates limbic brain inhibition after cocaine-cues exposure in cocaine abusers.

    Directory of Open Access Journals (Sweden)

    Nora D Volkow

    Full Text Available Dopamine (phasic release is implicated in conditioned responses. Imaging studies in cocaine abusers show decreases in striatal dopamine levels, which we hypothesize may enhance conditioned responses since tonic dopamine levels modulate phasic dopamine release. To test this we assessed the effects of increasing tonic dopamine levels (using oral methylphenidate on brain activation induced by cocaine-cues in cocaine abusers. Brain metabolism (marker of brain function was measured with PET and (18FDG in 24 active cocaine abusers tested four times; twice watching a Neutral video (nature scenes and twice watching a Cocaine-cues video; each video was preceded once by placebo and once by methylphenidate (20 mg. The Cocaine-cues video increased craving to the same extent with placebo (68% and with methylphenidate (64%. In contrast, SPM analysis of metabolic images revealed that differences between Neutral versus Cocaine-cues conditions were greater with placebo than methylphenidate; whereas with placebo the Cocaine-cues decreased metabolism (p<0.005 in left limbic regions (insula, orbitofrontal, accumbens and right parahippocampus, with methylphenidate it only decreased in auditory and visual regions, which also occurred with placebo. Decreases in metabolism in these regions were not associated with craving; in contrast the voxel-wise SPM analysis identified significant correlations with craving in anterior orbitofrontal cortex (p<0.005, amygdala, striatum and middle insula (p<0.05. This suggests that methylphenidate's attenuation of brain reactivity to Cocaine-cues is distinct from that involved in craving. Cocaine-cues decreased metabolism in limbic regions (reflects activity over 30 minutes, which contrasts with activations reported by fMRI studies (reflects activity over 2-5 minutes that may reflect long-lasting limbic inhibition following activation. Studies to evaluate the clinical significance of methylphenidate's blunting of cue-induced limbic

  3. The acoustic and perceptual cues affecting melody segregation for listeners with a cochlear implant.

    Directory of Open Access Journals (Sweden)

    Jeremy eMarozeau

    2013-11-01

    Full Text Available Our ability to listen selectively to single sound sources in complex auditory environments is termed ‘auditory stream segregation.’ This ability is affected by peripheral disorders such as hearing loss, as well as plasticity in central processing such as occurs with musical training. Brain plasticity induced by musical training can enhance the ability to segregate sound, leading to improvements in a variety of auditory abilities. The melody segregation ability of 12 cochlear-implant recipients was tested using a new method to determine the perceptual distance needed to segregate a simple 4-note melody from a background of interleaved random-pitch distractor notes. In experiment 1, participants rated the difficulty of segregating the melody from distracter notes. Four physical properties of the distracter notes were changed. In experiment 2, listeners were asked to rate the dissimilarity between melody patterns whose notes differed on the four physical properties simultaneously. Multidimensional scaling analysis transformed the dissimilarity ratings into perceptual distances. Regression between physical and perceptual cues then derived the minimal perceptual distance needed to segregate the melody.The most efficient streaming cue for CI users was loudness. For the normal hearing listeners without musical backgrounds, a greater difference on the perceptual dimension correlated to the temporal envelope is needed for stream segregation in CI users. No differences in streaming efficiency were found between the perceptual dimensions linked to the F0 and the spectral envelope.Combined with our previous results in normally-hearing musicians and non-musicians, the results show that differences in training as well as differences in peripheral auditory processing (hearing impairment and the use of a hearing device influences the way that listeners use different acoustic cues for segregating interleaved musical streams.

  4. Auditory short-term memory in the primate auditory cortex.

    Science.gov (United States)

    Scott, Brian H; Mishkin, Mortimer

    2016-06-01

    Sounds are fleeting, and assembling the sequence of inputs at the ear into a coherent percept requires auditory memory across various time scales. Auditory short-term memory comprises at least two components: an active ׳working memory' bolstered by rehearsal, and a sensory trace that may be passively retained. Working memory relies on representations recalled from long-term memory, and their rehearsal may require phonological mechanisms unique to humans. The sensory component, passive short-term memory (pSTM), is tractable to study in nonhuman primates, whose brain architecture and behavioral repertoire are comparable to our own. This review discusses recent advances in the behavioral and neurophysiological study of auditory memory with a focus on single-unit recordings from macaque monkeys performing delayed-match-to-sample (DMS) tasks. Monkeys appear to employ pSTM to solve these tasks, as evidenced by the impact of interfering stimuli on memory performance. In several regards, pSTM in monkeys resembles pitch memory in humans, and may engage similar neural mechanisms. Neural correlates of DMS performance have been observed throughout the auditory and prefrontal cortex, defining a network of areas supporting auditory STM with parallels to that supporting visual STM. These correlates include persistent neural firing, or a suppression of firing, during the delay period of the memory task, as well as suppression or (less commonly) enhancement of sensory responses when a sound is repeated as a ׳match' stimulus. Auditory STM is supported by a distributed temporo-frontal network in which sensitivity to stimulus history is an intrinsic feature of auditory processing. This article is part of a Special Issue entitled SI: Auditory working memory. PMID:26541581

  5. Auditory short-term memory in the primate auditory cortex.

    Science.gov (United States)

    Scott, Brian H; Mishkin, Mortimer

    2016-06-01

    Sounds are fleeting, and assembling the sequence of inputs at the ear into a coherent percept requires auditory memory across various time scales. Auditory short-term memory comprises at least two components: an active ׳working memory' bolstered by rehearsal, and a sensory trace that may be passively retained. Working memory relies on representations recalled from long-term memory, and their rehearsal may require phonological mechanisms unique to humans. The sensory component, passive short-term memory (pSTM), is tractable to study in nonhuman primates, whose brain architecture and behavioral repertoire are comparable to our own. This review discusses recent advances in the behavioral and neurophysiological study of auditory memory with a focus on single-unit recordings from macaque monkeys performing delayed-match-to-sample (DMS) tasks. Monkeys appear to employ pSTM to solve these tasks, as evidenced by the impact of interfering stimuli on memory performance. In several regards, pSTM in monkeys resembles pitch memory in humans, and may engage similar neural mechanisms. Neural correlates of DMS performance have been observed throughout the auditory and prefrontal cortex, defining a network of areas supporting auditory STM with parallels to that supporting visual STM. These correlates include persistent neural firing, or a suppression of firing, during the delay period of the memory task, as well as suppression or (less commonly) enhancement of sensory responses when a sound is repeated as a ׳match' stimulus. Auditory STM is supported by a distributed temporo-frontal network in which sensitivity to stimulus history is an intrinsic feature of auditory processing. This article is part of a Special Issue entitled SI: Auditory working memory.

  6. Audio-visual speech cue combination.

    Directory of Open Access Journals (Sweden)

    Derek H Arnold

    Full Text Available BACKGROUND: Different sources of sensory information can interact, often shaping what we think we have seen or heard. This can enhance the precision of perceptual decisions relative to those made on the basis of a single source of information. From a computational perspective, there are multiple reasons why this might happen, and each predicts a different degree of enhanced precision. Relatively slight improvements can arise when perceptual decisions are made on the basis of multiple independent sensory estimates, as opposed to just one. These improvements can arise as a consequence of probability summation. Greater improvements can occur if two initially independent estimates are summated to form a single integrated code, especially if the summation is weighted in accordance with the variance associated with each independent estimate. This form of combination is often described as a Bayesian maximum likelihood estimate. Still greater improvements are possible if the two sources of information are encoded via a common physiological process. PRINCIPAL FINDINGS: Here we show that the provision of simultaneous audio and visual speech cues can result in substantial sensitivity improvements, relative to single sensory modality based decisions. The magnitude of the improvements is greater than can be predicted on the basis of either a Bayesian maximum likelihood estimate or a probability summation. CONCLUSION: Our data suggest that primary estimates of speech content are determined by a physiological process that takes input from both visual and auditory processing, resulting in greater sensitivity than would be possible if initially independent audio and visual estimates were formed and then subsequently combined.

  7. Auditory Neuropathy - A Case of Auditory Neuropathy after Hyperbilirubinemia

    Directory of Open Access Journals (Sweden)

    Maliheh Mazaher Yazdi

    2007-12-01

    Full Text Available Background and Aim: Auditory neuropathy is an hearing disorder in which peripheral hearing is normal, but the eighth nerve and brainstem are abnormal. By clinical definition, patient with this disorder have normal OAE, but exhibit an absent or severely abnormal ABR. Auditory neuropathy was first reported in the late 1970s as different methods could identify discrepancy between absent ABR and present hearing threshold. Speech understanding difficulties are worse than can be predicted from other tests of hearing function. Auditory neuropathy may also affect vestibular function. Case Report: This article presents electrophysiological and behavioral data from a case of auditory neuropathy in a child with normal hearing after bilirubinemia in a 5 years follow-up. Audiological findings demonstrate remarkable changes after multidisciplinary rehabilitation. Conclusion: auditory neuropathy may involve damage to the inner hair cells-specialized sensory cells in the inner ear that transmit information about sound through the nervous system to the brain. Other causes may include faulty connections between the inner hair cells and the nerve leading from the inner ear to the brain or damage to the nerve itself. People with auditory neuropathy have OAEs response but absent ABR and hearing loss threshold that can be permanent, get worse or get better.

  8. Electrophysiological correlates of predictive coding of auditory location in the perception of natural audiovisual events

    Directory of Open Access Journals (Sweden)

    Jeroen eStekelenburg

    2012-05-01

    Full Text Available In many natural audiovisual events (e.g., a clap of the two hands, the visual signal precedes the sound and thus allows observers to predict when, where, and which sound will occur. Previous studies have already reported that there are distinct neural correlates of temporal (when versus phonetic/semantic (which content on audiovisual integration. Here we examined the effect of visual prediction of auditory location (where in audiovisual biological motion stimuli by varying the spatial congruency between the auditory and visual part of the audiovisual stimulus. Visual stimuli were presented centrally, whereas auditory stimuli were presented either centrally or at 90° azimuth. Typical subadditive amplitude reductions (AV – V < A were found for the auditory N1 and P2 for spatially congruent and incongruent conditions. The new finding is that the N1 suppression was larger for spatially congruent stimuli. A very early audiovisual interaction was also found at 30-50 ms in the spatially congruent condition, while no effect of congruency was found on the suppression of the P2. This indicates that visual prediction of auditory location can be coded very early in auditory processing.

  9. Adaptive changes between cue abstraction and exemplar memory in a multiple-cue judgment task with continuous cues.

    Science.gov (United States)

    Karlsson, Linea; Juslin, Peter; Olsson, Henrik

    2007-12-01

    The majority of previous studies on multiple-cue judgment with continuous cues have involved comparisons between judgments and multiple linear regression models that integrated cues into a judgment. The authors present an experiment indicating that in a judgment task with additive combination of multiple continuous cues, people indeed displayed abstract knowledge of the cue criterion relations that was mentally integrated into a judgment, but in a task with multiplicative combination of continuous cues, people instead relied on retrieval of memory traces of similar judgment cases (exemplars). These results suggest that people may adopt qualitatively distinct forms of knowledge, depending on the structure of a multiple-cue judgment task. The authors discuss implications for theories of multiple-cue judgment. PMID:18229487

  10. Acute stress switches spatial navigation strategy from egocentric to allocentric in a virtual Morris water maze.

    Science.gov (United States)

    van Gerven, Dustin J H; Ferguson, Thomas; Skelton, Ronald W

    2016-07-01

    Stress and stress hormones are known to influence the function of the hippocampus, a brain structure critical for cognitive-map-based, allocentric spatial navigation. The caudate nucleus, a brain structure critical for stimulus-response-based, egocentric navigation, is not as sensitive to stress. Evidence for this comes from rodent studies, which show that acute stress or stress hormones impair allocentric, but not egocentric navigation. However, there have been few studies investigating the effect of acute stress on human spatial navigation, and the results of these have been equivocal. To date, no study has investigated whether acute stress can shift human navigational strategy selection between allocentric and egocentric navigation. The present study investigated this question by exposing participants to an acute psychological stressor (the Paced Auditory Serial Addition Task, PASAT), before testing navigational strategy selection in the Dual-Strategy Maze, a modified virtual Morris water maze. In the Dual-Strategy maze, participants can chose to navigate using a constellation of extra-maze cues (allocentrically) or using a single cue proximal to the goal platform (egocentrically). Surprisingly, PASAT stress biased participants to solve the maze allocentrically significantly more, rather than less, often. These findings have implications for understanding the effects of acute stress on cognitive function in general, and the function of the hippocampus in particular. PMID:27174311

  11. Acute stress switches spatial navigation strategy from egocentric to allocentric in a virtual Morris water maze.

    Science.gov (United States)

    van Gerven, Dustin J H; Ferguson, Thomas; Skelton, Ronald W

    2016-07-01

    Stress and stress hormones are known to influence the function of the hippocampus, a brain structure critical for cognitive-map-based, allocentric spatial navigation. The caudate nucleus, a brain structure critical for stimulus-response-based, egocentric navigation, is not as sensitive to stress. Evidence for this comes from rodent studies, which show that acute stress or stress hormones impair allocentric, but not egocentric navigation. However, there have been few studies investigating the effect of acute stress on human spatial navigation, and the results of these have been equivocal. To date, no study has investigated whether acute stress can shift human navigational strategy selection between allocentric and egocentric navigation. The present study investigated this question by exposing participants to an acute psychological stressor (the Paced Auditory Serial Addition Task, PASAT), before testing navigational strategy selection in the Dual-Strategy Maze, a modified virtual Morris water maze. In the Dual-Strategy maze, participants can chose to navigate using a constellation of extra-maze cues (allocentrically) or using a single cue proximal to the goal platform (egocentrically). Surprisingly, PASAT stress biased participants to solve the maze allocentrically significantly more, rather than less, often. These findings have implications for understanding the effects of acute stress on cognitive function in general, and the function of the hippocampus in particular.

  12. Changes in auditory perceptions and cortex resulting from hearing recovery after extended congenital unilateral hearing loss

    Directory of Open Access Journals (Sweden)

    Jill B Firszt

    2013-12-01

    Full Text Available Monaural hearing induces auditory system reorganization. Imbalanced input also degrades time-intensity cues for sound localization and signal segregation for listening in noise. While there have been studies of bilateral auditory deprivation and later hearing restoration (e.g. cochlear implants, less is known about unilateral auditory deprivation and subsequent hearing improvement. We investigated effects of long-term congenital unilateral hearing loss on localization, speech understanding, and cortical organization following hearing recovery. Hearing in the congenitally affected ear of a 41 year old female improved significantly after stapedotomy and reconstruction. Pre-operative hearing threshold levels showed unilateral, mixed, moderately-severe to profound hearing loss. The contralateral ear had hearing threshold levels within normal limits. Testing was completed prior to, and three and nine months after surgery. Measurements were of sound localization with intensity-roved stimuli and speech recognition in various noise conditions. We also evoked magnetic resonance signals with monaural stimulation to the unaffected ear. Activation magnitudes were determined in core, belt, and parabelt auditory cortex regions via an interrupted single event design. Hearing improvement following 40 years of congenital unilateral hearing loss resulted in substantially improved sound localization and speech recognition in noise. Auditory cortex also reorganized. Contralateral auditory cortex responses were increased after hearing recovery and the extent of activated cortex was bilateral, including a greater portion of the posterior superior temporal plane. Thus, prolonged predominant monaural stimulation did not prevent auditory system changes consequent to restored binaural hearing. Results support future research of unilateral auditory deprivation effects and plasticity, with consideration for length of deprivation, age at hearing correction, degree and type

  13. Auditory Processing Disorder in Children

    Science.gov (United States)

    ... free publications Find organizations Related Topics Auditory Neuropathy Autism Spectrum Disorder: Communication Problems in Children Dysphagia Quick ... NIH… Turning Discovery Into Health ® National Institute on Deafness and Other Communication Disorders 31 Center Drive, MSC ...

  14. Auditory Processing Disorder (For Parents)

    Science.gov (United States)

    ... and school. A positive, realistic attitude and healthy self-esteem in a child with APD can work wonders. And kids with APD can go on to ... Parents MORE ON THIS TOPIC Auditory Processing Disorder Special ...

  15. Biases in Visual, Auditory, and Audiovisual Perception of Space.

    Directory of Open Access Journals (Sweden)

    Brian Odegaard

    2015-12-01

    Full Text Available Localization of objects and events in the environment is critical for survival, as many perceptual and motor tasks rely on estimation of spatial location. Therefore, it seems reasonable to assume that spatial localizations should generally be accurate. Curiously, some previous studies have reported biases in visual and auditory localizations, but these studies have used small sample sizes and the results have been mixed. Therefore, it is not clear (1 if the reported biases in localization responses are real (or due to outliers, sampling bias, or other factors, and (2 whether these putative biases reflect a bias in sensory representations of space or a priori expectations (which may be due to the experimental setup, instructions, or distribution of stimuli. Here, to address these questions, a dataset of unprecedented size (obtained from 384 observers was analyzed to examine presence, direction, and magnitude of sensory biases, and quantitative computational modeling was used to probe the underlying mechanism(s driving these effects. Data revealed that, on average, observers were biased towards the center when localizing visual stimuli, and biased towards the periphery when localizing auditory stimuli. Moreover, quantitative analysis using a Bayesian Causal Inference framework suggests that while pre-existing spatial biases for central locations exert some influence, biases in the sensory representations of both visual and auditory space are necessary to fully explain the behavioral data. How are these opposing visual and auditory biases reconciled in conditions in which both auditory and visual stimuli are produced by a single event? Potentially, the bias in one modality could dominate, or the biases could interact/cancel out. The data revealed that when integration occurred in these conditions, the visual bias dominated, but the magnitude of this bias was reduced compared to unisensory conditions. Therefore, multisensory integration not only

  16. Behavioral Cues of Interpersonal Warmth

    Science.gov (United States)

    Bayes, Marjorie A.

    1972-01-01

    The results of this study suggest, first, that interpersonal warmth does seem to be a personality dimension which can be reliably judged and, second, that it was possible to define and demonstrate the relevance of a number of behavioral cues for warmth. (Author)

  17. Optimal assessment of multiple cues

    NARCIS (Netherlands)

    Fawcett, TW; Johnstone, RA

    2003-01-01

    In a wide range of contexts from mate choice to foraging, animals are required to discriminate between alternative options on the basis of multiple cues. How should they best assess such complex multicomponent stimuli? Here, we construct a model to investigate this problem, focusing on a simple case

  18. Fractal Fluctuations in Human Walking: Comparison Between Auditory and Visually Guided Stepping.

    Science.gov (United States)

    Terrier, Philippe

    2016-09-01

    In human locomotion, sensorimotor synchronization of gait consists of the coordination of stepping with rhythmic auditory cues (auditory cueing, AC). AC changes the long-range correlations among consecutive strides (fractal dynamics) into anti-correlations. Visual cueing (VC) is the alignment of step lengths with marks on the floor. The effects of VC on the fluctuation structure of walking have not been investigated. Therefore, the objective was to compare the effects of AC and VC on the fluctuation pattern of basic spatiotemporal gait parameters. Thirty-six healthy individuals walked 3 × 500 strides on an instrumented treadmill with augmented reality capabilities. The conditions were no cueing (NC), AC, and VC. AC included an isochronous metronome. For VC, projected stepping stones were synchronized with the treadmill speed. Detrended fluctuation analysis assessed the correlation structure. The coefficient of variation (CV) was also assessed. The results showed that AC and VC similarly induced a strong anti-correlated pattern in the gait parameters. The CVs were similar between the NC and AC conditions but substantially higher in the VC condition. AC and VC probably mobilize similar motor control pathways and can be used alternatively in gait rehabilitation. However, the increased gait variability induced by VC should be considered. PMID:26903091

  19. Measuring the performance of visual to auditory information conversion.

    Directory of Open Access Journals (Sweden)

    Shern Shiou Tan

    Full Text Available BACKGROUND: Visual to auditory conversion systems have been in existence for several decades. Besides being among the front runners in providing visual capabilities to blind users, the auditory cues generated from image sonification systems are still easier to learn and adapt to compared to other similar techniques. Other advantages include low cost, easy customizability, and universality. However, every system developed so far has its own set of strengths and weaknesses. In order to improve these systems further, we propose an automated and quantitative method to measure the performance of such systems. With these quantitative measurements, it is possible to gauge the relative strengths and weaknesses of different systems and rank the systems accordingly. METHODOLOGY: Performance is measured by both the interpretability and also the information preservation of visual to auditory conversions. Interpretability is measured by computing the correlation of inter image distance (IID and inter sound distance (ISD whereas the information preservation is computed by applying Information Theory to measure the entropy of both visual and corresponding auditory signals. These measurements provide a basis and some insights on how the systems work. CONCLUSIONS: With an automated interpretability measure as a standard, more image sonification systems can be developed, compared, and then improved. Even though the measure does not test systems as thoroughly as carefully designed psychological experiments, a quantitative measurement like the one proposed here can compare systems to a certain degree without incurring much cost. Underlying this research is the hope that a major breakthrough in image sonification systems will allow blind users to cost effectively regain enough visual functions to allow them to lead secure and productive lives.

  20. A rate code for sound azimuth in monkey auditory cortex: implications for human neuroimaging studies

    OpenAIRE

    Werner-Reiss, Uri; Jennifer M Groh

    2008-01-01

    Is sound location represented in the auditory cortex of humans and monkeys? Human neuroimaging experiments have had only mixed success at demonstrating sound location sensitivity in primary auditory cortex. This is in apparent conflict with studies in monkeys and other animals, where single-unit recording studies have found stronger evidence for spatial sensitivity. Does this apparent discrepancy reflect a difference between humans and animals, or does it reflect differences in the sensitivit...

  1. Cues for localization in the horizontal plane

    DEFF Research Database (Denmark)

    Jeppesen, Jakob; Møller, Henrik

    2005-01-01

    manipulated in HRTFs used for binaural synthesis of sound in the horizontal plane. The manipulation of cues resulted in HRTFs with cues ranging from correct combinations of spectral information and ITDs to combinations with severely conflicting cues. Both the ITD and the spectral information seem to be...

  2. Fragrances as Cues for Remembering Words

    Science.gov (United States)

    Eich, James Eric

    1978-01-01

    Results of this experiment suggest that specific encoding of a word is not a necessary condition for cue effectiveness. Results imply that the effect of a nominal fragrance cue arises through the mediation of a functional, implicitly generated semantic cue. (Author/SW)

  3. Cue salience influences the use of height cues in reorientation in pigeons (Columba livia).

    Science.gov (United States)

    Du, Yu; Mahdi, Nuha; Paul, Breanne; Spetch, Marcia L

    2016-07-01

    Although orienting ability has been examined with numerous types of cues, most research has focused only on cues from the horizontal plane. The current study investigated pigeons' use of wall height, a vertical cue, in an open-field task and compared it with their use of horizontal cues. Pigeons were trained to locate food in 2 diagonal corners of a rectangular enclosure with 2 opposite high walls as height cues. Before each trial, pigeons were rotated to disorient them. In training, pigeons could use either the horizontal cues from the rectangular enclosure or the height information from the walls to locate the food. In testing, the apparatus was modified to provide (a) horizontal cues only, (b) height cues only, and (c) both height and horizontal cues in conflict. In Experiment 1 the lower and high walls, respectively, were 40 and 80 cm, whereas in Experiment 2 they were made more perceptually salient by shortening them to 20 and 40 cm. Pigeons accurately located the goal corners with horizontal cues alone in both experiments, but they searched accurately with height cues alone only in Experiment 2. When the height cues conflicted with horizontal cues, pigeons preferred the horizontal cues over the height cues in Experiment 1 but not in Experiment 2, suggesting that perceptual salience influences the relative weighting of cues. (PsycINFO Database Record PMID:27379717

  4. Cue salience influences the use of height cues in reorientation in pigeons (Columba livia).

    Science.gov (United States)

    Du, Yu; Mahdi, Nuha; Paul, Breanne; Spetch, Marcia L

    2016-07-01

    Although orienting ability has been examined with numerous types of cues, most research has focused only on cues from the horizontal plane. The current study investigated pigeons' use of wall height, a vertical cue, in an open-field task and compared it with their use of horizontal cues. Pigeons were trained to locate food in 2 diagonal corners of a rectangular enclosure with 2 opposite high walls as height cues. Before each trial, pigeons were rotated to disorient them. In training, pigeons could use either the horizontal cues from the rectangular enclosure or the height information from the walls to locate the food. In testing, the apparatus was modified to provide (a) horizontal cues only, (b) height cues only, and (c) both height and horizontal cues in conflict. In Experiment 1 the lower and high walls, respectively, were 40 and 80 cm, whereas in Experiment 2 they were made more perceptually salient by shortening them to 20 and 40 cm. Pigeons accurately located the goal corners with horizontal cues alone in both experiments, but they searched accurately with height cues alone only in Experiment 2. When the height cues conflicted with horizontal cues, pigeons preferred the horizontal cues over the height cues in Experiment 1 but not in Experiment 2, suggesting that perceptual salience influences the relative weighting of cues. (PsycINFO Database Record

  5. Humans as an animal model? : studies on cue interaction, occasion setting, and context dependency

    OpenAIRE

    Dibbets, Pauline

    2002-01-01

    The objective of the present thesis was to study human learning behaviour and to compare the results with those from animal learning studies. Three topics originating from animal learning research were examined: cue interaction, occasion setting, and context dependency. A series of experiments was first carried out to examine the influence of spatial position on cue-interaction effects in a predictive-learning task. Evidence that previously learned information about a stimulus can interact wi...

  6. Sex difference in cue strategy in a modified version of the Morris water task: correlations between brain and behaviour.

    Directory of Open Access Journals (Sweden)

    Robin J Keeley

    Full Text Available BACKGROUND: Sex differences in spatial memory function have been reported with mixed results in the literature, with some studies showing male advantages and others showing no differences. When considering estrus cycle in females, results are mixed at to whether high or low circulating estradiol results in an advantage in spatial navigation tasks. Research involving humans and rodents has demonstrated males preferentially employ Euclidean strategies and utilize geometric cues in order to spatially navigate, whereas females employ landmark strategies and cues in order to spatially navigate. METHODOLOGY/PRINCIPAL FINDINGS: This study used the water-based snowcone maze in order to assess male and female preference for landmark or geometric cues, with specific emphasis placed on the effects of estrus cycle phase for female rat. Performance and preference for the geometric cue was examined in relation to total hippocampal and hippocampal subregions (CA1&2, CA3 and dentate gyrus volumes and entorhinal cortex thickness in order to determine the relation between strategy and spatial performance and brain area size. The study revealed that males outperformed females overall during training trials, relied on the geometric cue when the platform was moved and showed significant correlations between entorhinal cortex thickness and spatial memory performance. No gross differences in behavioural performance was observed within females when accounting for cyclicity, and only total hippocampal volume was correlated with performance during the learning trials. CONCLUSIONS/SIGNIFICANCE: This study demonstrates the sex-specific use of cues and brain areas in a spatial learning task.

  7. The Effect of Visual Cues on Difficulty Ratings for Segregation of Musical Streams in Listeners with Impaired Hearing

    OpenAIRE

    Hamish Innes-Brown; Jeremy Marozeau; Peter Blamey

    2011-01-01

    BACKGROUND: Enjoyment of music is an important part of life that may be degraded for people with hearing impairments, especially those using cochlear implants. The ability to follow separate lines of melody is an important factor in music appreciation. This ability relies on effective auditory streaming, which is much reduced in people with hearing impairment, contributing to difficulties in music appreciation. The aim of this study was to assess whether visual cues could reduce the subjectiv...

  8. Visual cues for data mining

    Science.gov (United States)

    Rogowitz, Bernice E.; Rabenhorst, David A.; Gerth, John A.; Kalin, Edward B.

    1996-04-01

    This paper describes a set of visual techniques, based on principles of human perception and cognition, which can help users analyze and develop intuitions about tabular data. Collections of tabular data are widely available, including, for example, multivariate time series data, customer satisfaction data, stock market performance data, multivariate profiles of companies and individuals, and scientific measurements. In our approach, we show how visual cues can help users perform a number of data mining tasks, including identifying correlations and interaction effects, finding clusters and understanding the semantics of cluster membership, identifying anomalies and outliers, and discovering multivariate relationships among variables. These cues are derived from psychological studies on perceptual organization, visual search, perceptual scaling, and color perception. These visual techniques are presented as a complement to the statistical and algorithmic methods more commonly associated with these tasks, and provide an interactive interface for the human analyst.

  9. Visual Landmarks Facilitate Rodent Spatial Navigation in Virtual Reality Environments

    Science.gov (United States)

    Youngstrom, Isaac A.; Strowbridge, Ben W.

    2012-01-01

    Because many different sensory modalities contribute to spatial learning in rodents, it has been difficult to determine whether spatial navigation can be guided solely by visual cues. Rodents moving within physical environments with visual cues engage a variety of nonvisual sensory systems that cannot be easily inhibited without lesioning brain…

  10. Neural Correlates of an Auditory Afterimage in Primary Auditory Cortex

    OpenAIRE

    Noreña, A. J.; Eggermont, J. J.

    2003-01-01

    The Zwicker tone (ZT) is defined as an auditory negative afterimage, perceived after the presentation of an appropriate inducer. Typically, a notched noise (NN) with a notch width of 1/2 octave induces a ZT with a pitch falling in the frequency range of the notch. The aim of the present study was to find potential neural correlates of the ZT in the primary auditory cortex of ketamine-anesthetized cats. Responses of multiunits were recorded simultaneously with two 8-electrode arrays during 1 s...

  11. Encoding audio motion: spatial impairment in early blind individuals

    OpenAIRE

    Finocchietti, Sara; Cappagli, Giulia; Gori, Monica

    2015-01-01

    The consequence of blindness on auditory spatial localization has been an interesting issue of research in the last decade providing mixed results. Enhanced auditory spatial skills in individuals with visual impairment have been reported by multiple studies, while some aspects of spatial hearing seem to be impaired in the absence of vision. In this study, the ability to encode the trajectory of a 2-dimensional sound motion, reproducing the complete movement, and reaching the correct end-point...

  12. Functional neuroanatomy of spatial sound processing in Alzheimer's disease.

    OpenAIRE

    Golden, HL; Agustus, JL; Nicholas, JM; Schott, JM; Crutch, SJ; L. Mancini; Warren, JD

    2016-01-01

    Deficits of auditory scene analysis accompany Alzheimer's disease (AD). However, the functional neuroanatomy of spatial sound processing has not been defined in AD. We addressed this using a "sparse" fMRI virtual auditory spatial paradigm in 14 patients with typical AD in relation to 16 healthy age-matched individuals. Sound stimulus sequences discretely varied perceived spatial location and pitch of the sound source in a factorial design. AD was associated with loss of differentiated cortica...

  13. Auditory Hallucinations in Acute Stroke

    Directory of Open Access Journals (Sweden)

    Yair Lampl

    2005-01-01

    Full Text Available Auditory hallucinations are uncommon phenomena which can be directly caused by acute stroke, mostly described after lesions of the brain stem, very rarely reported after cortical strokes. The purpose of this study is to determine the frequency of this phenomenon. In a cross sectional study, 641 stroke patients were followed in the period between 1996–2000. Each patient underwent comprehensive investigation and follow-up. Four patients were found to have post cortical stroke auditory hallucinations. All of them occurred after an ischemic lesion of the right temporal lobe. After no more than four months, all patients were symptom-free and without therapy. The fact the auditory hallucinations may be of cortical origin must be taken into consideration in the treatment of stroke patients. The phenomenon may be completely reversible after a couple of months.

  14. Perceiving speech in context: Compensation for contextual variability during acoustic cue encoding and categorization

    Science.gov (United States)

    Toscano, Joseph Christopher

    Several fundamental questions about speech perception concern how listeners understand spoken language despite considerable variability in speech sounds across different contexts (the problem of lack of invariance in speech). This contextual variability is caused by several factors, including differences between individual talkers' voices, variation in speaking rate, and effects of coarticulatory context. A number of models have been proposed to describe how the speech system handles differences across contexts. Critically, these models make different predictions about (1) whether contextual variability is handled at the level of acoustic cue encoding or categorization, (2) whether it is driven by feedback from category-level processes or interactions between cues, and (3) whether listeners discard fine-grained acoustic information to compensate for contextual variability. Separating the effects of cue- and category-level processing has been difficult because behavioral measures tap processes that occur well after initial cue encoding and are influenced by task demands and linguistic information. Recently, we have used the event-related brain potential (ERP) technique to examine cue encoding and online categorization. Specifically, we have looked at differences in the auditory N1 as a measure of acoustic cue encoding and the P3 as a measure of categorization. This allows us to examine multiple levels of processing during speech perception and can provide a useful tool for studying effects of contextual variability. Here, I apply this approach to determine the point in processing at which context has an effect on speech perception and to examine whether acoustic cues are encoded continuously. Several types of contextual variability (talker gender, speaking rate, and coarticulation), as well as several acoustic cues (voice onset time, formant frequencies, and bandwidths), are examined in a series of experiments. The results suggest that (1) at early stages of speech

  15. Adaptation in the auditory system: an overview

    OpenAIRE

    David ePérez-González; Malmierca, Manuel S.

    2014-01-01

    The early stages of the auditory system need to preserve the timing information of sounds in order to extract the basic features of acoustic stimuli. At the same time, different processes of neuronal adaptation occur at several levels to further process the auditory information. For instance, auditory nerve fiber responses already experience adaptation of their firing rates, a type of response that can be found in many other auditory nuclei and may be useful for emphasizing the onset of the s...

  16. Dual streams of auditory afferents target multiple domains in the primate prefrontal cortex

    Science.gov (United States)

    Romanski, L. M.; Tian, B.; Fritz, J.; Mishkin, M.; Goldman-Rakic, P. S.; Rauschecker, J. P.

    2009-01-01

    ‘What’ and ‘where’ visual streams define ventrolateral object and dorsolateral spatial processing domains in the prefrontal cortex of nonhuman primates. We looked for similar streams for auditory–prefrontal connections in rhesus macaques by combining microelectrode recording with anatomical tract-tracing. Injection of multiple tracers into physiologically mapped regions AL, ML and CL of the auditory belt cortex revealed that anterior belt cortex was reciprocally connected with the frontal pole (area 10), rostral principal sulcus (area 46) and ventral prefrontal regions (areas 12 and 45), whereas the caudal belt was mainly connected with the caudal principal sulcus (area 46) and frontal eye fields (area 8a). Thus separate auditory streams originate in caudal and rostral auditory cortex and target spatial and non-spatial domains of the frontal lobe, respectively. PMID:10570492

  17. Quit interest influences smoking cue-reactivity.

    Science.gov (United States)

    Veilleux, Jennifer C; Skinner, Kayla D; Pollert, Garrett A

    2016-12-01

    Interest in quitting smoking is important to model in cue-reactivity studies, because the craving elicited by cue exposure likely requires different self-regulation efforts for smokers who are interested in quitting compared to those without any quit interest. The objective of the current study was to evaluate the role of quit interest in how cigarette cue exposure influences self-control efforts. Smokers interested in quitting (n=37) and smokers with no interest in quitting (n=53) were randomly assigned to a cigarette or neutral cue exposure task. Following the cue exposure, all participants completed two self-control tasks, a measure of risky gambling (the Iowa Gambling Task) and a cold pressor tolerance task. Results indicated that smokers interested in quitting had worse performance on the gambling task when exposed to a cigarette cue compared to neutral cue exposure. We also found that people interested in quitting tolerated the cold pressor task for a shorter amount of time than people not interested in quitting. Finally, we found that for people interested in quitting, exposure to a cigarette cue was associated with increased motivation to take steps toward decreasing use. Overall these results suggest that including quit interest in studies of cue reactivity is valuable, as quit interest influenced smoking cue-reactivity responses. PMID:27487082

  18. Sexual selection in the squirrel treefrog Hyla squirella: the role of multimodal cue assessment in female choice

    Science.gov (United States)

    Taylor, Ryan C.; Buchanan, Bryant W.; Doherty, Jessie L.

    2007-01-01

    Anuran amphibians have provided an excellent system for the study of animal communication and sexual selection. Studies of female mate choice in anurans, however, have focused almost exclusively on the role of auditory signals. In this study, we examined the effect of both auditory and visual cues on female choice in the squirrel treefrog. Our experiments used a two-choice protocol in which we varied male vocalization properties, visual cues, or both, to assess female preferences for the different cues. Females discriminated against high-frequency calls and expressed a strong preference for calls that contained more energy per unit time (faster call rate). Females expressed a preference for the visual stimulus of a model of a calling male when call properties at the two speakers were held the same. They also showed a significant attraction to a model possessing a relatively large lateral body stripe. These data indicate that visual cues do play a role in mate attraction in this nocturnal frog species. Furthermore, this study adds to a growing body of evidence that suggests that multimodal signals play an important role in sexual selection.

  19. The semantic representation of event information depends on the cue modality: an instance of meaning-based retrieval.

    Science.gov (United States)

    Karlsson, Kristina; Sikström, Sverker; Willander, Johan

    2013-01-01

    The semantic content, or the meaning, is the essence of autobiographical memories. In comparison to previous research, which has mainly focused on the phenomenological experience and the age distribution of retrieved events, the present study provides a novel view on the retrieval of event information by quantifying the information as semantic representations. We investigated the semantic representation of sensory cued autobiographical events and studied the modality hierarchy within the multimodal retrieval cues. The experiment comprised a cued recall task, where the participants were presented with visual, auditory, olfactory or multimodal retrieval cues and asked to recall autobiographical events. The results indicated that the three different unimodal retrieval cues generate significantly different semantic representations. Further, the auditory and the visual modalities contributed the most to the semantic representation of the multimodally retrieved events. Finally, the semantic representation of the multimodal condition could be described as a combination of the three unimodal conditions. In conclusion, these results suggest that the meaning of the retrieved event information depends on the modality of the retrieval cues.

  20. Phonetic training with acoustic cue manipulations: A comparison of methods for teaching English /r/-/l/ to Japanese adults

    Science.gov (United States)

    Iverson, Paul; Hazan, Valerie; Bannister, Kerry

    2005-11-01

    Recent work [Iverson et al. (2003) Cognition, 87, B47-57] has suggested that Japanese adults have difficulty learning English /r/ and /l/ because they are overly sensitive to acoustic cues that are not reliable for /r/-/l/ categorization (e.g., F2 frequency). This study investigated whether cue weightings are altered by auditory training, and compared the effectiveness of different training techniques. Separate groups of subjects received High Variability Phonetic Training (natural words from multiple talkers), and 3 techniques in which the natural recordings were altered via signal processing (All Enhancement, with F3 contrast maximized and closure duration lengthened; Perceptual Fading, with F3 enhancement reduced during training; and Secondary Cue Variability, with variation in F2 and durations increased during training). The results demonstrated that all of the training techniques improved /r/-/l/ identification by Japanese listeners, but there were no differences between the techniques. Training also altered the use of secondary acoustic cues; listeners became biased to identify stimuli as English /l/ when the cues made them similar to the Japanese /r/ category, and reduced their use of secondary acoustic cues for stimuli that were dissimilar to Japanese /r/. The results suggest that both category assimilation and perceptual interference affect English /r/ and /l/ acquisition.

  1. Cue predictability changes scaling in eye-movement fluctuations.

    Science.gov (United States)

    Wallot, Sebastian; Coey, Charles A; Richardson, Michael J

    2015-10-01

    Recent research has provided evidence for scaling-relations in eye-movement fluctuations, but not much is known about what these scaling relations imply about cognition or eye-movement control. Generally, scaling relations in behavioral and neurophysiological data have been interpreted as an indicator for the coordination of neurophysiological and cognitive processes. In this study, we investigated the effect of predictability in timing and gaze-direction on eye-movement fluctuations. Participants performed a simple eye-movement task, in which a visual cue prompted their gaze to different locations on a spatial layout, and the predictability about temporal and directional aspects of the cue were manipulated. The results showed that scaling exponents in eye-movements decreased with predictability and were related to the participants' perceived effort during the task. In relation to past research, these findings suggest that scaling exponents reflect a relative demand for voluntary control during task performance. PMID:26337612

  2. Different patterns of auditory cortex activation revealed by functional magnetic resonance imaging

    International Nuclear Information System (INIS)

    In the last few years, functional Magnetic Resonance Imaging (fMRI) has been widely accepted as an effective tool for mapping brain activities in both the sensorimotor and the cognitive field. The present work aims to assess the possibility of using fMRI methods to study the cortical response to different acoustic stimuli. Furthermore, we refer to recent data collected at Frankfurt University on the cortical pattern of auditory hallucinations. Healthy subjects showed broad bilateral activation, mostly located in the transverse gyrus of Heschl. The analysis of the cortical activation induced by different stimuli has pointed out a remarkable difference in the spatial and temporal features of the auditory cortex response to pulsed tones and pure tones. The activated areas during episodes of auditory hallucinations match the location of primary auditory cortex as defined in control measurements with the same patients and in the experiments on healthy subjects. (authors)

  3. Neural Correlates of Auditory Figure-Ground Segregation Based on Temporal Coherence

    Science.gov (United States)

    Teki, Sundeep; Barascud, Nicolas; Picard, Samuel; Payne, Christopher; Griffiths, Timothy D.; Chait, Maria

    2016-01-01

    To make sense of natural acoustic environments, listeners must parse complex mixtures of sounds that vary in frequency, space, and time. Emerging work suggests that, in addition to the well-studied spectral cues for segregation, sensitivity to temporal coherence—the coincidence of sound elements in and across time—is also critical for the perceptual organization of acoustic scenes. Here, we examine pre-attentive, stimulus-driven neural processes underlying auditory figure-ground segregation using stimuli that capture the challenges of listening in complex scenes where segregation cannot be achieved based on spectral cues alone. Signals (“stochastic figure-ground”: SFG) comprised a sequence of brief broadband chords containing random pure tone components that vary from 1 chord to another. Occasional tone repetitions across chords are perceived as “figures” popping out of a stochastic “ground.” Magnetoencephalography (MEG) measurement in naïve, distracted, human subjects revealed robust evoked responses, commencing from about 150 ms after figure onset that reflect the emergence of the “figure” from the randomly varying “ground.” Neural sources underlying this bottom-up driven figure-ground segregation were localized to planum temporale, and the intraparietal sulcus, demonstrating that this area, outside the “classic” auditory system, is also involved in the early stages of auditory scene analysis.” PMID:27325682

  4. Neural Correlates of Auditory Figure-Ground Segregation Based on Temporal Coherence.

    Science.gov (United States)

    Teki, Sundeep; Barascud, Nicolas; Picard, Samuel; Payne, Christopher; Griffiths, Timothy D; Chait, Maria

    2016-09-01

    To make sense of natural acoustic environments, listeners must parse complex mixtures of sounds that vary in frequency, space, and time. Emerging work suggests that, in addition to the well-studied spectral cues for segregation, sensitivity to temporal coherence-the coincidence of sound elements in and across time-is also critical for the perceptual organization of acoustic scenes. Here, we examine pre-attentive, stimulus-driven neural processes underlying auditory figure-ground segregation using stimuli that capture the challenges of listening in complex scenes where segregation cannot be achieved based on spectral cues alone. Signals ("stochastic figure-ground": SFG) comprised a sequence of brief broadband chords containing random pure tone components that vary from 1 chord to another. Occasional tone repetitions across chords are perceived as "figures" popping out of a stochastic "ground." Magnetoencephalography (MEG) measurement in naïve, distracted, human subjects revealed robust evoked responses, commencing from about 150 ms after figure onset that reflect the emergence of the "figure" from the randomly varying "ground." Neural sources underlying this bottom-up driven figure-ground segregation were localized to planum temporale, and the intraparietal sulcus, demonstrating that this area, outside the "classic" auditory system, is also involved in the early stages of auditory scene analysis." PMID:27325682

  5. Effect of post-training unilateral labyrinthectomy in a spatial orientation task by guinea pigs.

    Science.gov (United States)

    Chapuis, N; Krimm, M; de Waele, C; Vibert, N; Berthoz, A

    1992-11-15

    The effects of unilateral labyrinthectomy in guinea pigs have been studied on an angular orientation task consisting, in an open field, of running to a hidden goal oriented at 45 degrees with respect to the cephalocaudal axis of the animal placed in a starting-box. The task was conducted in light but in an homogeneous environment, i.e. without visual, auditory or olfactory cues indicating the location of the goal. A second group of animals was submitted to a similar task running to a hidden goal but the place of the goal was indicated by a colored card. All the animals were trained before the lesion and tested in their respective task for 1 month after the lesion. In the task conducted without conspicuous cues, animals were dramatically disturbed. In contrast, animals pretrained in the visually guided task were not impaired after the lesion. These results point out the important role of vestibular information in performing spatial tasks based on angular estimation, since, even if proprioceptive and visuokinesthetic information remain available, subjects seemed not able to maintain a correct angular trajectory. The trajectories being not disturbed in the visually guided task, one can exclude the hypothesis that such deficit was due to a purely motor disturbance. PMID:1466778

  6. Watch out! Magnetoencephalographic evidence for early modulation of attention orienting by fearful gaze cueing.

    Directory of Open Access Journals (Sweden)

    Fanny Lachat

    Full Text Available Others' gaze and emotional facial expression are important cues for the process of attention orienting. Here, we investigated with magnetoencephalography (MEG whether the combination of averted gaze and fearful expression may elicit a selectively early effect of attention orienting on the brain responses to targets. We used the direction of gaze of centrally presented fearful and happy faces as the spatial attention orienting cue in a Posner-like paradigm where the subjects had to detect a target checkerboard presented at gazed-at (valid trials or non gazed-at (invalid trials locations of the screen. We showed that the combination of averted gaze and fearful expression resulted in a very early attention orienting effect in the form of additional parietal activity between 55 and 70 ms for the valid versus invalid targets following fearful gaze cues. No such effect was obtained for the targets following happy gaze cues. This early cue-target validity effect selective of fearful gaze cues involved the left superior parietal region and the left lateral middle occipital region. These findings provide the first evidence for an effect of attention orienting induced by fearful gaze in the time range of C1. In doing so, they demonstrate the selective impact of combined gaze and fearful expression cues in the process of attention orienting.

  7. Tone-in-noise detection using envelope cues: comparison of signal-processing-based and physiological models.

    Science.gov (United States)

    Mao, Junwen; Carney, Laurel H

    2015-02-01

    Tone-in-noise detection tasks with reproducible noise maskers have been used to identify cues that listeners use to detect signals in noisy environments. Previous studies have shown that energy, envelope, and fine-structure cues are significantly correlated to listeners' performance for detection of a 500-Hz tone in noise. In this study, envelope cues were examined for both diotic and dichotic tone-in-noise detection using both stimulus-based signal processing and physiological models. For stimulus-based envelope cues, a modified envelope slope model was used for the diotic condition and the binaural slope of the interaural envelope difference model for the dichotic condition. Stimulus-based models do not include key nonlinear transformations in the auditory periphery such as compression, rate and dynamic range adaptation, and rate saturation, all of which affect the encoding of the stimulus envelope. For physiological envelope cues, stimuli were passed through models for the auditory nerve (AN), cochlear nucleus, and inferior colliculus (IC). The AN and cochlear nucleus models included appropriate modulation gain, another transformation of the stimulus envelope that is not typically included in stimulus-based models. A model IC cell was simulated with a linear band-pass modulation filter. The average discharge rate and response fluctuations of the model IC cell were compared to human performance. Previous studies have predicted a significant amount of the variance across reproducible noise maskers in listeners' detection using stimulus-based envelope cues. In this study, a physiological model that includes neural mechanisms that affect encoding of the stimulus envelope predicts a similar amount of the variance in listeners' performance across noise maskers. PMID:25266265

  8. Moving to music: Effects of heard and imagined musical cues on movement-related brain activity

    Directory of Open Access Journals (Sweden)

    Rebecca S Schaefer

    2014-09-01

    Full Text Available Music is commonly used to facilitate or support movement, and increasingly used in movement rehabilitation. Additionally, there is some evidence to suggest that music imagery, which is reported to lead to brain signatures similar to music perception, may also assist movement. However, it is not yet known whether either imagined or musical cueing changes the way in which the motor system of the human brain is activated during simple movements. Here, functional Magnetic Resonance Imaging (fMRI was used to compare neural activity during wrist flexions performed to either heard or imagined music with self-pacing of the same movement without any cueing. Focusing specifically on the motor network of the brain, analyses were performed within a mask of BA4, BA6, the basal ganglia (putamen, caudate and pallidum, the motor nuclei of the thalamus and the whole cerebellum. Results revealed that moving to music compared with self-paced movement resulted in significantly increased activation in left cerebellum VI. Moving to imagined music led to significantly more activation in pre-supplementary motor area (pre-SMA and right globus pallidus, relative to self-paced movement. When the music and imagery cueing conditions were contrasted directly, movements in the music condition showed significantly more activity in left hemisphere cerebellum VII and right hemisphere and vermis of cerebellum IX, while the imagery condition revealed more significant activity in pre-SMA. These results suggest that cueing movement with actual or imagined music impacts upon engagement of motor network regions during the movement, and suggest that heard and imagined cues can modulate movement in subtly different ways. These results may have implications for the applicability of auditory cueing in movement rehabilitation for different patient populations.

  9. Using electrophysiology to demonstrate that cueing affects long-term memory storage over the short term.

    Science.gov (United States)

    Maxcey, Ashleigh M; Fukuda, Keisuke; Song, Won S; Woodman, Geoffrey F

    2015-10-01

    As researchers who study working memory, we often assume that participants keep a representation of an object in working memory when we present a cue that indicates that the object will be tested in a couple of seconds. This intuitively accounts for how well people can remember a cued object, relative to their memory for that same object presented without a cue. However, it is possible that this superior memory does not purely reflect storage of the cued object in working memory. We tested the hypothesis that cues presented during a stream of objects, followed by a short retention interval and immediate memory test, can change how information is handled by long-term memory. We tested this hypothesis by using a family of frontal event-related potentials believed to reflect long-term memory storage. We found that these frontal indices of long-term memory were sensitive to the task relevance of objects signaled by auditory cues, even when the objects repeated frequently, such that proactive interference was high. Our findings indicate the problematic nature of assuming process purity in the study of working memory, and demonstrate that frequent stimulus repetitions fail to isolate the role of working memory mechanisms.

  10. Spatial Language and Children’s Spatial Landmark Use

    Directory of Open Access Journals (Sweden)

    Amber A. Ankowski

    2012-01-01

    Full Text Available We examined how spatial language affected search behavior in a landmark spatial search task. In Experiment 1, two- to six-year-old children were trained to find a toy in the center of a square array of four identical landmarks. Children heard one of three spatial language cues once during the initial training trial (“here,” “in the middle,” “next to this one”. After search performance reached criterion, children received a probe test trial in which the landmark array was expanded. In Experiment 2, two- to four-year-old children participated in the search task and also completed a language comprehension task. Results revealed that children’s spatial language comprehension scores and spatial language cues heard during training trials were related to children’s performance in the search task.

  11. Quantifying attentional modulation of auditory-evoked cortical responses from single-trial electroencephalography

    Directory of Open Access Journals (Sweden)

    Inyong eChoi

    2013-04-01

    Full Text Available Selective auditory attention is essential for human listeners to be able to communicate in multi-source environments. Selective attention is known to modulate the neural representation of the auditory scene, boosting the representation of a target sound relative to the background, but the strength of this modulation, and the mechanisms contributing to it, are not well understood. Here, listeners performed a behavioral experiment demanding sustained, focused spatial auditory attention while we measured cortical responses using electroencephalography (EEG. We presented three concurrent melodic streams; listeners were asked to attend and analyze the melodic contour of one of the streams, randomly selected from trial to trial. In a control task, listeners heard the same sound mixtures, but performed the contour judgment task on a series of visual arrows, ignoring all auditory streams. We found that the cortical responses could be fit as weighted sum of event-related potentials evoked by the stimulus onsets in the competing streams. The weighting to a given stream was roughly 10 dB higher when it was attended compared to when another auditory stream was attended; during the visual task, the auditory gains were intermediate. We then used a template-matching classification scheme to classify single-trial EEG results. We found that in all subjects, we could determine which stream the subject was attending significantly better than by chance. By directly quantifying the effect of selective attention on auditory cortical responses, these results reveal that focused auditory attention both suppresses the response to an unattended stream and enhances the response to an attended stream. The single-trial classification results add to the growing body of literature suggesting that auditory attentional modulation is sufficiently robust that it could be used as a control mechanism in brain-computer interfaces.

  12. Visual cues for landmine detection

    Science.gov (United States)

    Staszewski, James J.; Davison, Alan D.; Tischuk, Julia A.; Dippel, David J.

    2007-04-01

    Can human vision supplement the information that handheld landmine detection equipment provides its operators to increase detection rates and reduce the hazard of the task? Contradictory viewpoints exist regarding the viability of visual detection of landmines. Assuming both positions are credible, this work aims to reconcile them by exploring the visual information produced by landmine burial and how any visible signatures change as a function of time in a natural environment. Its objective is to acquire objective, foundational knowledge on which training could be based and subsequently evaluated. A representative set of demilitarized landmines were buried at a field site with bare soil and vegetated surfaces using doctrinal procedures. High resolution photographs of the ground surface were taken for approximately one month starting in April 2006. Photos taken immediately after burial show clearly visible surface signatures. Their features change with time and weather exposure, but the patterns they define persist, as photos taken a month later show. An analysis exploiting the perceptual sensitivity of expert observers showed signature photos to domain experts with instructions to identify the cues and patterns that defined the signatures. Analysis of experts' verbal descriptions identified a small set of easily communicable cues that characterize signatures and their changes over the duration of observation. Findings suggest that visual detection training is viable and has potential to enhance detection capabilities. The photos and descriptions generated offer materials for designing such training and testing its utility. Plans for investigating the generality of the findings, especially potential limiting conditions, are discussed.

  13. Sensory habituation of auditory receptor neurons: implications for sound localization.

    Science.gov (United States)

    Givois, V; Pollack, G S

    2000-09-01

    Auditory receptor neurons exhibit sensory habituation; their responses decline with repeated stimulation. We studied the effects of sensory habituation on the neural encoding of sound localization cues using crickets as a model system. In crickets, Teleogryllus oceanicus, sound localization is based on binaural comparison of stimulus intensity. There are two potential codes at the receptor-neuron level for interaural intensity difference: interaural difference in response strength, i.e. spike rate and/or count, and interaural difference in response latency. These are affected differently by sensory habituation. When crickets are stimulated with cricket-song-like trains of sound pulses, response strength declines for successive pulses in the train, and the decrease becomes more pronounced as the stimulus intensity increases. Response decrement is thus greater for receptors serving the ear ipsilateral to the sound source, where intensity is higher, resulting in a decrease in the interaural difference in response strength. Sensory habituation also affects response latency, which increases for responses to successive sound pulses in the stimulus train. The change in latency is independent of intensity, and thus is similar for receptors serving both ears. As a result, interaural latency difference is unaffected by sensory habituation and may be a more reliable cue for sound localization.

  14. On the Role of Working Memory in Spatial Contextual Cueing

    Science.gov (United States)

    Travis, Susan L.; Mattingley, Jason B.; Dux, Paul E.

    2013-01-01

    The human visual system receives more information than can be consciously processed. To overcome this capacity limit, we employ attentional mechanisms to prioritize task-relevant (target) information over less relevant (distractor) information. Regularities in the environment can facilitate the allocation of attention, as demonstrated by the…

  15. Neural entrainment to rhythmically-presented auditory, visual and audio-visual speech in children

    Directory of Open Access Journals (Sweden)

    Alan James Power

    2012-07-01

    Full Text Available Auditory cortical oscillations have been proposed to play an important role in speech perception. It is suggested that the brain may take temporal ‘samples’ of information from the speech stream at different rates, phase-resetting ongoing oscillations so that they are aligned with similar frequency bands in the input (‘phase locking’. Information from these frequency bands is then bound together for speech perception. To date, there are no explorations of neural phase-locking and entrainment to speech input in children. However, it is clear from studies of language acquisition that infants use both visual speech information and auditory speech information in learning. In order to study neural entrainment to speech in typically-developing children, we use a rhythmic entrainment paradigm (underlying 2 Hz or delta rate based on repetition of the syllable ba, presented in either the auditory modality alone, the visual modality alone, or as auditory-visual speech (via a talking head. To ensure attention to the task, children aged 13 years were asked to press a button as fast as possible when the ba stimulus violated the rhythm for each stream type. Rhythmic violation depended on delaying the occurrence of a ba in the isochronous stream. Neural entrainment was demonstrated for all stream types, and individual differences in standardized measures of language processing were related to auditory entrainment at the theta rate. Further, there was significant modulation of the preferred phase of auditory entrainment in the theta band when visual speech cues were present, indicating cross-modal phase resetting. The rhythmic entrainment paradigm developed here offers a method for exploring individual differences in oscillatory phase locking during development. In particular, a method for assessing neural entrainment and cross-modal phase resetting would be useful for exploring developmental learning difficulties thought to involve temporal sampling

  16. Is the auditory evoked P2 response a biomarker of learning?

    Science.gov (United States)

    Tremblay, Kelly L; Ross, Bernhard; Inoue, Kayo; McClannahan, Katrina; Collet, Gregory

    2014-01-01

    Even though auditory training exercises for humans have been shown to improve certain perceptual skills of individuals with and without hearing loss, there is a lack of knowledge pertaining to which aspects of training are responsible for the perceptual gains, and which aspects of perception are changed. To better define how auditory training impacts brain and behavior, electroencephalography (EEG) and magnetoencephalography (MEG) have been used to determine the time course and coincidence of cortical modulations associated with different types of training. Here we focus on P1-N1-P2 auditory evoked responses (AEP), as there are consistent reports of gains in P2 amplitude following various types of auditory training experiences; including music and speech-sound training. The purpose of this experiment was to determine if the auditory evoked P2 response is a biomarker of learning. To do this, we taught native English speakers to identify a new pre-voiced temporal cue that is not used phonemically in the English language so that coinciding changes in evoked neural activity could be characterized. To differentiate possible effects of repeated stimulus exposure and a button-pushing task from learning itself, we examined modulations in brain activity in a group of participants who learned to identify the pre-voicing contrast and compared it to participants, matched in time, and stimulus exposure, that did not. The main finding was that the amplitude of the P2 auditory evoked response increased across repeated EEG sessions for all groups, regardless of any change in perceptual performance. What's more, these effects are retained for months. Changes in P2 amplitude were attributed to changes in neural activity associated with the acquisition process and not the learned outcome itself. A further finding was the expression of a late negativity (LN) wave 600-900 ms post-stimulus onset, post-training exclusively for the group that learned to identify the pre-voiced contrast

  17. Is the auditory evoked P2 response a biomarker of learning?

    Directory of Open Access Journals (Sweden)

    Kelly eTremblay

    2014-02-01

    Full Text Available Even though auditory training exercises for humans have been shown to improve certain perceptual skills of individuals with and without hearing loss, there is a lack of knowledge pertaining to which aspects of training are responsible for the perceptual gains, and which aspects of perception are changed. To better define how auditory training impacts brain and behavior, electroencephalography and magnetoencephalography have been used to determine the time course and coincidence of cortical modulations associated with different types of training. Here we focus on P1-N1-P2 auditory evoked responses (AEP, as there are consistent reports of gains in P2 amplitude following various types of auditory training experiences; including music and speech-sound training. The purpose of this experiment was to determine if the auditory evoked P2 response is a biomarker of learning. To do this, we taught native English speakers to identify a new pre-voiced temporal cue that is not used phonemically in the English language so that coinciding changes in evoked neural activity could be characterized. To differentiate possible effects of repeated stimulus exposure and a button-pushing task from learning itself, we examined modulations in brain activity in a group of participants who learned to identify the pre-voicing contrast and compared it to participants, matched in time, and stimulus exposure, that did not. The main finding was that the amplitude of the P2 auditory evoked response increased across repeated EEG sessions for all groups, regardless of any change in perceptual performance. What’s more, these effects were retained for months. Changes in P2 amplitude were attributed to changes in neural activity associated with the acquisition process and not the learned outcome itself. A further finding was the expression of a late negativity (LN wave 600-900 ms post-stimulus onset, post-training, exclusively for the group that learned to identify the pre

  18. Contribution of auditory working memory to speech understanding in mandarin-speaking cochlear implant users.

    Directory of Open Access Journals (Sweden)

    Duoduo Tao

    Full Text Available To investigate how auditory working memory relates to speech perception performance by Mandarin-speaking cochlear implant (CI users.Auditory working memory and speech perception was measured in Mandarin-speaking CI and normal-hearing (NH participants. Working memory capacity was measured using forward digit span and backward digit span; working memory efficiency was measured using articulation rate. Speech perception was assessed with: (a word-in-sentence recognition in quiet, (b word-in-sentence recognition in speech-shaped steady noise at +5 dB signal-to-noise ratio, (c Chinese disyllable recognition in quiet, (d Chinese lexical tone recognition in quiet. Self-reported school rank was also collected regarding performance in schoolwork.There was large inter-subject variability in auditory working memory and speech performance for CI participants. Working memory and speech performance were significantly poorer for CI than for NH participants. All three working memory measures were strongly correlated with each other for both CI and NH participants. Partial correlation analyses were performed on the CI data while controlling for demographic variables. Working memory efficiency was significantly correlated only with sentence recognition in quiet when working memory capacity was partialled out. Working memory capacity was correlated with disyllable recognition and school rank when efficiency was partialled out. There was no correlation between working memory and lexical tone recognition in the present CI participants.Mandarin-speaking CI users experience significant deficits in auditory working memory and speech performance compared with NH listeners. The present data suggest that auditory working memory may contribute to CI users' difficulties in speech understanding. The present pattern of results with Mandarin-speaking CI users is consistent with previous auditory working memory studies with English-speaking CI users, suggesting that the lexical

  19. Conceptual priming for realistic auditory scenes and for auditory words.

    Science.gov (United States)

    Frey, Aline; Aramaki, Mitsuko; Besson, Mireille

    2014-02-01

    Two experiments were conducted using both behavioral and Event-Related brain Potentials methods to examine conceptual priming effects for realistic auditory scenes and for auditory words. Prime and target sounds were presented in four stimulus combinations: Sound-Sound, Word-Sound, Sound-Word and Word-Word. Within each combination, targets were conceptually related to the prime, unrelated or ambiguous. In Experiment 1, participants were asked to judge whether the primes and targets fit together (explicit task) and in Experiment 2 they had to decide whether the target was typical or ambiguous (implicit task). In both experiments and in the four stimulus combinations, reaction times and/or error rates were longer/higher and the N400 component was larger to ambiguous targets than to conceptually related targets, thereby pointing to a common conceptual system for processing auditory scenes and linguistic stimuli in both explicit and implicit tasks. However, fine-grained analyses also revealed some differences between experiments and conditions in scalp topography and duration of the priming effects possibly reflecting differences in the integration of perceptual and cognitive attributes of linguistic and nonlinguistic sounds. These results have clear implications for the building-up of virtual environments that need to convey meaning without words. PMID:24378910

  20. Gaze in Visual Search Is Guided More Efficiently by Positive Cues than by Negative Cues.

    Directory of Open Access Journals (Sweden)

    Günter Kugler

    Full Text Available Visual search can be accelerated when properties of the target are known. Such knowledge allows the searcher to direct attention to items sharing these properties. Recent work indicates that information about properties of non-targets (i.e., negative cues can also guide search. In the present study, we examine whether negative cues lead to different search behavior compared to positive cues. We asked observers to search for a target defined by a certain shape singleton (broken line among solid lines. Each line was embedded in a colored disk. In "positive cue" blocks, participants were informed about possible colors of the target item. In "negative cue" blocks, the participants were informed about colors that could not contain the target. Search displays were designed such that with both the positive and negative cues, the same number of items could potentially contain the broken line ("relevant items". Thus, both cues were equally informative. We measured response times and eye movements. Participants exhibited longer response times when provided with negative cues compared to positive cues. Although negative cues did guide the eyes to relevant items, there were marked differences in eye movements. Negative cues resulted in smaller proportions of fixations on relevant items, longer duration of fixations and in higher rates of fixations per item as compared to positive cues. The effectiveness of both cue types, as measured by fixations on relevant items, increased over the course of each search. In sum, a negative color cue can guide attention to relevant items, but it is less efficient than a positive cue of the same informational value.

  1. Auditory and visual distance perception: The proximity-image effect revisited

    Science.gov (United States)

    Zahorik, Pavel

    2003-04-01

    The proximity-image effect [M. B. Gardner, J. Acoust. Soc. Am. 43, 163 (1968)] describes a phenomenon in which the apparent distance of an auditory target is determined by the distance of the nearest plausible visual target rather than by acoustic distance cues. Here this effect is examined using a single visual target (an un-energized loudspeaker) and invisible virtual sound sources. These sources were synthesized from binaural impulse-response measurements at distances ranging from 1 to 5 m (0.25-m steps) in the semi-reverberant room (7.7 m×4.2 m×2.7 m) in which the experiment was conducted. Listeners (n=11) were asked whether or not the auditory target appeared to be at the same distance as the visual target. Within a block of trials, the visual target was placed at a fixed distance of 1.5, 3, or 4.5 m, and the auditory target varied randomly from trial-to-trial over the sample of measurement distances. The resulting psychometric functions are consistent with the proximity-image effect, and can be predicted using a simple model of sensory integration and decision in which perceived auditory space is both compressed in distance and has lower resolution than perceived visual space. [Work supported by NIH-NEI.

  2. Children's recognition of emotions from vocal cues

    NARCIS (Netherlands)

    D.A. Sauter; C. Panattoni; F. Happé

    2013-01-01

    Emotional cues contain important information about the intentions and feelings of others. Despite a wealth of research into children's understanding of facial signals of emotions, little research has investigated the developmental trajectory of interpreting affective cues in the voice. In this study

  3. Guiding Attention by Cooperative Cues

    Institute of Scientific and Technical Information of China (English)

    KangWoo Lee

    2008-01-01

    A common assumption in visual attention is based on the rationale of "limited capacity of information pro-ceasing". From this view point there is little consideration of how different information channels or modules are cooperating because cells in processing stages are forced to compete for the limited resource. To examine the mechanism behind the cooperative behavior of information channels, a computational model of selective attention is implemented based on two hypotheses. Unlike the traditional view of visual attention, the cooperative behavior is assumed to be a dynamic integration process between the bottom-up and top-down information. Furthermore, top-down information is assumed to provide a contextual cue during selection process and to guide the attentional allocation among many bottom-up candidates. The result from a series of simulation with still and video images showed some interesting properties that could not be explained by the competitive aspect of selective attention alone.

  4. Psychophysiological responses to auditory change.

    Science.gov (United States)

    Chuen, Lorraine; Sears, David; McAdams, Stephen

    2016-06-01

    A comprehensive characterization of autonomic and somatic responding within the auditory domain is currently lacking. We studied whether simple types of auditory change that occur frequently during music listening could elicit measurable changes in heart rate, skin conductance, respiration rate, and facial motor activity. Participants heard a rhythmically isochronous sequence consisting of a repeated standard tone, followed by a repeated target tone that changed in pitch, timbre, duration, intensity, or tempo, or that deviated momentarily from rhythmic isochrony. Changes in all parameters produced increases in heart rate. Skin conductance response magnitude was affected by changes in timbre, intensity, and tempo. Respiratory rate was sensitive to deviations from isochrony. Our findings suggest that music researchers interpreting physiological responses as emotional indices should consider acoustic factors that may influence physiology in the absence of induced emotions. PMID:26927928

  5. Reality of auditory verbal hallucinations

    Science.gov (United States)

    Valkonen-Korhonen, Minna; Holi, Matti; Therman, Sebastian; Lehtonen, Johannes; Hari, Riitta

    2009-01-01

    Distortion of the sense of reality, actualized in delusions and hallucinations, is the key feature of psychosis but the underlying neuronal correlates remain largely unknown. We studied 11 highly functioning subjects with schizophrenia or schizoaffective disorder while they rated the reality of auditory verbal hallucinations (AVH) during functional magnetic resonance imaging (fMRI). The subjective reality of AVH correlated strongly and specifically with the hallucination-related activation strength of the inferior frontal gyri (IFG), including the Broca's language region. Furthermore, how real the hallucination that subjects experienced was depended on the hallucination-related coupling between the IFG, the ventral striatum, the auditory cortex, the right posterior temporal lobe, and the cingulate cortex. Our findings suggest that the subjective reality of AVH is related to motor mechanisms of speech comprehension, with contributions from sensory and salience-detection-related brain regions as well as circuitries related to self-monitoring and the experience of agency. PMID:19620178

  6. 基于环境重置的视听觉刺激在脑卒中偏侧忽略的护理研究%Research of Visual and Auditory Stimulation Based on Environment Reset on Hemi-spatial Neglect in Patients with Cerebral Apoplexy

    Institute of Scientific and Technical Information of China (English)

    韩宇花; 陶希; 邓景贵; 刘佳; 宋涛; 何娟

    2014-01-01

    目的:探讨基于环境重置的视听觉刺激对脑卒中偏侧忽略( hemispatial neglect, HSN)的影响。方法2010年3月-2012年9月收治的脑卒中HSN 49例随机分为观察组27例和对照组22例。两组均给予常规治疗,对照组对病房及康复环境不做要求,观察组对病房床位及康复环境进行重新设置。于治疗前、治疗4周及治疗8周时分别行直线二等分( LB)测试和线段划消( LC)测试评估HSN的程度,以美国国立研究院脑卒中评定量表( NIHSS)评定神经功能缺损和改良Barthel Index( MBI)评估日常生活活动能力( ADL)。结果治疗4、8周时两组LB、LC及NIHSS评分均低于治疗前,MBI评分均高于治疗前(P<0.05)。治疗8周时两组LB、NIHSS评分和观察组LC均较治疗4周时降低,MBI评分较治疗4周时升高,观察组LB、LC低于对照组,MBI评分高于对照组(P<0.05)。结论基于环境重置的视听觉刺激对脑卒中HSN患者有益,可提高ADL能力,但对神经功能缺损影响可能不大。%Objective To explore the effect of visual and auditory stimulation based on environment reset on he-mi-spatial neglect ( HSN) in patients with cerebral apoplexy. Methods A total of 49 patients with cerebral apoplexy combined with HSN during March 2010 and September 2012 were randomly divided into control group (n=22) and ob-servation group (n=27). Conventional therapy was performed in the two groups. Wards and rehabilitation environment for patients in control group had no special requirement, while wards and rehabilitation environment for patients were rese-ted regularly. HSN degrees were assessed by test of line bisection ( LB) and line cancellation ( LC);scores of neurologic impairment were evaluated with National Institute of Health stroke scale ( NIHSS) , and abilities of activity of daily living ( ADL) were evaluated with modified Barthel index ( MBI) before treatment, after treatment for 4 weeks and 8 weeks. Results Compared with those before

  7. Research of Visual and Auditory Stimulation Based on Environment Reset on Hemi-spatial Neglect in Patients with Cerebral Apoplexy%基于环境重置的视听觉刺激在脑卒中偏侧忽略的护理研究

    Institute of Scientific and Technical Information of China (English)

    韩宇花; 陶希; 邓景贵; 刘佳; 宋涛; 何娟

    2014-01-01

    目的:探讨基于环境重置的视听觉刺激对脑卒中偏侧忽略( hemispatial neglect, HSN)的影响。方法2010年3月-2012年9月收治的脑卒中HSN 49例随机分为观察组27例和对照组22例。两组均给予常规治疗,对照组对病房及康复环境不做要求,观察组对病房床位及康复环境进行重新设置。于治疗前、治疗4周及治疗8周时分别行直线二等分( LB)测试和线段划消( LC)测试评估HSN的程度,以美国国立研究院脑卒中评定量表( NIHSS)评定神经功能缺损和改良Barthel Index( MBI)评估日常生活活动能力( ADL)。结果治疗4、8周时两组LB、LC及NIHSS评分均低于治疗前,MBI评分均高于治疗前(P<0.05)。治疗8周时两组LB、NIHSS评分和观察组LC均较治疗4周时降低,MBI评分较治疗4周时升高,观察组LB、LC低于对照组,MBI评分高于对照组(P<0.05)。结论基于环境重置的视听觉刺激对脑卒中HSN患者有益,可提高ADL能力,但对神经功能缺损影响可能不大。%Objective To explore the effect of visual and auditory stimulation based on environment reset on he-mi-spatial neglect ( HSN) in patients with cerebral apoplexy. Methods A total of 49 patients with cerebral apoplexy combined with HSN during March 2010 and September 2012 were randomly divided into control group (n=22) and ob-servation group (n=27). Conventional therapy was performed in the two groups. Wards and rehabilitation environment for patients in control group had no special requirement, while wards and rehabilitation environment for patients were rese-ted regularly. HSN degrees were assessed by test of line bisection ( LB) and line cancellation ( LC);scores of neurologic impairment were evaluated with National Institute of Health stroke scale ( NIHSS) , and abilities of activity of daily living ( ADL) were evaluated with modified Barthel index ( MBI) before treatment, after treatment for 4 weeks and 8 weeks. Results Compared with those before

  8. Cue abstraction and exemplar memory in categorization.

    Science.gov (United States)

    Juslin, Peter; Jones, Sari; Olsson, Henrik; Winman, Anders

    2003-09-01

    In this article, the authors compare 3 generic models of the cognitive processes in a categorization task. The cue abstraction model implies abstraction in training of explicit cue-criterion relations that are mentally integrated to form a judgment, the lexicographic heuristic uses only the most valid cue, and the exemplar-based model relies on retrieval of exemplars. The results from 2 experiments showed that, in lieu of the lexicographic heuristic, most participants spontaneously integrate cues. In contrast to single-system views, exemplar memory appeared to dominate when the feedback was poor, but when the feedback was rich enough to allow the participants to discern the task structure, it was exploited for abstraction of explicit cue-criterion relations. PMID:14516225

  9. Kin-informative recognition cues in ants

    DEFF Research Database (Denmark)

    Nehring, Volker; Evison, Sophie E F; Santorelli, Lorenzo A;

    2011-01-01

    behaviour is thought to be rare in one of the classic examples of cooperation--social insect colonies--because the colony-level costs of individual selfishness select against cues that would allow workers to recognize their closest relatives. In accord with this, previous studies of wasps and ants have...... found little or no kin information in recognition cues. Here, we test the hypothesis that social insects do not have kin-informative recognition cues by investigating the recognition cues and relatedness of workers from four colonies of the ant Acromyrmex octospinosus. Contrary to the theoretical...... prediction, we show that the cuticular hydrocarbons of ant workers in all four colonies are informative enough to allow full-sisters to be distinguished from half-sisters with a high accuracy. These results contradict the hypothesis of non-heritable recognition cues and suggest that there is more potential...

  10. Auditory distraction and serial memory

    OpenAIRE

    Jones, D M; Hughes, Rob; Macken, W.J.

    2010-01-01

    One mental activity that is very vulnerable to auditory distraction is serial recall. This review of the contemporary findings relating to serial recall charts the key determinants of distraction. It is evident that there is one form of distraction that is a joint product of the cognitive characteristics of the task and of the obligatory cognitive processing of the sound. For sequences of sound, distraction appears to be an ineluctable product of similarity-of-process, specifically, the seria...

  11. Reality of auditory verbal hallucinations

    OpenAIRE

    Raij TT; Valkonen-Korhonen M; Holi M; Therman S; Lehtonen J; Hari R

    2009-01-01

    Distortion of the sense of reality, actualized in delusions and hallucinations, is the key feature of psychosis but the underlying neuronal correlates remain largely unknown. We studied 11 highly functioning subjects with schizophrenia or schizoaffective disorder while they rated the reality of auditory verbal hallucinations (AVH) during functional magnetic resonance imaging (fMRI). The subjective reality of AVH correlated strongly and specifically with the hallucination-related activation st...

  12. Music As a Sacred Cue? Effects of Religious Music on Moral Behavior.

    Science.gov (United States)

    Lang, Martin; Mitkidis, Panagiotis; Kundt, Radek; Nichols, Aaron; Krajčíková, Lenka; Xygalatas, Dimitris

    2016-01-01

    Religion can have an important influence in moral decision-making, and religious reminders may deter people from unethical behavior. Previous research indicated that religious contexts may increase prosocial behavior and reduce cheating. However, the perceptual-behavioral link between religious contexts and decision-making lacks thorough scientific understanding. This study adds to the current literature by testing the effects of purely audial religious symbols (instrumental music) on moral behavior across three different sites: Mauritius, the Czech Republic, and the USA. Participants were exposed to one of three kinds of auditory stimuli (religious, secular, or white noise), and subsequently were given a chance to dishonestly report on solved mathematical equations in order to increase their monetary reward. The results showed cross-cultural differences in the effects of religious music on moral behavior, as well as a significant interaction between condition and religiosity across all sites, suggesting that religious participants were more influenced by the auditory religious stimuli than non-religious participants. We propose that religious music can function as a subtle cue associated with moral standards via cultural socialization and ritual participation. Such associative learning can charge music with specific meanings and create sacred cues that influence normative behavior. Our findings provide preliminary support for this view, which we hope further research will investigate more closely. PMID:27375515

  13. Music As a Sacred Cue? Effects of Religious Music on Moral Behavior.

    Science.gov (United States)

    Lang, Martin; Mitkidis, Panagiotis; Kundt, Radek; Nichols, Aaron; Krajčíková, Lenka; Xygalatas, Dimitris

    2016-01-01

    Religion can have an important influence in moral decision-making, and religious reminders may deter people from unethical behavior. Previous research indicated that religious contexts may increase prosocial behavior and reduce cheating. However, the perceptual-behavioral link between religious contexts and decision-making lacks thorough scientific understanding. This study adds to the current literature by testing the effects of purely audial religious symbols (instrumental music) on moral behavior across three different sites: Mauritius, the Czech Republic, and the USA. Participants were exposed to one of three kinds of auditory stimuli (religious, secular, or white noise), and subsequently were given a chance to dishonestly report on solved mathematical equations in order to increase their monetary reward. The results showed cross-cultural differences in the effects of religious music on moral behavior, as well as a significant interaction between condition and religiosity across all sites, suggesting that religious participants were more influenced by the auditory religious stimuli than non-religious participants. We propose that religious music can function as a subtle cue associated with moral standards via cultural socialization and ritual participation. Such associative learning can charge music with specific meanings and create sacred cues that influence normative behavior. Our findings provide preliminary support for this view, which we hope further research will investigate more closely.

  14. Music As a Sacred Cue? Effects of Religious Music on Moral Behavior

    Science.gov (United States)

    Lang, Martin; Mitkidis, Panagiotis; Kundt, Radek; Nichols, Aaron; Krajčíková, Lenka; Xygalatas, Dimitris

    2016-01-01

    Religion can have an important influence in moral decision-making, and religious reminders may deter people from unethical behavior. Previous research indicated that religious contexts may increase prosocial behavior and reduce cheating. However, the perceptual-behavioral link between religious contexts and decision-making lacks thorough scientific understanding. This study adds to the current literature by testing the effects of purely audial religious symbols (instrumental music) on moral behavior across three different sites: Mauritius, the Czech Republic, and the USA. Participants were exposed to one of three kinds of auditory stimuli (religious, secular, or white noise), and subsequently were given a chance to dishonestly report on solved mathematical equations in order to increase their monetary reward. The results showed cross-cultural differences in the effects of religious music on moral behavior, as well as a significant interaction between condition and religiosity across all sites, suggesting that religious participants were more influenced by the auditory religious stimuli than non-religious participants. We propose that religious music can function as a subtle cue associated with moral standards via cultural socialization and ritual participation. Such associative learning can charge music with specific meanings and create sacred cues that influence normative behavior. Our findings provide preliminary support for this view, which we hope further research will investigate more closely. PMID:27375515

  15. Auditory sequence analysis and phonological skill.

    Science.gov (United States)

    Grube, Manon; Kumar, Sukhbinder; Cooper, Freya E; Turton, Stuart; Griffiths, Timothy D

    2012-11-01

    This work tests the relationship between auditory and phonological skill in a non-selected cohort of 238 school students (age 11) with the specific hypothesis that sound-sequence analysis would be more relevant to phonological skill than the analysis of basic, single sounds. Auditory processing was assessed across the domains of pitch, time and timbre; a combination of six standard tests of literacy and language ability was used to assess phonological skill. A significant correlation between general auditory and phonological skill was demonstrated, plus a significant, specific correlation between measures of phonological skill and the auditory analysis of short sequences in pitch and time. The data support a limited but significant link between auditory and phonological ability with a specific role for sound-sequence analysis, and provide a possible new focus for auditory training strategies to aid language development in early adolescence. PMID:22951739

  16. Absence of cue-recruitment for extrinsic signals: sounds, spots, and swirling dots fail to influence perceived 3D rotation direction after training.

    Directory of Open Access Journals (Sweden)

    Anshul Jain

    Full Text Available The visual system can learn to use information in new ways to construct appearance. Thus, signals such as the location or translation direction of an ambiguously rotating wire frame cube, which are normally uninformative, can be learned as cues to determine the rotation direction. This perceptual learning occurs when the formerly uninformative signal is statistically associated with long-trusted visual cues (such as binocular disparity that disambiguate appearance during training. In previous demonstrations, the newly learned cue was intrinsic to the perceived object, in that the signal was conveyed by the same image elements as the object itself. Here we used extrinsic new signals and observed no learning. We correlated three new signals with long-trusted cues in the rotating cube paradigm: one crossmodal (an auditory signal and two within modality (visual. Cue recruitment did not occur in any of these conditions, either in single sessions or in ten sessions across as many days. These results suggest that the intrinsic/extrinsic distinction is important for the perceptual system in determining whether it can learn and use new information from the environment to construct appearance. Extrinsic cues do have perceptual effects (e.g. the "bounce-pass" illusion and McGurk effect, so we speculate that extrinsic signals must be recruited for perception, but only if certain conditions are met. These conditions might specify the age of the observer, the strength of the long-trusted cues, or the amount of exposure to the correlation.

  17. Synchronization and leadership in string quartet performance: a case study of auditory and visual cues

    Directory of Open Access Journals (Sweden)

    Renee eTimmers

    2014-06-01

    Full Text Available Temporal coordination between members of a string quartet was investigated across repeated performances of an excerpt of Haydn’s string quartet in G Major, Op. 77 No. 1. Cross-correlations between interbeat intervals of performances at different lags showed a unidirectional dependence of Viola on Violin I, and of Violin I on Cello. Bidirectional dependence was observed for the relationships between Violin II and Cello and Violin II and Viola. Own-reported dependencies after the performances reflected these measured dependencies more closely than dependencies of players reported by the other players, which instead showed more typical leader-follower patterns in which Violin I leads. On the other hand, primary leadership from Violin I was observed in an analysis of the bow speed characteristics preceding the first tone onset. The anticipatory movement of Violin I set the tempo of the excerpt. Taken together the results show a more complex and differentiated pattern of dependencies than expected from a traditional role division of leadership suggesting several avenues for further research.

  18. Self-generated auditory feedback as a cue to support rhythmic motor stability

    DEFF Research Database (Denmark)

    Krupenia, Stas S.; Hoffmann, Pablo F.; Zalmanov, Hagar;

    2011-01-01

    A goal of the SKILLS project is to develop Virtual Reality (VR)-based training simulators for different application domains, one of which is juggling. Within this context the value of multimodal VR environments for skill acquisition is investigated. In this study, we investigated whether it was n......A goal of the SKILLS project is to develop Virtual Reality (VR)-based training simulators for different application domains, one of which is juggling. Within this context the value of multimodal VR environments for skill acquisition is investigated. In this study, we investigated whether...

  19. Early blindness results in a degraded auditory map of space in the optic tectum of the barn owl.

    OpenAIRE

    Knudsen, E I

    1988-01-01

    The optic tectum of the barn owl (Tyto alba) contains a neural map of auditory space consisting of neurons that are sharply tuned for sound source location and organized precisely according to their spatial tuning. The importance of vision for the development of this auditory map was investigated by comparing space maps measured in normal owls with those measured in owls raised with both eyelids sutured closed. The results demonstrate that owls raised without sight, but with normal hearing, d...

  20. Effect of dual sensory loss on auditory localization: implications for intervention.

    Science.gov (United States)

    Simon, Helen J; Levitt, Harry

    2007-12-01

    Our sensory systems are remarkable in several respects. They are extremely sensitive, they each perform more than one function, and they interact in a complementary way, thereby providing a high degree of redundancy that is particularly helpful should one or more sensory systems be impaired. In this article, the problem of dual hearing and vision loss is addressed. A brief description is provided on the use of auditory cues in vision loss, the use of visual cues in hearing loss, and the additional difficulties encountered when both sensory systems are impaired. A major focus of this article is the use of sound localization by normal hearing, hearing impaired, and blind individuals and the special problem of sound localization in people with dual sensory loss. PMID:18003869

  1. Speech distortion measure based on auditory properties

    Institute of Scientific and Technical Information of China (English)

    CHEN Guo; HU Xiulin; ZHANG Yunyu; ZHU Yaoting

    2000-01-01

    The Perceptual Spectrum Distortion (PSD), based on auditory properties of human being, is presented to measure speech distortion. The PSD measure calculates the speech distortion distance by simulating the auditory properties of human being and converting short-time speech power spectrum to auditory perceptual spectrum. Preliminary simulative experiments in comparison with the Itakura measure have been done. The results show that the PSD measure is a perferable speech distortion measure and more consistent with subjective assessment of speech quality.

  2. Auditory stimulation and cardiac autonomic regulation

    OpenAIRE

    Vitor E Valenti; Guida, Heraldo L.; Frizzo, Ana C F; Cardoso, Ana C. V.; Vanderlei, Luiz Carlos M; Luiz Carlos de Abreu

    2012-01-01

    Previous studies have already demonstrated that auditory stimulation with music influences the cardiovascular system. In this study, we described the relationship between musical auditory stimulation and heart rate variability. Searches were performed with the Medline, SciELO, Lilacs and Cochrane databases using the following keywords: "auditory stimulation", "autonomic nervous system", "music" and "heart rate variability". The selected studies indicated that there is a strong correlation bet...

  3. Mechanisms of Auditory Verbal Hallucination in Schizophrenia

    OpenAIRE

    Raymond eCho; Wayne eWu

    2013-01-01

    Recent work on the mechanisms underlying auditory verbal hallucination (AVH) has been heavily informed by self-monitoring accounts that postulate defects in an internal monitoring mechanism as the basis of AVH. A more neglected alternative is an account focusing on defects in auditory processing, namely a spontaneous activation account of auditory activity underlying AVH. Science is often aided by putting theories in competition. Accordingly, a discussion that systematically contrasts the two...

  4. Visual Cues Generated during Action Facilitate 14-Month-Old Infants' Mental Rotation

    Science.gov (United States)

    Antrilli, Nick K.; Wang, Su-hua

    2016-01-01

    Although action experience has been shown to enhance the development of spatial cognition, the mechanism underlying the effects of action is still unclear. The present research examined the role of visual cues generated during action in promoting infants' mental rotation. We sought to clarify the underlying mechanism by decoupling different…

  5. Auditory Training and Its Effects upon the Auditory Discrimination and Reading Readiness of Kindergarten Children.

    Science.gov (United States)

    Cullen, Minga Mustard

    The purpose of this investigation was to evaluate the effects of a systematic auditory training program on the auditory discrimination ability and reading readiness of 55 white, middle/upper middle class kindergarten students. Following pretesting with the "Wepman Auditory Discrimination Test,""The Clymer-Barrett Prereading Battery," and the…

  6. Effects of Methylphenidate (Ritalin) on Auditory Performance in Children with Attention and Auditory Processing Disorders.

    Science.gov (United States)

    Tillery, Kim L.; Katz, Jack; Keller, Warren D.

    2000-01-01

    A double-blind, placebo-controlled study examined effects of methylphenidate (Ritalin) on auditory processing in 32 children with both attention deficit hyperactivity disorder and central auditory processing (CAP) disorder. Analyses revealed that Ritalin did not have a significant effect on any of the central auditory processing measures, although…

  7. Seeing the song: left auditory structures may track auditory-visual dynamic alignment.

    Directory of Open Access Journals (Sweden)

    Julia A Mossbridge

    Full Text Available Auditory and visual signals generated by a single source tend to be temporally correlated, such as the synchronous sounds of footsteps and the limb movements of a walker. Continuous tracking and comparison of the dynamics of auditory-visual streams is thus useful for the perceptual binding of information arising from a common source. Although language-related mechanisms have been implicated in the tracking of speech-related auditory-visual signals (e.g., speech sounds and lip movements, it is not well known what sensory mechanisms generally track ongoing auditory-visual synchrony for non-speech signals in a complex auditory-visual environment. To begin to address this question, we used music and visual displays that varied in the dynamics of multiple features (e.g., auditory loudness and pitch; visual luminance, color, size, motion, and organization across multiple time scales. Auditory activity (monitored using auditory steady-state responses, ASSR was selectively reduced in the left hemisphere when the music and dynamic visual displays were temporally misaligned. Importantly, ASSR was not affected when attentional engagement with the music was reduced, or when visual displays presented dynamics clearly dissimilar to the music. These results appear to suggest that left-lateralized auditory mechanisms are sensitive to auditory-visual temporal alignment, but perhaps only when the dynamics of auditory and visual streams are similar. These mechanisms may contribute to correct auditory-visual binding in a busy sensory environment.

  8. Central auditory function of deafness genes.

    Science.gov (United States)

    Willaredt, Marc A; Ebbers, Lena; Nothwang, Hans Gerd

    2014-06-01

    The highly variable benefit of hearing devices is a serious challenge in auditory rehabilitation. Various factors contribute to this phenomenon such as the diversity in ear defects, the different extent of auditory nerve hypoplasia, the age of intervention, and cognitive abilities. Recent analyses indicate that, in addition, central auditory functions of deafness genes have to be considered in this context. Since reduced neuronal activity acts as the common denominator in deafness, it is widely assumed that peripheral deafness influences development and function of the central auditory system in a stereotypical manner. However, functional characterization of transgenic mice with mutated deafness genes demonstrated gene-specific abnormalities in the central auditory system as well. A frequent function of deafness genes in the central auditory system is supported by a genome-wide expression study that revealed significant enrichment of these genes in the transcriptome of the auditory brainstem compared to the entire brain. Here, we will summarize current knowledge of the diverse central auditory functions of deafness genes. We furthermore propose the intimately interwoven gene regulatory networks governing development of the otic placode and the hindbrain as a mechanistic explanation for the widespread expression of these genes beyond the cochlea. We conclude that better knowledge of central auditory dysfunction caused by genetic alterations in deafness genes is required. In combination with improved genetic diagnostics becoming currently available through novel sequencing technologies, this information will likely contribute to better outcome prediction of hearing devices.

  9. The perception of speech modulation cues in lexical tones is guided by early language-specific experience

    Directory of Open Access Journals (Sweden)

    Laurianne eCabrera

    2015-08-01

    Full Text Available A number of studies showed that infants reorganize their perception of speech sounds according to their native language categories during their first year of life. Still, information is lacking about the contribution of basic auditory mechanisms to this process. This study aimed to evaluate when native language experience starts to noticeably affect the perceptual processing of basic acoustic cues (i.e., frequency-modulation (FM and amplitude-modulation (AM information known to be crucial for speech perception in adults. The discrimination of a lexical-tone contrast (rising versus low was assessed in 6- and 10-month-old infants learning either French or Mandarin using a visual habituation paradigm. The lexical tones were presented in two conditions designed to either keep intact or to severely degrade the FM and fine spectral cues needed to accurately perceive voice-pitch trajectory. A third condition was designed to assess the discrimination of the same voice-pitch trajectories using click trains containing only the FM cues related to the fundamental-frequency (F0 in French- and Mandarin-learning 10-month-old infants. Results showed that the younger infants of both language groups and the Mandarin-learning 10-month-olds discriminated the intact lexical-tone contrast while French-learning 10-month-olds failed. However, only the French 10-month-olds discriminated degraded lexical tones when FM, and thus voice-pitch cues were reduced. Moreover, Mandarin-learning 10-month-olds were found to discriminate the pitch trajectories as presented in click trains better than French infants. Altogether, these results reveal that the perceptual reorganization occurring during the first year of life for lexical tones is coupled with changes in the auditory ability to use speech modulation cues.

  10. Functional neuroanatomy of spatial sound processing in Alzheimer's disease.

    Science.gov (United States)

    Golden, Hannah L; Agustus, Jennifer L; Nicholas, Jennifer M; Schott, Jonathan M; Crutch, Sebastian J; Mancini, Laura; Warren, Jason D

    2016-03-01

    Deficits of auditory scene analysis accompany Alzheimer's disease (AD). However, the functional neuroanatomy of spatial sound processing has not been defined in AD. We addressed this using a "sparse" fMRI virtual auditory spatial paradigm in 14 patients with typical AD in relation to 16 healthy age-matched individuals. Sound stimulus sequences discretely varied perceived spatial location and pitch of the sound source in a factorial design. AD was associated with loss of differentiated cortical profiles of auditory location and pitch processing at the prescribed threshold, and significant group differences were identified for processing auditory spatial variation in posterior cingulate cortex (controls > AD) and the interaction of pitch and spatial variation in posterior insula (AD > controls). These findings build on emerging evidence for altered brain mechanisms of auditory scene analysis and suggest complex dysfunction of network hubs governing the interface of internal milieu and external environment in AD. Auditory spatial processing may be a sensitive probe of this interface and contribute to characterization of brain network failure in AD and other neurodegenerative syndromes. PMID:26923412

  11. Auditory Brainstem Response to Complex Sounds Predicts Self-Reported Speech-in-Noise Performance

    Science.gov (United States)

    Anderson, Samira; Parbery-Clark, Alexandra; White-Schwoch, Travis; Kraus, Nina

    2013-01-01

    Purpose: To compare the ability of the auditory brainstem response to complex sounds (cABR) to predict subjective ratings of speech understanding in noise on the Speech, Spatial, and Qualities of Hearing Scale (SSQ; Gatehouse & Noble, 2004) relative to the predictive ability of the Quick Speech-in-Noise test (QuickSIN; Killion, Niquette,…

  12. Sequential assessment of prey through the use of multiple sensory cues by an eavesdropping bat

    Science.gov (United States)

    Page, Rachel A.; Schnelle, Tanja; Kalko, Elisabeth K. V.; Bunge, Thomas; Bernal, Ximena E.

    2012-06-01

    Predators are often confronted with a broad diversity of potential prey. They rely on cues associated with prey quality and palatability to optimize their hunting success and to avoid consuming toxic prey. Here, we investigate a predator's ability to assess prey cues during capture, handling, and consumption when confronted with conflicting information about prey quality. We used advertisement calls of a preferred prey item (the túngara frog) to attract fringe-lipped bats, Trachops cirrhosus, then offered palatable, poisonous, and chemically manipulated anurans as prey. Advertisement calls elicited an attack response, but as bats approached, they used additional sensory cues in a sequential manner to update their information about prey size and palatability. While both palatable and poisonous small anurans were readily captured, large poisonous toads were approached but not contacted suggesting the use of echolocation for assessment of prey size at close range. Once prey was captured, bats used chemical cues to make final, post-capture decisions about whether to consume the prey. Bats dropped small, poisonous toads as well as palatable frogs coated in toad toxins either immediately or shortly after capture. Our study suggests that echolocation and chemical cues obtained at close range supplement information obtained from acoustic cues at long range. Updating information about prey quality minimizes the occurrence of costly errors and may be advantageous in tracking temporal and spatial fluctuations of prey and exploiting novel food sources. These findings emphasize the sequential, complex nature of prey assessment that may allow exploratory and flexible hunting behaviors.

  13. It Depends Who Is Watching You: 3-D Agent Cues Increase Fairness.

    Science.gov (United States)

    Krátký, Jan; McGraw, John J; Xygalatas, Dimitris; Mitkidis, Panagiotis; Reddish, Paul

    2016-01-01

    Laboratory and field studies have demonstrated that exposure to cues of intentional agents in the form of eyes can increase prosocial behavior. However, previous research mostly used 2-dimensional depictions as experimental stimuli. Thus far no study has examined the influence of the spatial properties of agency cues on this prosocial effect. To investigate the role of dimensionality of agency cues on fairness, 345 participants engaged in a decision-making task in a naturalistic setting. The experimental treatment included a 3-dimensional pseudo-realistic model of a human head and a 2-dimensional picture of the same object. The control stimuli consisted of a real plant and its 2-D image. Our results partly support the findings of previous studies that cues of intentional agents increase prosocial behavior. However, this effect was only found for the 3-D cues, suggesting that dimensionality is a critical variable in triggering these effects in a real-world settings. Our research sheds light on a hitherto unexplored aspect of the effects of environmental cues and their morphological properties on decision-making. PMID:26859562

  14. It Depends Who Is Watching You: 3-D Agent Cues Increase Fairness.

    Directory of Open Access Journals (Sweden)

    Jan Krátký

    Full Text Available Laboratory and field studies have demonstrated that exposure to cues of intentional agents in the form of eyes can increase prosocial behavior. However, previous research mostly used 2-dimensional depictions as experimental stimuli. Thus far no study has examined the influence of the spatial properties of agency cues on this prosocial effect. To investigate the role of dimensionality of agency cues on fairness, 345 participants engaged in a decision-making task in a naturalistic setting. The experimental treatment included a 3-dimensional pseudo-realistic model of a human head and a 2-dimensional picture of the same object. The control stimuli consisted of a real plant and its 2-D image. Our results partly support the findings of previous studies that cues of intentional agents increase prosocial behavior. However, this effect was only found for the 3-D cues, suggesting that dimensionality is a critical variable in triggering these effects in a real-world settings. Our research sheds light on a hitherto unexplored aspect of the effects of environmental cues and their morphological properties on decision-making.

  15. Rapid context-based identification of target sounds in an auditory scene

    Science.gov (United States)

    Gamble, Marissa L.; Woldorff, Marty G.

    2015-01-01

    To make sense of our dynamic and complex auditory environment, we must be able to parse the sensory input into usable parts and pick out relevant sounds from all the potentially distracting auditory information. While it is unclear exactly how we accomplish this difficult task, Gamble and Woldorff (2014) recently reported an ERP study of an auditory target-search task in a temporally and spatially distributed, rapidly presented, auditory scene. They reported an early, differential, bilateral activation (beginning ~60 ms) between feature-deviating Target stimuli and physically equivalent feature-deviating Nontargets, reflecting a rapid Target-detection process. This was followed shortly later (~130 ms) by the lateralized N2ac ERP activation, reflecting the focusing of auditory spatial attention toward the Target sound and paralleling attentional-shifting processes widely studied in vision. Here we directly examined the early, bilateral, Target-selective effect to better understand its nature and functional role. Participants listened to midline-presented sounds that included Target and Nontarget stimuli that were randomly either embedded in a brief rapid stream or presented alone. The results indicate that this early bilateral effect results from a template for the Target that utilizes its feature deviancy within a stream to enable rapid identification. Moreover, individual-differences analysis showed that the size of this effect was larger for subjects with faster response times. The findings support the hypothesis that our auditory attentional systems can implement and utilize a context-based relational template for a Target sound, making use of additional auditory information in the environment when needing to rapidly detect a relevant sound. PMID:25848684

  16. Action experience changes attention to kinematic cues

    Directory of Open Access Journals (Sweden)

    Courtney eFilippi

    2016-02-01

    Full Text Available The current study used remote corneal reflection eye-tracking to examine the relationship between motor experience and action anticipation in 13-month-old infants. To measure online anticipation of actions infants watched videos where the actor’s hand provided kinematic information (in its orientation about the type of object that the actor was going to reach for. The actor’s hand orientation either matched the orientation of a rod (congruent cue or did not match the orientation of the rod (incongruent cue. To examine relations between motor experience and action anticipation, we used a 2 (reach first vs. observe first x 2 (congruent kinematic cue vs. incongruent kinematic cue between-subjects design. We show that 13-month-old infants in the observe first condition spontaneously generate rapid online visual predictions to congruent hand orientation cues and do not visually anticipate when presented incongruent cues. We further demonstrate that the speed that these infants generate predictions to congruent motor cues is correlated with their own ability to pre-shape their hands. Finally, we demonstrate that following reaching experience, infants generate rapid predictions to both congruent and incongruent hand shape cues—suggesting that short-term experience changes attention to kinematics.

  17. When unreliable cues are good enough.

    Science.gov (United States)

    Donaldson-Matasci, Matina C; Bergstrom, Carl T; Lachmann, Michael

    2013-09-01

    In many species, nongenetic phenotypic variation helps mitigate risk associated with an uncertain environment. In some cases, developmental cues can be used to match phenotype to environment-a strategy known as predictive plasticity. When environmental conditions are entirely unpredictable, generating random phenotypic diversity may improve the long-term success of a lineage-a strategy known as diversified bet hedging. When partially reliable information is available, a well-adapted developmental strategy may strike a balance between the two strategies. We use information theory to analyze a model of development in an uncertain environment, where cue reliability is affected by variation both within and between generations. We show that within-generation variation in cues decreases the reliability of cues without affecting their fitness value. This transpires because the optimal balance of predictive plasticity and diversified bet hedging is unchanged. However, within-generation variation in cues does change the developmental mechanisms used to create that balance: developmental sensitivity to such cues not only helps match phenotype to environment but also creates phenotypic diversity that may be useful for hedging bets against environmental change. Understanding the adaptive role of developmental sensitivity thus depends on a proper assessment of both the predictive power and the structure of variation in environmental cues. PMID:23933723

  18. Stimulation of the human auditory nerve with optical radiation

    Science.gov (United States)

    Fishman, Andrew; Winkler, Piotr; Mierzwinski, Jozef; Beuth, Wojciech; Izzo Matic, Agnella; Siedlecki, Zygmunt; Teudt, Ingo; Maier, Hannes; Richter, Claus-Peter

    2009-02-01

    A novel, spatially selective method to stimulate cranial nerves has been proposed: contact free stimulation with optical radiation. The radiation source is an infrared pulsed laser. The Case Report is the first report ever that shows that optical stimulation of the auditory nerve is possible in the human. The ethical approach to conduct any measurements or tests in humans requires efficacy and safety studies in animals, which have been conducted in gerbils. This report represents the first step in a translational research project to initiate a paradigm shift in neural interfaces. A patient was selected who required surgical removal of a large meningioma angiomatum WHO I by a planned transcochlear approach. Prior to cochlear ablation by drilling and subsequent tumor resection, the cochlear nerve was stimulated with a pulsed infrared laser at low radiation energies. Stimulation with optical radiation evoked compound action potentials from the human auditory nerve. Stimulation of the auditory nerve with infrared laser pulses is possible in the human inner ear. The finding is an important step for translating results from animal experiments to human and furthers the development of a novel interface that uses optical radiation to stimulate neurons. Additional measurements are required to optimize the stimulation parameters.

  19. Cues of maternal condition influence offspring selfishness.

    Directory of Open Access Journals (Sweden)

    Janine W Y Wong

    Full Text Available The evolution of parent-offspring communication was mostly studied from the perspective of parents responding to begging signals conveying information about offspring condition. Parents should respond to begging because of the differential fitness returns obtained from their investment in offspring that differ in condition. For analogous reasons, offspring should adjust their behavior to cues/signals of parental condition: parents that differ in condition pay differential costs of care and, hence, should provide different amounts of food. In this study, we experimentally tested in the European earwig (Forficula auricularia if cues of maternal condition affect offspring behavior in terms of sibling cannibalism. We experimentally manipulated female condition by providing them with different amounts of food, kept nymph condition constant, allowed for nymph exposure to chemical maternal cues over extended time, quantified nymph survival (deaths being due to cannibalism and extracted and analyzed the females' cuticular hydrocarbons (CHC. Nymph survival was significantly affected by chemical cues of maternal condition, and this effect depended on the timing of breeding. Cues of poor maternal condition enhanced nymph survival in early broods, but reduced nymph survival in late broods, and vice versa for cues of good condition. Furthermore, female condition affected the quantitative composition of their CHC profile which in turn predicted nymph survival patterns. Thus, earwig offspring are sensitive to chemical cues of maternal condition and nymphs from early and late broods show opposite reactions to the same chemical cues. Together with former evidence on maternal sensitivities to condition-dependent nymph chemical cues, our study shows context-dependent reciprocal information exchange about condition between earwig mothers and their offspring, potentially mediated by cuticular hydrocarbons.

  20. Cues of maternal condition influence offspring selfishness.

    Science.gov (United States)

    Wong, Janine W Y; Lucas, Christophe; Kölliker, Mathias

    2014-01-01

    The evolution of parent-offspring communication was mostly studied from the perspective of parents responding to begging signals conveying information about offspring condition. Parents should respond to begging because of the differential fitness returns obtained from their investment in offspring that differ in condition. For analogous reasons, offspring should adjust their behavior to cues/signals of parental condition: parents that differ in condition pay differential costs of care and, hence, should provide different amounts of food. In this study, we experimentally tested in the European earwig (Forficula auricularia) if cues of maternal condition affect offspring behavior in terms of sibling cannibalism. We experimentally manipulated female condition by providing them with different amounts of food, kept nymph condition constant, allowed for nymph exposure to chemical maternal cues over extended time, quantified nymph survival (deaths being due to cannibalism) and extracted and analyzed the females' cuticular hydrocarbons (CHC). Nymph survival was significantly affected by chemical cues of maternal condition, and this effect depended on the timing of breeding. Cues of poor maternal condition enhanced nymph survival in early broods, but reduced nymph survival in late broods, and vice versa for cues of good condition. Furthermore, female condition affected the quantitative composition of their CHC profile which in turn predicted nymph survival patterns. Thus, earwig offspring are sensitive to chemical cues of maternal condition and nymphs from early and late broods show opposite reactions to the same chemical cues. Together with former evidence on maternal sensitivities to condition-dependent nymph chemical cues, our study shows context-dependent reciprocal information exchange about condition between earwig mothers and their offspring, potentially mediated by cuticular hydrocarbons. PMID:24498046

  1. Relative saliency of pitch versus phonetic cues in infancy

    Science.gov (United States)

    Cardillo, Gina; Kuhl, Patricia; Sundara, Megha

    2005-09-01

    Infants in their first year are highly sensitive to different acoustic components of speech, including phonetic detail and pitch information. The present investigation examined whether relative sensitivity to these two dimensions changes during this period, as the infant acquires language-specific phonetic categories. If pitch and phonetic discrimination are hierarchical, then the relative salience of pitch and phonetic change may become reversed between 8 and 12 months of age. Thirty-two- and 47-week-old infants were tested using an auditory preference paradigm in which they first heard a recording of a person singing a 4-note song (i.e., ``go-bi-la-tu'') and were then presented with both the familiar and an unfamiliar, modified version of that song. Modifications were either a novel pitch order (keeping syllables constant) or a novel syllable order (keeping melody constant). Compared to the younger group, older infants were predicted to show greater relative sensitivity to syllable order than pitch order, in accordance with an increased tendency to attend to linguistically relevant information (phonetic patterns) as opposed to cues that are initially more salient (pitch patterns). Preliminary data show trends toward the predicted interaction, with preference patterns commensurate with previously reported data. [Work supported by the McDonnell Foundation and NIH.

  2. Perceptual separation of transparent motion components: the interaction of motion, luminance and shape cues.

    Science.gov (United States)

    Meso, Andrew Isaac; Durant, Szonya; Zanker, Johannes M

    2013-09-01

    Transparency is perceived when two or more objects or surfaces can be separated by the visual system whilst they are presented in the same region of the visual field at the same time. This segmentation of distinct entities on the basis of overlapping local visual cues poses an interesting challenge for the understanding of cortical information processing. In psychophysical experiments, we studied stimuli that contained randomly positioned disc elements, moving at two different speeds in the same direction, to analyse the interaction of cues during the perception of motion transparency. The current work extends findings from previous experiments with sine wave luminance gratings which only vary in one spatial dimension. The reported experiments manipulate low-level cues, like differences in speed or luminance, and what are likely to be higher level cues such as the relative size of the elements or the superposition rules that govern overlapping regions. The mechanism responsible for separation appears to be mediated by combination of the relevant and available cues. Where perceived transparency is stronger, the neural representations of components are inferred to be more distinguishable from each other across what appear to be multiple cue dimensions. The disproportionally large effect on transparency strength of the type of superposition of disc suggests that with this manipulation, there may be enhanced separation above what might be expected from the linear combination of low-level cues in a process we term labelling. A mechanism for transparency perception consistent with the current results would require a minimum of three stages; in addition to the local motion detection and global pooling and separation of motion signals, findings suggest a powerful additional role of higher level separation cues. PMID:23831850

  3. Auditory hallucinations suppressed by etizolam in a patient with schizophrenia.

    Science.gov (United States)

    Benazzi, F; Mazzoli, M; Rossi, E

    1993-10-01

    A patient presented with a 15 year history of schizophrenia with auditory hallucinations. Though unresponsive to prolonged trials of neuroleptics, the auditory hallucinations disappeared with etizolam. PMID:7902201

  4. Auditory Association Cortex Lesions Impair Auditory Short-Term Memory in Monkeys

    Science.gov (United States)

    Colombo, Michael; D'Amato, Michael R.; Rodman, Hillary R.; Gross, Charles G.

    1990-01-01

    Monkeys that were trained to perform auditory and visual short-term memory tasks (delayed matching-to-sample) received lesions of the auditory association cortex in the superior temporal gyrus. Although visual memory was completely unaffected by the lesions, auditory memory was severely impaired. Despite this impairment, all monkeys could discriminate sounds closer in frequency than those used in the auditory memory task. This result suggests that the superior temporal cortex plays a role in auditory processing and retention similar to the role the inferior temporal cortex plays in visual processing and retention.

  5. Multisensory cueing for enhancing orientation information during flight.

    Science.gov (United States)

    Albery, William B

    2007-05-01

    The U.S. Air Force still regards spatial disorientation (SD) and loss of situational awareness (SA) as major contributing factors in operational Class A aircraft mishaps ($1M in aircraft loss and/or pilot fatality). Air Force Safety Agency data show 71 Class A SD mishaps from 1991-2004 in both fixed and rotary-wing aircraft. These mishaps resulted in 62 fatalities and an aircraft cost of over $2.OB. These losses account for 21 % of the USAF's Class A mishaps during that 14-yr period. Even non-mishap SD events negatively impact aircrew performance and reduce mission effectiveness. A multisensory system has been developed called the Spatial Orientation Retention Device (SORD) to enhance the aircraft attitude information to the pilot. SORD incorporates multisensory aids including helmet mounted symbology and tactile and audio cues. SORD has been prototyped and demonstrated in the Air Force Research Laboratory at Wright-Patterson AFB, OH. The technology has now been transitioned to a Rotary Wing Brownout program. This paper discusses the development of SORD and a potential application, including an augmented cognition application. Unlike automatic ground collision avoidance systems, SORD does not take over the aircraft if a pre-set altitude is broached by the pilot; rather, SORD provides complementary attitude cues to the pilot via the tactile, audio, and visual systems that allow the pilot to continue flying through disorienting conditions.

  6. Narrow, duplicated internal auditory canal

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, T. [Servico de Neurorradiologia, Hospital Garcia de Orta, Avenida Torrado da Silva, 2801-951, Almada (Portugal); Shayestehfar, B. [Department of Radiology, UCLA Oliveview School of Medicine, Los Angeles, California (United States); Lufkin, R. [Department of Radiology, UCLA School of Medicine, Los Angeles, California (United States)

    2003-05-01

    A narrow internal auditory canal (IAC) constitutes a relative contraindication to cochlear implantation because it is associated with aplasia or hypoplasia of the vestibulocochlear nerve or its cochlear branch. We report an unusual case of a narrow, duplicated IAC, divided by a bony septum into a superior relatively large portion and an inferior stenotic portion, in which we could identify only the facial nerve. This case adds support to the association between a narrow IAC and aplasia or hypoplasia of the vestibulocochlear nerve. The normal facial nerve argues against the hypothesis that the narrow IAC is the result of a primary bony defect which inhibits the growth of the vestibulocochlear nerve. (orig.)

  7. Auditory hallucinations in nonverbal quadriplegics.

    Science.gov (United States)

    Hamilton, J

    1985-11-01

    When a system for communicating with nonverbal, quadriplegic, institutionalized residents was developed, it was discovered that many were experiencing auditory hallucinations. Nine cases are presented in this study. The "voices" described have many similar characteristics, the primary one being that they give authoritarian commands that tell the residents how to behave and to which the residents feel compelled to respond. Both the relationship of this phenomenon to the theoretical work of Julian Jaynes and its effect on the lives of the residents are discussed.

  8. Autosomal recessive hereditary auditory neuropathy

    Institute of Scientific and Technical Information of China (English)

    王秋菊; 顾瑞; 曹菊阳

    2003-01-01

    Objectives: Auditory neuropathy (AN) is a sensorineural hearing disorder characterized by absent or abnormal auditory brainstem responses (ABRs) and normal cochlear outer hair cell function as measured by otoacoustic emissions (OAEs). Many risk factors are thought to be involved in its etiology and pathophysiology. Three Chinese pedigrees with familial AN are presented herein to demonstrate involvement of genetic factors in AN etiology. Methods: Probands of the above - mentioned pedigrees, who had been diagnosed with AN, were evaluated and followed up in the Department of Otolaryngology Head and Neck Surgery, China PLA General Hospital. Their family members were studied and the pedigree diagrams were established. History of illness, physical examination,pure tone audiometry, acoustic reflex, ABRs and transient evoked and distortion- product otoacoustic emissions (TEOAEs and DPOAEs) were obtained from members of these families. DPOAE changes under the influence of contralateral sound stimuli were observed by presenting a set of continuous white noise to the non - recording ear to exam the function of auditory efferent system. Some subjects received vestibular caloric test, computed tomography (CT)scan of the temporal bone and electrocardiography (ECG) to exclude other possible neuropathy disorders. Results: In most affected subjects, hearing loss of various degrees and speech discrimination difficulties started at 10 to16 years of age. Their audiological evaluation showed absence of acoustic reflex and ABRs. As expected in AN, these subjects exhibited near normal cochlear outer hair cell function as shown in TEOAE & DPOAE recordings. Pure- tone audiometry revealed hearing loss ranging from mild to severe in these patients. Autosomal recessive inheritance patterns were observed in the three families. In Pedigree Ⅰ and Ⅱ, two affected brothers were found respectively, while in pedigree Ⅲ, 2 sisters were affected. All the patients were otherwise normal without

  9. Design guidelines for the use of audio cues in computer interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Sumikawa, D.A.; Blattner, M.M.; Joy, K.I.; Greenberg, R.M.

    1985-07-01

    A logical next step in the evolution of the computer-user interface is the incorporation of sound thereby using our senses of ''hearing'' in our communication with the computer. This allows our visual and auditory capacities to work in unison leading to a more effective and efficient interpretation of information received from the computer than by sight alone. In this paper we examine earcons, which are audio cues, used in the computer-user interface to provide information and feedback to the user about computer entities (these include messages and functions, as well as states and labels). The material in this paper is part of a larger study that recommends guidelines for the design and use of audio cues in the computer-user interface. The complete work examines the disciplines of music, psychology, communication theory, advertising, and psychoacoustics to discover how sound is utilized and analyzed in those areas. The resulting information is organized according to the theory of semiotics, the theory of signs, into the syntax, semantics, and pragmatics of communication by sound. Here we present design guidelines for the syntax of earcons. Earcons are constructed from motives, short sequences of notes with a specific rhythm and pitch, embellished by timbre, dynamics, and register. Compound earcons and family earcons are introduced. These are related motives that serve to identify a family of related cues. Examples of earcons are given.

  10. Further Evidence of Auditory Extinction in Aphasia

    Science.gov (United States)

    Marshall, Rebecca Shisler; Basilakos, Alexandra; Love-Myers, Kim

    2013-01-01

    Purpose: Preliminary research ( Shisler, 2005) suggests that auditory extinction in individuals with aphasia (IWA) may be connected to binding and attention. In this study, the authors expanded on previous findings on auditory extinction to determine the source of extinction deficits in IWA. Method: Seventeen IWA (M[subscript age] = 53.19 years)…

  11. Mapping tonotopy in human auditory cortex

    NARCIS (Netherlands)

    van Dijk, Pim; Langers, Dave R M; Moore, BCJ; Patterson, RD; Winter, IM; Carlyon, RP; Gockel, HE

    2013-01-01

    Tonotopy is arguably the most prominent organizational principle in the auditory pathway. Nevertheless, the layout of tonotopic maps in humans is still debated. We present neuroimaging data that robustly identify multiple tonotopic maps in the bilateral auditory cortex. In contrast with some earlier

  12. Auditory Processing Disorder and Foreign Language Acquisition

    Science.gov (United States)

    Veselovska, Ganna

    2015-01-01

    This article aims at exploring various strategies for coping with the auditory processing disorder in the light of foreign language acquisition. The techniques relevant to dealing with the auditory processing disorder can be attributed to environmental and compensatory approaches. The environmental one involves actions directed at creating a…

  13. Effect of Auditory Constraints on Motor Learning Depends on Stage of Recovery Post Stroke

    Directory of Open Access Journals (Sweden)

    Viswanath eAluru

    2014-06-01

    Full Text Available In order to develop evidence-based rehabilitation protocols post stroke, one must first reconcile the vast heterogeneity in the post-stroke population and develop protocols to facilitate motor learning in the various subgroups. The main purpose of this study is to show that auditory constraints interact with the stage of recovery post stroke to influence motor learning. We characterized the stages of upper limb recovery using task-based kinematic measures in twenty subjects with chronic hemiparesis, and used a bimanual wrist extension task using a custom-made wrist trainer to facilitate learning of wrist extension in the paretic hand under four auditory conditions: 1 without auditory cueing; 2 to non-musical happy sounds; 3 to self-selected music; and 4 to a metronome beat set at a comfortable tempo. Two bimanual trials (15 s each were followed by one unimanual trial with the paretic hand over six cycles under each condition. Clinical metrics, wrist and arm kinematics and electromyographic activity were recorded. Hierarchical cluster analysis with the Mahalanobis metric based on baseline speed and extent of wrist movement stratified subjects into three distinct groups which reflected their stage of recovery: spastic paresis, spastic co-contraction, and minimal paresis. In spastic paresis, the metronome beat increased wrist extension, but also increased muscle co-activation across the wrist. In contrast, in spastic co-contraction, no auditory stimulation increased wrist extension and reduced co-activation. In minimal paresis, wrist extension did not improve under any condition. The results suggest that auditory task constraints interact with stage of recovery during motor learning after stroke, perhaps due to recruitment of distinct neural substrates over the course of recovery. The findings advance our understanding of the mechanisms of progression of motor recovery and lay the foundation for personalized treatment algorithms post stroke.

  14. Segregation and integration of auditory streams when listening to multi-part music.

    Directory of Open Access Journals (Sweden)

    Marie Ragert

    Full Text Available In our daily lives, auditory stream segregation allows us to differentiate concurrent sound sources and to make sense of the scene we are experiencing. However, a combination of segregation and the concurrent integration of auditory streams is necessary in order to analyze the relationship between streams and thus perceive a coherent auditory scene. The present functional magnetic resonance imaging study investigates the relative role and neural underpinnings of these listening strategies in multi-part musical stimuli. We compare a real human performance of a piano duet and a synthetic stimulus of the same duet in a prioritized integrative attention paradigm that required the simultaneous segregation and integration of auditory streams. In so doing, we manipulate the degree to which the attended part of the duet led either structurally (attend melody vs. attend accompaniment or temporally (asynchronies vs. no asynchronies between parts, and thus the relative contributions of integration and segregation used to make an assessment of the leader-follower relationship. We show that perceptually the relationship between parts is biased towards the conventional structural hierarchy in western music in which the melody generally dominates (leads the accompaniment. Moreover, the assessment varies as a function of both cognitive load, as shown through difficulty ratings and the interaction of the temporal and the structural relationship factors. Neurally, we see that the temporal relationship between parts, as one important cue for stream segregation, revealed distinct neural activity in the planum temporale. By contrast, integration used when listening to both the temporally separated performance stimulus and the temporally fused synthetic stimulus resulted in activation of the intraparietal sulcus. These results support the hypothesis that the planum temporale and IPS are key structures underlying the mechanisms of segregation and integration of

  15. A comparative study of simple auditory reaction time in blind (congenitally and sighted subjects

    Directory of Open Access Journals (Sweden)

    Pritesh Hariprasad Gandhi

    2013-01-01

    Full Text Available Background: Reaction time is the time interval between the application of a stimulus and the appearance of appropriate voluntary response by a subject. It involves stimulus processing, decision making, and response programming. Reaction time study has been popular due to their implication in sports physiology. Reaction time has been widely studied as its practical implications may be of great consequence e.g., a slower than normal reaction time while driving can have grave results. Objective: To study simple auditory reaction time in congenitally blind subjects and in age sex matched sighted subjects. To compare the simple auditory reaction time between congenitally blind subjects and healthy control subjects. Materials and Methods: Study had been carried out in two groups: The 1 st of 50 congenitally blind subjects and 2 nd group comprises of 50 healthy controls. It was carried out on Multiple Choice Reaction Time Apparatus, Inco Ambala Ltd. (Accuracy΁0.001 s in a sitting position at Government Medical College and Hospital, Bhavnagar and at a Blind School, PNR campus, Bhavnagar, Gujarat, India. Observations / Results: Simple auditory reaction time response with four different type of sound (horn, bell, ring, and whistle was recorded in both groups. According to our study, there is no significant different in reaction time between congenital blind and normal healthy persons. Conclusion: Blind individuals commonly utilize tactual and auditory cues for information and orientation and they reliance on touch and audition, together with more practice in using these modalities to guide behavior, is often reflected in better performance of blind relative to sighted participants in tactile or auditory discrimination tasks, but there is not any difference in reaction time between congenitally blind and sighted people.

  16. Effect of auditory constraints on motor performance depends on stage of recovery post-stroke.

    Science.gov (United States)

    Aluru, Viswanath; Lu, Ying; Leung, Alan; Verghese, Joe; Raghavan, Preeti

    2014-01-01

    In order to develop evidence-based rehabilitation protocols post-stroke, one must first reconcile the vast heterogeneity in the post-stroke population and develop protocols to facilitate motor learning in the various subgroups. The main purpose of this study is to show that auditory constraints interact with the stage of recovery post-stroke to influence motor learning. We characterized the stages of upper limb recovery using task-based kinematic measures in 20 subjects with chronic hemiparesis. We used a bimanual wrist extension task, performed with a custom-made wrist trainer, to facilitate learning of wrist extension in the paretic hand under four auditory conditions: (1) without auditory cueing; (2) to non-musical happy sounds; (3) to self-selected music; and (4) to a metronome beat set at a comfortable tempo. Two bimanual trials (15 s each) were followed by one unimanual trial with the paretic hand over six cycles under each condition. Clinical metrics, wrist and arm kinematics, and electromyographic activity were recorded. Hierarchical cluster analysis with the Mahalanobis metric based on baseline speed and extent of wrist movement stratified subjects into three distinct groups, which reflected their stage of recovery: spastic paresis, spastic co-contraction, and minimal paresis. In spastic paresis, the metronome beat increased wrist extension, but also increased muscle co-activation across the wrist. In contrast, in spastic co-contraction, no auditory stimulation increased wrist extension and reduced co-activation. In minimal paresis, wrist extension did not improve under any condition. The results suggest that auditory task constraints interact with stage of recovery during motor learning after stroke, perhaps due to recruitment of distinct neural substrates over the course of recovery. The findings advance our understanding of the mechanisms of progression of motor recovery and lay the foundation for personalized treatment algorithms post-stroke. PMID

  17. The Auditory-Visual Speech Benefit on Working Memory in Older Adults with Hearing Impairment

    Directory of Open Access Journals (Sweden)

    Jana B. Frtusova

    2016-04-01

    Full Text Available This study examined the effect of auditory-visual (AV speech stimuli on working memory in hearing impaired participants (HIP in comparison to age- and education-matched normal elderly controls (NEC. Participants completed a working memory n-back task (0- to 2-back in which sequences of digits were presented in visual-only (i.e., speech-reading, auditory-only (A-only, and AV conditions. Auditory event-related potentials (ERP were collected to assess the relationship between perceptual and working memory processing. The behavioural results showed that both groups were faster in the AV condition in comparison to the unisensory conditions. The ERP data showed perceptual facilitation in the AV condition, in the form of reduced amplitudes and latencies of the auditory N1 and/or P1 components, in the HIP group. Furthermore, a working memory ERP component, the P3, peaked earlier for both groups in the AV condition compared to the A-only condition. In general, the HIP group showed a more robust AV benefit; however, the NECs showed a dose-response relationship between perceptual facilitation and working memory improvement, especially for facilitation of processing speed. Two measures, reaction time and P3 amplitude, suggested that the presence of visual speech cues may have helped the HIP to counteract the demanding auditory processing, to the level that no group differences were evident during the AV modality despite lower performance during the A-only condition. Overall, this study provides support for the theory of an integrated perceptual-cognitive system. The practical significance of these findings is also discussed.

  18. The Auditory-Visual Speech Benefit on Working Memory in Older Adults with Hearing Impairment.

    Science.gov (United States)

    Frtusova, Jana B; Phillips, Natalie A

    2016-01-01

    This study examined the effect of auditory-visual (AV) speech stimuli on working memory in older adults with poorer-hearing (PH) in comparison to age- and education-matched older adults with better hearing (BH). Participants completed a working memory n-back task (0- to 2-back) in which sequences of digits were presented in visual-only (i.e., speech-reading), auditory-only (A-only), and AV conditions. Auditory event-related potentials (ERP) were collected to assess the relationship between perceptual and working memory processing. The behavioral results showed that both groups were faster in the AV condition in comparison to the unisensory conditions. The ERP data showed perceptual facilitation in the AV condition, in the form of reduced amplitudes and latencies of the auditory N1 and/or P1 components, in the PH group. Furthermore, a working memory ERP component, the P3, peaked earlier for both groups in the AV condition compared to the A-only condition. In general, the PH group showed a more robust AV benefit; however, the BH group showed a dose-response relationship between perceptual facilitation and working memory improvement, especially for facilitation of processing speed. Two measures, reaction time and P3 amplitude, suggested that the presence of visual speech cues may have helped the PH group to counteract the demanding auditory processing, to the level that no group differences were evident during the AV modality despite lower performance during the A-only condition. Overall, this study provides support for the theory of an integrated perceptual-cognitive system. The practical significance of these findings is also discussed. PMID:27148106

  19. Relating auditory attributes of multichannel sound to preference and to physical parameters

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian Maria

    2006-01-01

    Sound reproduced by multichannel systems is affected by many factors giving rise to various sensations, or auditory attributes. Relating specific attributes to overall preference and to physical measures of the sound field provides valuable information for a better understanding of the parameters...... within and between musical program materials, allowing for a careful generalization regarding the perception of spatial audio reproduction. Finally, a set of objective measures is derived from analysis of the sound field at the listening position in an attempt to predict the auditory attributes....

  20. Speech perception as complex auditory categorization

    Science.gov (United States)

    Holt, Lori L.

    2002-05-01

    Despite a long and rich history of categorization research in cognitive psychology, very little work has addressed the issue of complex auditory category formation. This is especially unfortunate because the general underlying cognitive and perceptual mechanisms that guide auditory category formation are of great importance to understanding speech perception. I will discuss a new methodological approach to examining complex auditory category formation that specifically addresses issues relevant to speech perception. This approach utilizes novel nonspeech sound stimuli to gain full experimental control over listeners' history of experience. As such, the course of learning is readily measurable. Results from this methodology indicate that the structure and formation of auditory categories are a function of the statistical input distributions of sound that listeners hear, aspects of the operating characteristics of the auditory system, and characteristics of the perceptual categorization system. These results have important implications for phonetic acquisition and speech perception.

  1. Gender differences in craving and cue reactivity to smoking and negative affect/stress cues.

    Science.gov (United States)

    Saladin, Michael E; Gray, Kevin M; Carpenter, Matthew J; LaRowe, Steven D; DeSantis, Stacia M; Upadhyaya, Himanshu P

    2012-01-01

    There is evidence that women may be less successful when attempting to quit smoking than men. One potential contributory cause of this gender difference is differential craving and stress reactivity to smoking- and negative affect/stress-related cues. The present human laboratory study investigated the effects of gender on reactivity to smoking and negative affect/stress cues by exposing nicotine dependent women (n = 37) and men (n = 53) smokers to two active cue types, each with an associated control cue: (1) in vivo smoking cues and in vivo neutral control cues, and (2) imagery-based negative affect/stress script and a neutral/relaxing control script. Both before and after each cue/script, participants provided subjective reports of smoking-related craving and affective reactions. Heart rate (HR) and skin conductance (SC) responses were also measured. Results indicated that participants reported greater craving and SC in response to smoking versus neutral cues and greater subjective stress in response to the negative affect/stress versus neutral/relaxing script. With respect to gender differences, women evidenced greater craving, stress and arousal ratings and lower valence ratings (greater negative emotion) in response to the negative affect/stressful script. While there were no gender differences in responses to smoking cues, women trended towards higher arousal ratings. Implications of the findings for treatment and tobacco-related morbidity and mortality are discussed.

  2. Perception of health from facial cues.

    Science.gov (United States)

    Henderson, Audrey J; Holzleitner, Iris J; Talamas, Sean N; Perrett, David I

    2016-05-01

    Impressions of health are integral to social interactions, yet poorly understood. A review of the literature reveals multiple facial characteristics that potentially act as cues to health judgements. The cues vary in their stability across time: structural shape cues including symmetry and sexual dimorphism alter slowly across the lifespan and have been found to have weak links to actual health, but show inconsistent effects on perceived health. Facial adiposity changes over a medium time course and is associated with both perceived and actual health. Skin colour alters over a short time and has strong effects on perceived health, yet links to health outcomes have barely been evaluated. Reviewing suggested an additional influence of demeanour as a perceptual cue to health. We, therefore, investigated the association of health judgements with multiple facial cues measured objectively from two-dimensional and three-dimensional facial images. We found evidence for independent contributions of face shape and skin colour cues to perceived health. Our empirical findings: (i) reinforce the role of skin yellowness; (ii) demonstrate the utility of global face shape measures of adiposity; and (iii) emphasize the role of affect in facial images with nominally neutral expression in impressions of health. PMID:27069057

  3. Cortical encoding and neurophysiological tracking of intensity and pitch cues signaling English stress patterns in native and nonnative speakers.

    Science.gov (United States)

    Chung, Wei-Lun; Bidelman, Gavin M

    2016-01-01

    We examined cross-language differences in neural encoding and tracking of intensity and pitch cues signaling English stress patterns. Auditory mismatch negativities (MMNs) were recorded in English and Mandarin listeners in response to contrastive English pseudowords whose primary stress occurred either on the first or second syllable (i.e., "nocTICity" vs. "NOCticity"). The contrastive syllable stress elicited two consecutive MMNs in both language groups, but English speakers demonstrated larger responses to stress patterns than Mandarin speakers. Correlations between the amplitude of ERPs and continuous changes in the running intensity and pitch of speech assessed how well each language group's brain activity tracked these salient acoustic features of lexical stress. We found that English speakers' neural responses tracked intensity changes in speech more closely than Mandarin speakers (higher brain-acoustic correlation). Findings demonstrate more robust and precise processing of English stress (intensity) patterns in early auditory cortical responses of native relative to nonnative speakers. PMID:27140864

  4. Cortical encoding and neurophysiological tracking of intensity and pitch cues signaling English stress patterns in native and nonnative speakers.

    Science.gov (United States)

    Chung, Wei-Lun; Bidelman, Gavin M

    2016-01-01

    We examined cross-language differences in neural encoding and tracking of intensity and pitch cues signaling English stress patterns. Auditory mismatch negativities (MMNs) were recorded in English and Mandarin listeners in response to contrastive English pseudowords whose primary stress occurred either on the first or second syllable (i.e., "nocTICity" vs. "NOCticity"). The contrastive syllable stress elicited two consecutive MMNs in both language groups, but English speakers demonstrated larger responses to stress patterns than Mandarin speakers. Correlations between the amplitude of ERPs and continuous changes in the running intensity and pitch of speech assessed how well each language group's brain activity tracked these salient acoustic features of lexical stress. We found that English speakers' neural responses tracked intensity changes in speech more closely than Mandarin speakers (higher brain-acoustic correlation). Findings demonstrate more robust and precise processing of English stress (intensity) patterns in early auditory cortical responses of native relative to nonnative speakers.

  5. Active listening for spatial orientation in a complex auditory scene.

    Directory of Open Access Journals (Sweden)

    Cynthia F Moss

    2006-04-01

    Full Text Available To successfully negotiate a complex environment, an animal must control the timing of motor behaviors in coordination with dynamic sensory information. Here, we report on adaptive temporal control of vocal-motor behavior in an echolocating bat, Eptesicus fuscus, as it captured tethered insects close to background vegetation. Recordings of the bat's sonar vocalizations were synchronized with high-speed video images that were used to reconstruct the bat's three-dimensional flight path and the positions of target and vegetation. When the bat encountered the difficult task of taking insects as close as 10-20 cm from the vegetation, its behavior changed significantly from that under open room conditions. Its success rate decreased by about 50%, its time to initiate interception increased by a factor of ten, and its high repetition rate "terminal buzz" decreased in duration by a factor of three. Under all conditions, the bat produced prominent sonar "strobe groups," clusters of echolocation pulses with stable intervals. In the final stages of insect capture, the bat produced strobe groups at a higher incidence when the insect was positioned near clutter. Strobe groups occurred at all phases of the wingbeat (and inferred respiration cycle, challenging the hypothesis of strict synchronization between respiration and sound production in echolocating bats. The results of this study provide a clear demonstration of temporal vocal-motor control that directly impacts the signals used for perception.

  6. Processing of spatial sounds in the impaired auditory system

    DEFF Research Database (Denmark)

    Arweiler, Iris

    information is not crucial. The results from an additional experiment demonstrated that the ER benefit was maintained with independent as well as with linked hearing aid compression. Overall, this work contributes to the understanding of ER processing in listeners with normal and impaired hearing and may have...... with an intelligibility-weighted “efficiency factor” which revealed that the spectral characteristics of the ER’s caused the reduced benefit. Hearing-impaired listeners were able to utilize the ER energy as effectively as normal-hearing listeners, most likely because binaural processing was not required...... for the integration of the ER’s with the DS. Different masker types were found to have an impact on the binaural processing of the overall speech signal but not on the processing of ER’s. Second, the influence of interaural level differences (ILD’s) on speech intelligibility was investigated with a hearing aid...

  7. THE EFFECTS OF SALICYLATE ON AUDITORY EVOKED POTENTIAL AMPLITWDE FROM THE AUDITORY CORTEX AND AUDITORY BRAINSTEM

    Institute of Scientific and Technical Information of China (English)

    Brian Sawka; SUN Wei

    2014-01-01

    Tinnitus has often been studied using salicylate in animal models as they are capable of inducing tempo-rary hearing loss and tinnitus. Studies have recently observed enhancement of auditory evoked responses of the auditory cortex (AC) post salicylate treatment which is also shown to be related to tinnitus like behavior in rats. The aim of this study was to observe if enhancements of the AC post salicylate treatment are also present at structures in the brainstem. Four male Sprague Dawley rats with AC implanted electrodes were tested for both AC and auditory brainstem response (ABR) recordings pre and post 250 mg/kg intraperitone-al injections of salicylate. The responses were recorded as the peak to trough amplitudes of P1-N1 (AC), ABR wave V, and ABR waveⅡ. AC responses resulted in statistically significant enhancement of ampli-tude at 2 hours post salicylate with 90 dB stimuli tone bursts of 4, 8, 12, and 20 kHz. Wave V of ABR re-sponses at 90 dB resulted in a statistically significant reduction of amplitude 2 hours post salicylate and a mean decrease of amplitude of 31%for 16 kHz. WaveⅡamplitudes at 2 hours post treatment were signifi-cantly reduced for 4, 12, and 20 kHz stimuli at 90 dB SPL. Our results suggest that the enhancement chang-es of the AC related to salicylate induced tinnitus are generated superior to the level of the inferior colliculus and may originate in the AC.

  8. Temporal auditory processing in elders

    Directory of Open Access Journals (Sweden)

    Azzolini, Vanuza Conceição

    2010-03-01

    Full Text Available Introduction: In the trial of aging all the structures of the organism are modified, generating intercurrences in the quality of the hearing and of the comprehension. The hearing loss that occurs in consequence of this trial occasion a reduction of the communicative function, causing, also, a distance of the social relationship. Objective: Comparing the performance of the temporal auditory processing between elderly individuals with and without hearing loss. Method: The present study is characterized for to be a prospective, transversal and of diagnosis character field work. They were analyzed 21 elders (16 women and 5 men, with ages between 60 to 81 years divided in two groups, a group "without hearing loss"; (n = 13 with normal auditive thresholds or restricted hearing loss to the isolated frequencies and a group "with hearing loss" (n = 8 with neurosensory hearing loss of variable degree between light to moderately severe. Both the groups performed the tests of frequency (PPS and duration (DPS, for evaluate the ability of temporal sequencing, and the test Randon Gap Detection Test (RGDT, for evaluate the temporal resolution ability. Results: It had not difference statistically significant between the groups, evaluated by the tests DPS and RGDT. The ability of temporal sequencing was significantly major in the group without hearing loss, when evaluated by the test PPS in the condition "muttering". This result presented a growing one significant in parallel with the increase of the age group. Conclusion: It had not difference in the temporal auditory processing in the comparison between the groups.

  9. Direct and Indirect Cues to Knowledge States during Word Learning

    Science.gov (United States)

    Saylor, Megan M.; Carroll, C. Brooke

    2009-01-01

    The present study investigated three-year-olds' sensitivity to direct and indirect cues to others' knowledge states for word learning purposes. Children were given either direct, physical cues to knowledge or indirect, verbal cues to knowledge. Preschoolers revealed a better ability to learn words from a speaker following direct, physical cues to…

  10. Attentional deployment in visual half-field tasks: the effect of cue position on word naming latency.

    Science.gov (United States)

    Lindell, Annukka K; Nicholls, Michael E R

    2003-11-01

    Divided visual-field research suggests that attentional factors may contribute to the left hemisphere's (LH) superiority for language processing. The LH's parallel recognition strategy, specialised for whole word encoding, is largely unaffected by the distribution of spatial attention. In contrast, the right hemisphere's (RH) serial, letter-by-letter strategy places far greater demands on attentional resources. By manipulating spatial attention, the present study gauged the effect of cueing the beginning vs. the end of the word on LH and RH naming latency. Results indicated no effect of cue position on LH performance, consistent with research indicating that the LH enjoys an attentional advantage, deploying attention in parallel across the stimulus. As anticipated, the RH showed a facilitatory effect of beginning cue, which draws spatial attention to the initial letter cluster, enabling efficient implementation of the RH's sequential strategy. These findings suggest that differences in attentional deployment contribute to hemispheric asymmetries for word recognition. PMID:14607163

  11. Effects of Auditory Rhythm and Music on Gait Disturbances in Parkinson's Disease.

    Science.gov (United States)

    Ashoori, Aidin; Eagleman, David M; Jankovic, Joseph

    2015-01-01

    Gait abnormalities, such as shuffling steps, start hesitation, and freezing, are common and often incapacitating symptoms of Parkinson's disease (PD) and other parkinsonian disorders. Pharmacological and surgical approaches have only limited efficacy in treating these gait disorders. Rhythmic auditory stimulation (RAS), such as playing marching music and dance therapy, has been shown to be a safe, inexpensive, and an effective method in improving gait in PD patients. However, RAS that adapts to patients' movements may be more effective than rigid, fixed-tempo RAS used in most studies. In addition to auditory cueing, immersive virtual reality technologies that utilize interactive computer-generated systems through wearable devices are increasingly used for improving brain-body interaction and sensory-motor integration. Using multisensory cues, these therapies may be particularly suitable for the treatment of parkinsonian freezing and other gait disorders. In this review, we examine the affected neurological circuits underlying gait and temporal processing in PD patients and summarize the current studies demonstrating the effects of RAS on improving these gait deficits. PMID:26617566

  12. Auditory enhancement of visual perception at threshold depends on visual abilities.

    Science.gov (United States)

    Caclin, Anne; Bouchet, Patrick; Djoulah, Farida; Pirat, Elodie; Pernier, Jacques; Giard, Marie-Hélène

    2011-06-17

    Whether or not multisensory interactions can improve detection thresholds, and thus widen the range of perceptible events is a long-standing debate. Here we revisit this question, by testing the influence of auditory stimuli on visual detection threshold, in subjects exhibiting a wide range of visual-only performance. Above the perceptual threshold, crossmodal interactions have indeed been reported to depend on the subject's performance when the modalities are presented in isolation. We thus tested normal-seeing subjects and short-sighted subjects wearing their usual glasses. We used a paradigm limiting potential shortcomings of previous studies: we chose a criterion-free threshold measurement procedure and precluded exogenous cueing effects by systematically presenting a visual cue whenever a visual target (a faint Gabor patch) might occur. Using this carefully controlled procedure, we found that concurrent sounds only improved visual detection thresholds in the sub-group of subjects exhibiting the poorest performance in the visual-only conditions. In these subjects, for oblique orientations of the visual stimuli (but not for vertical or horizontal targets), the auditory improvement was still present when visual detection was already helped with flanking visual stimuli generating a collinear facilitation effect. These findings highlight that crossmodal interactions are most efficient to improve perceptual performance when an isolated modality is deficient.

  13. Relationship between Sympathetic Skin Responses and Auditory Hypersensitivity to Different Auditory Stimuli.

    Science.gov (United States)

    Kato, Fumi; Iwanaga, Ryoichiro; Chono, Mami; Fujihara, Saori; Tokunaga, Akiko; Murata, Jun; Tanaka, Koji; Nakane, Hideyuki; Tanaka, Goro

    2014-07-01

    [Purpose] Auditory hypersensitivity has been widely reported in patients with autism spectrum disorders. However, the neurological background of auditory hypersensitivity is currently not clear. The present study examined the relationship between sympathetic nervous system responses and auditory hypersensitivity induced by different types of auditory stimuli. [Methods] We exposed 20 healthy young adults to six different types of auditory stimuli. The amounts of palmar sweating resulting from the auditory stimuli were compared between groups with (hypersensitive) and without (non-hypersensitive) auditory hypersensitivity. [Results] Although no group × type of stimulus × first stimulus interaction was observed for the extent of reaction, significant type of stimulus × first stimulus interaction was noted for the extent of reaction. For an 80 dB-6,000 Hz stimulus, the trends for palmar sweating differed between the groups. For the first stimulus, the variance became larger in the hypersensitive group than in the non-hypersensitive group. [Conclusion] Subjects who regularly felt excessive reactions to auditory stimuli tended to have excessive sympathetic responses to repeated loud noises compared with subjects who did not feel excessive reactions. People with auditory hypersensitivity may be classified into several subtypes depending on their reaction patterns to auditory stimuli.

  14. Effects of Verbal Cues versus Pictorial Cues on the Transfer of Stimulus Control for Children with Autism

    Science.gov (United States)

    West, Elizabeth Anne

    2008-01-01

    The author examined the transfer of stimulus control from instructor assistance to verbal cues and pictorial cues. The intent was to determine whether it is easier to transfer stimulus control to one form of cue or the other. No studies have conducted such comparisons to date; however, literature exists to suggest that visual cues may be…

  15. Spontaneous high-gamma band activity reflects functional organization of auditory cortex in the awake macaque.

    Science.gov (United States)

    Fukushima, Makoto; Saunders, Richard C; Leopold, David A; Mishkin, Mortimer; Averbeck, Bruno B

    2012-06-01

    In the absence of sensory stimuli, spontaneous activity in the brain has been shown to exhibit organization at multiple spatiotemporal scales. In the macaque auditory cortex, responses to acoustic stimuli are tonotopically organized within multiple, adjacent frequency maps aligned in a caudorostral direction on the supratemporal plane (STP) of the lateral sulcus. Here, we used chronic microelectrocorticography to investigate the correspondence between sensory maps and spontaneous neural fluctuations in the auditory cortex. We first mapped tonotopic organization across 96 electrodes spanning approximately two centimeters along the primary and higher auditory cortex. In separate sessions, we then observed that spontaneous activity at the same sites exhibited spatial covariation that reflected the tonotopic map of the STP. This observation demonstrates a close relationship between functional organization and spontaneous neural activity in the sensory cortex of the awake monkey. PMID:22681693

  16. Preschoolers' Learning of Brand Names from Visual Cues.

    OpenAIRE

    Macklin, M Carole

    1996-01-01

    This research addresses the question of how perceptual cues affect preschoolers' learning of brand names. It is found that when visual cues are provided in addition to brand names that are prior-associated in children's memory structures, children better remember the brand names. Although two cues (a picture and a color) improve memory over the imposition of a single cue, extensive visual cues may overtax young children's processing abilities. The study contributes to our understanding of how...

  17. Neural representation in the auditory midbrain of the envelope of vocalizations based on a peripheral ear model

    Directory of Open Access Journals (Sweden)

    Thilo eRode

    2013-10-01

    Full Text Available The auditory midbrain implant (AMI consists of a single shank array (20 sites for stimulation along the tonotopic axis of the central nucleus of the inferior colliculus (ICC and has been safely implanted in deaf patients who cannot benefit from a cochlear implant (CI. The AMI improves lip-reading abilities and environmental awareness in the implanted patients. However, the AMI cannot achieve the high levels of speech perception possible with the CI. It appears the AMI can transmit sufficient spectral cues but with limited temporal cues required for speech understanding. Currently, the AMI uses a CI-based strategy, which was originally designed to stimulate each frequency region along the cochlea with amplitude-modulated pulse trains matching the envelope of the bandpass-filtered sound components. However, it is unclear if this type of stimulation with only a single site within each frequency lamina of the ICC can elicit sufficient temporal cues for speech perception. At least speech understanding in quiet is still possible with envelope cues as low as 50 Hz. Therefore, we investigated how ICC neurons follow the bandpass-filtered envelope structure of natural stimuli in ketamine-anesthetized guinea pigs. We identified a subset of ICC neurons that could closely follow the envelope structure (up to ~100 Hz of a diverse set of species-specific calls, which was revealed by using a peripheral ear model to estimate the true bandpass-filtered envelopes observed by the brain. Although previous studies have suggested a complex neural transformation from the auditory nerve to the ICC, our data suggest that the brain maintains a robust temporal code in a subset of ICC neurons matching the envelope structure of natural stimuli. Clinically, these findings suggest that a CI-based strategy may still be effective for the AMI if the appropriate neurons are entrained to the envelope of the acoustic stimulus and can transmit sufficient temporal cues to higher

  18. An Auditory Model of Improved Adaptive ZCPA

    Directory of Open Access Journals (Sweden)

    Jinping Zhang

    2013-07-01

    Full Text Available An improved ZCAP auditory model with adaptability is proposed in this paper, and the  adaptive method designed for ZCPA model is suitable for other auditory model with inner-hair-cell sub-model. The first step in the implement process of the proposed ZCPA model is to carry out the calculation of inner product between signal and complex Gammatone filters to obtain important frequency components  of signal. And then, according to  the result of the first step, the parameters of the basilar membrane sub-model and frequency box are automatically adjusted, such as the number of the basilar membrane filters, center frequency and bandwith of each basilar membrane filter, position of each frequency box, and so on. Lastly  an auditory model is built, and the final output is auditory spectrum.The results of numerical simulation and experiments have showed that the proposed model could realize accurate frequency selection, and the auditory spectrum is more distinctly than that of conventional ZCPA model. Moreover, the proposed model can completely avoided the influence of the number of filter on the shape of auditory spectrum existing in conventional ZCPA model so that the shape of auditory spectrum is steady, and the data quantity is small.

  19. Auditory Efferent System Modulates Mosquito Hearing.

    Science.gov (United States)

    Andrés, Marta; Seifert, Marvin; Spalthoff, Christian; Warren, Ben; Weiss, Lukas; Giraldo, Diego; Winkler, Margret; Pauls, Stephanie; Göpfert, Martin C

    2016-08-01

    The performance of vertebrate ears is controlled by auditory efferents that originate in the brain and innervate the ear, synapsing onto hair cell somata and auditory afferent fibers [1-3]. Efferent activity can provide protection from noise and facilitate the detection and discrimination of sound by modulating mechanical amplification by hair cells and transmitter release as well as auditory afferent action potential firing [1-3]. Insect auditory organs are thought to lack efferent control [4-7], but when we inspected mosquito ears, we obtained evidence for its existence. Antibodies against synaptic proteins recognized rows of bouton-like puncta running along the dendrites and axons of mosquito auditory sensory neurons. Electron microscopy identified synaptic and non-synaptic sites of vesicle release, and some of the innervating fibers co-labeled with somata in the CNS. Octopamine, GABA, and serotonin were identified as efferent neurotransmitters or neuromodulators that affect auditory frequency tuning, mechanical amplification, and sound-evoked potentials. Mosquito brains thus modulate mosquito ears, extending the use of auditory efferent systems from vertebrates to invertebrates and adding new levels of complexity to mosquito sound detection and communication. PMID:27476597

  20. Colliding Cues in Word Segmentation: The Role of Cue Strength and General Cognitive Processes

    Science.gov (United States)

    Weiss, Daniel J.; Gerfen, Chip; Mitchel, Aaron D.

    2010-01-01

    The process of word segmentation is flexible, with many strategies potentially available to learners. This experiment explores how segmentation cues interact, and whether successful resolution of cue competition is related to general executive functioning. Participants listened to artificial speech streams that contained both statistical and…

  1. Counterconditioning reduces cue-induced craving and actual cue-elicited consumption.

    NARCIS (Netherlands)

    D. van Gucht; F. Baeyens; D. Vansteenwegen; D. Hermans; T. Beckers

    2010-01-01

    Cue-induced craving is not easily reduced by an extinction or exposure procedure and may constitute an important route toward relapse in addictive behavior after treatment. In the present study, we investigated the effectiveness of counterconditioning as an alternative procedure to reduce cue-induce

  2. Cues for Better Writing: Empirical Assessment of a Word Counter and Cueing Application's Effectiveness

    Science.gov (United States)

    Vijayasarathy, Leo R.; Gould, Susan Martin; Gould, Michael

    2015-01-01

    Written clarity and conciseness are desired by employers and emphasized in business communication courses. We developed and tested the efficacy of a cueing tool--Scribe Bene--to help students reduce their use of imprecise and ambiguous words and wordy phrases. Effectiveness was measured by comparing cue word usage between a treatment group given…

  3. The effect of visual cues on difficulty ratings for segregation of musical streams in listeners with impaired hearing.

    Directory of Open Access Journals (Sweden)

    Hamish Innes-Brown

    Full Text Available BACKGROUND: Enjoyment of music is an important part of life that may be degraded for people with hearing impairments, especially those using cochlear implants. The ability to follow separate lines of melody is an important factor in music appreciation. This ability relies on effective auditory streaming, which is much reduced in people with hearing impairment, contributing to difficulties in music appreciation. The aim of this study was to assess whether visual cues could reduce the subjective difficulty of segregating a melody from interleaved background notes in normally hearing listeners, those using hearing aids, and those using cochlear implants. METHODOLOGY/PRINCIPAL FINDINGS: Normally hearing listeners (N = 20, hearing aid users (N = 10, and cochlear implant users (N = 11 were asked to rate the difficulty of segregating a repeating four-note melody from random interleaved distracter notes. The pitch of the background notes was gradually increased or decreased throughout blocks, providing a range of difficulty from easy (with a large pitch separation between melody and distracter to impossible (with the melody and distracter completely overlapping. Visual cues were provided on half the blocks, and difficulty ratings for blocks with and without visual cues were compared between groups. Visual cues reduced the subjective difficulty of extracting the melody from the distracter notes for normally hearing listeners and cochlear implant users, but not hearing aid users. CONCLUSION/SIGNIFICANCE: Simple visual cues may improve the ability of cochlear implant users to segregate lines of music, thus potentially increasing their enjoyment of music. More research is needed to determine what type of acoustic cues to encode visually in order to optimise the benefits they may provide.

  4. Auditory functional magnetic resonance imaging in dogs – normalization and group analysis and the processing of pitch in the canine auditory pathways

    OpenAIRE

    Bach, Jan-Peter; Lüpke, Matthias; Dziallas, Peter; Wefstaedt, Patrick; Uppenkamp, Stefan; Seifert, Hermann; Nolte, Ingo

    2016-01-01

    Background Functional magnetic resonance imaging (fMRI) is an advanced and frequently used technique for studying brain functions in humans and increasingly so in animals. A key element of analyzing fMRI data is group analysis, for which valid spatial normalization is a prerequisite. In the current study we applied normalization and group analysis to a dataset from an auditory functional MRI experiment in anesthetized beagles. The stimulation paradigm used in the experiment was composed of si...

  5. Functional Neurochemistry of the Auditory System

    Directory of Open Access Journals (Sweden)

    Nourollah Agha Ebrahimi

    1993-03-01

    Full Text Available Functional Neurochemistry is one of the fields of studies in the auditory system which has had an outstanding development in the recent years. Many of the findings in the mentioned field had led not only the basic auditory researches but also the clinicians to new points of view in audiology.Here, we are aimed at discussing the latest investigations in the Functional Neurochemistry of the auditory system and have focused this review mainly on the researches which will arise flashes of hope for future clinical studies

  6. Auditory Neuropathy/Dyssynchrony in Biotinidase Deficiency

    Science.gov (United States)

    Yaghini, Omid

    2016-01-01

    Biotinidase deficiency is a disorder inherited autosomal recessively showing evidence of hearing loss and optic atrophy in addition to seizures, hypotonia, and ataxia. In the present study, a 2-year-old boy with Biotinidase deficiency is presented in which clinical symptoms have been reported with auditory neuropathy/auditory dyssynchrony (AN/AD). In this case, transient-evoked otoacoustic emissions showed bilaterally normal responses representing normal function of outer hair cells. In contrast, acoustic reflex test showed absent reflexes bilaterally, and visual reinforcement audiometry and auditory brainstem responses indicated severe to profound hearing loss in both ears. These results suggest AN/AD in patients with Biotinidase deficiency. PMID:27144235

  7. Functional Neurochemistry of the Auditory System

    OpenAIRE

    Nourollah Agha Ebrahimi

    1993-01-01

    Functional Neurochemistry is one of the fields of studies in the auditory system which has had an outstanding development in the recent years. Many of the findings in the mentioned field had led not only the basic auditory researches but also the clinicians to new points of view in audiology.Here, we are aimed at discussing the latest investigations in the Functional Neurochemistry of the auditory system and have focused this review mainly on the researches which will arise flashes of hope f...

  8. Auditory filters at low-frequencies

    DEFF Research Database (Denmark)

    Orellana, Carlos Andrés Jurado; Pedersen, Christian Sejer; Møller, Henrik

    2009-01-01

    Prediction and assessment of low-frequency noise problems requires information about the auditory filter characteristics at low-frequencies. Unfortunately, data at low-frequencies is scarce and practically no results have been published for frequencies below 100 Hz. Extrapolation of ERB results......-ear transfer function), the asymmetry of the auditory filter changed from steeper high-frequency slopes at 1000 Hz to steeper low-frequency slopes below 100 Hz. Increasing steepness at low-frequencies of the middle-ear high-pass filter is thought to cause this effect. The dynamic range of the auditory filter...

  9. Large cross-sectional study of presbycusis reveals rapid progressive decline in auditory temporal acuity.

    Science.gov (United States)

    Ozmeral, Erol J; Eddins, Ann C; Frisina, D Robert; Eddins, David A

    2016-07-01

    The auditory system relies on extraordinarily precise timing cues for the accurate perception of speech, music, and object identification. Epidemiological research has documented the age-related progressive decline in hearing sensitivity that is known to be a major health concern for the elderly. Although smaller investigations indicate that auditory temporal processing also declines with age, such measures have not been included in larger studies. Temporal gap detection thresholds (TGDTs; an index of auditory temporal resolution) measured in 1071 listeners (aged 18-98 years) were shown to decline at a minimum rate of 1.05 ms (15%) per decade. Age was a significant predictor of TGDT when controlling for audibility (partial correlation) and when restricting analyses to persons with normal-hearing sensitivity (n = 434). The TGDTs were significantly better for males (3.5 ms; 51%) than females when averaged across the life span. These results highlight the need for indices of temporal processing in diagnostics, as treatment targets, and as factors in models of aging. PMID:27255816

  10. Classification across the senses: Auditory-visual cognitive performance in a California sea lion (Zalophus californianus)

    Science.gov (United States)

    Lindemann, Kristy L.; Reichmuth-Kastak, Colleen; Schusterman, Ronald J.

    2005-09-01

    The model of stimulus equivalence describes how perceptually dissimilar stimuli can become interrelated to form useful categories both within and between the sensory modalities. A recent experiment expanded upon prior work with a California sea lion by examining stimulus classification across the auditory and visual modalities. Acoustic stimuli were associated with an exemplar from one of two pre-existing visual classes in a matching-to-sample paradigm. After direct training of these associations, the sea lion showed spontaneous transfer of the new auditory stimuli to the remaining members of the visual classes. The sea lion's performance on this cross-modal equivalence task was similar to that shown by human subjects in studies of emergent word learning and reading comprehension. Current research with the same animal further examines how stimulus classes can be expanded across modalities. Fast-mapping techniques are used to rapidly establish new auditory-visual relationships between acoustic cues and multiple arbitrary visual stimuli. Collectively, this research illustrates complex cross-modal performances in a highly experienced subject and provides insight into how animals organize information from multiple sensory modalities into meaningful representations.

  11. Large cross-sectional study of presbycusis reveals rapid progressive decline in auditory temporal acuity.

    Science.gov (United States)

    Ozmeral, Erol J; Eddins, Ann C; Frisina, D Robert; Eddins, David A

    2016-07-01

    The auditory system relies on extraordinarily precise timing cues for the accurate perception of speech, music, and object identification. Epidemiological research has documented the age-related progressive decline in hearing sensitivity that is known to be a major health concern for the elderly. Although smaller investigations indicate that auditory temporal processing also declines with age, such measures have not been included in larger studies. Temporal gap detection thresholds (TGDTs; an index of auditory temporal resolution) measured in 1071 listeners (aged 18-98 years) were shown to decline at a minimum rate of 1.05 ms (15%) per decade. Age was a significant predictor of TGDT when controlling for audibility (partial correlation) and when restricting analyses to persons with normal-hearing sensitivity (n = 434). The TGDTs were significantly better for males (3.5 ms; 51%) than females when averaged across the life span. These results highlight the need for indices of temporal processing in diagnostics, as treatment targets, and as factors in models of aging.

  12. Enhancing Manual Scan Registration Using Audio Cues

    Science.gov (United States)

    Ntsoko, T.; Sithole, G.

    2014-04-01

    Indoor mapping and modelling requires that acquired data be processed by editing, fusing, formatting the data, amongst other operations. Currently the manual interaction the user has with the point cloud (data) while processing it is visual. Visual interaction does have limitations, however. One way of dealing with these limitations is to augment audio in point cloud processing. Audio augmentation entails associating points of interest in the point cloud with audio objects. In coarse scan registration, reverberation, intensity and frequency audio cues were exploited to help the user estimate depth and occupancy of space of points of interest. Depth estimations were made reliably well when intensity and frequency were both used as depth cues. Coarse changes of depth could be estimated in this manner. The depth between surfaces can therefore be estimated with the aid of the audio objects. Sound reflections of an audio object provided reliable information of the object surroundings in some instances. For a point/area of interest in the point cloud, these reflections can be used to determine the unseen events around that point/area of interest. Other processing techniques could benefit from this while other information is estimated using other audio cues like binaural cues and Head Related Transfer Functions. These other cues could be used in position estimations of audio objects to aid in problems such as indoor navigation problems.

  13. Word segmentation with universal prosodic cues.

    Science.gov (United States)

    Endress, Ansgar D; Hauser, Marc D

    2010-09-01

    When listening to speech from one's native language, words seem to be well separated from one another, like beads on a string. When listening to a foreign language, in contrast, words seem almost impossible to extract, as if there was only one bead on the same string. This contrast reveals that there are language-specific cues to segmentation. The puzzle, however, is that infants must be endowed with a language-independent mechanism for segmentation, as they ultimately solve the segmentation problem for any native language. Here, we approach the acquisition problem by asking whether there are language-independent cues to segmentation that might be available to even adult learners who have already acquired a native language. We show that adult learners recognize words in connected speech when only prosodic cues to word-boundaries are given from languages unfamiliar to the participants. In both artificial and natural speech, adult English speakers, with no prior exposure to the test languages, readily recognized words in natural languages with critically different prosodic patterns, including French, Turkish and Hungarian. We suggest that, even though languages differ in their sound structures, they carry universal prosodic characteristics. Further, these language-invariant prosodic cues provide a universally accessible mechanism for finding words in connected speech. These cues may enable infants to start acquiring words in any language even before they are fine-tuned to the sound structure of their native language.

  14. Assessing the aging effect on auditory-verbal memory by Persian version of dichotic auditory verbal memory test

    Directory of Open Access Journals (Sweden)

    Zahra Shahidipour

    2014-01-01

    Conclusion: Based on the obtained results, significant reduction in auditory memory was seen in aged group and the Persian version of dichotic auditory-verbal memory test, like many other auditory verbal memory tests, showed the aging effects on auditory verbal memory performance.

  15. AUDITORY CORTICAL PLASTICITY: DOES IT PROVIDE EVIDENCE FOR COGNITIVE PROCESSING IN THE AUDITORY CORTEX?

    OpenAIRE

    Irvine, Dexter R. F.

    2007-01-01

    The past 20 years have seen substantial changes in our view of the nature of the processing carried out in auditory cortex. Some processing of a cognitive nature, previously attributed to higher order “association” areas, is now considered to take place in auditory cortex itself. One argument adduced in support of this view is the evidence indicating a remarkable degree of plasticity in the auditory cortex of adult animals. Such plasticity has been demonstrated in a wide range of paradigms, i...

  16. Temporal visual cues aid speech recognition

    DEFF Research Database (Denmark)

    Zhou, Xiang; Ross, Lars; Lehn-Schiøler, Tue;

    2006-01-01

    BACKGROUND: It is well known that under noisy conditions, viewing a speaker's articulatory movement aids the recognition of spoken words. Conventionally it is thought that the visual input disambiguates otherwise confusing auditory input. HYPOTHESIS: In contrast we hypothesize that it is the temp......BACKGROUND: It is well known that under noisy conditions, viewing a speaker's articulatory movement aids the recognition of spoken words. Conventionally it is thought that the visual input disambiguates otherwise confusing auditory input. HYPOTHESIS: In contrast we hypothesize...... that it is the temporal synchronicity of the visual input that aids parsing of the auditory stream. More specifically, we expected that purely temporal information, which does not convey information such as place of articulation may facility word recognition. METHODS: To test this prediction we used temporal features...... of audio to generate an artificial talking-face video and measured word recognition performance on simple monosyllabic words. RESULTS: When presenting words together with the artificial video we find that word recognition is improved over purely auditory presentation. The effect is significant (p...

  17. Topographical cues regulate the crosstalk between MSCs and macrophages

    Science.gov (United States)

    Vallés, Gema; Bensiamar, Fátima; Crespo, Lara; Arruebo, Manuel; Vilaboa, Nuria; Saldaña, Laura

    2015-01-01

    Implantation of scaffolds may elicit a host foreign body response triggered by monocyte/macrophage lineage cells. Growing evidence suggests that topographical cues of scaffolds play an important role in MSC functionality. In this work, we examined whether surface topographical features can regulate paracrine interactions that MSCs establish with macrophages. Three-dimensional (3D) topography sensing drives MSCs into a spatial arrangement that stimulates the production of the anti-inflammatory proteins PGE2 and TSG-6. Compared to two-dimensional (2D) settings, 3D arrangement of MSCs co-cultured with macrophages leads to an important decrease in the secretion of soluble factors related with inflammation and chemotaxis including IL-6 and MCP-1. Attenuation of MCP-1 secretion in 3D co-cultures correlates with a decrease in the accumulation of its mRNA levels in MSCs and macrophages. Using neutralizing antibodies, we identified that the interplay between PGE2, IL-6, TSG-6 and MCP-1 in the co-cultures is strongly influenced by the micro-architecture that supports MSCs. Local inflammatory milieu provided by 3D-arranged MSCs in co-cultures induces a decrease in monocyte migration as compared to monolayer cells. This effect is partially mediated by reduced levels of IL-6 and MCP-1, proteins that up-regulate each other's secretion. Our findings highlight the importance of topographical cues in the soluble factor-guided communication between MSCs and macrophages. PMID:25453943

  18. Highly effective photonic cue for repulsive axonal guidance.

    Directory of Open Access Journals (Sweden)

    Bryan J Black

    Full Text Available In vivo nerve repair requires not only the ability to regenerate damaged axons, but most importantly, the ability to guide developing or regenerating axons along paths that will result in functional connections. Furthermore, basic studies in neuroscience and neuro-electronic interface design require the ability to construct in vitro neural circuitry. Both these applications require the development of a noninvasive, highly effective tool for axonal growth-cone guidance. To date, a myriad of technologies have been introduced based on chemical, electrical, mechanical, and hybrid approaches (such as electro-chemical, optofluidic flow and photo-chemical methods. These methods are either lacking in desired spatial and temporal selectivity or require the introduction of invasive external factors. Within the last fifteen years however, several attractive guidance cues have been developed using purely light based cues to achieve axonal guidance. Here, we report a novel, purely optical repulsive guidance technique that uses low power, near infrared light, and demonstrates the guidance of primary goldfish retinal ganglion cell axons through turns of up to 120 degrees and over distances of ∼90 µm.

  19. Memory for location and visual cues in white-eared hummingbirds Hylocharis leucotis

    Directory of Open Access Journals (Sweden)

    Guillermo PÉREZ, Carlos LARA, José VICCON-PALE, Martha SIGNORET-POILLON

    2011-08-01

    Full Text Available In nature hummingbirds face floral resources whose availability, quality and quantity can vary spatially and temporally. Thus, they must constantly make foraging decisions about which patches, plants and flowers to visit, partly as a function of the nectar reward. The uncertainty of these decisions would possibly be reduced if an individual could remember locations or use visual cues to avoid revisiting recently depleted flowers. In the present study, we carried out field experiments with white-eared hummingbirds Hylocharis leucotis, to evaluate their use of locations or visual cues when foraging on natural flowers Penstemon roseus. We evaluated the use of spatial memory by observing birds while they were foraging between two plants and within a single plant. Our results showed that hummingbirds prefer to use location when foraging in two plants, but they also use visual cues to efficiently locate unvisited rewarded flowers when they feed on a single plant. However, in absence of visual cues, in both experiments birds mainly used the location of previously visited flowers to make subsequent visits. Our data suggest that hummingbirds are capable of learning and employing this flexibility depending on the faced environmental conditions and the information acquired in previous visits [Current Zoology 57 (4: 468–476, 2011].

  20. Coordinated sensor cueing for chemical plume detection

    Science.gov (United States)

    Abraham, Nathan J.; Jensenius, Andrea M.; Watkins, Adam S.; Hawthorne, R. Chad; Stepnitz, Brian J.

    2011-05-01

    This paper describes an organic data fusion and sensor cueing approach for Chemical, Biological, Radiological, and Nuclear (CBRN) sensors. The Joint Warning and Reporting Network (JWARN) uses a hardware component referred to as the JWARN Component Interface Device (JCID). The Edgewood Chemical and Biological Center has developed a small footprint and open architecture solution for the JCID capability called JCID-on-a-Chip (JoaC). The JoaC program aims to reduce the cost and complexity of the JCID by shrinking the necessary functionality down to a small single board computer. This effort focused on development of a fusion and cueing algorithm organic to the JoaC hardware. By embedding this capability in the JoaC, sensors have the ability to receive and process cues from other sensors without the use of a complex and costly centralized infrastructure. Additionally, the JoaC software is hardware agnostic, as evidenced by its drop-in inclusion in two different system-on-a-chip platforms including Windows CE and LINUX environments. In this effort, a partnership between JPM-CA, JHU/APL, and the Edgewood Chemical and Biological Center (ECBC), the authors implemented and demonstrated a new algorithm for cooperative detection and localization of a chemical agent plume. This experiment used a pair of mobile Joint Services Lightweight Standoff Chemical Agent Detector (JSLSCAD) units which were controlled by fusion and cueing algorithms hosted on a JoaC. The algorithms embedded in the JoaC enabled the two sensor systems to perform cross cueing and cooperatively form a higher fidelity estimate of chemical releases by combining sensor readings. Additionally, each JSLSCAD had the ability to focus its search on smaller regions than those required by a single sensor system by using the cross cue information from the other sensor.

  1. In search of an auditory engram

    Science.gov (United States)

    Fritz, Jonathan; Mishkin, Mortimer; Saunders, Richard C.

    2005-01-01

    Monkeys trained preoperatively on a task designed to assess auditory recognition memory were impaired after removal of either the rostral superior temporal gyrus or the medial temporal lobe but were unaffected by lesions of the rhinal cortex. Behavioral analysis indicated that this result occurred because the monkeys did not or could not use long-term auditory recognition, and so depended instead on short-term working memory, which is unaffected by rhinal lesions. The findings suggest that monkeys may be unable to place representations of auditory stimuli into a long-term store and thus question whether the monkey's cerebral memory mechanisms in audition are intrinsically different from those in other sensory modalities. Furthermore, it raises the possibility that language is unique to humans not only because it depends on speech but also because it requires long-term auditory memory. PMID:15967995

  2. Effect of omega-3 on auditory system

    Directory of Open Access Journals (Sweden)

    Vida Rahimi

    2014-01-01

    Full Text Available Background and Aim: Omega-3 fatty acid have structural and biological roles in the body 's various systems . Numerous studies have tried to research about it. Auditory system is affected a s well. The aim of this article was to review the researches about the effect of omega-3 on auditory system.Methods: We searched Medline , Google Scholar, PubMed, Cochrane Library and SID search engines with the "auditory" and "omega-3" keywords and read textbooks about this subject between 19 70 and 20 13.Conclusion: Both excess and deficient amounts of dietary omega-3 fatty acid can cause harmful effects on fetal and infant growth and development of brain and central nervous system esspesially auditory system. It is important to determine the adequate dosage of omega-3.

  3. Auditory perception modulated by word reading.

    Science.gov (United States)

    Cao, Liyu; Klepp, Anne; Schnitzler, Alfons; Gross, Joachim; Biermann-Ruben, Katja

    2016-10-01

    Theories of embodied cognition positing that sensorimotor areas are indispensable during language comprehension are supported by neuroimaging and behavioural studies. Among others, the auditory system has been suggested to be important for understanding sound-related words (visually presented) and the motor system for action-related words. In this behavioural study, using a sound detection task embedded in a lexical decision task, we show that in participants with high lexical decision performance sound verbs improve auditory perception. The amount of modulation was correlated with lexical decision performance. Our study provides convergent behavioural evidence of auditory cortex involvement in word processing, supporting the view of embodied language comprehension concerning the auditory domain. PMID:27324193

  4. [Approaches to therapy of auditory agnosia].

    Science.gov (United States)

    Fechtelpeter, A; Göddenhenrich, S; Huber, W; Springer, L

    1990-01-01

    In a 41-year-old stroke patient with bitemporal brain damage, we found severe signs of auditory agnosia 6 months after onset. Recognition of environmental sounds was extremely impaired when tested in a multiple choice sound-picture matching task, whereas auditory discrimination between sounds and picture identifications by written names was almost undisturbed. In a therapy experiment, we tried to enhance sound recognition via semantic categorization and association, imitation of sound and analysis of auditory features, respectively. The stimulation of conscious auditory analysis proved to be increasingly effective over a 4-week period of therapy. We were able to show that the patient's improvement was not only a simple effect of practicing, but it was stable and carried over to nontrained items.

  5. Auditory stimulation and cardiac autonomic regulation

    Directory of Open Access Journals (Sweden)

    Vitor E. Valenti

    2012-08-01

    Full Text Available Previous studies have already demonstrated that auditory stimulation with music influences the cardiovascular system. In this study, we described the relationship between musical auditory stimulation and heart rate variability. Searches were performed with the Medline, SciELO, Lilacs and Cochrane databases using the following keywords: "auditory stimulation", "autonomic nervous system", "music" and "heart rate variability". The selected studies indicated that there is a strong correlation between noise intensity and vagal-sympathetic balance. Additionally, it was reported that music therapy improved heart rate variability in anthracycline-treated breast cancer patients. It was hypothesized that dopamine release in the striatal system induced by pleasurable songs is involved in cardiac autonomic regulation. Musical auditory stimulation influences heart rate variability through a neural mechanism that is not well understood. Further studies are necessary to develop new therapies to treat cardiovascular disorders.

  6. Environment for Auditory Research Facility (EAR)

    Data.gov (United States)

    Federal Laboratory Consortium — EAR is an auditory perception and communication research center enabling state-of-the-art simulation of various indoor and outdoor acoustic environments. The heart...

  7. Material differences of auditory source retrieval:Evidence from event-related potential studies

    Institute of Scientific and Technical Information of China (English)

    NIE AiQing; GUO ChunYan; SHEN MoWei

    2008-01-01

    Two event-related potential experiments were conducted to investigate the temporal and the spatial distributions of the old/new effects for the item recognition task and the auditory source retrieval task using picture and Chinese character as stimuli respectively. Stimuli were presented on the center of the screen with their names read out either by female or by male voice simultaneously during the study phase and then two testa were performed separately. One test task was to differentiate the old items from the new ones, and the other task was to judge the items read out by a certain voice during the study phase as targets and other ones as non-targets. The results showed that the old/new effect of the auditory source retrieval task was more sustained over time than that of the item recognition task in both experiments, and the spatial distribution of the former effect was wider than that of the latter one. Both experiments recorded reliable old/new effect over the prefrontal cortex during the source retrieval task. However, there existed some differences of the old/new effect for the auditory source retrieval task between picture and Chinese character, and LORETA source analysis indicated that the differ-ences might be rooted in the temporal lobe. These findings demonstrate that the relevancy of the old/new effects between the item recognition task and the auditory source retrieval task supports the dual-process model; the spatial and the temporal distributions of the old/new effect elicited by the auditory source retrieval task are regulated by both the feature of the experimental material and the perceptual attribute of the voice.

  8. Videotaped Modeling with and without Verbal Cues

    OpenAIRE

    Rowland, Amy Lee

    2004-01-01

    The purpose of this study was to investigate the use of videotaped modeling of a tennis skill with and without verbal cues. Eighteen female players from two NCAA Division III colleges served as the subjects for the study. The players were randomly assigned to one of two groups. Both of the groups viewed a modeling videotape which contained a 56-second clip of a female professional hitting forehand groundstrokes looped seven times. Group One'­s tape included verbal cues on balance, posture, an...

  9. Auditory sequence analysis and phonological skill

    OpenAIRE

    Grube, Manon; Kumar, Sukhbinder; Cooper, Freya E.; Turton, Stuart; Griffiths, Timothy D

    2012-01-01

    This work tests the relationship between auditory and phonological skill in a non-selected cohort of 238 school students (age 11) with the specific hypothesis that sound-sequence analysis would be more relevant to phonological skill than the analysis of basic, single sounds. Auditory processing was assessed across the domains of pitch, time and timbre; a combination of six standard tests of literacy and language ability was used to assess phonological skill. A significant correlation between ...

  10. Auditory memory function in expert chess players

    OpenAIRE

    Fattahi, Fariba; Geshani, Ahmad; Jafari, Zahra; Jalaie, Shohreh; Salman Mahini, Mona

    2015-01-01

    Background: Chess is a game that involves many aspects of high level cognition such as memory, attention, focus and problem solving. Long term practice of chess can improve cognition performances and behavioral skills. Auditory memory, as a kind of memory, can be influenced by strengthening processes following long term chess playing like other behavioral skills because of common processing pathways in the brain. The purpose of this study was to evaluate the auditory memory function of expert...

  11. Auditory brain-stem responses in syphilis.

    OpenAIRE

    Rosenhall, U; Roupe, G

    1981-01-01

    Analysis of auditory brain-stem electrical responses (BSER) provides an effective means of detecting lesions in the auditory pathways. In the present study the wave patterns were analysed in 11 patients with secondary or latent syphilis with no clinical symptoms referrable to the central nervous system and in two patients with congenital syphilis and general paralysis. Decreased amplitudes and prolonged latencies occurred frequently in patients with secondary and with advanced syphilis. This ...

  12. Task-switching effects for visual and auditory pro- and antisaccades: evidence for a task-set inertia.

    Science.gov (United States)

    Heath, Matthew; Starrs, Faryn; Macpherson, Ewan; Weiler, Jeffrey

    2015-01-01

    The completion of an antisaccade delays the reaction time (RT) of a subsequent prosaccade; however, the converse switch does not influence RT. In accounting for this result, the task-set inertia hypothesis contends that antisaccades engender a persistent nonstandard task-set that delays the planning of a subsequent prosaccade. In contrast, the coordinate system transformation hypothesis asserts that the transformation required to construct a mirror-symmetrical target representation persistently inhibits prosaccade planning. The authors tested the latter hypothesis by examining switch-costs for pro- and antisaccades directed to visual (i.e., the stimuli used in previous work) and auditory targets. Notably, auditory cues are specified in a head-centered frame of reference prior to their conversion into the retinocentric coordinates necessary for saccade output. Thus, if the coordinate system transformation hypothesis is correct then auditory pro- and antisaccades should elicit a bidirectional switch-cost because each requires a coordinate transformation. RTs for visual and auditory modalities showed a reliable--and equivalent magnitude--prosaccade switch-cost. Moreover, performance (e.g., movement time) and kinematic (e.g., velocity) variables indicated the switch-cost was restricted to response planning. As such, results are incompatible with the coordinate system transformation hypothesis and therefore provide convergent evidence that a task-set inertia contributes to the prosaccade switch-cost.

  13. Auditory model inversion and its application

    Institute of Scientific and Technical Information of China (English)

    ZHAO Heming; WANG Yongqi; CHEN Xueqin

    2005-01-01

    Auditory model has been applied to several aspects of speech signal processing field, and appears to be effective in performance. This paper presents the inverse transform of each stage of one widely used auditory model. First of all it is necessary to invert correlogram and reconstruct phase information by repetitious iterations in order to get auditory-nerve firing rate. The next step is to obtain the negative parts of the signal via the reverse process of the HWR (Half Wave Rectification). Finally the functions of inner hair cell/synapse model and Gammatone filters have to be inverted. Thus the whole auditory model inversion has been achieved. An application of noisy speech enhancement based on auditory model inversion algorithm is proposed. Many experiments show that this method is effective in reducing noise.Especially when SNR of noisy speech is low it is more effective than other methods. Thus this auditory model inversion method given in this paper is applicable to speech enhancement field.

  14. Simple ears-flexible behavior: Information processing in the moth auditory pathway

    Institute of Scientific and Technical Information of China (English)

    Gerit PFUHL; Blanka KALINOVA; Irena VALTEROVA; Bente G.BERG

    2015-01-01

    Lepidoptera evolved tympanic ears in response to echolocating bats.Comparative studies have shown that moth ears evolved many times independently from chordotonal organs.With only 1 to 4 receptor cells,they are one of the simplest hearing organs.The small number of receptors does not imply simplicity,neither in behavior nor in the neural circuit.Behaviorally,the response to ultrasound is far from being a simple reflex.Moths' escape behavior is modulated by a variety of cues,especially pheromones,which can alter the auditory response.Neurally the receptor cell(s) diverges onto many intemeurons,enabling pa rallel processing and feature extraction.Ascending interneurons and sound-sensitive brain neurons innervate a neuropil in the ventrolateral protocerebrum.Further,recent electrophysiological data provides the first glimpses into how the acoustic response is modulated as well as how ultrasound influences the other senses.So far,the auditory pathway has been studied in noctuids.The findings agree well with common computational principles found in other insects.However,moth ears also show unique mechanical and neural adaptation.Here,we first describe the variety of moths' auditory behavior,especially the co-option of ultrasonic signals for intraspecific communication.Second,we describe the current knowledge of the neural pathway gained from noctuid moths.Finally,we argue that Galleriinae which show negative and positive phonotaxis,are an interesting model species for future electrophysiological studies of the auditory pathway and multimodal sensory integration,and so are ideally suited for the study of the evolution of behavioral mechanisms given a few receptors [Current Zoology 61 (2):292-302,2015].

  15. Cue reactivity in virtual reality: the role of context.

    Science.gov (United States)

    Paris, Megan M; Carter, Brian L; Traylor, Amy C; Bordnick, Patrick S; Day, Susan X; Armsworth, Mary W; Cinciripini, Paul M

    2011-07-01

    Cigarette smokers in laboratory experiments readily respond to smoking stimuli with increased craving. An alternative to traditional cue-reactivity methods (e.g., exposure to cigarette photos), virtual reality (VR) has been shown to be a viable cue presentation method to elicit and assess cigarette craving within complex virtual environments. However, it remains poorly understood whether contextual cues from the environment contribute to craving increases in addition to specific cues, like cigarettes. This study examined the role of contextual cues in a VR environment to evoke craving. Smokers were exposed to a virtual convenience store devoid of any specific cigarette cues followed by exposure to the same convenience store with specific cigarette cues added. Smokers reported increased craving following exposure to the virtual convenience store without specific cues, and significantly greater craving following the convenience store with cigarette cues added. However, increased craving recorded after the second convenience store may have been due to the pre-exposure to the first convenience store. This study offers evidence that an environmental context where cigarette cues are normally present (but are not), elicits significant craving in the absence of specific cigarette cues. This finding suggests that VR may have stronger ecological validity over traditional cue reactivity exposure methods by exposing smokers to the full range of cigarette-related environmental stimuli, in addition to specific cigarette cues, that smokers typically experience in their daily lives. PMID:21349649

  16. Effects of similarity on environmental context cueing.

    Science.gov (United States)

    Smith, Steven M; Handy, Justin D; Angello, Genna; Manzano, Isabel

    2014-01-01

    Three experiments examined the prediction that context cues which are similar to study contexts can facilitate episodic recall, even if those cues are never seen before the recall test. Environmental context cueing effects have typically produced such small effect sizes that influences of moderating factors, such as the similarity between encoding and retrieval contexts, would be difficult to observe experimentally. Videos of environmental contexts, however, can be used to produce powerful context-dependent memory effects, particularly when only one memory target is associated with each video context, intentional item-context encoding is encouraged, and free recall tests are used. Experiment 1 showed that a not previously viewed video of the study context provided an effective recall cue, although it was not as effective as the originally viewed video context. Experiments 2 and 3 showed that videos of environments that were conceptually similar to encoding contexts (e.g., both were videos of ball field games) also cued recall, but not as well if the encoding contexts were given specific labels (e.g., "home run") incompatible with test contexts (e.g., a soccer scene). A fourth experiment that used incidental item-context encoding showed that video context reinstatement has a robust effect on paired associate memory, indicating that the video context reinstatement effect does not depend on interactive item-context encoding or free recall testing. PMID:23721293

  17. Visual Cues and Listening Effort: Individual Variability

    Science.gov (United States)

    Picou, Erin M.; Ricketts, Todd A; Hornsby, Benjamin W. Y.

    2011-01-01

    Purpose: To investigate the effect of visual cues on listening effort as well as whether predictive variables such as working memory capacity (WMC) and lipreading ability affect the magnitude of listening effort. Method: Twenty participants with normal hearing were tested using a paired-associates recall task in 2 conditions (quiet and noise) and…

  18. Effects of similarity on environmental context cueing.

    Science.gov (United States)

    Smith, Steven M; Handy, Justin D; Angello, Genna; Manzano, Isabel

    2014-01-01

    Three experiments examined the prediction that context cues which are similar to study contexts can facilitate episodic recall, even if those cues are never seen before the recall test. Environmental context cueing effects have typically produced such small effect sizes that influences of moderating factors, such as the similarity between encoding and retrieval contexts, would be difficult to observe experimentally. Videos of environmental contexts, however, can be used to produce powerful context-dependent memory effects, particularly when only one memory target is associated with each video context, intentional item-context encoding is encouraged, and free recall tests are used. Experiment 1 showed that a not previously viewed video of the study context provided an effective recall cue, although it was not as effective as the originally viewed video context. Experiments 2 and 3 showed that videos of environments that were conceptually similar to encoding contexts (e.g., both were videos of ball field games) also cued recall, but not as well if the encoding contexts were given specific labels (e.g., "home run") incompatible with test contexts (e.g., a soccer scene). A fourth experiment that used incidental item-context encoding showed that video context reinstatement has a robust effect on paired associate memory, indicating that the video context reinstatement effect does not depend on interactive item-context encoding or free recall testing.

  19. Adaptive auditory feedback control of the production of formant trajectories in the Mandarin triphthong /iau/ and its pattern of generalization.

    Science.gov (United States)

    Cai, Shanqing; Ghosh, Satrajit S; Guenther, Frank H; Perkell, Joseph S

    2010-10-01

    In order to test whether auditory feedback is involved in the planning of complex articulatory gestures in time-varying phonemes, the current study examined native Mandarin speakers' responses to auditory perturbations of their auditory feedback of the trajectory of the first formant frequency during their production of the triphthong /iau/. On average, subjects adaptively adjusted their productions to partially compensate for the perturbations in auditory feedback. This result indicates that auditory feedback control of speech movements is not restricted to quasi-static gestures in monophthongs as found in previous studies, but also extends to time-varying gestures. To probe the internal structure of the mechanisms of auditory-motor transformations, the pattern of generalization of the adaptation learned on the triphthong /iau/ to other vowels with different temporal and spatial characteristics (produced only under masking noise) was tested. A broad but weak pattern of generalization was observed; the strength of the generalization diminished with increasing dissimilarity from /iau/. The details and implications of the pattern of generalization are examined and discussed in light of previous sensorimotor adaptation studies of both speech and limb motor control and a neurocomputational model of speech motor control. PMID:20968374

  20. Task-specific modulation of human auditory evoked responses in a delayed-match-to-sample task

    Directory of Open Access Journals (Sweden)

    Feng eRong

    2011-05-01

    Full Text Available In this study, we focus our investigation on task-specific cognitive modulation of early cortical auditory processing in human cerebral cortex. During the experiments, we acquired whole-head magnetoencephalography (MEG data while participants were performing an auditory delayed-match-to-sample (DMS task and associated control tasks. Using a spatial filtering beamformer technique to simultaneously estimate multiple source activities inside the human brain, we observed a significant DMS-specific suppression of the auditory evoked response to the second stimulus in a sound pair, with the center of the effect being located in the vicinity of the left auditory cortex. For the right auditory cortex, a non-invariant suppression effect was observed in both DMS and control tasks. Furthermore, analysis of coherence revealed a beta band (12 ~ 20 Hz DMS-specific enhanced functional interaction between the sources in left auditory cortex and those in left inferior frontal gyrus, which has been shown to involve in short-term memory processing during the delay period of DMS task. Our findings support the view that early evoked cortical responses to incoming acoustic stimuli can be modulated by task-specific cognitive functions by means of frontal-temporal functional interactions.

  1. Memory for location and visual cues in white-eared hummingbirds Hylocharis leucotis

    OpenAIRE

    Guillermo PÉREZ, Carlos LARA, José VICCON-PALE, Martha SIGNORET-POILLON

    2011-01-01

    In nature hummingbirds face floral resources whose availability, quality and quantity can vary spatially and temporally. Thus, they must constantly make foraging decisions about which patches, plants and flowers to visit, partly as a function of the nectar reward. The uncertainty of these decisions would possibly be reduced if an individual could remember locations or use visual cues to avoid revisiting recently depleted flowers. In the present study, we carried out field experiments with whi...

  2. The (unclear effects of invalid retro-cues.

    Directory of Open Access Journals (Sweden)

    Marcel eGressmann

    2016-03-01

    Full Text Available Studies with the retro-cue paradigm have shown that validly cueing objects in visual working memory long after encoding can still benefit performance on subsequent change detection tasks. With regard to the effects of invalid cues, the literature is less clear. Some studies reported costs, others did not. We here revisit two recent studies that made interesting suggestions concerning invalid retro-cues: One study suggested that costs only occur for larger set sizes, and another study suggested that inclusion of invalid retro-cues diminishes the retro-cue benefit. New data from one experiment and a reanalysis of published data are provided to address these conclusions. The new data clearly show costs (and benefits that were independent of set size, and the reanalysis suggests no influence of the inclusion of invalid retro-cues on the retro-cue benefit. Thus, previous interpretations may be taken with some caution at present.

  3. Effect of auditory feedback differs according to side of hemiparesis: a comparative pilot study

    Directory of Open Access Journals (Sweden)

    Bensmail Djamel

    2009-12-01

    Full Text Available Abstract Background Following stroke, patients frequently demonstrate loss of motor control and function and altered kinematic parameters of reaching movements. Feedback is an essential component of rehabilitation and auditory feedback of kinematic parameters may be a useful tool for rehabilitation of reaching movements at the impairment level. The aim of this study was to investigate the effect of 2 types of auditory feedback on the kinematics of reaching movements in hemiparetic stroke patients and to compare differences between patients with right (RHD and left hemisphere damage (LHD. Methods 10 healthy controls, 8 stroke patients with LHD and 8 with RHD were included. Patient groups had similar levels of upper limb function. Two types of auditory feedback (spatial and simple were developed and provided online during reaching movements to 9 targets in the workspace. Kinematics of the upper limb were recorded with an electromagnetic system. Kinematics were compared between groups (Mann Whitney test and the effect of auditory feedback on kinematics was tested within each patient group (Friedman test. Results In the patient groups, peak hand velocity was lower, the number of velocity peaks was higher and movements were more curved than in the healthy group. Despite having a similar clinical level, kinematics differed between LHD and RHD groups. Peak velocity was similar but LHD patients had fewer velocity peaks and less curved movements than RHD patients. The addition of auditory feedback improved the curvature index in patients with RHD and deteriorated peak velocity, the number of velocity peaks and curvature index in LHD patients. No difference between types of feedback was found in either patient group. Conclusion In stroke patients, side of lesion should be considered when examining arm reaching kinematics. Further studies are necessary to evaluate differences in responses to auditory feedback between patients with lesions in opposite

  4. Auditory-prefrontal axonal connectivity in the macaque cortex: quantitative assessment of processing streams.

    Science.gov (United States)

    Bezgin, Gleb; Rybacki, Konrad; van Opstal, A John; Bakker, Rembrandt; Shen, Kelly; Vakorin, Vasily A; McIntosh, Anthony R; Kötter, Rolf

    2014-08-01

    Primate sensory systems subserve complex neurocomputational functions. Consequently, these systems are organised anatomically in a distributed fashion, commonly linking areas to form specialised processing streams. Each stream is related to a specific function, as evidenced from studies of the visual cortex, which features rather prominent segregation into spatial and non-spatial domains. It has been hypothesised that other sensory systems, including auditory, are organised in a similar way on the cortical level. Recent studies offer rich qualitative evidence for the dual stream hypothesis. Here we provide a new paradigm to quantitatively uncover these patterns in the auditory system, based on an analysis of multiple anatomical studies using multivariate techniques. As a test case, we also apply our assessment techniques to more ubiquitously-explored visual system. Importantly, the introduced framework opens the possibility for these techniques to be applied to other neural systems featuring a dichotomised organisation, such as language or music perception. PMID:24980416

  5. Emotional pictures and sounds: A review of multimodal interactions of emotion cues in multiple domains

    Directory of Open Access Journals (Sweden)

    Antje B M Gerdes

    2014-12-01

    Full Text Available In everyday life, multiple sensory channels jointly trigger emotional experiences and one channel may alter processing in another channel. For example, seeing an emotional facial expression and hearing the voice’s emotional tone will jointly create the emotional experience. This example, where auditory and visual input is related to social communication, has gained considerable attention by researchers. However, interactions of visual and auditory emotional information are not limited to social communication but can extend to much broader contexts including human, animal, and environmental cues. In this article, we review current research on audiovisual emotion processing beyond face-voice stimuli to develop a broader perspective on multimodal interactions in emotion processing. We argue that current concepts of multimodality should be extended in considering an ecologically valid variety of stimuli in audiovisual emotion processing. Therefore, we provide an overview of studies in which emotional sounds and interactions with complex pictures of scenes were investigated. In addition to behavioral studies, we focus on neuroimaging, electro- and peripher-physiological findings. Furthermore, we integrate these findings and identify similarities or differences. We conclude with suggestions for future research.

  6. Expression and function of scleraxis in the developing auditory system.

    Directory of Open Access Journals (Sweden)

    Zoe F Mann

    Full Text Available A study of genes expressed in the developing inner ear identified the bHLH transcription factor Scleraxis (Scx in the developing cochlea. Previous work has demonstrated an essential role for Scx in the differentiation and development of tendons, ligaments and cells of chondrogenic lineage. Expression in the cochlea has been shown previously, however the functional role for Scx in the cochlea is unknown. Using a Scx-GFP reporter mouse line we examined the spatial and temporal patterns of Scx expression in the developing cochlea between embryonic day 13.5 and postnatal day 25. Embryonically, Scx is expressed broadly throughout the cochlear duct and surrounding mesenchyme and at postnatal ages becomes restricted to the inner hair cells and the interdental cells of the spiral limbus. Deletion of Scx results in hearing impairment indicated by elevated auditory brainstem response (ABR thresholds and diminished distortion product otoacoustic emission (DPOAE amplitudes, across a range of frequencies. No changes in either gross cochlear morphology or expression of the Scx target genes Col2A, Bmp4 or Sox9 were observed in Scx(-/- mutants, suggesting that the auditory defects observed in these animals may be a result of unidentified Scx-dependent processes within the cochlea.

  7. Auditory sustained field responses to periodic noise

    Directory of Open Access Journals (Sweden)

    Keceli Sumru

    2012-01-01

    Full Text Available Abstract Background Auditory sustained responses have been recently suggested to reflect neural processing of speech sounds in the auditory cortex. As periodic fluctuations below the pitch range are important for speech perception, it is necessary to investigate how low frequency periodic sounds are processed in the human auditory cortex. Auditory sustained responses have been shown to be sensitive to temporal regularity but the relationship between the amplitudes of auditory evoked sustained responses and the repetitive rates of auditory inputs remains elusive. As the temporal and spectral features of sounds enhance different components of sustained responses, previous studies with click trains and vowel stimuli presented diverging results. In order to investigate the effect of repetition rate on cortical responses, we analyzed the auditory sustained fields evoked by periodic and aperiodic noises using magnetoencephalography. Results Sustained fields were elicited by white noise and repeating frozen noise stimuli with repetition rates of 5-, 10-, 50-, 200- and 500 Hz. The sustained field amplitudes were significantly larger for all the periodic stimuli than for white noise. Although the sustained field amplitudes showed a rising and falling pattern within the repetition rate range, the response amplitudes to 5 Hz repetition rate were significantly larger than to 500 Hz. Conclusions The enhanced sustained field responses to periodic noises show that cortical sensitivity to periodic sounds is maintained for a wide range of repetition rates. Persistence of periodicity sensitivity below the pitch range suggests that in addition to processing the fundamental frequency of voice, sustained field generators can also resolve low frequency temporal modulations in speech envelope.

  8. Beethoven's Last Piano Sonata and Those Who Follow Crocodiles: Cross-Domain Mappings of Auditory Pitch in a Musical Context

    Science.gov (United States)

    Eitan, Zohar; Timmers, Renee

    2010-01-01

    Though auditory pitch is customarily mapped in Western cultures onto spatial verticality (high-low), both anthropological reports and cognitive studies suggest that pitch may be mapped onto a wide variety of other domains. We collected a total number of 35 pitch mappings and investigated in four experiments how these mappings are used and…

  9. Cue Reactivity in Virtual Reality: The Role of Context

    OpenAIRE

    Paris, Megan M.; Carter, Brian L.; Traylor, Amy C.; Bordnick, Patrick S.; Day, Susan X.; Armsworth, Mary W.; Cinciripini, Paul M.

    2011-01-01

    Cigarette smokers in laboratory experiments readily respond to smoking stimuli with increased craving. An alternative to traditional cue-reactivity methods (e.g., exposure to cigarette photos), virtual reality (VR) has been shown to be a viable cue presentation method to elicit and assess cigarette craving within complex virtual environments. However, it remains poorly understood whether contextual cues from the environment contribute to craving increases in addition to specific cues, like ci...

  10. Reactivity to Cannabis Cues in Virtual Reality Environments†

    OpenAIRE

    Bordnick, Patrick S.; Copp, Hilary L.; Traylor, Amy; Graap, Ken M.; Carter, Brian L.; Walton, Alicia; Ferrer, Mirtha

    2009-01-01

    Virtual reality (VR) cue environments have been developed and successfully tested in nicotine, cocaine, and alcohol abusers. Aims in the current article include the development and testing of a novel VR cannabis cue reactivity assessment system. It was hypothesized that subjective craving levels and attention to cannabis cues would be higher in VR environments merits with cannabis cues compared to VR neutral environments. Twenty nontreatment-seeking current cannabis smokers participated in th...

  11. Extinction of Drug Cue Reactivity in Methamphetamine-Dependent Individuals

    OpenAIRE

    Price, Kimber L.; Saladin, Michael E.; Baker, Nathaniel L.; Tolliver, Bryan K.; DeSantis, Stacia M.; McRae-Clark, Aimee L.; Brady, Kathleen T.

    2010-01-01

    Conditioned responses to drug-related environmental cues (such as craving) play a critical role in relapse to drug use. Animal models demonstrate that repeated exposure to drug-associated cues in the absence of drug administration leads to the extinction of conditioned responses, but the few existing clinical trials focused on extinction of conditioned responses to drug-related cues in drug-dependent individuals show equivocal results. The current study examined drug-related cue reactivity an...

  12. The use of auditory and visual context in speech perception by listeners with normal hearing and listeners with cochlear implants

    Directory of Open Access Journals (Sweden)

    Matthew eWinn

    2013-11-01

    Full Text Available There is a wide range of acoustic and visual variability across different talkers and different speaking contexts. Listeners with normal hearing accommodate that variability in ways that facilitate efficient perception, but it is not known whether listeners with cochlear implants can do the same. In this study, listeners with normal hearing (NH and listeners with cochlear implants (CIs were tested for accommodation to auditory and visual phonetic contexts created by gender-driven speech differences as well as vowel coarticulation and lip rounding in both consonants and vowels. Accommodation was measured as the shifting of perceptual boundaries between /s/ and /ʃ/ sounds in various contexts, as modeled by mixed-effects logistic regression. Owing to the spectral contrasts thought to underlie these context effects, CI listeners were predicted to perform poorly, but showed considerable success. Listeners with cochlear implants not only showed sensitivity to auditory cues to gender, they were also able to use visual cues to gender (i.e. faces as a supplement or proxy for information in the acoustic domain, in a pattern that was not observed for listeners with normal hearing. Spectrally-degraded stimuli heard by listeners with normal hearing generally did not elicit strong context effects, underscoring the limitations of noise vocoders and/or the importance of experience with electric hearing. Visual cues for consonant lip rounding and vowel lip rounding were perceived in a manner consistent with coarticulation and were generally used more heavily by listeners with CIs. Results suggest that listeners with cochlear implants are able to accommodate various sources of acoustic variability either by attending to appropriate acoustic cues or by inferring them via the visual signal.

  13. Ambiguous Tilt and Translation Motion Cues after Space Flight and Otolith Assessment during Post-Flight Re-Adaptation

    Science.gov (United States)

    Wood, Scott J.; Clarke, A. H.; Harm, D. L.; Rupert, A. H.; Clement, G. R.

    2009-01-01

    Adaptive changes during space flight in how the brain integrates vestibular cues with other sensory information can lead to impaired movement coordination, vertigo, spatial disorientation and perceptual illusions following Gtransitions. These studies are designed to examine both the physiological basis and operational implications for disorientation and tilt-translation disturbances following short duration space flights.

  14. Extra-classical tuning predicts stimulus-dependent receptive fields in auditory neurons

    OpenAIRE

    Schneider, David M.; Woolley, Sarah M. N.

    2011-01-01

    The receptive fields of many sensory neurons are sensitive to statistical differences among classes of complex stimuli. For example, excitatory spectral bandwidths of midbrain auditory neurons and the spatial extent of cortical visual neurons differ during the processing of natural stimuli compared to the processing of artificial stimuli. Experimentally characterizing neuronal non-linearities that contribute to stimulus-dependent receptive fields is important for understanding how neurons res...

  15. The Effects of Overt and Covert Cues on Written Syntax.

    Science.gov (United States)

    Combs, Warren E.; Smith, William L.

    1980-01-01

    Experiments conducted with freshman composition students suggested that (1) the repeated use of a control stimulus passage does not result in increased syntactic complexity; (2) both overt and covert cues elicit more complex writing than do no-cue situations; and (3) the effect of overt cues seems to be retained, at least across a short duration.…

  16. Auditory and motor imagery modulate learning in music performance

    Science.gov (United States)

    Brown, Rachel M.; Palmer, Caroline

    2013-01-01

    Skilled performers such as athletes or musicians can improve their performance by imagining the actions or sensory outcomes associated with their skill. Performers vary widely in their auditory and motor imagery abilities, and these individual differences influence sensorimotor learning. It is unknown whether imagery abilities influence both memory encoding and retrieval. We examined how auditory and motor imagery abilities influence musicians' encoding (during Learning, as they practiced novel melodies), and retrieval (during Recall of those melodies). Pianists learned melodies by listening without performing (auditory learning) or performing without sound (motor learning); following Learning, pianists performed the melodies from memory with auditory feedback (Recall). During either Learning (Experiment 1) or Recall (Experiment 2), pianists experienced either auditory interference, motor interference, or no interference. Pitch accuracy (percentage of correct pitches produced) and temporal regularity (variability of quarter-note interonset intervals) were measured at Recall. Independent tests measured auditory and motor imagery skills. Pianists' pitch accuracy was higher following auditory learning than following motor learning and lower in motor interference conditions (Experiments 1 and 2). Both auditory and motor imagery skills improved pitch accuracy overall. Auditory imagery skills modulated pitch accuracy encoding (Experiment 1): Higher auditory imagery skill corresponded to higher pitch accuracy following auditory learning with auditory or motor interference, and following motor learning with motor or no interference. These findings suggest that auditory imagery abilities decrease vulnerability to interference and compensate for missing auditory feedback at encoding. Auditory imagery skills also influenced temporal regularity at retrieval (Experiment 2): Higher auditory imagery skill predicted greater temporal regularity during Recall in the presence of

  17. Auditory and motor imagery modulate learning in music performance.

    Science.gov (United States)

    Brown, Rachel M; Palmer, Caroline

    2013-01-01

    Skilled performers such as athletes or musicians can improve their performance by imagining the actions or sensory outcomes associated with their skill. Performers vary widely in their auditory and motor imagery abilities, and these individual differences influence sensorimotor learning. It is unknown whether imagery abilities influence both memory encoding and retrieval. We examined how auditory and motor imagery abilities influence musicians' encoding (during Learning, as they practiced novel melodies), and retrieval (during Recall of those melodies). Pianists learned melodies by listening without performing (auditory learning) or performing without sound (motor learning); following Learning, pianists performed the melodies from memory with auditory feedback (Recall). During either Learning (Experiment 1) or Recall (Experiment 2), pianists experienced either auditory interference, motor interference, or no interference. Pitch accuracy (percentage of correct pitches produced) and temporal regularity (variability of quarter-note interonset intervals) were measured at Recall. Independent tests measured auditory and motor imagery skills. Pianists' pitch accuracy was higher following auditory learning than following motor learning and lower in motor interference conditions (Experiments 1 and 2). Both auditory and motor imagery skills improved pitch accuracy overall. Auditory imagery skills modulated pitch accuracy encoding (Experiment 1): Higher auditory imagery skill corresponded to higher pitch accuracy following auditory learning with auditory or motor interference, and following motor learning with motor or no interference. These findings suggest that auditory imagery abilities decrease vulnerability to interference and compensate for missing auditory feedback at encoding. Auditory imagery skills also influenced temporal regularity at retrieval (Experiment 2): Higher auditory imagery skill predicted greater temporal regularity during Recall in the presence of

  18. Effects of Caffeine on Auditory Brainstem Response

    Directory of Open Access Journals (Sweden)

    Saleheh Soleimanian

    2008-06-01

    Full Text Available Background and Aim: Blocking of the adenosine receptor in central nervous system by caffeine can lead to increasing the level of neurotransmitters like glutamate. As the adenosine receptors are present in almost all brain areas like central auditory pathway, it seems caffeine can change conduction in this way. The purpose of this study was to evaluate the effects of caffeine on latency and amplitude of auditory brainstem response(ABR.Materials and Methods: In this clinical trial study 43 normal 18-25 years old male students were participated. The subjects consumed 0, 2 and 3 mg/kg BW caffeine in three different sessions. Auditory brainstem responses were recorded before and 30 minute after caffeine consumption. The results were analyzed by Friedman and Wilcoxone test to assess the effects of caffeine on auditory brainstem response.Results: Compared to control group the latencies of waves III,V and I-V interpeak interval of the cases decreased significantly after 2 and 3mg/kg BW caffeine consumption. Wave I latency significantly decreased after 3mg/kg BW caffeine consumption(p<0.01. Conclusion: Increasing of the glutamate level resulted from the adenosine receptor blocking brings about changes in conduction in the central auditory pathway.

  19. The harmonic organization of auditory cortex.

    Science.gov (United States)

    Wang, Xiaoqin

    2013-01-01

    A fundamental structure of sounds encountered in the natural environment is the harmonicity. Harmonicity is an essential component of music found in all cultures. It is also a unique feature of vocal communication sounds such as human speech and animal vocalizations. Harmonics in sounds are produced by a variety of acoustic generators and reflectors in the natural environment, including vocal apparatuses of humans and animal species as well as music instruments of many types. We live in an acoustic world full of harmonicity. Given the widespread existence of the harmonicity in many aspects of the hearing environment, it is natural to expect that it be reflected in the evolution and development of the auditory systems of both humans and animals, in particular the auditory cortex. Recent neuroimaging and neurophysiology experiments have identified regions of non-primary auditory cortex in humans and non-human primates that have selective responses to harmonic pitches. Accumulating evidence has also shown that neurons in many regions of the auditory cortex exhibit characteristic responses to harmonically related frequencies beyond the range of pitch. Together, these findings suggest that a fundamental organizational principle of auditory cortex is based on the harmonicity. Such an organization likely plays an important role in music processing by the brain. It may also form the basis of the preference for particular classes of music and voice sounds. PMID:24381544

  20. Different mechanisms are responsible for dishabituation of electrophysiological auditory responses to a change in acoustic identity than to a change in stimulus location.

    Science.gov (United States)

    Smulders, Tom V; Jarvis, Erich D

    2013-11-01

    Repeated exposure to an auditory stimulus leads to habituation of the electrophysiological and immediate-early-gene (IEG) expression response in the auditory system. A novel auditory stimulus reinstates this response in a form of dishabituation. This has been interpreted as the start of new memory formation for this novel stimulus. Changes in the location of an otherwise identical auditory stimulus can also dishabituate the IEG expression response. This has been interpreted as an integration of stimulus identity and stimulus location into a single auditory object, encoded in the firing patterns of the auditory system. In this study, we further tested this hypothesis. Using chronic multi-electrode arrays to record multi-unit activity from the auditory system of awake and behaving zebra finches, we found that habituation occurs to repeated exposure to the same song and dishabituation with a novel song, similar to that described in head-fixed, restrained animals. A large proportion of recording sites also showed dishabituation when the same auditory stimulus was moved to a novel location. However, when the song was randomly moved among 8 interleaved locations, habituation occurred independently of the continuous changes in location. In contrast, when 8 different auditory stimuli were interleaved all from the same location, a separate habituation occurred to each stimulus. This result suggests that neuronal memories of the acoustic identity and spatial location are different, and that allocentric location of a stimulus is not encoded as part of the memory for an auditory object, while its acoustic properties are. We speculate that, instead, the dishabituation that occurs with a change from a stable location of a sound is due to the unexpectedness of the location change, and might be due to different underlying mechanisms than the dishabituation and separate habituations to different acoustic stimuli.

  1. Introspective responses to cues and motivation to reduce cigarette smoking influence state and behavioral responses to cue exposure.

    Science.gov (United States)

    Veilleux, Jennifer C; Skinner, Kayla D

    2016-09-01

    In the current study, we aimed to extend smoking cue-reactivity research by evaluating delay discounting as an outcome of cigarette cue exposure. We also separated introspection in response to cues (e.g., self-reporting craving and affect) from cue exposure alone, to determine if introspection changes behavioral responses to cigarette cues. Finally, we included measures of quit motivation and resistance to smoking to assess motivational influences on cue exposure. Smokers were invited to participate in an online cue-reactivity study. Participants were randomly assigned to view smoking images or neutral images, and were randomized to respond to cues with either craving and affect questions (e.g., introspection) or filler questions. Following cue exposure, participants completed a delay discounting task and then reported state affect, craving, and resistance to smoking, as well as an assessment of quit motivation. We found that after controlling for trait impulsivity, participants who introspected on craving and affect showed higher delay discounting, irrespective of cue type, but we found no effect of response condition on subsequent craving (e.g., craving reactivity). We also found that motivation to quit interacted with experimental conditions to predict state craving and state resistance to smoking. Although asking about craving during cue exposure did not increase later craving, it resulted in greater delaying of discounted rewards. Overall, our findings suggest the need to further assess the implications of introspection and motivation on behavioral outcomes of cue exposure. PMID:27115733

  2. Introspective responses to cues and motivation to reduce cigarette smoking influence state and behavioral responses to cue exposure.

    Science.gov (United States)

    Veilleux, Jennifer C; Skinner, Kayla D

    2016-09-01

    In the current study, we aimed to extend smoking cue-reactivity research by evaluating delay discounting as an outcome of cigarette cue exposure. We also separated introspection in response to cues (e.g., self-reporting craving and affect) from cue exposure alone, to determine if introspection changes behavioral responses to cigarette cues. Finally, we included measures of quit motivation and resistance to smoking to assess motivational influences on cue exposure. Smokers were invited to participate in an online cue-reactivity study. Participants were randomly assigned to view smoking images or neutral images, and were randomized to respond to cues with either craving and affect questions (e.g., introspection) or filler questions. Following cue exposure, participants completed a delay discounting task and then reported state affect, craving, and resistance to smoking, as well as an assessment of quit motivation. We found that after controlling for trait impulsivity, participants who introspected on craving and affect showed higher delay discounting, irrespective of cue type, but we found no effect of response condition on subsequent craving (e.g., craving reactivity). We also found that motivation to quit interacted with experimental conditions to predict state craving and state resistance to smoking. Although asking about craving during cue exposure did not increase later craving, it resulted in greater delaying of discounted rewards. Overall, our findings suggest the need to further assess the implications of introspection and motivation on behavioral outcomes of cue exposure.

  3. The effect of background music in auditory health persuasion

    NARCIS (Netherlands)

    Elbert, Sarah; Dijkstra, Arie

    2013-01-01

    In auditory health persuasion, threatening information regarding health is communicated by voice only. One relevant context of auditory persuasion is the addition of background music. There are different mechanisms through which background music might influence persuasion, for example through mood (

  4. Auditory Discrimination Development through Vestibulo-Cochlear Stimulation.

    Science.gov (United States)

    Palmer, Lyelle L.

    1980-01-01

    Three types of vestibular activities (active, adaptive, and passively imposed) to improve auditory discrimination development are described and results of a study using the vestibular stimulation techniques with 20 Ss (average age 9) having abnormal auditory discrimination. (PHR)

  5. The role of visual spatial attention in audiovisual speech perception

    DEFF Research Database (Denmark)

    Andersen, Tobias; Tiippana, K.; Laarni, J.;

    2009-01-01

    Auditory and visual information is integrated when perceiving speech, as evidenced by the McGurk effect in which viewing an incongruent talking face categorically alters auditory speech perception. Audiovisual integration in speech perception has long been considered automatic and pre......-attentive but recent reports have challenged this view. Here we study the effect of visual spatial attention on the McGurk effect. By presenting a movie of two faces symmetrically displaced to each side of a central fixation point and dubbed with a single auditory speech track, we were able to discern the influences...... integration did not change. Visual spatial attention was also able to select between the faces when lip reading. This suggests that visual spatial attention acts at the level of visual speech perception prior to audiovisual integration and that the effect propagates through audiovisual integration...

  6. Musical experience shapes top-down auditory mechanisms: evidence from masking and auditory attention performance.

    Science.gov (United States)

    Strait, Dana L; Kraus, Nina; Parbery-Clark, Alexandra; Ashley, Richard

    2010-03-01

    A growing body of research suggests that cognitive functions, such as attention and memory, drive perception by tuning sensory mechanisms to relevant acoustic features. Long-term musical experience also modulates lower-level auditory function, although the mechanisms by which this occurs remain uncertain. In order to tease apart the mechanisms that drive perceptual enhancements in musicians, we posed the question: do well-developed cognitive abilities fine-tune auditory perception in a top-down fashion? We administered a standardized battery of perceptual and cognitive tests to adult musicians and non-musicians, including tasks either more or less susceptible to cognitive control (e.g., backward versus simultaneous masking) and more or less dependent on auditory or visual processing (e.g., auditory versus visual attention). Outcomes indicate lower perceptual thresholds in musicians specifically for auditory tasks that relate with cognitive abilities, such as backward masking and auditory attention. These enhancements were observed in the absence of group differences for the simultaneous masking and visual attention tasks. Our results suggest that long-term musical practice strengthens cognitive functions and that these functions benefit auditory skills. Musical training bolsters higher-level mechanisms that, when impaired, relate to language and literacy deficits. Thus, musical training may serve to lessen the impact of these deficits by strengthening the corticofugal system for hearing. PMID:20018234

  7. Effect of task-related continuous auditory feedback during learning of tracking motion exercises

    Directory of Open Access Journals (Sweden)

    Rosati Giulio

    2012-10-01

    Full Text Available Abstract Background This paper presents the results of a set of experiments in which we used continuous auditory feedback to augment motor training exercises. This feedback modality is mostly underexploited in current robotic rehabilitation systems, which usually implement only very basic auditory interfaces. Our hypothesis is that properly designed continuous auditory feedback could be used to represent temporal and spatial information that could in turn, improve performance and motor learning. Methods We implemented three different experiments on healthy subjects, who were asked to track a target on a screen by moving an input device (controller with their hand. Different visual and auditory feedback modalities were envisaged. The first experiment investigated whether continuous task-related auditory feedback can help improve performance to a greater extent than error-related audio feedback, or visual feedback alone. In the second experiment we used sensory substitution to compare different types of auditory feedback with equivalent visual feedback, in order to find out whether mapping the same information on a different sensory channel (the visual channel yielded comparable effects with those gained in the first experiment. The final experiment applied a continuously changing visuomotor transformation between the controller and the screen and mapped kinematic information, computed in either coordinate system (controller or video, to the audio channel, in order to investigate which information was more relevant to the user. Results Task-related audio feedback significantly improved performance with respect to visual feedback alone, whilst error-related feedback did not. Secondly, performance in audio tasks was significantly better with respect to the equivalent sensory-substituted visual tasks. Finally, with respect to visual feedback alone, video-task-related sound feedback decreased the tracking error during the learning of a novel

  8. Cooperative dynamics in auditory brain response

    CERN Document Server

    Kwapien, J; Liu, L C; Ioannides, A A

    1998-01-01

    Simultaneous estimates of the activity in the left and right auditory cortex of five normal human subjects were extracted from Multichannel Magnetoencephalography recordings. Left, right and binaural stimulation were used, in separate runs, for each subject. The resulting time-series of left and right auditory cortex activity were analysed using the concept of mutual information. The analysis constitutes an objective method to address the nature of inter-hemispheric correlations in response to auditory stimulations. The results provide a clear evidence for the occurrence of such correlations mediated by a direct information transport, with clear laterality effects: as a rule, the contralateral hemisphere leads by 10-20ms, as can be seen in the average signal. The strength of the inter-hemispheric coupling, which cannot be extracted from the average data, is found to be highly variable from subject to subject, but remarkably stable for each subject.

  9. Applied research in auditory data representation

    Science.gov (United States)

    Frysinger, Steve P.

    1990-08-01

    A class of data displays, characterized generally as Auditory Data Representation, is described and motivated. This type of data representation takes advantage of the tremendous pattern recognition capability of the human auditory channel. Audible displays offer an alternative means of conveying quantitative data to the analyst to facilitate information extraction, and are successfully used alone and in conjunction with visual displays. The Auditory Data Representation literature is reviewed, along with elements of the allied fields of investigation, Psychoacoustics and Musical Perception. A methodology for applied research in this field, based upon the well-developed discipline of psychophysics, is elaborated using a recent experiment as a case study. This method permits objective estimation of a data representation technique by comparing it to alternative displays for the pattern recognition task at hand. The psychophysical threshold of signal to noise level, for constant pattern recognition performance, is the measure of display effectiveness.

  10. Are auditory percepts determined by experience?

    Science.gov (United States)

    Monson, Brian B; Han, Shui'Er; Purves, Dale

    2013-01-01

    Audition--what listeners hear--is generally studied in terms of the physical properties of sound stimuli and physiological properties of the auditory system. Based on recent work in vision, we here consider an alternative perspective that sensory percepts are based on past experience. In this framework, basic auditory qualities (e.g., loudness and pitch) are based on the frequency of occurrence of stimulus patterns in natural acoustic stimuli. To explore this concept of audition, we examined five well-documented psychophysical functions. The frequency of occurrence of acoustic patterns in a database of natural sound stimuli (speech) predicts some qualitative aspects of these functions, but with substantial quantitative discrepancies. This approach may offer a rationale for auditory phenomena that are difficult to explain in terms of the physical attributes of the stimuli as such.

  11. Are auditory percepts determined by experience?

    Directory of Open Access Journals (Sweden)

    Brian B Monson

    Full Text Available Audition--what listeners hear--is generally studied in terms of the physical properties of sound stimuli and physiological properties of the auditory system. Based on recent work in vision, we here consider an alternative perspective that sensory percepts are based on past experience. In this framework, basic auditory qualities (e.g., loudness and pitch are based on the frequency of occurrence of stimulus patterns in natural acoustic stimuli. To explore this concept of audition, we examined five well-documented psychophysical functions. The frequency of occurrence of acoustic patterns in a database of natural sound stimuli (speech predicts some qualitative aspects of these functions, but with substantial quantitative discrepancies. This approach may offer a rationale for auditory phenomena that are difficult to explain in terms of the physical attributes of the stimuli as such.

  12. Phonetic categorization in auditory word perception.

    Science.gov (United States)

    Ganong, W F

    1980-02-01

    To investigate the interaction in speech perception of auditory information and lexical knowledge (in particular, knowledge of which phonetic sequences are words), acoustic continua varying in voice onset time were constructed so that for each acoustic continuum, one of the two possible phonetic categorizations made a word and the other did not. For example, one continuum ranged between the word dash and the nonword tash; another used the nonword dask and the word task. In two experiments, subjects showed a significant lexical effect--that is, a tendency to make phonetic categorizations that make words. This lexical effect was greater at the phoneme boundary (where auditory information is ambiguous) than at the ends of the condinua. Hence the lexical effect must arise at a stage of processing sensitive to both lexical knowledge and auditory information.

  13. Auditory temporal processes in the elderly

    Directory of Open Access Journals (Sweden)

    E. Ben-Artzi

    2011-03-01

    Full Text Available Several studies have reported age-related decline in auditory temporal resolution and in working memory. However, earlier studies did not provide evidence as to whether these declines reflect overall changes in the same mechanisms, or reflect age-related changes in two independent mechanisms. In the current study we examined whether the age-related decline in auditory temporal resolution and in working memory would remain significant even after controlling for their shared variance. Eighty-two participants, aged 21-82 performed the dichotic temporal order judgment task and the backward digit span task. The findings indicate that age-related decline in auditory temporal resolution and in working memory are two independent processes.

  14. Brain dynamic mechanisms on the visual attention scale with Chinese characters cues

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The temporal dynamics in brain evoked by the scale of visual attention with the cues of Chinese characters were studied by recording event-related potentials (ERPs). With the fixed orientation of visual attention, 14 healthy young participants performed a search task in which the search array was preceded by Chinese characters cues, "大, 中, 小" (large, medium, small). 128 channels scalp ERPs were recorded to study the role of visual attention scale played in the visual spatial attention. The results showed that there was no significant difference in the ERP components evoked by the three Chinese characters cues except the inferoposterior N2 latency. The targets evoked P2, N2 amplitudes and latency have significant differences with the different cues of large, middle and small, while P1 and N1 components had no significant difference. The results suggested that the processing of scale of visual attention was mainly concerned with P2, N2 components, while the P1, N1 components were mainly related with the processing of visual orientation information.

  15. Scale Changes Provide an Alternative Cue For the Discrimination of Heading, But Not Object Motion.

    Science.gov (United States)

    Calabro, Finnegan J; Vaina, Lucia Maria

    2016-01-01

    BACKGROUND Understanding the dynamics of our surrounding environments is a task usually attributed to the detection of motion based on changes in luminance across space. Yet a number of other cues, both dynamic and static, have been shown to provide useful information about how we are moving and how objects around us move. One such cue, based on changes in spatial frequency, or scale, over time has been shown to be useful in conveying motion in depth even in the absence of a coherent, motion-defined flow field (optic flow). MATERIAL AND METHODS 16 right handed healthy observers (ages 18-28) participated in the behavioral experiments described in this study. Using analytical behavioral methods we investigate the functional specificity of this cue by measuring the ability of observers to perform tasks of heading (direction of self-motion) and 3D trajectory discrimination on the basis of scale changes and optic flow. RESULTS Statistical analyses of performance on the test-experiments in comparison to the control experiments suggests that while scale changes may be involved in the detection of heading, they are not correctly integrated with translational motion and, thus, do not provide a correct discrimination of 3D object trajectories. CONCLUSIONS These results have the important implication for the type of visual guided navigation that can be done by an observer blind to optic flow. Scale change is an important alternative cue for self-motion. PMID:27231114

  16. Cleaning MEG artifacts using external cues.

    Science.gov (United States)

    Tal, I; Abeles, M

    2013-07-15

    Using EEG, ECoG, MEG, and microelectrodes to record brain activity is prone to multiple artifacts. The main power line (mains line), video equipment, mechanical vibrations and activities outside the brain are the most common sources of artifacts. MEG amplitudes are low, and even small artifacts distort recordings. In this study, we show how these artifacts can be efficiently removed by recording external cues during MEG recordings. These external cues are subsequently used to register the precise times or spectra of the artifacts. The results indicate that these procedures preserve both the spectra and the time domain wave-shapes of the neuromagnetic signal, while successfully reducing the contribution of the artifacts to the target signals without reducing the rank of the data.

  17. Consumer attention to product health cues

    DEFF Research Database (Denmark)

    Orquin, Jacob Lund

    different dairy products) were varied within subjects and the viewing task (free viewing, product healthfulness evaluation, purchase likelihood evaluation) was varied between subjects. As a follow-up, three more studies were carried out using verbal response measures to assess perceived product...... was a 2x2 group mixed design manipulating health claims (absent, present) and taste claims (absent, present). Study 4 was a four-group between-subjects design manipulating food labels (a national organic label, EU organic label, a national keyhole label [indicating product healthfulness], a combination...... of all labels). Results The only elements operating as health cues were the nutrition label and the organic label. The information cues used during purchase evaluation were the product category name and the nutrition label. Results also revealed that the probability a consumer will read the nutrition...

  18. What determines auditory distraction? On the roles of local auditory changes and expectation violations.

    Directory of Open Access Journals (Sweden)

    Jan P Röer

    Full Text Available Both the acoustic variability of a distractor sequence and the degree to which it violates expectations are important determinants of auditory distraction. In four experiments we examined the relative contribution of local auditory changes on the one hand and expectation violations on the other hand in the disruption of serial recall by irrelevant sound. We present evidence for a greater disruption by auditory sequences ending in unexpected steady state distractor repetitions compared to auditory sequences with expected changing state endings even though the former contained fewer local changes. This effect was demonstrated with piano melodies (Experiment 1 and speech distractors (Experiment 2. Furthermore, it was replicated when the expectation violation occurred after the encoding of the target items (Experiment 3, indicating that the items' maintenance in short-term memory was disrupted by attentional capture and not their encoding. This seems to be primarily due to the violation of a model of the specific auditory distractor sequences because the effect vanishes and even reverses when the experiment provides no opportunity to build up a specific neural model about the distractor sequence (Experiment 4. Nevertheless, the violation of abstract long-term knowledge about auditory regularities seems to cause a small and transient capture effect: Disruption decreased markedly over the course of the experiments indicating that participants habituated to the unexpected distractor repetitions across trials. The overall pattern of results adds to the growing literature that the degree to which auditory distractors violate situation-specific expectations is a more important determinant of auditory distraction than the degree to which a distractor sequence contains local auditory changes.

  19. Auditory Neuropathy Spectrum Disorder Masquerading as Social Anxiety

    OpenAIRE

    Behere, Rishikesh V.; Rao, Mukund G.; Mishra, Shree; Varambally, Shivarama; Nagarajarao, Shivashankar; Bangalore N Gangadhar

    2015-01-01

    The authors report a case of a 47-year-old man who presented with treatment-resistant anxiety disorder. Behavioral observation raised clinical suspicion of auditory neuropathy spectrum disorder. The presence of auditory neuropathy spectrum disorder was confirmed on audiological investigations. The patient was experiencing extreme symptoms of anxiety, which initially masked the underlying diagnosis of auditory neuropathy spectrum disorder. Challenges in diagnosis and treatment of auditory neur...

  20. ABR and auditory P300 findings inchildren with ADHD

    OpenAIRE

    Schochat Eliane; Scheuer Claudia Ines; Andrade Ênio Roberto de

    2002-01-01

    Auditory processing disorders (APD), also referred as central auditory processing disorders (CAPD) and attention deficit hyperactivity disorders (ADHD) have become popular diagnostic entities for school age children. It has been demonstrated a high incidence of comorbid ADHD with communication disorders and auditory processing disorder. The aim of this study was to investigate ABR and P300 auditory evoked potentials in children with ADHD, in a double-blind study. Twenty-one children, ages bet...