WorldWideScience

Sample records for auditory spatial cues

  1. Cueing Visual Attention to Spatial Locations With Auditory Cues

    OpenAIRE

    Kean, Matthew; Crawford, Trevor J

    2008-01-01

    We investigated exogenous and endogenous orienting of visual attention to the spatial loca-tion of an auditory cue. In Experiment 1, significantly faster saccades were observed to vis-ual targets appearing ipsilateral, compared to contralateral, to the peripherally-presented cue. This advantage was greatest in an 80% target-at-cue (TAC) condition but equivalent in 20% and 50% TAC conditions. In Experiment 2, participants maintained central fixation while making an elevation judgment of the pe...

  2. The Role of Auditory Cues in the Spatial Knowledge of Blind Individuals

    Science.gov (United States)

    Papadopoulos, Konstantinos; Papadimitriou, Kimon; Koutsoklenis, Athanasios

    2012-01-01

    The study presented here sought to explore the role of auditory cues in the spatial knowledge of blind individuals by examining the relation between the perceived auditory cues and the landscape of a given area and by investigating how blind individuals use auditory cues to create cognitive maps. The findings reveal that several auditory cues…

  3. A Dominance Hierarchy of Auditory Spatial Cues in Barn Owls

    OpenAIRE

    Witten, Ilana B.; Phyllis F Knudsen; Knudsen, Eric I.

    2010-01-01

    BACKGROUND: Barn owls integrate spatial information across frequency channels to localize sounds in space. METHODOLOGY/PRINCIPAL FINDINGS: We presented barn owls with synchronous sounds that contained different bands of frequencies (3-5 kHz and 7-9 kHz) from different locations in space. When the owls were confronted with the conflicting localization cues from two synchronous sounds of equal level, their orienting responses were dominated by one of the sounds: they oriented toward the locatio...

  4. Selective attention modulates human auditory brainstem responses: relative contributions of frequency and spatial cues.

    Directory of Open Access Journals (Sweden)

    Alexandre Lehmann

    Full Text Available Selective attention is the mechanism that allows focusing one's attention on a particular stimulus while filtering out a range of other stimuli, for instance, on a single conversation in a noisy room. Attending to one sound source rather than another changes activity in the human auditory cortex, but it is unclear whether attention to different acoustic features, such as voice pitch and speaker location, modulates subcortical activity. Studies using a dichotic listening paradigm indicated that auditory brainstem processing may be modulated by the direction of attention. We investigated whether endogenous selective attention to one of two speech signals affects amplitude and phase locking in auditory brainstem responses when the signals were either discriminable by frequency content alone, or by frequency content and spatial location. Frequency-following responses to the speech sounds were significantly modulated in both conditions. The modulation was specific to the task-relevant frequency band. The effect was stronger when both frequency and spatial information were available. Patterns of response were variable between participants, and were correlated with psychophysical discriminability of the stimuli, suggesting that the modulation was biologically relevant. Our results demonstrate that auditory brainstem responses are susceptible to efferent modulation related to behavioral goals. Furthermore they suggest that mechanisms of selective attention actively shape activity at early subcortical processing stages according to task relevance and based on frequency and spatial cues.

  5. Cross-modal cueing in audiovisual spatial attention

    OpenAIRE

    Blurton, Steven Paul; Mark W Greenlee; Gondan, Matthias

    2015-01-01

    Visual processing is most effective at the location of our attentional focus. It has long been known that various spatial cues can direct visuospatial attention and influence the detection of auditory targets. Cross-modal cueing, however, seems to depend on the type of the visual cue: facilitation effects have been reported for endogenous visual cues while exogenous cues seem to be mostly ineffective. In three experiments, we investigated cueing effects on the processing of audiovisual signal...

  6. Spatial auditory processing in pinnipeds

    Science.gov (United States)

    Holt, Marla M.

    Given the biological importance of sound for a variety of activities, pinnipeds must be able to obtain spatial information about their surroundings thorough acoustic input in the absence of other sensory cues. The three chapters of this dissertation address spatial auditory processing capabilities of pinnipeds in air given that these amphibious animals use acoustic signals for reproduction and survival on land. Two chapters are comparative lab-based studies that utilized psychophysical approaches conducted in an acoustic chamber. Chapter 1 addressed the frequency-dependent sound localization abilities at azimuth of three pinniped species (the harbor seal, Phoca vitulina, the California sea lion, Zalophus californianus, and the northern elephant seal, Mirounga angustirostris). While performances of the sea lion and harbor seal were consistent with the duplex theory of sound localization, the elephant seal, a low-frequency hearing specialist, showed a decreased ability to localize the highest frequencies tested. In Chapter 2 spatial release from masking (SRM), which occurs when a signal and masker are spatially separated resulting in improvement in signal detectability relative to conditions in which they are co-located, was determined in a harbor seal and sea lion. Absolute and masked thresholds were measured at three frequencies and azimuths to determine the detection advantages afforded by this type of spatial auditory processing. Results showed that hearing sensitivity was enhanced by up to 19 and 12 dB in the harbor seal and sea lion, respectively, when the signal and masker were spatially separated. Chapter 3 was a field-based study that quantified both sender and receiver variables of the directional properties of male northern elephant seal calls produce within communication system that serves to delineate dominance status. This included measuring call directivity patterns, observing male-male vocally-mediated interactions, and an acoustic playback study

  7. Cross-modal cueing in audiovisual spatial attention

    DEFF Research Database (Denmark)

    Blurton, Steven Paul; Greenlee, Mark W.; Gondan, Matthias

    2015-01-01

    Visual processing is most effective at the location of our attentional focus. It has long been known that various spatial cues can direct visuospatial attention and influence the detection of auditory targets. Cross-modal cueing, however, seems to depend on the type of the visual cue: facilitation...... endogenous cues imply that the perception of multisensory signals is modulated by a single, supramodal system operating in a top-down manner (Experiment 1). In contrast, bottom-up control of attention, as observed in the exogenous cueing task of Experiment 2, mainly exerts its influence through modality...

  8. Auditory distance perception in humans: a review of cues, development, neuronal bases, and effects of sensory loss.

    Science.gov (United States)

    Kolarik, Andrew J; Moore, Brian C J; Zahorik, Pavel; Cirstea, Silvia; Pardhan, Shahina

    2016-02-01

    Auditory distance perception plays a major role in spatial awareness, enabling location of objects and avoidance of obstacles in the environment. However, it remains under-researched relative to studies of the directional aspect of sound localization. This review focuses on the following four aspects of auditory distance perception: cue processing, development, consequences of visual and auditory loss, and neurological bases. The several auditory distance cues vary in their effective ranges in peripersonal and extrapersonal space. The primary cues are sound level, reverberation, and frequency. Nonperceptual factors, including the importance of the auditory event to the listener, also can affect perceived distance. Basic internal representations of auditory distance emerge at approximately 6 months of age in humans. Although visual information plays an important role in calibrating auditory space, sensorimotor contingencies can be used for calibration when vision is unavailable. Blind individuals often manifest supranormal abilities to judge relative distance but show a deficit in absolute distance judgments. Following hearing loss, the use of auditory level as a distance cue remains robust, while the reverberation cue becomes less effective. Previous studies have not found evidence that hearing-aid processing affects perceived auditory distance. Studies investigating the brain areas involved in processing different acoustic distance cues are described. Finally, suggestions are given for further research on auditory distance perception, including broader investigation of how background noise and multiple sound sources affect perceived auditory distance for those with sensory loss. PMID:26590050

  9. Retrosplenial Cortex Is Required for the Retrieval of Remote Memory for Auditory Cues

    Science.gov (United States)

    Todd, Travis P.; Mehlman, Max L.; Keene, Christopher S.; DeAngeli, Nicole E.; Bucci, David J.

    2016-01-01

    The retrosplenial cortex (RSC) has a well-established role in contextual and spatial learning and memory, consistent with its known connectivity with visuo-spatial association areas. In contrast, RSC appears to have little involvement with delay fear conditioning to an auditory cue. However, all previous studies have examined the contribution of…

  10. Similarity in Spatial Origin of Information Facilitates Cue Competition and Interference

    Science.gov (United States)

    Amundson, Jeffrey C.; Miller, Ralph R.

    2007-01-01

    Two lick suppression studies were conducted with water-deprived rats to investigate the influence of spatial similarity in cue interaction. Experiment 1 assessed the influence of similarity of the spatial origin of competing cues in a blocking procedure. Greater blocking was observed in the condition in which the auditory blocking cue and the…

  11. The plastic ear and perceptual relearning in auditory spatial perception.

    Science.gov (United States)

    Carlile, Simon

    2014-01-01

    The auditory system of adult listeners has been shown to accommodate to altered spectral cues to sound location which presumably provides the basis for recalibration to changes in the shape of the ear over a life time. Here we review the role of auditory and non-auditory inputs to the perception of sound location and consider a range of recent experiments looking at the role of non-auditory inputs in the process of accommodation to these altered spectral cues. A number of studies have used small ear molds to modify the spectral cues that result in significant degradation in localization performance. Following chronic exposure (10-60 days) performance recovers to some extent and recent work has demonstrated that this occurs for both audio-visual and audio-only regions of space. This begs the questions as to the teacher signal for this remarkable functional plasticity in the adult nervous system. Following a brief review of influence of the motor state in auditory localization, we consider the potential role of auditory-motor learning in the perceptual recalibration of the spectral cues. Several recent studies have considered how multi-modal and sensory-motor feedback might influence accommodation to altered spectral cues produced by ear molds or through virtual auditory space stimulation using non-individualized spectral cues. The work with ear molds demonstrates that a relatively short period of training involving audio-motor feedback (5-10 days) significantly improved both the rate and extent of accommodation to altered spectral cues. This has significant implications not only for the mechanisms by which this complex sensory information is encoded to provide spatial cues but also for adaptive training to altered auditory inputs. The review concludes by considering the implications for rehabilitative training with hearing aids and cochlear prosthesis. PMID:25147497

  12. Tactile feedback improves auditory spatial localization

    OpenAIRE

    Gori, Monica; Vercillo, Tiziana; Sandini, Giulio; Burr, David

    2014-01-01

    Our recent studies suggest that congenitally blind adults have severely impaired thresholds in an auditory spatial bisection task, pointing to the importance of vision in constructing complex auditory spatial maps (Gori et al., 2014). To explore strategies that may improve the auditory spatial sense in visually impaired people, we investigated the impact of tactile feedback on spatial auditory localization in 48 blindfolded sighted subjects. We measured auditory spatial bisection thresholds b...

  13. Tactile feedback improves auditory spatial localization

    OpenAIRE

    Monica eGori; Tiziana eVercillo; Giulio eSandini; David eBurr

    2014-01-01

    Our recent studies suggest that congenitally blind adults have severely impaired thresholds in an auditory spatial-bisection task, pointing to the importance of vision in constructing complex auditory spatial maps (Gori et al., 2014). To explore strategies that may improve the auditory spatial sense in visually impaired people, we investigated the impact of tactile feedback on spatial auditory localization in 48 blindfolded sighted subjects. We measured auditory spatial bisection thresholds b...

  14. The plastic ear and perceptual relearning in auditory spatial perception.

    Directory of Open Access Journals (Sweden)

    Simon eCarlile

    2014-08-01

    Full Text Available The auditory system of adult listeners has been shown to accommodate to altered spectral cues to sound location which presumably provides the basis for recalibration to changes in the shape of the ear over a life time. Here we review the role of auditory and non-auditory inputs to the perception of sound location and consider a range of recent experiments looking at the role of non-auditory inputs in the process of accommodation to these altered spectral cues. A number of studies have used small ear moulds to modify the spectral cues that result in significant degradation in localization performance. Following chronic exposure (10-60 days performance recovers to some extent and recent work has demonstrated that this occurs for both audio-visual and audio-only regions of space. This begs the questions as to the teacher signal for this remarkable functional plasticity in the adult nervous system. Following a brief review of influence of the motor state in auditory localisation, we consider the potential role of auditory-motor learning in the perceptual recalibration of the spectral cues. Several recent studies have considered how multi-modal and sensory-motor feedback might influence accommodation to altered spectral cues produced by ear moulds or through virtual auditory space stimulation using non-individualised spectral cues. The work with ear moulds demonstrates that a relatively short period of training involving sensory-motor feedback (5 – 10 days significantly improved both the rate and extent of accommodation to altered spectral cues. This has significant implications not only for the mechanisms by which this complex sensory information is encoded to provide a spatial code but also for adaptive training to altered auditory inputs. The review concludes by considering the implications for rehabilitative training with hearing aids and cochlear prosthesis.

  15. Auditory rhythmic cueing in movement rehabilitation: findings and possible mechanisms

    Science.gov (United States)

    Schaefer, Rebecca S.

    2014-01-01

    Moving to music is intuitive and spontaneous, and music is widely used to support movement, most commonly during exercise. Auditory cues are increasingly also used in the rehabilitation of disordered movement, by aligning actions to sounds such as a metronome or music. Here, the effect of rhythmic auditory cueing on movement is discussed and representative findings of cued movement rehabilitation are considered for several movement disorders, specifically post-stroke motor impairment, Parkinson's disease and Huntington's disease. There are multiple explanations for the efficacy of cued movement practice. Potentially relevant, non-mutually exclusive mechanisms include the acceleration of learning; qualitatively different motor learning owing to an auditory context; effects of increased temporal skills through rhythmic practices and motivational aspects of musical rhythm. Further considerations of rehabilitation paradigm efficacy focus on specific movement disorders, intervention methods and complexity of the auditory cues. Although clinical interventions using rhythmic auditory cueing do not show consistently positive results, it is argued that internal mechanisms of temporal prediction and tracking are crucial, and further research may inform rehabilitation practice to increase intervention efficacy. PMID:25385780

  16. Auditory spatial localization: Developmental delay in children with visual impairments.

    Science.gov (United States)

    Cappagli, Giulia; Gori, Monica

    2016-01-01

    For individuals with visual impairments, auditory spatial localization is one of the most important features to navigate in the environment. Many works suggest that blind adults show similar or even enhanced performance for localization of auditory cues compared to sighted adults (Collignon, Voss, Lassonde, & Lepore, 2009). To date, the investigation of auditory spatial localization in children with visual impairments has provided contrasting results. Here we report, for the first time, that contrary to visually impaired adults, children with low vision or total blindness show a significant impairment in the localization of static sounds. These results suggest that simple auditory spatial tasks are compromised in children, and that this capacity recovers over time. PMID:27002960

  17. Visual–auditory spatial processing in auditory cortical neurons

    OpenAIRE

    Bizley, Jennifer K.; King, Andrew J

    2008-01-01

    Neurons responsive to visual stimulation have now been described in the auditory cortex of various species, but their functions are largely unknown. Here we investigate the auditory and visual spatial sensitivity of neurons recorded in 5 different primary and non-primary auditory cortical areas of the ferret. We quantified the spatial tuning of neurons by measuring the responses to stimuli presented across a range of azimuthal positions and calculating the mutual information (MI) between the ...

  18. Spatial audition in a static virtual environment: the role of auditory-visual interaction

    Directory of Open Access Journals (Sweden)

    Isabelle Viaud-Delmon

    2009-04-01

    Full Text Available The integration of the auditory modality in virtual reality environments is known to promote the sensations of immersion and presence. However it is also known from psychophysics studies that auditory-visual interaction obey to complex rules and that multisensory conflicts may disrupt the adhesion of the participant to the presented virtual scene. It is thus important to measure the accuracy of the auditory spatial cues reproduced by the auditory display and their consistency with the spatial visual cues. This study evaluates auditory localization performances under various unimodal and auditory-visual bimodal conditions in a virtual reality (VR setup using a stereoscopic display and binaural reproduction over headphones in static conditions. The auditory localization performances observed in the present study are in line with those reported in real conditions, suggesting that VR gives rise to consistent auditory and visual spatial cues. These results validate the use of VR for future psychophysics experiments with auditory and visual stimuli. They also emphasize the importance of a spatially accurate auditory and visual rendering for VR setups.

  19. The effect of exogenous spatial attention on auditory information processing.

    OpenAIRE

    Kanai, Kenichi; Ikeda, Kazuo; Tayama, Tadayuki

    2007-01-01

    This study investigated the effect of exogenous spatial attention on auditory information processing. In Experiments 1, 2 and 3, temporal order judgment tasks were performed to examine the effect. In Experiment 1 and 2, a cue tone was presented to either the left or right ear, followed by sequential presentation of two target tones. The subjects judged the order of presentation of the target tones. The results showed that subjects heard both tones simultaneously when the target tone, which wa...

  20. Tactile feedback improves auditory spatial localization.

    Science.gov (United States)

    Gori, Monica; Vercillo, Tiziana; Sandini, Giulio; Burr, David

    2014-01-01

    Our recent studies suggest that congenitally blind adults have severely impaired thresholds in an auditory spatial bisection task, pointing to the importance of vision in constructing complex auditory spatial maps (Gori et al., 2014). To explore strategies that may improve the auditory spatial sense in visually impaired people, we investigated the impact of tactile feedback on spatial auditory localization in 48 blindfolded sighted subjects. We measured auditory spatial bisection thresholds before and after training, either with tactile feedback, verbal feedback, or no feedback. Audio thresholds were first measured with a spatial bisection task: subjects judged whether the second sound of a three sound sequence was spatially closer to the first or the third sound. The tactile feedback group underwent two audio-tactile feedback sessions of 100 trials, where each auditory trial was followed by the same spatial sequence played on the subject's forearm; auditory spatial bisection thresholds were evaluated after each session. In the verbal feedback condition, the positions of the sounds were verbally reported to the subject after each feedback trial. The no feedback group did the same sequence of trials, with no feedback. Performance improved significantly only after audio-tactile feedback. The results suggest that direct tactile feedback interacts with the auditory spatial localization system, possibly by a process of cross-sensory recalibration. Control tests with the subject rotated suggested that this effect occurs only when the tactile and acoustic sequences are spatially congruent. Our results suggest that the tactile system can be used to recalibrate the auditory sense of space. These results encourage the possibility of designing rehabilitation programs to help blind persons establish a robust auditory sense of space, through training with the tactile modality. PMID:25368587

  1. Tactile feedback improves auditory spatial localization

    Directory of Open Access Journals (Sweden)

    Monica eGori

    2014-10-01

    Full Text Available Our recent studies suggest that congenitally blind adults have severely impaired thresholds in an auditory spatial-bisection task, pointing to the importance of vision in constructing complex auditory spatial maps (Gori et al., 2014. To explore strategies that may improve the auditory spatial sense in visually impaired people, we investigated the impact of tactile feedback on spatial auditory localization in 48 blindfolded sighted subjects. We measured auditory spatial bisection thresholds before and after training, either with tactile feedback, verbal feedback or no feedback. Audio thresholds were first measured with a spatial bisection task: subjects judged whether the second sound of a three sound sequence was spatially closer to the first or the third sound. The tactile-feedback group underwent two audio-tactile feedback sessions of 100 trials, where each auditory trial was followed by the same spatial sequence played on the subject’s forearm; auditory spatial bisection thresholds were evaluated after each session. In the verbal-feedback condition, the positions of the sounds were verbally reported to the subject after each feedback trial. The no-feedback group did the same sequence of trials, with no feedback. Performance improved significantly only after audio-tactile feedback. The results suggest that direct tactile feedback interacts with the auditory spatial localization system, possibly by a process of cross-sensory recalibration. Control tests with the subject rotated suggested that this effect occurs only when the tactile and acoustic sequences are spatially coherent. Our results suggest that the tactile system can be used to recalibrate the auditory sense of space. These results encourage the possibility of designing rehabilitation programs to help blind persons establish a robust auditory sense of space, through training with the tactile modality.

  2. Auditory Cues Used for Wayfinding in Urban Environments by Individuals with Visual Impairments

    Science.gov (United States)

    Koutsoklenis, Athanasios; Papadopoulos, Konstantinos

    2011-01-01

    The study presented here examined which auditory cues individuals with visual impairments use more frequently and consider to be the most important for wayfinding in urban environments. It also investigated the ways in which these individuals use the most significant auditory cues. (Contains 1 table and 3 figures.)

  3. A review on auditory space adaptations to altered head-related cues

    Directory of Open Access Journals (Sweden)

    Catarina eMendonça

    2014-07-01

    Full Text Available In this article we present a review of current literature on adaptations to altered head-related auditory localization cues. Localization cues can be altered through ear blocks, ear molds, electronic hearing devices and altered head-related transfer functions. Three main methods have been used to induce auditory space adaptation: sound exposure, training with feedback, and explicit training. Adaptations induced by training, rather than exposure, are consistently faster. Studies on localization with altered head-related cues have reported poor initial localization, but improved accuracy and discriminability with training. Also, studies that displaced the auditory space by altering cue values reported adaptations in perceived source position to compensate for such displacements. Auditory space adaptations can last for a few months even without further contact with the learned cues. In most studies, localization with the subject’s own unaltered cues remained intact despite the adaptation to a second set of cues. Generalization is observed from trained to untrained sound source positions, but there is mixed evidence regarding cross-frequency generalization. Multiple brain areas might be involved in auditory space adaptation processes, but the auditory cortex may play a critical role. Auditory space plasticity may involve context-dependent cue reweighting.

  4. A lateralized auditory evoked potential elicited when auditory objects are defined by spatial motion.

    Science.gov (United States)

    Butcher, Andrew; Govenlock, Stanley W; Tata, Matthew S

    2011-02-01

    Scene analysis involves the process of segmenting a field of overlapping objects from each other and from the background. It is a fundamental stage of perception in both vision and hearing. The auditory system encodes complex cues that allow listeners to find boundaries between sequential objects, even when no gap of silence exists between them. In this sense, object perception in hearing is similar to perceiving visual objects defined by isoluminant color, motion or binocular disparity. Motion is one such cue: when a moving sound abruptly disappears from one location and instantly reappears somewhere else, the listener perceives two sequential auditory objects. Smooth reversals of motion direction do not produce this segmentation. We investigated the brain electrical responses evoked by this spatial segmentation cue and compared them to the familiar auditory evoked potential elicited by sound onsets. Segmentation events evoke a pattern of negative and positive deflections that are unlike those evoked by onsets. We identified a negative component in the waveform - the Lateralized Object-Related Negativity - generated by the hemisphere contralateral to the side on which the new sound appears. The relationship between this component and similar components found in related paradigms is considered. PMID:21056097

  5. Guiding a Person with Blindness and Intellectual Disability in Indoor Travel with Fewer Auditory Cues.

    Science.gov (United States)

    Lancioni, Giulio E.; O'Reilly, Mark F.; Oliva, Doretta; Bracalente, Sandro

    1998-01-01

    This study assessed the possibility of guiding a person with blindness and intellectual disability during indoor travel with fewer auditory cues. Results indicated that infrequent presentation of the cues and the provision of extra cues in case of errors maintained high levels of independent moves, albeit of increased duration. (Author/CR)

  6. Robust auditory localization using probabilistic inference and coherence-based weighting of interaural cues.

    Science.gov (United States)

    Kayser, Hendrik; Hohmann, Volker; Ewert, Stephan D; Kollmeier, Birger; Anemüller, Jörn

    2015-11-01

    Robust sound source localization is performed by the human auditory system even in challenging acoustic conditions and in previously unencountered, complex scenarios. Here a computational binaural localization model is proposed that possesses mechanisms for handling of corrupted or unreliable localization cues and generalization across different acoustic situations. Central to the model is the use of interaural coherence, measured as interaural vector strength (IVS), to dynamically weight the importance of observed interaural phase (IPD) and level (ILD) differences in frequency bands up to 1.4 kHz. This is accomplished through formulation of a probabilistic model in which the ILD and IPD distributions pertaining to a specific source location are dependent on observed interaural coherence. Bayesian computation of the direction-of-arrival probability map naturally leads to coherence-weighted integration of location cues across frequency and time. Results confirm the model's validity through statistical analyses of interaural parameter values. Simulated localization experiments show that even data points with low reliability (i.e., low IVS) can be exploited to enhance localization performance. A temporal integration length of at least 200 ms is required to gain a benefit; this is in accordance with previous psychoacoustic findings on temporal integration of spatial cues in the human auditory system. PMID:26627742

  7. Listener orientation and spatial judgments of elevated auditory percepts

    Science.gov (United States)

    Parks, Anthony J.

    How do listener head rotations affect auditory perception of elevation? This investi-. gation addresses this in the hopes that perceptual judgments of elevated auditory. percepts may be more thoroughly understood in terms of dynamic listening cues. engendered by listener head rotations and that this phenomenon can be psychophys-. ically and computationally modeled. Two listening tests were conducted and a. psychophysical model was constructed to this end. The frst listening test prompted. listeners to detect an elevated auditory event produced by a virtual noise source. orbiting the median plane via 24-channel ambisonic spatialization. Head rotations. were tracked using computer vision algorithms facilitated by camera tracking. The. data were used to construct a dichotomous criteria model using factorial binary. logistic regression model. The second auditory test investigated the validity of the. historically supported frequency dependence of auditory elevation perception using. narrow-band noise for continuous and brief stimuli with fxed and free-head rotation. conditions. The data were used to construct a multinomial logistic regression model. to predict categorical judgments of above, below, and behind. Finally, in light. of the psychophysical data found from the above studies, a functional model of. elevation perception for point sources along the cone of confusion was constructed. using physiologically-inspired signal processing methods along with top-down pro-. cessing utilizing principles of memory and orientation. The model is evaluated using. white noise bursts for 42 subjects' head-related transfer functions. The investigation. concludes with study limitations, possible implications, and speculation on future. research trajectories.

  8. A Psychophysical Imaging Method Evidencing Auditory Cue Extraction during Speech Perception: A Group Analysis of Auditory Classification Images

    OpenAIRE

    Varnet, Léo; Knoblauch, Kenneth; Serniclaes, Willy; Meunier, Fanny; Hoen, Michel

    2015-01-01

    Although there is a large consensus regarding the involvement of specific acoustic cues in speech perception, the precise mechanisms underlying the transformation from continuous acoustical properties into discrete perceptual units remains undetermined. This gap in knowledge is partially due to the lack of a turnkey solution for isolating critical speech cues from natural stimuli. In this paper, we describe a psychoacoustic imaging method known as the Auditory Classification Image technique t...

  9. Auditory Verbal Cues Alter the Perceived Flavor of Beverages and Ease of Swallowing: A Psychometric and Electrophysiological Analysis

    OpenAIRE

    Aya Nakamura; Satoshi Imaizumi

    2013-01-01

    We investigated the possible effects of auditory verbal cues on flavor perception and swallow physiology for younger and elder participants. Apple juice, aojiru (grass) juice, and water were ingested with or without auditory verbal cues. Flavor perception and ease of swallowing were measured using a visual analog scale and swallow physiology by surface electromyography and cervical auscultation. The auditory verbal cues had significant positive effects on flavor and ease of swallowing as well...

  10. A Psychophysical Imaging Method Evidencing Auditory Cue Extraction during Speech Perception: A Group Analysis of Auditory Classification Images : Auditory Classification Images

    OpenAIRE

    Varnet, Léo; Knoblauch, Kenneth; Serniclaes, Willy; Meunier, Fanny; Hoen, Michel

    2015-01-01

    Although there is a large consensus regarding the involvement of specific acoustic cues in speech perception, the precise mechanisms underlying the transformation from continuous acoustical properties into discrete perceptual units remains undetermined. This gap in knowledge is partially due to the lack of a turnkey solution for isolating critical speech cues from natural stimuli. In this paper, we describe a psychoacoustic imaging method known as the Auditory Classification Image technique t...

  11. Modeling Auditory-Haptic Interface Cues from an Analog Multi-line Telephone

    Science.gov (United States)

    Begault, Durand R.; Anderson, Mark R.; Bittner, Rachael M.

    2012-01-01

    The Western Electric Company produced a multi-line telephone during the 1940s-1970s using a six-button interface design that provided robust tactile, haptic and auditory cues regarding the "state" of the communication system. This multi-line telephone was used as a model for a trade study comparison of two interfaces: a touchscreen interface (iPad)) versus a pressure-sensitive strain gauge button interface (Phidget USB interface controllers). The experiment and its results are detailed in the authors' AES 133rd convention paper " Multimodal Information Management: Evaluation of Auditory and Haptic Cues for NextGen Communication Dispays". This Engineering Brief describes how the interface logic, visual indications, and auditory cues of the original telephone were synthesized using MAX/MSP, including the logic for line selection, line hold, and priority line activation.

  12. Sonic morphology: Aesthetic dimensional auditory spatial awareness

    Science.gov (United States)

    Whitehouse, Martha M.

    The sound and ceramic sculpture installation, " Skirting the Edge: Experiences in Sound & Form," is an integration of art and science demonstrating the concept of sonic morphology. "Sonic morphology" is herein defined as aesthetic three-dimensional auditory spatial awareness. The exhibition explicates my empirical phenomenal observations that sound has a three-dimensional form. Composed of ceramic sculptures that allude to different social and physical situations, coupled with sound compositions that enhance and create a three-dimensional auditory and visual aesthetic experience (see accompanying DVD), the exhibition supports the research question, "What is the relationship between sound and form?" Precisely how people aurally experience three-dimensional space involves an integration of spatial properties, auditory perception, individual history, and cultural mores. People also utilize environmental sound events as a guide in social situations and in remembering their personal history, as well as a guide in moving through space. Aesthetically, sound affects the fascination, meaning, and attention one has within a particular space. Sonic morphology brings art forms such as a movie, video, sound composition, and musical performance into the cognitive scope by generating meaning from the link between the visual and auditory senses. This research examined sonic morphology as an extension of musique concrete, sound as object, originating in Pierre Schaeffer's work in the 1940s. Pointing, as John Cage did, to the corporeal three-dimensional experience of "all sound," I composed works that took their total form only through the perceiver-participant's participation in the exhibition. While contemporary artist Alvin Lucier creates artworks that draw attention to making sound visible, "Skirting the Edge" engages the perceiver-participant visually and aurally, leading to recognition of sonic morphology.

  13. Auditory cueing in Parkinson's patients with freezing of gait. What matters most: Action-relevance or cue-continuity?

    Science.gov (United States)

    Young, William R; Shreve, Lauren; Quinn, Emma Jane; Craig, Cathy; Bronte-Stewart, Helen

    2016-07-01

    Gait disturbances are a common feature of Parkinson's disease, one of the most severe being freezing of gait. Sensory cueing is a common method used to facilitate stepping in people with Parkinson's. Recent work has shown that, compared to walking to a metronome, Parkinson's patients without freezing of gait (nFOG) showed reduced gait variability when imitating recorded sounds of footsteps made on gravel. However, it is not known if these benefits are realised through the continuity of the acoustic information or the action-relevance. Furthermore, no study has examined if these benefits extend to PD with freezing of gait. We prepared four different auditory cues (varying in action-relevance and acoustic continuity) and asked 19 Parkinson's patients (10 nFOG, 9 with freezing of gait (FOG)) to step in place to each cue. Results showed a superiority of action-relevant cues (regardless of cue-continuity) for inducing reductions in Step coefficient of variation (CV). Acoustic continuity was associated with a significant reduction in Swing CV. Neither cue-continuity nor action-relevance was independently sufficient to increase the time spent stepping before freezing. However, combining both attributes in the same cue did yield significant improvements. This study demonstrates the potential of using action-sounds as sensory cues for Parkinson's patients with freezing of gait. We suggest that the improvements shown might be considered audio-motor 'priming' (i.e., listening to the sounds of footsteps will engage sensorimotor circuitry relevant to the production of that same action, thus effectively bypassing the defective basal ganglia). PMID:27163397

  14. Auditory Verbal Cues Alter the Perceived Flavor of Beverages and Ease of Swallowing: A Psychometric and Electrophysiological Analysis

    Directory of Open Access Journals (Sweden)

    Aya Nakamura

    2013-01-01

    Full Text Available We investigated the possible effects of auditory verbal cues on flavor perception and swallow physiology for younger and elder participants. Apple juice, aojiru (grass juice, and water were ingested with or without auditory verbal cues. Flavor perception and ease of swallowing were measured using a visual analog scale and swallow physiology by surface electromyography and cervical auscultation. The auditory verbal cues had significant positive effects on flavor and ease of swallowing as well as on swallow physiology. The taste score and the ease of swallowing score significantly increased when the participant’s anticipation was primed by accurate auditory verbal cues. There was no significant effect of auditory verbal cues on distaste score. Regardless of age, the maximum suprahyoid muscle activity significantly decreased when a beverage was ingested without auditory verbal cues. The interval between the onset of swallowing sounds and the peak timing point of the infrahyoid muscle activity significantly shortened when the anticipation induced by the cue was contradicted in the elderly participant group. These results suggest that auditory verbal cues can improve the perceived flavor of beverages and swallow physiology.

  15. Volume Attenuation and High Frequency Loss as Auditory Depth Cues in Stereoscopic 3D Cinema

    Science.gov (United States)

    Manolas, Christos; Pauletto, Sandra

    2014-09-01

    Assisted by the technological advances of the past decades, stereoscopic 3D (S3D) cinema is currently in the process of being established as a mainstream form of entertainment. The main focus of this collaborative effort is placed on the creation of immersive S3D visuals. However, with few exceptions, little attention has been given so far to the potential effect of the soundtrack on such environments. The potential of sound both as a means to enhance the impact of the S3D visual information and to expand the S3D cinematic world beyond the boundaries of the visuals is large. This article reports on our research into the possibilities of using auditory depth cues within the soundtrack as a means of affecting the perception of depth within cinematic S3D scenes. We study two main distance-related auditory cues: high-end frequency loss and overall volume attenuation. A series of experiments explored the effectiveness of these auditory cues. Results, although not conclusive, indicate that the studied auditory cues can influence the audience judgement of depth in cinematic 3D scenes, sometimes in unexpected ways. We conclude that 3D filmmaking can benefit from further studies on the effectiveness of specific sound design techniques to enhance S3D cinema.

  16. Auditory gist: recognition of very short sounds from timbre cues.

    Science.gov (United States)

    Suied, Clara; Agus, Trevor R; Thorpe, Simon J; Mesgarani, Nima; Pressnitzer, Daniel

    2014-03-01

    Sounds such as the voice or musical instruments can be recognized on the basis of timbre alone. Here, sound recognition was investigated with severely reduced timbre cues. Short snippets of naturally recorded sounds were extracted from a large corpus. Listeners were asked to report a target category (e.g., sung voices) among other sounds (e.g., musical instruments). All sound categories covered the same pitch range, so the task had to be solved on timbre cues alone. The minimum duration for which performance was above chance was found to be short, on the order of a few milliseconds, with the best performance for voice targets. Performance was independent of pitch and was maintained when stimuli contained less than a full waveform cycle. Recognition was not generally better when the sound snippets were time-aligned with the sound onset compared to when they were extracted with a random starting time. Finally, performance did not depend on feedback or training, suggesting that the cues used by listeners in the artificial gating task were similar to those relevant for longer, more familiar sounds. The results show that timbre cues for sound recognition are available at a variety of time scales, including very short ones. PMID:24606276

  17. Deceptive Body Movements Reverse Spatial Cueing in Soccer

    OpenAIRE

    Wright, MJ; Jackson, RC

    2014-01-01

    The purpose of the experiments was to analyse the spatial cueing effects of the movements of soccer players executing normal and deceptive (step-over) turns with the ball. Stimuli comprised normal resolution or point-light video clips of soccer players dribbling a football towards the observer then turning right or left with the ball. Clips were curtailed before or on the turn (-160, -80, 0 or +80 ms) to examine the time course of direction prediction and spatial cueing effects. Participants ...

  18. Auditory cues increase the hippocampal response to unimodal virtual reality.

    Science.gov (United States)

    Andreano, Joseph; Liang, Kevin; Kong, Lingjun; Hubbard, David; Wiederhold, Brenda K; Wiederhold, Mark D

    2009-06-01

    Previous research suggests that the effectiveness of virtual reality exposure therapy should increase as the experience becomes more immersive. However, the neural mechanisms underlying the experience of immersion are not yet well understood. To address this question, neural activity during exposure to two virtual worlds was measured by functional magnetic resonance imaging (fMRI). Two levels of immersion were used: unimodal (video only) and multimodal (video plus audio). The results indicated increased activity in both auditory and visual sensory cortices during multimodal presentation. Additionally, multimodal presentation elicited increased activity in the hippocampus, a region well known to be involved in learning and memory. The implications of this finding for exposure therapy are discussed. PMID:19500000

  19. An exploration of spatial auditory BCI paradigms with different sounds: music notes versus beeps.

    Science.gov (United States)

    Huang, Minqiang; Daly, Ian; Jin, Jing; Zhang, Yu; Wang, Xingyu; Cichocki, Andrzej

    2016-06-01

    Visual brain-computer interfaces (BCIs) are not suitable for people who cannot reliably maintain their eye gaze. Considering that this group usually maintains audition, an auditory based BCI may be a good choice for them. In this paper, we explore two auditory patterns: (1) a pattern utilizing symmetrical spatial cues with multiple frequency beeps [called the high low medium (HLM) pattern], and (2) a pattern utilizing non-symmetrical spatial cues with six tones derived from the diatonic scale [called the diatonic scale (DS) pattern]. These two patterns are compared to each other in terms of accuracy to determine which auditory pattern is better. The HLM pattern uses three different frequency beeps and has a symmetrical spatial distribution. The DS pattern uses six spoken stimuli, which are six notes solmizated as "do", "re", "mi", "fa", "sol" and "la", and derived from the diatonic scale. These six sounds are distributed to six, spatially distributed, speakers. Thus, we compare a BCI paradigm using beeps with another BCI paradigm using tones on the diatonic scale, when the stimuli are spatially distributed. Although no significant differences are found between the ERPs, the HLM pattern performs better than the DS pattern: the online accuracy achieved with the HLM pattern is significantly higher than that achieved with the DS pattern (p = 0.0028). PMID:27275376

  20. Tactile Cueing as a Gravitational Substitute for Spatial Navigation During Parabolic Flight

    Science.gov (United States)

    Montgomery, K. L.; Beaton, K. H.; Barba, J. M.; Cackler, J. M.; Son, J. H.; Horsfield, S. P.; Wood, S. J.

    2010-01-01

    INTRODUCTION: Spatial navigation requires an accurate awareness of orientation in your environment. The purpose of this experiment was to examine how spatial awareness was impaired with changing gravitational cues during parabolic flight, and the extent to which vibrotactile feedback of orientation could be used to help improve performance. METHODS: Six subjects were restrained in a chair tilted relative to the plane floor, and placed at random positions during the start of the microgravity phase. Subjects reported their orientation using verbal reports, and used a hand-held controller to point to a desired target location presented using a virtual reality video mask. This task was repeated with and without constant tactile cueing of "down" direction using a belt of 8 tactors placed around the mid-torso. Control measures were obtained during ground testing using both upright and tilted conditions. RESULTS: Perceptual estimates of orientation and pointing accuracy were impaired during microgravity or during rotation about an upright axis in 1g. The amount of error was proportional to the amount of chair displacement. Perceptual errors were reduced during movement about a tilted axis on earth. CONCLUSIONS: Reduced perceptual errors during tilts in 1g indicate the importance of otolith and somatosensory cues for maintaining spatial awareness. Tactile cueing may improve navigation in operational environments or clinical populations, providing a non-visual non-auditory feedback of orientation or desired direction heading.

  1. Deceptive body movements reverse spatial cueing in soccer.

    Science.gov (United States)

    Wright, Michael J; Jackson, Robin C

    2014-01-01

    The purpose of the experiments was to analyse the spatial cueing effects of the movements of soccer players executing normal and deceptive (step-over) turns with the ball. Stimuli comprised normal resolution or point-light video clips of soccer players dribbling a football towards the observer then turning right or left with the ball. Clips were curtailed before or on the turn (-160, -80, 0 or +80 ms) to examine the time course of direction prediction and spatial cueing effects. Participants were divided into higher-skilled (HS) and lower-skilled (LS) groups according to soccer experience. In experiment 1, accuracy on full video clips was higher than on point-light but results followed the same overall pattern. Both HS and LS groups correctly identified direction on normal moves at all occlusion levels. For deceptive moves, LS participants were significantly worse than chance and HS participants were somewhat more accurate but nevertheless substantially impaired. In experiment 2, point-light clips were used to cue a lateral target. HS and LS groups showed faster reaction times to targets that were congruent with the direction of normal turns, and to targets incongruent with the direction of deceptive turns. The reversed cueing by deceptive moves coincided with earlier kinematic events than cueing by normal moves. It is concluded that the body kinematics of soccer players generate spatial cueing effects when viewed from an opponent's perspective. This could create a reaction time advantage when anticipating the direction of a normal move. A deceptive move is designed to turn this cueing advantage into a disadvantage. Acting on the basis of advance information, the presence of deceptive moves primes responses in the wrong direction, which may be only partly mitigated by delaying a response until veridical cues emerge. PMID:25100444

  2. Deceptive body movements reverse spatial cueing in soccer.

    Directory of Open Access Journals (Sweden)

    Michael J Wright

    Full Text Available The purpose of the experiments was to analyse the spatial cueing effects of the movements of soccer players executing normal and deceptive (step-over turns with the ball. Stimuli comprised normal resolution or point-light video clips of soccer players dribbling a football towards the observer then turning right or left with the ball. Clips were curtailed before or on the turn (-160, -80, 0 or +80 ms to examine the time course of direction prediction and spatial cueing effects. Participants were divided into higher-skilled (HS and lower-skilled (LS groups according to soccer experience. In experiment 1, accuracy on full video clips was higher than on point-light but results followed the same overall pattern. Both HS and LS groups correctly identified direction on normal moves at all occlusion levels. For deceptive moves, LS participants were significantly worse than chance and HS participants were somewhat more accurate but nevertheless substantially impaired. In experiment 2, point-light clips were used to cue a lateral target. HS and LS groups showed faster reaction times to targets that were congruent with the direction of normal turns, and to targets incongruent with the direction of deceptive turns. The reversed cueing by deceptive moves coincided with earlier kinematic events than cueing by normal moves. It is concluded that the body kinematics of soccer players generate spatial cueing effects when viewed from an opponent's perspective. This could create a reaction time advantage when anticipating the direction of a normal move. A deceptive move is designed to turn this cueing advantage into a disadvantage. Acting on the basis of advance information, the presence of deceptive moves primes responses in the wrong direction, which may be only partly mitigated by delaying a response until veridical cues emerge.

  3. Auditory and Visual Cues for Spatiotemporal Rhythm Reproduction

    DEFF Research Database (Denmark)

    Maculewicz, Justyna; Serafin, Stefania; Kofoed, Lise B.

    2013-01-01

    The goal of this experiment is to investigate the role of au- ditory and visual feedback in a rhythmic tapping task. Subjects had to tap with the finger following presented rhythms, which were divided into easy and difficult patterns. Specificity of the task was that participants had to take into...... temporal accuracy is better reproduced with vision, while spatial accuracy is better reproduced with audition. Our focus was placed also on strength of pressure of tapping answers. We supposed that it could differ between modalities and also between different types of the stimuli characteristics. Results...

  4. Facilitation of learning spatial relations among locations by visual cues: generality across spatial configurations.

    Science.gov (United States)

    Sturz, Bradley R; Kelly, Debbie M; Brown, Michael F

    2010-03-01

    Spatial pattern learning permits the learning of the location of objects in space relative to each other without reference to discrete visual landmarks or environmental geometry. In the present experiment, we investigated conditions that facilitate spatial pattern learning. Specifically, human participants searched in a real environment or interactive 3-D computer-generated virtual environment open-field search task for four hidden goal locations arranged in a diamond configuration located in a 5 x 5 matrix of raised bins. Participants were randomly assigned to one of three groups: Pattern Only, Landmark + Pattern, or Cues + Pattern. All participants experienced a Training phase followed by a Testing phase. Visual cues were coincident with the goal locations during Training only in the Cues + Pattern group whereas a single visual cue at a non-goal location maintained a consistent spatial relationship with the goal locations during Training only in the Landmark + Pattern group. All groups were then tested in the absence of visual cues. Results in both environments indicated that participants in all three groups learned the spatial configuration of goal locations. The presence of the visual cues during Training facilitated acquisition of the task for the Landmark + Pattern and Cues + Pattern groups compared to the Pattern Only group. During Testing the Landmark + Pattern and Cues + Pattern groups did not differ when their respective visual cues were removed. Furthermore, during Testing the performance of these two groups was superior to the Pattern Only group. Results generalize prior research to a different configuration of spatial locations, isolate spatial pattern learning as the process facilitated by visual cues, and indicate that the facilitation of learning spatial relations among locations by visual cues does not require coincident visual cues. PMID:19777275

  5. Covert Auditory Spatial Orienting: An Evaluation of the Spatial Relevance Hypothesis

    Science.gov (United States)

    Roberts, Katherine L.; Summerfield, A. Quentin; Hall, Deborah A.

    2009-01-01

    The spatial relevance hypothesis (J. J. McDonald & L. M. Ward, 1999) proposes that covert auditory spatial orienting can only be beneficial to auditory processing when task stimuli are encoded spatially. We present a series of experiments that evaluate 2 key aspects of the hypothesis: (a) that "reflexive activation of location-sensitive neurons is…

  6. Interface Design Implications for Recalling the Spatial Configuration of Virtual Auditory Environments

    Science.gov (United States)

    McMullen, Kyla A.

    Although the concept of virtual spatial audio has existed for almost twenty-five years, only in the past fifteen years has modern computing technology enabled the real-time processing needed to deliver high-precision spatial audio. Furthermore, the concept of virtually walking through an auditory environment did not exist. The applications of such an interface have numerous potential uses. Spatial audio has the potential to be used in various manners ranging from enhancing sounds delivered in virtual gaming worlds to conveying spatial locations in real-time emergency response systems. To incorporate this technology in real-world systems, various concerns should be addressed. First, to widely incorporate spatial audio into real-world systems, head-related transfer functions (HRTFs) must be inexpensively created for each user. The present study further investigated an HRTF subjective selection procedure previously developed within our research group. Users discriminated auditory cues to subjectively select their preferred HRTF from a publicly available database. Next, the issue of training to find virtual sources was addressed. Listeners participated in a localization training experiment using their selected HRTFs. The training procedure was created from the characterization of successful search strategies in prior auditory search experiments. Search accuracy significantly improved after listeners performed the training procedure. Next, in the investigation of auditory spatial memory, listeners completed three search and recall tasks with differing recall methods. Recall accuracy significantly decreased in tasks that required the storage of sound source configurations in memory. To assess the impacts of practical scenarios, the present work assessed the performance effects of: signal uncertainty, visual augmentation, and different attenuation modeling. Fortunately, source uncertainty did not affect listeners' ability to recall or identify sound sources. The present

  7. Can Tau-guided auditory cues help control of movement in Parkinson's Disease patients ?

    OpenAIRE

    Curran, Isabel

    2006-01-01

    Based on the theory that the movement disturbances seen in Parkinson’s disease are caused by the lack of an intrinsic tau-guide (Lee et al, 1999), and drawing from knowledge of the role of the basal ganglia and its pathophysiology in Parkinson’s disease, an experiment was designed to investigate the use of auditory cues in alleviating the symptoms of the disease. Four Parkinson’s disease patients carried out 3 simple writing tasks under an un-cued and an externally cued conditi...

  8. Placebo effects induced by auditory cues decrease parkinsonian rigidity in patients with subthalamic stimulation.

    Science.gov (United States)

    Rätsep, Tõnu; Asser, Toomas

    2016-03-15

    Placebo effects are the consequence of an interaction between an organism and its surroundings and may be influenced by cues from the environment. Our study was designed to analyze if conditioned auditory cues could trigger placebo effects and affect parkinsonian rigidity as measured by viscoelastic properties of skeletal muscles in patients treated with subthalamic stimulation. We found that after repeatedly associating with the effect of deep brain stimulation on rigidity, a common dial phone signal itself was able to reduce the mean values of viscoelastic stiffness in the placebo stage (368.8±50.4Nm(-1)) as compared to the stimulation-off conditions (383.7±61.2Nm(-1)) (q=4.18; peffects affecting the clinical status of the patients. PMID:26706890

  9. Central Auditory Processing of Temporal and Spectral-Variance Cues in Cochlear Implant Listeners.

    Directory of Open Access Journals (Sweden)

    Carol Q Pham

    Full Text Available Cochlear implant (CI listeners have difficulty understanding speech in complex listening environments. This deficit is thought to be largely due to peripheral encoding problems arising from current spread, which results in wide peripheral filters. In normal hearing (NH listeners, central processing contributes to segregation of speech from competing sounds. We tested the hypothesis that basic central processing abilities are retained in post-lingually deaf CI listeners, but processing is hampered by degraded input from the periphery. In eight CI listeners, we measured auditory nerve compound action potentials to characterize peripheral filters. Then, we measured psychophysical detection thresholds in the presence of multi-electrode maskers placed either inside (peripheral masking or outside (central masking the peripheral filter. This was intended to distinguish peripheral from central contributions to signal detection. Introduction of temporal asynchrony between the signal and masker improved signal detection in both peripheral and central masking conditions for all CI listeners. Randomly varying components of the masker created spectral-variance cues, which seemed to benefit only two out of eight CI listeners. Contrastingly, the spectral-variance cues improved signal detection in all five NH listeners who listened to our CI simulation. Together these results indicate that widened peripheral filters significantly hamper central processing of spectral-variance cues but not of temporal cues in post-lingually deaf CI listeners. As indicated by two CI listeners in our study, however, post-lingually deaf CI listeners may retain some central processing abilities similar to NH listeners.

  10. Computational characterization of visually-induced auditory spatial adaptation

    Directory of Open Access Journals (Sweden)

    David R Wozny

    2011-11-01

    Full Text Available Recent research investigating the principles governing human perception has provided increasing evidence for probabilistic inference in human perception. For example, human auditory and visual localization judgments closely resemble that of a Bayesian causal inference observer, where the underlying causal structure of the stimuli are inferred based on both the available sensory evidence and prior knowledge. However, most previous studies have focused on characterization of perceptual inference within a static environment, and therefore, little is known about how this inference process changes when observers are exposed to a new environment. In this study we aimed to computationally characterize the change in auditory spatial perception induced by repeated auditory-visual spatial conflict, known as the Ventriloquist Aftereffect. In theory, this change could reflect a shift in the auditory sensory representations (i.e., shift in auditory likelihood distribution, a decrease in the precision of the auditory estimates (i.e., increase in spread of likelihood distribution, a shift in the auditory bias (i.e., shift in prior distribution, or an increase/decrease in strength of the auditory bias (i.e., the spread of prior distribution, or a combination of these. By quantitatively estimating the parameters of the perceptual process for each individual observer using a Bayesian causal inference model, we found that the shift in the perceived locations after exposure was associated with a shift in the mean of the auditory likelihood functions in the direction of the experienced visual offset. The results suggest that repeated exposure to a fixed auditory-visual discrepancy is attributed by the nervous system to sensory representation error and as a result, the sensory map of space is recalibrated to correct the error.

  11. Spatial processing in the auditory cortex of the macaque monkey

    Science.gov (United States)

    Recanzone, Gregg H.

    2000-10-01

    The patterns of cortico-cortical and cortico-thalamic connections of auditory cortical areas in the rhesus monkey have led to the hypothesis that acoustic information is processed in series and in parallel in the primate auditory cortex. Recent physiological experiments in the behaving monkey indicate that the response properties of neurons in different cortical areas are both functionally distinct from each other, which is indicative of parallel processing, and functionally similar to each other, which is indicative of serial processing. Thus, auditory cortical processing may be similar to the serial and parallel "what" and "where" processing by the primate visual cortex. If "where" information is serially processed in the primate auditory cortex, neurons in cortical areas along this pathway should have progressively better spatial tuning properties. This prediction is supported by recent experiments that have shown that neurons in the caudomedial field have better spatial tuning properties than neurons in the primary auditory cortex. Neurons in the caudomedial field are also better than primary auditory cortex neurons at predicting the sound localization ability across different stimulus frequencies and bandwidths in both azimuth and elevation. These data support the hypothesis that the primate auditory cortex processes acoustic information in a serial and parallel manner and suggest that this may be a general cortical mechanism for sensory perception.

  12. Confusing what you heard with what you did: False action-memories from auditory cues.

    Science.gov (United States)

    Lindner, Isabel; Henkel, Linda A

    2015-12-01

    Creating a mental image of one's own performance, observing someone else performing an action, and viewing a photograph of a completed action all can lead to the illusory recollection that one has performed this action. While there are fundamental differences in the nature of these three processes, they are aligned by the fact that they involve primarily or solely the visual modality. According to the source-monitoring framework, the corresponding visual memory traces later can be mistakenly attributed to self-performance. However, when people perform actions, they do not only engage vision, but also other modalities, such as auditory and tactile systems. The present study focused on the role of audition in the creation of false beliefs about performing an action and explored whether auditory cues alone-in the absence of any visual cues-can induce false beliefs and memories for actions. After performing a series of simple actions, participants listened to the sound of someone performing various actions, watched someone perform the actions, or simultaneously both heard and saw someone perform them. Some of these actions had been performed earlier by the participants and others were new. A later source-memory test revealed that all three types of processing (hearing, seeing, or hearing plus seeing someone perform the actions) led to comparable increases in false claims of having performed actions oneself. The potential mechanisms underlying false action-memories from sound and vision are discussed. PMID:25925600

  13. Task-dependent calibration of auditory spatial perception through environmental visual observation

    OpenAIRE

    Luca Brayda

    2015-01-01

    Visual information is paramount to space perception. Vision influences auditory space estimation. Many studies show that simultaneous visual and auditory cues improve precision of the final multisensory estimate. However, the amount or the temporal extent of visual information, that is sufficient to influence auditory perception, is still unknown. It is therefore interesting to know if vision can improve auditory precision through a short-term environmental observation preceding the audio tas...

  14. Attention to sound improves auditory reliability in audio-tactile spatial optimal integration

    Directory of Open Access Journals (Sweden)

    Tiziana eVercillo

    2015-05-01

    Full Text Available The role of attention on multisensory processing is still poorly understood. In particular, it is unclear whether directing attention toward a sensory cue dynamically reweights cue reliability during integration of multiple sensory signals. In this study, we investigated the impact of attention in combining audio-tactile signals in an optimal fashion. We used the Maximum Likelihood Estimation (MLE model to predict audio-tactile spatial localization on the body surface. We developed a new audio-tactile device composed by several small units, each one consisting of a speaker and a tactile vibrator independently controllable by external software. We tested subjects in an attentional and a non-attentional condition. In the attention experiment participants performed a dual task paradigm: they were required to evaluate the duration of a sound while performing an audio-tactile spatial task. Three unisensory or multisensory stimuli (conflictual or not conflictual sounds and vibrations arranged along the horizontal axis were presented sequentially. In the primary task subjects had to evaluate the position of the second stimulus (the probe with respect to the others (in a space bisection task. In the secondary task they had to report occasionally changes in duration of the second auditory stimulus. In the non-attentional task participants had only to perform the primary task (space bisection. Our results showed enhanced auditory precision (and auditory weights in the auditory attentional condition with respect to the control non-attentional condition. Interestingly in both conditions the multisensory results are well predicted by the MLE model. The results of this study support the idea that modality-specific attention modulates multisensory integration.

  15. Nonlinear dynamics of human locomotion: effects of rhythmic auditory cueing on local dynamic stability

    Directory of Open Access Journals (Sweden)

    Philippe eTerrier

    2013-09-01

    Full Text Available It has been observed that times series of gait parameters (stride length (SL, stride time (ST and stride speed (SS, exhibit long-term persistence and fractal-like properties. Synchronizing steps with rhythmic auditory stimuli modifies the persistent fluctuation pattern to anti-persistence. Another nonlinear method estimates the degree of resilience of gait control to small perturbations, i.e. the local dynamic stability (LDS. The method makes use of the maximal Lyapunov exponent, which estimates how fast a nonlinear system embedded in a reconstructed state space (attractor diverges after an infinitesimal perturbation. We propose to use an instrumented treadmill to simultaneously measure basic gait parameters (time series of SL, ST and SS from which the statistical persistence among consecutive strides can be assessed, and the trajectory of the center of pressure (from which the LDS can be estimated. In 20 healthy participants, the response to rhythmic auditory cueing (RAC of LDS and of statistical persistence (assessed with detrended fluctuation analysis (DFA was compared. By analyzing the divergence curves, we observed that long-term LDS (computed as the reverse of the average logarithmic rate of divergence between the 4th and the 10th strides downstream from nearest neighbors in the reconstructed attractor was strongly enhanced (relative change +47%. That is likely the indication of a more dampened dynamics. The change in short-term LDS (divergence over one step was smaller (+3%. DFA results (scaling exponents confirmed an anti-persistent pattern in ST, SL and SS. Long-term LDS (but not short-term LDS and scaling exponents exhibited a significant correlation between them (r=0.7. Both phenomena probably result from the more conscious/voluntary gait control that is required by RAC. We suggest that LDS and statistical persistence should be used to evaluate the efficiency of cueing therapy in patients with neurological gait disorders.

  16. Verbal Auditory Cueing of Improvisational Dance: A Proposed Method for Training Agency in Parkinson's Disease.

    Science.gov (United States)

    Batson, Glenna; Hugenschmidt, Christina E; Soriano, Christina T

    2016-01-01

    Dance is a non-pharmacological intervention that helps maintain functional independence and quality of life in people with Parkinson's disease (PPD). Results from controlled studies on group-delivered dance for people with mild-to-moderate stage Parkinson's have shown statistically and clinically significant improvements in gait, balance, and psychosocial factors. Tested interventions include non-partnered dance forms (ballet and modern dance) and partnered (tango). In all of these dance forms, specific movement patterns initially are learned through repetition and performed in time-to-music. Once the basic steps are mastered, students may be encouraged to improvise on the learned steps as they perform them in rhythm with the music. Here, we summarize a method of teaching improvisational dance that advances previous reported benefits of dance for people with Parkinson's disease (PD). The method relies primarily on improvisational verbal auditory cueing with less emphasis on directed movement instruction. This method builds on the idea that daily living requires flexible, adaptive responses to real-life challenges. In PD, movement disorders not only limit mobility but also impair spontaneity of thought and action. Dance improvisation demands open and immediate interpretation of verbally delivered movement cues, potentially fostering the formation of spontaneous movement strategies. Here, we present an introduction to a proposed method, detailing its methodological specifics, and pointing to future directions. The viewpoint advances an embodied cognitive approach that has eco-validity in helping PPD meet the changing demands of daily living. PMID:26925029

  17. Listenmee and Listenmee smartphone application: synchronizing walking to rhythmic auditory cues to improve gait in Parkinson's disease.

    Science.gov (United States)

    Lopez, William Omar Contreras; Higuera, Carlos Andres Escalante; Fonoff, Erich Talamoni; Souza, Carolina de Oliveira; Albicker, Ulrich; Martinez, Jairo Alberto Espinoza

    2014-10-01

    Evidence supports the use of rhythmic external auditory signals to improve gait in PD patients (Arias & Cudeiro, 2008; Kenyon & Thaut, 2000; McIntosh, Rice & Thaut, 1994; McIntosh et al., 1997; Morris, Iansek, & Matyas, 1994; Thaut, McIntosh, & Rice, 1997; Suteerawattananon, Morris, Etnyre, Jankovic, & Protas , 2004; Willems, Nieuwboer, Chavert, & Desloovere, 2006). However, few prototypes are available for daily use, and to our knowledge, none utilize a smartphone application allowing individualized sounds and cadence. Therefore, we analyzed the effects on gait of Listenmee®, an intelligent glasses system with a portable auditory device, and present its smartphone application, the Listenmee app®, offering over 100 different sounds and an adjustable metronome to individualize the cueing rate as well as its smartwatch with accelerometer to detect magnitude and direction of the proper acceleration, track calorie count, sleep patterns, steps count and daily distances. The present study included patients with idiopathic PD presented gait disturbances including freezing. Auditory rhythmic cues were delivered through Listenmee®. Performance was analyzed in a motion and gait analysis laboratory. The results revealed significant improvements in gait performance over three major dependent variables: walking speed in 38.1%, cadence in 28.1% and stride length in 44.5%. Our findings suggest that auditory cueing through Listenmee® may significantly enhance gait performance. Further studies are needed to elucidate the potential role and maximize the benefits of these portable devices. PMID:25215623

  18. Spatial organization of tettigoniid auditory receptors: insights from neuronal tracing.

    Science.gov (United States)

    Strauß, Johannes; Lehmann, Gerlind U C; Lehmann, Arne W; Lakes-Harlan, Reinhard

    2012-11-01

    The auditory sense organ of Tettigoniidae (Insecta, Orthoptera) is located in the foreleg tibia and consists of scolopidial sensilla which form a row termed crista acustica. The crista acustica is associated with the tympana and the auditory trachea. This ear is a highly ordered, tonotopic sensory system. As the neuroanatomy of the crista acustica has been documented for several species, the most distal somata and dendrites of receptor neurons have occasionally been described as forming an alternating or double row. We investigate the spatial arrangement of receptor cell bodies and dendrites by retrograde tracing with cobalt chloride solution. In six tettigoniid species studied, distal receptor neurons are consistently arranged in double-rows of somata rather than a linear sequence. This arrangement of neurons is shown to affect 30-50% of the overall auditory receptors. No strict correlation of somata positions between the anterio-posterior and dorso-ventral axis was evident within the distal crista acustica. Dendrites of distal receptors occasionally also occur in a double row or are even massed without clear order. Thus, a substantial part of auditory receptors can deviate from a strictly straight organization into a more complex morphology. The linear organization of dendrites is not a morphological criterion that allows hearing organs to be distinguished from nonhearing sense organs serially homologous to ears in all species. Both the crowded arrangement of receptor somata and dendrites may result from functional constraints relating to frequency discrimination, or from developmental constraints of auditory morphogenesis in postembryonic development. PMID:22807283

  19. Independent effects of bottom-up temporal expectancy and top-down spatial attention. An audiovisual study using rhythmic cueing.

    Directory of Open Access Journals (Sweden)

    Alexander eJones

    2015-01-01

    Full Text Available Selective attention to a spatial location has shown enhance perception and facilitate behaviour for events at attended locations. However, selection relies not only on where but also when an event occurs. Recently, interest has turned to how intrinsic neural oscillations in the brain entrain to rhythms in our environment, and, stimuli appearing in or out of synch with a rhythm have shown to modulate perception and performance. Temporal expectations created by rhythms and spatial attention are two processes which have independently shown to affect stimulus processing but it remains largely unknown how, and if, they interact. In four separate tasks, this study investigated the effects of voluntary spatial attention and bottom-up temporal expectations created by rhythms in both unimodal and crossmodal conditions. In each task the participant used an informative cue, either colour or pitch, to direct their covert spatial attention to the left or right, and respond as quickly as possible to a target. The lateralized target (visual or auditory was then presented at the attended or unattended side. Importantly, although not task relevant, the cue was a rhythm of either flashes or beeps. The target was presented in or out of sync (early or late with the rhythmic cue. The results showed participants were faster responding to spatially attended compared to unattended targets in all tasks. Moreover, there was an effect of rhythmic cueing upon response times in both unimodal and crossmodal conditions. Responses were faster to targets presented in sync with the rhythm compared to when they appeared too early in both crossmodal tasks. That is, rhythmic stimuli in one modality influenced the temporal expectancy in the other modality, suggesting temporal expectancies created by rhythms are crossmodal. Interestingly, there was no interaction between top-down spatial attention and rhythmic cueing in any task suggesting these two processes largely influenced

  20. Mechanisms of spatial and non-spatial auditory selective attention

    OpenAIRE

    Paltoglou, Aspasia Eleni

    2009-01-01

    Selective attention is a crucial function that encompasses all perceptual modalities and which enables us to focus on the behaviorally relevant information and ignore the rest. The main goal of the thesis is to test well-established hypotheses about the mechanisms of visual selective attention in the auditory domain using behavioral and neuroimaging methods. Two fMRI studies (Experiments 1 and 2) test the hypothesis of feature-specific attentional enhancement. This hypothesis states that ...

  1. Natural auditory scene statistics shapes human spatial hearing.

    Science.gov (United States)

    Parise, Cesare V; Knorre, Katharina; Ernst, Marc O

    2014-04-22

    Human perception, cognition, and action are laced with seemingly arbitrary mappings. In particular, sound has a strong spatial connotation: Sounds are high and low, melodies rise and fall, and pitch systematically biases perceived sound elevation. The origins of such mappings are unknown. Are they the result of physiological constraints, do they reflect natural environmental statistics, or are they truly arbitrary? We recorded natural sounds from the environment, analyzed the elevation-dependent filtering of the outer ear, and measured frequency-dependent biases in human sound localization. We find that auditory scene statistics reveals a clear mapping between frequency and elevation. Perhaps more interestingly, this natural statistical mapping is tightly mirrored in both ear-filtering properties and in perceived sound location. This suggests that both sound localization behavior and ear anatomy are fine-tuned to the statistics of natural auditory scenes, likely providing the basis for the spatial connotation of human hearing. PMID:24711409

  2. Speed on the dance floor: Auditory and visual cues for musical tempo.

    Science.gov (United States)

    London, Justin; Burger, Birgitta; Thompson, Marc; Toiviainen, Petri

    2016-02-01

    Musical tempo is most strongly associated with the rate of the beat or "tactus," which may be defined as the most prominent rhythmic periodicity present in the music, typically in a range of 1.67-2 Hz. However, other factors such as rhythmic density, mean rhythmic inter-onset interval, metrical (accentual) structure, and rhythmic complexity can affect perceived tempo (Drake, Gros, & Penel, 1999; London, 2011 Drake, Gros, & Penel, 1999; London, 2011). Visual information can also give rise to a perceived beat/tempo (Iversen, et al., 2015), and auditory and visual temporal cues can interact and mutually influence each other (Soto-Faraco & Kingstone, 2004; Spence, 2015). A five-part experiment was performed to assess the integration of auditory and visual information in judgments of musical tempo. Participants rated the speed of six classic R&B songs on a seven point scale while observing an animated figure dancing to them. Participants were presented with original and time-stretched (±5%) versions of each song in audio-only, audio+video (A+V), and video-only conditions. In some videos the animations were of spontaneous movements to the different time-stretched versions of each song, and in other videos the animations were of "vigorous" versus "relaxed" interpretations of the same auditory stimulus. Two main results were observed. First, in all conditions with audio, even though participants were able to correctly rank the original vs. time-stretched versions of each song, a song-specific tempo-anchoring effect was observed, such that sped-up versions of slower songs were judged to be faster than slowed-down versions of faster songs, even when their objective beat rates were the same. Second, when viewing a vigorous dancing figure in the A+V condition, participants gave faster tempo ratings than from the audio alone or when viewing the same audio with a relaxed dancing figure. The implications of this illusory tempo percept for cross-modal sensory integration and

  3. Processing of spatial sounds in the impaired auditory system

    DEFF Research Database (Denmark)

    Arweiler, Iris

    of two such cues on speech intelligibility was studied. First, the benefit from early reflections (ER’s) in a room was determined using a virtual auditory environment. ER’s were found to be useful for speech intelligibility, but to a smaller extent than the direct sound (DS). The benefit was...... intelligibility, the exact ILD information is not crucial. The results from an additional experiment demonstrated that the ER benefit was maintained with independent as well as with linked hearing aid compression. Overall, this work contributes to the understanding of ER processing in listeners with normal and...... quantified with an intelligibility-weighted “efficiency factor” which revealed that the spectral characteristics of the ER’s caused the reduced benefit. Hearing-impaired listeners were able to utilize the ER energy as effectively as normal-hearing listeners, most likely because binaural processing was not...

  4. Auditory spatial perception dynamically realigns with changing eye position.

    Science.gov (United States)

    Razavi, Babak; O'Neill, William E; Paige, Gary D

    2007-09-19

    Audition and vision both form spatial maps of the environment in the brain, and their congruency requires alignment and calibration. Because audition is referenced to the head and vision is referenced to movable eyes, the brain must accurately account for eye position to maintain alignment between the two modalities as well as perceptual space constancy. Changes in eye position are known to variably, but inconsistently, shift sound localization, suggesting subtle shortcomings in the accuracy or use of eye position signals. We systematically and directly quantified sound localization across a broad spatial range and over time after changes in eye position. A sustained fixation task addressed the spatial (steady-state) attributes of eye position-dependent effects on sound localization. Subjects continuously fixated visual reference spots straight ahead (center), to the left (20 degrees), or to the right (20 degrees) of the midline in separate sessions while localizing auditory targets using a laser pointer guided by peripheral vision. An alternating fixation task focused on the temporal (dynamic) aspects of auditory spatial shifts after changes in eye position. Localization proceeded as in sustained fixation, except that eye position alternated between the three fixation references over multiple epochs, each lasting minutes. Auditory space shifted by approximately 40% toward the new eye position and dynamically over several minutes. We propose that this spatial shift reflects an adaptation mechanism for aligning the "straight-ahead" of perceived sensory-motor maps, particularly during early childhood when normal ocular alignment is achieved, but also resolving challenges to normal spatial perception throughout life. PMID:17881531

  5. The neural circuitry underlying the executive control of auditory spatial attention

    OpenAIRE

    Wu, C-T; Weissman, D.H.; Roberts, K. C.; Woldorff, M.G.

    2007-01-01

    Although a fronto-parietal network has consistently been implicated in the control of visual spatial attention, the network that guides spatial attention in the auditory domain is not yet clearly understood. To investigate this issue, we measured brain activity using functional magnetic resonance imaging while participants performed a cued auditory spatial attention task. We found that cued orienting of auditory spatial attention activated a medial-superior distributed fronto-parietal network...

  6. Quadri-stability of a spatially ambiguous auditory illusion

    Directory of Open Access Journals (Sweden)

    Constance May Bainbridge

    2015-01-01

    Full Text Available In addition to vision, audition plays an important role in sound localization in our world. One way we estimate the motion of an auditory object moving towards or away from us is from changes in volume intensity. However, the human auditory system has unequally distributed spatial resolution, including difficulty distinguishing sounds in front versus behind the listener. Here, we introduce a novel quadri-stable illusion, the Transverse-and-Bounce Auditory Illusion, which combines front-back confusion with changes in volume levels of a nonspatial sound to create ambiguous percepts of an object approaching and withdrawing from the listener. The sound can be perceived as traveling transversely from front to back or back to front, or bouncing to remain exclusively in front of or behind the observer. Here we demonstrate how human listeners experience this illusory phenomenon by comparing ambiguous and unambiguous stimuli for each of the four possible motion percepts. When asked to rate their confidence in perceiving each sound’s motion, participants reported equal confidence for the illusory and unambiguous stimuli. Participants perceived all four illusory motion percepts, and could not distinguish the illusion from the unambiguous stimuli. These results show that this illusion is effectively quadri-stable. In a second experiment, the illusory stimulus was looped continuously in headphones while participants identified its perceived path of motion to test properties of perceptual switching, locking, and biases. Participants were biased towards perceiving transverse compared to bouncing paths, and they became perceptually locked into alternating between front-to-back and back-to-front percepts, perhaps reflecting how auditory objects commonly move in the real world. This multi-stable auditory illusion opens opportunities for studying the perceptual, cognitive, and neural representation of objects in motion, as well as exploring multimodal perceptual

  7. Hand proximity facilitates spatial discrimination of auditory tones

    Directory of Open Access Journals (Sweden)

    Philip eTseng

    2014-06-01

    Full Text Available The effect of hand proximity on vision and visual attention has been well documented. In this study we tested whether such effect(s would also be present in the auditory modality. With hands placed either near or away from the audio sources, participants performed an auditory-spatial discrimination (Exp 1: left or right side, pitch discrimination (Exp 2: high, med, or low tone, and spatial-plus-pitch (Exp 3: left or right; high, med, or low discrimination task. In Exp 1, when hands were away from the audio source, participants consistently responded faster with their right hand regardless of stimulus location. This right hand advantage, however, disappeared in the hands-near condition because of a significant improvement in left hand’s reaction time. No effect of hand proximity was found in Exp 2 or 3, where a choice reaction time task requiring pitch discrimination was used. Together, these results suggest that the effect of hand proximity is not exclusive to vision alone, but is also present in audition, though in a much weaker form. Most important, these findings provide evidence from auditory attention that supports the multimodal account originally raised by Reed et al. in 2006.

  8. The role of vowel perceptual cues in compensatory responses to perturbations of speech auditory feedback

    OpenAIRE

    Reilly, Kevin J.; Dougherty, Kathleen E.

    2013-01-01

    The perturbation of acoustic features in a speaker's auditory feedback elicits rapid compensatory responses that demonstrate the importance of auditory feedback for control of speech output. The current study investigated whether responses to a perturbation of speech auditory feedback vary depending on the importance of the perturbed feature to perception of the vowel being produced. Auditory feedback of speakers' first formant frequency (F1) was shifted upward by 130 mels in randomly selecte...

  9. The role of different cues in the brain mechanism on visual spatial attention

    Institute of Scientific and Technical Information of China (English)

    SONG Weiqun; LUO Yuejia; CHI Song; JI Xunming; LING Feng; ZHAO Lun; WANG Maobin; SHI Jiannong

    2006-01-01

    The visual spatial attention mechanism in the brain was studied in 16 young subjects through the visual search paradigm of precue-target by the event-related potential (ERP) technique, with the attentive ranges cued by different scales of Chinese character and region cues. The results showed that the response time for Chinese character cues was much longer than that for region cues especially for small region cues. With the exterior interferences, the target stimuli recognition under region cues was much quicker than that under Chinese character cues. Compared with that under region cues, targets under Chinese character cues could lead to increase of the posterior P1,decrease of the N1 and increase of the P2. It should also be noted that the differences between region cues and Chinese character cues were affected by the interference types. Under exterior interferences, no significant difference was found between region cues and Chinese character cues; however, it was not the case under the interior interferences. Considering the difference between the exterior interferences and the interior interferences, we could conclude that with the increase of difficulty in target recognition there was obvious difference in the consumption of anterior frontal resources by target stimuli under the two kinds of cues.

  10. Auditory Spatial Coding Flexibly Recruits Anterior, but Not Posterior, Visuotopic Parietal Cortex.

    Science.gov (United States)

    Michalka, Samantha W; Rosen, Maya L; Kong, Lingqiang; Shinn-Cunningham, Barbara G; Somers, David C

    2016-03-01

    Audition and vision both convey spatial information about the environment, but much less is known about mechanisms of auditory spatial cognition than visual spatial cognition. Human cortex contains >20 visuospatial map representations but no reported auditory spatial maps. The intraparietal sulcus (IPS) contains several of these visuospatial maps, which support visuospatial attention and short-term memory (STM). Neuroimaging studies also demonstrate that parietal cortex is activated during auditory spatial attention and working memory tasks, but prior work has not demonstrated that auditory activation occurs within visual spatial maps in parietal cortex. Here, we report both cognitive and anatomical distinctions in the auditory recruitment of visuotopically mapped regions within the superior parietal lobule. An auditory spatial STM task recruited anterior visuotopic maps (IPS2-4, SPL1), but an auditory temporal STM task with equivalent stimuli failed to drive these regions significantly. Behavioral and eye-tracking measures rule out task difficulty and eye movement explanations. Neither auditory task recruited posterior regions IPS0 or IPS1, which appear to be exclusively visual. These findings support the hypothesis of multisensory spatial processing in the anterior, but not posterior, superior parietal lobule and demonstrate that recruitment of these maps depends on auditory task demands. PMID:26656996

  11. The influence of acoustic reflections from diffusive architectural surfaces on spatial auditory perception

    Science.gov (United States)

    Robinson, Philip W.

    This thesis addresses the effect of reflections from diffusive architectural surfaces on the perception of echoes and on auditory spatial resolution. Diffusive architectural surfaces play an important role in performance venue design for architectural expression and proper sound distribution. Extensive research has been devoted to the prediction and measurement of the spatial dispersion. However, previous psychoacoustic research on perception of reflections and the precedence effect has focused on specular reflections. This study compares the echo threshold of specular reflections, against those for reflections from realistic architectural surfaces, and against synthesized reflections that isolate individual qualities of reflections from diffusive surfaces, namely temporal dispersion and spectral coloration. In particular, the activation of the precedence effect, as indicated by the echo threshold is measured. Perceptual tests are conducted with direct sound, and simulated or measured reflections with varying temporal dispersion. The threshold for reflections from diffusive architectural surfaces is found to be comparable to that of a specular re ection of similar energy rather than similar amplitude. This is surprising because the amplitude of the dispersed re ection is highly attenuated, and onset cues are reduced. This effect indicates that the auditory system is integrating re ection response energy dispersed over many milliseconds into a single stream. Studies on the effect of a single diffuse reflection are then extended to a full architectural enclosure with various surface properties. This research utilizes auralizations from measured and simulated performance venues to investigate spatial discrimination of multiple acoustic sources in rooms. It is found that discriminating the lateral arrangement of two sources is possible at narrower separation angles when reflections come from at rather than diffusive surfaces. Additionally, subjective impressions are

  12. Spatial scale of motion segmentation from speed cues

    Science.gov (United States)

    Mestre, D. R.; Masson, G. S.; Stone, L. S.

    2001-01-01

    For the accurate perception of multiple, potentially overlapping, surfaces or objects, the visual system must distinguish different local motion vectors and selectively integrate similar motion vectors over space to segment the retinal image properly. We recently showed that large differences in speed are required to yield a percept of motion transparency. In the present study, to investigate the spatial scale of motion segmentation from speed cues alone, we measured the speed-segmentation threshold (the minimum speed difference required for 75% performance accuracy) for 'corrugated' random-dot patterns, i.e. patterns in which dots with two different speeds were alternately placed in adjacent bars of variable width. In a first experiment, we found that, at large bar widths, a smaller speed difference was required to segment and perceive the corrugated pattern of moving dots, while at small bar-widths, a larger speed difference was required to segment the two speeds and perceive two transparent surfaces of moving dots. Both the perceptual and segmentation performance transitions occurred at a bar width of around 0.4 degrees. In a second experiment, speed-segmentation thresholds were found to increase sharply when dots with different speeds were paired within a local pooling area. The critical pairing distance was about 0.2 degrees in the fovea and increased linearly with stimulus eccentricity. However, across the range of eccentricities tested (up to 15 degrees ), the critical pairing distance did not change much and remained close to the receptive field size of neurons within the primate primary visual cortex. In a third experiment, increasing dot density changed the relationship between speed-segmentation thresholds and bar width. Thresholds decreased for large bar widths, but increased for small bar widths. All of these results are well fit by a simple stochastic model, which estimates the probabilities of having identical or different motion vectors within a

  13. The Modulation of Exogenous Spatial Cueing on Spatial Stroop Interference: Evidence of a Set for "Cue-Target Event Segregation"

    Science.gov (United States)

    Funes, Maria Jesus; Lupianez, Juan; Milliken, Bruce

    2008-01-01

    Two experiments are reported that test whether the modulation of exogenous cuing effects by the presence of a distractor at the location opposite the target (altering the time course of cueing effects, Lupianez et al., 1999, 2001) is due to the fast reorienting of attention or to a set for preventing the integration of the cue and the target…

  14. Auditory Spatial Coding Flexibly Recruits Anterior, but Not Posterior, Visuotopic Parietal Cortex

    OpenAIRE

    Michalka, Samantha W.; Rosen, Maya L.; Kong, Lingqiang; Shinn-Cunningham, Barbara G.; Somers, David C.

    2015-01-01

    Audition and vision both convey spatial information about the environment, but much less is known about mechanisms of auditory spatial cognition than visual spatial cognition. Human cortex contains >20 visuospatial map representations but no reported auditory spatial maps. The intraparietal sulcus (IPS) contains several of these visuospatial maps, which support visuospatial attention and short-term memory (STM). Neuroimaging studies also demonstrate that parietal cortex is activated during au...

  15. The Effect of Tactile Cues on Auditory Stream Segregation Ability of Musicians and Nonmusicians

    DEFF Research Database (Denmark)

    Slater, Kyle D.; Marozeau, Jeremy

    2016-01-01

    the random melody. Tactile cues were applied to the listener’s fingers on half of the blocks. Results showed that tactile cues can significantly improve the melodic segregation ability in both musician and nonmusician groups in challenging listening conditions. Overall, the musician group performance...

  16. Learning to Match Auditory and Visual Speech Cues: Social Influences on Acquisition of Phonological Categories

    Science.gov (United States)

    Altvater-Mackensen, Nicole; Grossmann, Tobias

    2015-01-01

    Infants' language exposure largely involves face-to-face interactions providing acoustic and visual speech cues but also social cues that might foster language learning. Yet, both audiovisual speech information and social information have so far received little attention in research on infants' early language development. Using a preferential…

  17. Within-hemifield posture changes affect tactile–visual exogenous spatial cueing without spatial precision, especially in the dark

    OpenAIRE

    Kennett, Steffan; Driver, Jon

    2014-01-01

    We investigated the effects of seen and unseen within-hemifield posture changes on crossmodal visual–tactile links in covert spatial attention. In all experiments, a spatially nonpredictive tactile cue was presented to the left or the right hand, with the two hands placed symmetrically across the midline. Shortly after a tactile cue, a visual target appeared at one of two eccentricities within either of the hemifields. For half of the trial blocks, the hands were aligned with the inner visual...

  18. Increased Variability and Asymmetric Expansion of the Hippocampal Spatial Representation in a Distal Cue-Dependent Memory Task.

    Science.gov (United States)

    Park, Seong-Beom; Lee, Inah

    2016-08-01

    Place cells in the hippocampus fire at specific positions in space, and distal cues in the environment play critical roles in determining the spatial firing patterns of place cells. Many studies have shown that place fields are influenced by distal cues in foraging animals. However, it is largely unknown whether distal-cue-dependent changes in place fields appear in different ways in a memory task if distal cues bear direct significance to achieving goals. We investigated this possibility in this study. Rats were trained to choose different spatial positions in a radial arm in association with distal cue configurations formed by visual cue sets attached to movable curtains around the apparatus. The animals were initially trained to associate readily discernible distal cue configurations (0° vs. 80° angular separation between distal cue sets) with different food-well positions and then later experienced ambiguous cue configurations (14° and 66°) intermixed with the original cue configurations. Rats showed no difficulty in transferring the associated memory formed for the original cue configurations when similar cue configurations were presented. Place field positions remained at the same locations across different cue configurations, whereas stability and coherence of spatial firing patterns were significantly disrupted when ambiguous cue configurations were introduced. Furthermore, the spatial representation was extended backward and skewed more negatively at the population level when processing ambiguous cue configurations, compared with when processing the original cue configurations only. This effect was more salient for large cue-separation conditions than for small cue-separation conditions. No significant rate remapping was observed across distal cue configurations. These findings suggest that place cells in the hippocampus dynamically change their detailed firing characteristics in response to a modified cue environment and that some of the firing

  19. Within-hemifield posture changes affect tactile-visual exogenous spatial cueing without spatial precision, especially in the dark.

    Science.gov (United States)

    Kennett, Steffan; Driver, Jon

    2014-05-01

    We investigated the effects of seen and unseen within-hemifield posture changes on crossmodal visual-tactile links in covert spatial attention. In all experiments, a spatially nonpredictive tactile cue was presented to the left or the right hand, with the two hands placed symmetrically across the midline. Shortly after a tactile cue, a visual target appeared at one of two eccentricities within either of the hemifields. For half of the trial blocks, the hands were aligned with the inner visual target locations, and for the remainder, the hands were aligned with the outer target locations. In Experiments 1 and 2, the inner and outer eccentricities were 17.5º and 52.5º, respectively. In Experiment 1, the arms were completely covered, and visual up-down judgments were better when on the same side as the preceding tactile cue. Cueing effects were not significantly affected by hand or target alignment. In Experiment 2, the arms were in view, and now some target responses were affected by cue alignment: Cueing for outer targets was only significant when the hands were aligned with them. In Experiment 3, we tested whether any unseen posture changes could alter the cueing effects, by widely separating the inner and outer target eccentricities (now 10º and 86º). In this case, hand alignment did affect some of the cueing effects: Cueing for outer targets was now only significant when the hands were in the outer position. Although these results confirm that proprioception can, in some cases, influence tactile-visual links in exogenous spatial attention, they also show that spatial precision is severely limited, especially when posture is unseen. PMID:24470256

  20. Multimodal information Management: Evaluation of Auditory and Haptic Cues for NextGen Communication Displays

    Science.gov (United States)

    Begault, Durand R.; Bittner, Rachel M.; Anderson, Mark R.

    2012-01-01

    Auditory communication displays within the NextGen data link system may use multiple synthetic speech messages replacing traditional ATC and company communications. The design of an interface for selecting amongst multiple incoming messages can impact both performance (time to select, audit and release a message) and preference. Two design factors were evaluated: physical pressure-sensitive switches versus flat panel "virtual switches", and the presence or absence of auditory feedback from switch contact. Performance with stimuli using physical switches was 1.2 s faster than virtual switches (2.0 s vs. 3.2 s); auditory feedback provided a 0.54 s performance advantage (2.33 s vs. 2.87 s). There was no interaction between these variables. Preference data were highly correlated with performance.

  1. Auditory spatial resolution in horizontal, vertical, and diagonal planes

    Science.gov (United States)

    Grantham, D. Wesley; Hornsby, Benjamin W. Y.; Erpenbeck, Eric A.

    2003-08-01

    Minimum audible angle (MAA) and minimum audible movement angle (MAMA) thresholds were measured for stimuli in horizontal, vertical, and diagonal (60°) planes. A pseudovirtual technique was employed in which signals were recorded through KEMAR's ears and played back to subjects through insert earphones. Thresholds were obtained for wideband, high-pass, and low-pass noises. Only 6 of 20 subjects obtained wideband vertical-plane MAAs less than 10°, and only these 6 subjects were retained for the complete study. For all three filter conditions thresholds were lowest in the horizontal plane, slightly (but significantly) higher in the diagonal plane, and highest for the vertical plane. These results were similar in magnitude and pattern to those reported by Perrott and Saberi [J. Acoust. Soc. Am. 87, 1728-1731 (1990)] and Saberi and Perrott [J. Acoust. Soc. Am. 88, 2639-2644 (1990)], except that these investigators generally found that thresholds for diagonal planes were as good as those for the horizontal plane. The present results are consistent with the hypothesis that diagonal-plane performance is based on independent contributions from a horizontal-plane system (sensitive to interaural differences) and a vertical-plane system (sensitive to pinna-based spectral changes). Measurements of the stimuli recorded through KEMAR indicated that sources presented from diagonal planes can produce larger interaural level differences (ILDs) in certain frequency regions than would be expected based on the horizontal projection of the trajectory. Such frequency-specific ILD cues may underlie the very good performance reported in previous studies for diagonal spatial resolution. Subjects in the present study could apparently not take advantage of these cues in the diagonal-plane condition, possibly because they did not externalize the images to their appropriate positions in space or possibly because of the absence of a patterned visual field.

  2. Attention Cueing and Activity Equally Reduce False Alarm Rate in Visual-Auditory Associative Learning through Improving Memory

    Science.gov (United States)

    Haghgoo, Hojjat Allah; Azizi, Solmaz; Nili Ahmadabadi, Majid

    2016-01-01

    In our daily life, we continually exploit already learned multisensory associations and form new ones when facing novel situations. Improving our associative learning results in higher cognitive capabilities. We experimentally and computationally studied the learning performance of healthy subjects in a visual-auditory sensory associative learning task across active learning, attention cueing learning, and passive learning modes. According to our results, the learning mode had no significant effect on learning association of congruent pairs. In addition, subjects’ performance in learning congruent samples was not correlated with their vigilance score. Nevertheless, vigilance score was significantly correlated with the learning performance of the non-congruent pairs. Moreover, in the last block of the passive learning mode, subjects significantly made more mistakes in taking non-congruent pairs as associated and consciously reported lower confidence. These results indicate that attention and activity equally enhanced visual-auditory associative learning for non-congruent pairs, while false alarm rate in the passive learning mode did not decrease after the second block. We investigated the cause of higher false alarm rate in the passive learning mode by using a computational model, composed of a reinforcement learning module and a memory-decay module. The results suggest that the higher rate of memory decay is the source of making more mistakes and reporting lower confidence in non-congruent pairs in the passive learning mode. PMID:27314235

  3. Two Persons with Multiple Disabilities Use Orientation Technology with Auditory Cues to Manage Simple Indoor Traveling

    Science.gov (United States)

    Lancioni, Giulio E.; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff; Campodonico, Francesca; Oliva, Doretta

    2010-01-01

    This study was an effort to extend the evaluation of orientation technology for promoting independent indoor traveling in persons with multiple disabilities. Two participants (adults) were included, who were to travel to activity destinations within occupational settings. The orientation system involved (a) cueing sources only at the destinations…

  4. Verbal auditory cueing of improvisational dance: A proposed method for training agency in Parkinson’s disease

    Directory of Open Access Journals (Sweden)

    Glenna eBatson

    2016-02-01

    Full Text Available Dance is a non-pharmacological intervention that helps maintain functional independence and quality of life in people with Parkinson’s disease (PPD. Results from controlled studies on group-delivered dance for people with mild-to-moderate stage Parkinson’s have shown statistically and clinically significant improvements in gait, balance, and psychosocial factors. Tested interventions include non-partnered dance forms (ballet and modern dance and partnered (tango. In all of these dance forms, specific movement patterns initially are learned through repetition and performed in time to music. Once the basic steps are mastered, students may be encouraged to improvise on the learned steps as they perform them in rhythm with the music. Here, we summarize a method of teaching improvisational dance that advances previous reported benefits of dance for people with PD. The method relies primarily on improvisational verbal auditory cueing (VAC with less emphasis on directed movement instruction. This method builds on the idea that daily living requires flexible, adaptive responses to real-life challenges. In PD, movement disorders not only limit mobility, but also impair spontaneity of thought and action. Dance improvisation trains spontaneity of thought, fostering open and immediate interpretation of verbally delivered movement cues. Here we present an introduction to a proposed method, detailing its methodological specifics, and pointing to future directions. The viewpoint advances an embodied cognitive approach that has eco-validity in helping PPD meet the changing demands of daily living.

  5. Verbal Auditory Cueing of Improvisational Dance: A Proposed Method for Training Agency in Parkinson’s Disease

    Science.gov (United States)

    Batson, Glenna; Hugenschmidt, Christina E.; Soriano, Christina T.

    2016-01-01

    Dance is a non-pharmacological intervention that helps maintain functional independence and quality of life in people with Parkinson’s disease (PPD). Results from controlled studies on group-delivered dance for people with mild-to-moderate stage Parkinson’s have shown statistically and clinically significant improvements in gait, balance, and psychosocial factors. Tested interventions include non-partnered dance forms (ballet and modern dance) and partnered (tango). In all of these dance forms, specific movement patterns initially are learned through repetition and performed in time-to-music. Once the basic steps are mastered, students may be encouraged to improvise on the learned steps as they perform them in rhythm with the music. Here, we summarize a method of teaching improvisational dance that advances previous reported benefits of dance for people with Parkinson’s disease (PD). The method relies primarily on improvisational verbal auditory cueing with less emphasis on directed movement instruction. This method builds on the idea that daily living requires flexible, adaptive responses to real-life challenges. In PD, movement disorders not only limit mobility but also impair spontaneity of thought and action. Dance improvisation demands open and immediate interpretation of verbally delivered movement cues, potentially fostering the formation of spontaneous movement strategies. Here, we present an introduction to a proposed method, detailing its methodological specifics, and pointing to future directions. The viewpoint advances an embodied cognitive approach that has eco-validity in helping PPD meet the changing demands of daily living. PMID:26925029

  6. Psychophysical Responses Comparison in Spatial Visual, Audiovisual, and Auditory BCI-Spelling Paradigms

    OpenAIRE

    Chang, Moonjeong; Nishikawa, Nozomu; Cai, Zhenyu; Makino, Shoji; Rutkowski, Tomasz M.

    2012-01-01

    The paper presents a pilot study conducted with spatial visual, audiovisual and auditory brain-computer-interface (BCI) based speller paradigms. The psychophysical experiments are conducted with healthy subjects in order to evaluate a difficulty and a possible response accuracy variability. We also present preliminary EEG results in offline BCI mode. The obtained results validate a thesis, that spatial auditory only paradigm performs as good as the traditional visual and audiovisual speller B...

  7. Objective Fidelity Evaluation in Multisensory Virtual Environments: Auditory Cue Fidelity in Flight Simulation

    OpenAIRE

    Meyer, Georg F.; Wong, Li Ting; Timson, Emma; Perfect, Philip; Mark D. White

    2012-01-01

    We argue that objective fidelity evaluation of virtual environments, such as flight simulation, should be human-performance-centred and task-specific rather than measure the match between simulation and physical reality. We show how principled experimental paradigms and behavioural models to quantify human performance in simulated environments that have emerged from research in multisensory perception provide a framework for the objective evaluation of the contribution of individual cues to h...

  8. The Effect of Attentional Cueing and Spatial Uncertainty in Visual Field Testing

    Science.gov (United States)

    Phu, Jack; Kalloniatis, Michael; Khuu, Sieu K.

    2016-01-01

    Purpose To determine the effect of reducing spatial uncertainty by attentional cueing on contrast sensitivity at a range of spatial locations and with different stimulus sizes. Methods Six observers underwent perimetric testing with the Humphrey Visual Field Analyzer (HFA) full threshold paradigm, and the output thresholds were compared to conditions where stimulus location was verbally cued to the observer. We varied the number of points cued, the eccentric and spatial location, and stimulus size (Goldmann size I, III and V). Subsequently, four observers underwent laboratory-based psychophysical testing on a custom computer program using Method of Constant Stimuli to determine the frequency-of-seeing (FOS) curves with similar variables. Results We found that attentional cueing increased contrast sensitivity when measured using the HFA. We report a difference of approximately 2 dB with size I at peripheral and mid-peripheral testing locations. For size III, cueing had a greater effect for points presented in the periphery than in the mid-periphery. There was an exponential decay of the effect of cueing with increasing number of elements cued. Cueing a size V stimulus led to no change. FOS curves generated from laboratory-based psychophysical testing confirmed an increase in contrast detection sensitivity under the same conditions. We found that the FOS curve steepened when spatial uncertainty was reduced. Conclusion We show that attentional cueing increases contrast sensitivity when using a size I or size III test stimulus on the HFA when up to 8 points are cued but not when a size V stimulus is cued. We show that this cueing also alters the slope of the FOS curve. This suggests that at least 8 points should be used to minimise potential attentional factors that may affect measurement of contrast sensitivity in the visual field. PMID:26937972

  9. Perception of auditory, visual, and egocentric spatial alignment adapts differently to changes in eye position.

    Science.gov (United States)

    Cui, Qi N; Razavi, Babak; O'Neill, William E; Paige, Gary D

    2010-02-01

    Vision and audition represent the outside world in spatial synergy that is crucial for guiding natural activities. Input conveying eye-in-head position is needed to maintain spatial congruence because the eyes move in the head while the ears remain head-fixed. Recently, we reported that the human perception of auditory space shifts with changes in eye position. In this study, we examined whether this phenomenon is 1) dependent on a visual fixation reference, 2) selective for frequency bands (high-pass and low-pass noise) related to specific auditory spatial channels, 3) matched by a shift in the perceived straight-ahead (PSA), and 4) accompanied by a spatial shift for visual and/or bimodal (visual and auditory) targets. Subjects were tested in a dark echo-attenuated chamber with their heads fixed facing a cylindrical screen, behind which a mobile speaker/LED presented targets across the frontal field. Subjects fixated alternating reference spots (0, +/-20 degrees ) horizontally or vertically while either localizing targets or indicating PSA using a laser pointer. Results showed that the spatial shift induced by ocular eccentricity is 1) preserved for auditory targets without a visual fixation reference, 2) generalized for all frequency bands, and thus all auditory spatial channels, 3) paralleled by a shift in PSA, and 4) restricted to auditory space. Findings are consistent with a set-point control strategy by which eye position governs multimodal spatial alignment. The phenomenon is robust for auditory space and egocentric perception, and highlights the importance of controlling for eye position in the examination of spatial perception and behavior. PMID:19846626

  10. A dynamic model of how feature cues guide spatial attention

    OpenAIRE

    Hamker, Fred H.

    2004-01-01

    We will describe a computational model of attention which explains the guidance of spatial attention by feedback within a distributed network. We hypothesize that feedback within the ventral pathway transfers the target template from prefrontal areas into intermediate areas like V4. The oculomotor circuit consisting of FEF, LIP and superior colliculus picks up this distributed activity and provides a continuous spatial reentry signal from premotor cells. In order to test this hypothesis, we s...

  11. Flexible spatial perspective-taking: Conversational partners weigh multiple cues in collaborative tasks

    Directory of Open Access Journals (Sweden)

    Alexia Galati

    2013-09-01

    Full Text Available Research on spatial perspective-taking often focuses on the cognitive processes of isolated individuals as they adopt or maintain imagined perspectives. Collaborative studies of spatial perspective-taking typically examine speakers’ linguistic choices, while overlooking their underlying processes and representations. We review evidence from two collaborative experiments that examine the contribution of social and representational cues to spatial perspective choices in both language and the organization of spatial memory. Across experiments, speakers organized their memory representations according to the convergence of various cues. When layouts were randomly configured and did not afford intrinsic cues, speakers encoded their partner’s viewpoint in memory, if available, but did not use it as an organizing direction. On the other hand, when the layout afforded an intrinsic structure, speakers organized their spatial memories according to the person-centered perspective reinforced by the layout’s structure. Similarly, in descriptions, speakers considered multiple cues whether available a priori or at the interaction. They used partner-centered expressions more frequently (e.g., “to your right” when the partner’s viewpoint was misaligned by a small offset or coincided with the layout’s structure. Conversely, they used egocentric expressions more frequently when their own viewpoint coincided with the intrinsic structure or when the partner was misaligned by a computationally difficult, oblique offset. Based on these findings we advocate for a framework for flexible perspective-taking: people weigh multiple cues (including social ones to make attributions about the relative difficulty of perspective-taking for each partner, and adapt behavior to minimize their collective effort. This framework is not specialized for spatial reasoning but instead emerges from the same principles and memory-depended processes that govern perspective

  12. Processing of spatial sounds in human auditory cortex during visual, discrimination and 2-back tasks

    Directory of Open Access Journals (Sweden)

    TeemuRinne

    2014-07-01

    Full Text Available Previous imaging studies on the brain mechanisms of spatial hearing have mainly focused on sounds varying in the horizontal plane. In this study, we compared activations in human auditory cortex (AC and adjacent inferior parietal lobule (IPL to sounds varying in horizontal location, distance, or space (i.e., different rooms. In order to investigate both stimulus-dependent and task-dependent activations, these sounds were presented during visual discrimination, auditory discrimination, and auditory 2-back memory tasks. Consistent with previous studies, activations in AC were modulated by the auditory tasks. During both auditory and visual tasks, activations in AC were stronger to sounds varying in horizontal location than along other feature dimensions. However, in IPL, this enhancement was detected only during auditory tasks. Based on these results, we argue that IPL is not primarily involved in stimulus-level spatial analysis but that it may represent such information for more general processing when relevant to an active auditory task.

  13. Self-Generated Auditory Feedback as a Cue to Support Rhythmic Motor Stability

    Directory of Open Access Journals (Sweden)

    Gopher Daniel

    2011-12-01

    Full Text Available A goal of the SKILLS project is to develop Virtual Reality (VR-based training simulators for different application domains, one of which is juggling. Within this context the value of multimodal VR environments for skill acquisition is investigated. In this study, we investigated whether it was necessary to render the sounds of virtual balls hitting virtual hands within the juggling training simulator. First, we recorded sounds at the jugglers’ ears and found the sound of ball hitting hands to be audible. Second, we asked 24 jugglers to juggle under normal conditions (Audible or while listening to pink noise intended to mask the juggling sounds (Inaudible. We found that although the jugglers themselves reported no difference in their juggling across these two conditions, external juggling experts rated rhythmic stability worse in the Inaudible condition than in the Audible condition. This result suggests that auditory information should be rendered in the VR juggling training simulator.

  14. Hierarchical and serial processing in the spatial auditory cortical pathway is degraded by natural aging

    OpenAIRE

    Juarez-Salinas, Dina L.; Engle, James R.; Navarro, Xochi O.; Gregg H Recanzone

    2010-01-01

    The compromised abilities to localize sounds and to understand speech are two hallmark deficits in aged individuals. The auditory cortex is necessary for these processes, yet we know little about how normal aging affects these early cortical fields. In this study, we recorded the spatial tuning of single neurons in primary (area A1) and secondary (area CL) auditory cortical areas in young and aged alert rhesus macaques. We found that the neurons of aged animals had greater spontaneous and dri...

  15. Separate Mechanisms Recruited by Exogenous and Endogenous Spatial Cues: Evidence from a Spatial Stroop Paradigm

    Science.gov (United States)

    Funes, Maria Jesus; Lupianez, Juan; Milliken, Bruce

    2007-01-01

    The present experiments tested whether endogenous and exogenous cues produce separate effects on target processing. In Experiment 1, participants discriminated whether an arrow presented left or right of fixation pointed to the left or right. For 1 group, the arrow was preceded by a peripheral noninformative cue. For the other group, the arrow was…

  16. Encoding and retrieval of landmark-related spatial cues during navigation: an fMRI study.

    Science.gov (United States)

    Wegman, Joost; Tyborowska, Anna; Janzen, Gabriele

    2014-07-01

    To successfully navigate, humans can use different cues from their surroundings. Learning locations in an environment can be supported by parallel subsystems in the hippocampus and the striatum. We used fMRI to look at differences in the use of object-related spatial cues while 47 participants actively navigated in an open-field virtual environment. In each trial, participants navigated toward a target object. During encoding, three positional cues (columns) with directional cues (shadows) were available. During retrieval, the removed target had to be replaced while either two objects without shadows (objects trial) or one object with a shadow (shadow trial) were available. Participants were informed in blocks about which type of retrieval trial was most likely to occur, thereby modulating expectations of having to rely on a single landmark or on a configuration of landmarks. How the spatial learning systems in the hippocampus and caudate nucleus were involved in these landmark-based encoding and retrieval processes were investigated. Landmark configurations can create a geometry similar to boundaries in an environment. It was found that the hippocampus was involved in encoding when relying on configurations of landmarks, whereas the caudate nucleus was involved in encoding when relying on single landmarks. This might suggest that the observed hippocampal activation for configurations of objects is linked to a spatial representation observed with environmental boundaries. Retrieval based on configurations of landmarks activated regions associated with the spatial updation of object locations for reorientation. When only a single landmark was available during retrieval, regions associated with updating the location of oneself were activated. There was also evidence that good between-participant performance was predicted by right hippocampal activation. This study therefore sheds light on how the brain deals with changing demands on spatial processing related purely

  17. Spatially valid proprioceptive cues improve the detection of a visual stimulus

    DEFF Research Database (Denmark)

    Jackson, Carl P T; Miall, R Chris; Balslev, Daniela

    2010-01-01

    Vision and proprioception are the main sensory modalities that convey hand location and direction of movement. Fusion of these sensory signals into a single robust percept is now well documented. However, it is not known whether these modalities also interact in the spatial allocation of attention......, which has been demonstrated for other modality pairings. The aim of this study was to test whether proprioceptive signals can spatially cue a visual target to improve its detection. Participants were instructed to use a planar manipulandum in a forward reaching action and determine during this movement...... whether a near-threshold visual target appeared at either of two lateral positions. The target presentation was followed by a masking stimulus, which made its possible location unambiguous, but not its presence. Proprioceptive cues were given by applying a brief lateral force to the participant's arm...

  18. The Use of Spatialized Speech in Auditory Interfaces for Computer Users Who Are Visually Impaired

    Science.gov (United States)

    Sodnik, Jaka; Jakus, Grega; Tomazic, Saso

    2012-01-01

    Introduction: This article reports on a study that explored the benefits and drawbacks of using spatially positioned synthesized speech in auditory interfaces for computer users who are visually impaired (that is, are blind or have low vision). The study was a practical application of such systems--an enhanced word processing application compared…

  19. Geometric Cues, Reference Frames, and the Equivalence of Experienced-Aligned and Novel-Aligned Views in Human Spatial Memory

    Science.gov (United States)

    Kelly, Jonathan W.; Sjolund, Lori A.; Sturz, Bradley R.

    2013-01-01

    Spatial memories are often organized around reference frames, and environmental shape provides a salient cue to reference frame selection. To date, however, the environmental cues responsible for influencing reference frame selection remain relatively unknown. To connect research on reference frame selection with that on orientation via…

  20. Perception of Auditory, Visual, and Egocentric Spatial Alignment Adapts Differently to Changes in Eye Position

    OpenAIRE

    Cui, Qi N; Razavi, Babak; O'Neill, William E.; Paige, Gary D.

    2009-01-01

    Vision and audition represent the outside world in spatial synergy that is crucial for guiding natural activities. Input conveying eye-in-head position is needed to maintain spatial congruence because the eyes move in the head while the ears remain head-fixed. Recently, we reported that the human perception of auditory space shifts with changes in eye position. In this study, we examined whether this phenomenon is 1) dependent on a visual fixation reference, 2) selective for frequency bands (...

  1. Effects of spatially correlated acoustic-tactile information on judgments of auditory circular direction

    Science.gov (United States)

    Cohen, Annabel J.; Lamothe, M. J. Reina; Toms, Ian D.; Fleming, Richard A. G.

    2002-05-01

    Cohen, Lamothe, Fleming, MacIsaac, and Lamoureux [J. Acoust. Soc. Am. 109, 2460 (2001)] reported that proximity governed circular direction judgments (clockwise/counterclockwise) of two successive tones emanating from all pairs of 12 speakers located at 30-degree intervals around a listeners' head (cranium). Many listeners appeared to experience systematic front-back confusion. Diametrically opposed locations (180-degrees-theoretically ambiguous direction) produced a direction bias pattern resembling Deutsch's tritone paradox [Deutsch, Kuyper, and Fisher, Music Percept. 5, 7992 (1987)]. In Experiment 1 of the present study, the circular direction task was conducted in the tactile domain using 12 circumcranial points of vibration. For all 5 participants, proximity governed direction (without front-back confusion) and a simple clockwise bias was shown for 180-degree pairs. Experiment 2 tested 9 new participants in one unimodal auditory condition and two bimodal auditory-tactile conditions (spatially-correlated/spatially-uncorrelated). Correlated auditory-tactile information eliminated front-back confusion for 8 participants and replaced the ``paradoxical'' bias for 180-degree pairs with the clockwise bias. Thus, spatially correlated audio-tactile location information improves the veridical representation of 360-degree acoustic space, and modality-specific principles are implicated by the unique circular direction bias patterns for 180-degree pairs in the separate auditory and tactile modalities. [Work supported by NSERC.

  2. The Relationship between Visual-Spatial and Auditory-Verbal Working Memory Span in Senegalese and Ugandan Children

    OpenAIRE

    Michael J Boivin; Paul Bangirana; Rebecca C Shaffer

    2010-01-01

    BACKGROUND: Using the Kaufman Assessment Battery for Children (K-ABC) Conant et al. (1999) observed that visual and auditory working memory (WM) span were independent in both younger and older children from DR Congo, but related in older American children and in Lao children. The present study evaluated whether visual and auditory WM span were independent in Ugandan and Senegalese children. METHOD: In a linear regression analysis we used visual (Spatial Memory, Hand Movements) and auditory (N...

  3. Sub-second temporal processing : effects of modality and spatial change on brief visual and auditory time judgments

    OpenAIRE

    Retsa, Chryssoula

    2013-01-01

    The present thesis set out to investigate how sensory modality and spatial presentation influence visual and auditory duration judgments in the millisecond range. The effects of modality and spatial location were explored by considering right and left side presentations of mixed or blocked visual and auditory stimuli. Several studies have shown that perceived duration of a stimulus can be affected by various extra-temporal factors such as modality and spatial position. Audit...

  4. Interference between postural control and spatial vs. non-spatial auditory reaction time tasks in older adults.

    Science.gov (United States)

    Fuhrman, Susan I; Redfern, Mark S; Jennings, J Richard; Furman, Joseph M

    2015-01-01

    This study investigated whether spatial aspects of an information processing task influence dual-task interference. Two groups (Older/Young) of healthy adults participated in dual-task experiments. Two auditory information processing tasks included a frequency discrimination choice reaction time task (non-spatial task) and a lateralization choice reaction time task (spatial task). Postural tasks included combinations of standing with eyes open or eyes closed on either a fixed floor or a sway-referenced floor. Reaction times and postural sway via center of pressure were recorded. Baseline measures of reaction time and sway were subtracted from the corresponding dual-task results to calculate reaction time task costs and postural task costs. Reaction time task cost increased with eye closure (p = 0.01), sway-referenced flooring (p visual-spatial interference may occur in older subjects when vision is used to maintain posture. PMID:26410669

  5. Follow the Sound : Design of mobile spatial audio applications for pedestrian navigation

    OpenAIRE

    2012-01-01

    Auditory displays are slower than graphical user interfaces. We believe spatial audio can change that. Human perception can localize the position of sound sources due to psychoacoustical cues. Spatial audio reproduces these cues to produce virtual sound source position by headphones. The spatial attribute of sound can be used to produce richer and more effective auditory displays. In this work, there is proposed a set of interaction design guidelines for the use of spatial audio displays i...

  6. Cues, context, and long-term memory: the role of the retrosplenial cortex in spatial cognition

    Directory of Open Access Journals (Sweden)

    Adam M P Miller

    2014-08-01

    Full Text Available Spatial navigation requires representations of landmarks and other navigation cues. The retrosplenial cortex (RSC is anatomically positioned between limbic areas important for memory formation, such as the hippocampus and the anterior thalamus, and cortical regions along the dorsal stream known to contribute importantly to long-term spatial representation, such as the posterior parietal cortex. Damage to the RSC severely impairs allocentric representations of the environment, including the ability to derive navigational information from landmarks. The specific deficits seen in tests of human and rodent navigation suggest that the RSC supports allocentric representation by processing the stable features of the environment and the spatial relationships among them. In addition to spatial cognition, the RSC plays a key role in contextual and episodic memory. The RSC also contributes importantly to the acquisition and consolidation of long-term spatial and contextual memory through its interactions with the hippocampus. Within this framework, the RSC plays a dual role as part of the feedforward network providing sensory and mnemonic input to the hippocampus and as a target of the hippocampal-dependent systems consolidation of long-term memory.

  7. So Close to a Deal: Spatial-Distance Cues Influence Economic Decision-Making in a Social Context.

    Science.gov (United States)

    Fatfouta, Ramzi; Schulreich, Stefan; Meshi, Dar; Heekeren, Hauke

    2015-01-01

    Social distance (i.e., the degree of closeness to another person) affects the way humans perceive and respond to fairness during financial negotiations. Feeling close to someone enhances the acceptance of monetary offers. Here, we explored whether this effect also extends to the spatial domain. Specifically, using an iterated version of the Ultimatum Game in a within-subject design, we investigated whether different visual spatial distance-cues result in different rates of acceptance of otherwise identical monetary offers. Study 1 found that participants accepted significantly more offers when they were cued with spatial closeness than when they were cued with spatial distance. Study 2 replicated this effect using identical procedures but different spatial-distance cues in an independent sample. Importantly, our results could not be explained by feelings of social closeness. Our results demonstrate that mere perceptions of spatial closeness produce analogous-but independent-effects to those of social closeness. PMID:26287528

  8. The relationship between visual-spatial and auditory-verbal working memory span in Senegalese and Ugandan children.

    Directory of Open Access Journals (Sweden)

    Michael J Boivin

    Full Text Available BACKGROUND: Using the Kaufman Assessment Battery for Children (K-ABC Conant et al. (1999 observed that visual and auditory working memory (WM span were independent in both younger and older children from DR Congo, but related in older American children and in Lao children. The present study evaluated whether visual and auditory WM span were independent in Ugandan and Senegalese children. METHOD: In a linear regression analysis we used visual (Spatial Memory, Hand Movements and auditory (Number Recall WM along with education and physical development (weight/height as predictors. The predicted variable in this analysis was Word Order, which is a verbal memory task that has both visual and auditory memory components. RESULTS: Both the younger (8.5 yrs Ugandan children had auditory memory span (Number Recall that was strongly predictive of Word Order performance. For both the younger and older groups of Senegalese children, only visual WM span (Spatial Memory was strongly predictive of Word Order. Number Recall was not significantly predictive of Word Order in either age group. CONCLUSIONS: It is possible that greater literacy from more schooling for the Ugandan age groups mediated their greater degree of interdependence between auditory and verbal WM. Our findings support those of Conant et al., who observed in their cross-cultural comparisons that stronger education seemed to enhance the dominance of the phonological-auditory processing loop for WM.

  9. Influence of age, spatial memory, and ocular fixation on localization of auditory, visual, and bimodal targets by human subjects.

    Science.gov (United States)

    Dobreva, Marina S; O'Neill, William E; Paige, Gary D

    2012-12-01

    A common complaint of the elderly is difficulty identifying and localizing auditory and visual sources, particularly in competing background noise. Spatial errors in the elderly may pose challenges and even threats to self and others during everyday activities, such as localizing sounds in a crowded room or driving in traffic. In this study, we investigated the influence of aging, spatial memory, and ocular fixation on the localization of auditory, visual, and combined auditory-visual (bimodal) targets. Head-restrained young and elderly subjects localized targets in a dark, echo-attenuated room using a manual laser pointer. Localization accuracy and precision (repeatability) were quantified for both ongoing and transient (remembered) targets at response delays up to 10 s. Because eye movements bias auditory spatial perception, localization was assessed under target fixation (eyes free, pointer guided by foveal vision) and central fixation (eyes fixed straight ahead, pointer guided by peripheral vision) conditions. Spatial localization across the frontal field in young adults demonstrated (1) horizontal overshoot and vertical undershoot for ongoing auditory targets under target fixation conditions, but near-ideal horizontal localization with central fixation; (2) accurate and precise localization of ongoing visual targets guided by foveal vision under target fixation that degraded when guided by peripheral vision during central fixation; (3) overestimation in horizontal central space (±10°) of remembered auditory, visual, and bimodal targets with increasing response delay. In comparison with young adults, elderly subjects showed (1) worse precision in most paradigms, especially when localizing with peripheral vision under central fixation; (2) greatly impaired vertical localization of auditory and bimodal targets; (3) increased horizontal overshoot in the central field for remembered visual and bimodal targets across response delays; (4) greater vulnerability to

  10. Trial-by-trial changes in a priori informational value of external cues and subjective expectancies in human auditory attention.

    Directory of Open Access Journals (Sweden)

    Antonio Arjona

    Full Text Available BACKGROUND: Preparatory activity based on a priori probabilities generated in previous trials and subjective expectancies would produce an attentional bias. However, preparation can be correct (valid or incorrect (invalid depending on the actual target stimulus. The alternation effect refers to the subjective expectancy that a target will not be repeated in the same position, causing RTs to increase if the target location is repeated. The present experiment, using the Posner's central cue paradigm, tries to demonstrate that not only the credibility of the cue, but also the expectancy about the next position of the target are changed in a trial by trial basis. Sequences of trials were analyzed. RESULTS: The results indicated an increase in RT benefits when sequences of two and three valid trials occurred. The analysis of errors indicated an increase in anticipatory behavior which grows as the number of valid trials is increased. On the other hand, there was also an RT benefit when a trial was preceded by trials in which the position of the target changed with respect to the current trial (alternation effect. Sequences of two alternations or two repetitions were faster than sequences of trials in which a pattern of repetition or alternation is broken. CONCLUSIONS: Taken together, these results suggest that in Posner's central cue paradigm, and with regard to the anticipatory activity, the credibility of the external cue and of the endogenously anticipated patterns of target location are constantly updated. The results suggest that Bayesian rules are operating in the generation of anticipatory activity as a function of the previous trial's outcome, but also on biases or prior beliefs like the "gambler fallacy".

  11. Using spatial manipulation to examine interactions between visual and auditory encoding of pitch and time

    Directory of Open Access Journals (Sweden)

    Neil M McLachlan

    2010-12-01

    Full Text Available Music notations use both symbolic and spatial representation systems. Novice musicians do not have the training to associate symbolic information with musical identities, such as chords or rhythmic and melodic patterns. They provide an opportunity to explore the mechanisms underpinning multimodal learning when spatial encoding strategies of feature dimensions might be expected to dominate. In this study, we applied a range of transformations (such as time reversal to short melodies and rhythms and asked novice musicians to identify them with or without the aid of notation. Performance using a purely spatial (graphic notation was contrasted with the more symbolic, traditional western notation over a series of weekly sessions. The results showed learning effects for both notation types, but performance improved more for graphic notation. This points to greater compatibility of auditory and visual neural codes for novice musicians when using spatial notation, suggesting that pitch and time may be spatially encoded in multimodal associative memory. The findings also point to new strategies for training novice musicians.

  12. The Effects of Cueing Temporal and Spatial Attention on Word Recognition in a Complex Listening Task in Hearing-Impaired Listeners

    OpenAIRE

    Gatehouse, Stuart; Akeroyd, Michael A.

    2008-01-01

    In a complex listening situation such as a multiperson conversation, the demands on an individual's attention are considerable: There will often be many sounds occurring simultaneously, with continual changes in source and direction. A laboratory analog of this was designed to measure the benefit that helping attention (by visual cueing) would have on word identification. These words were presented unpredictably but were sometimes cued with a temporal cue or a temporal-and-spatial cue. Two gr...

  13. Auditory Perceptual and Visual-Spatial Characteristics of Gaze-Evoked Tinnitus

    Directory of Open Access Journals (Sweden)

    Jamileh Fattahi

    1996-09-01

    Full Text Available Auditory perceptual and visual-spatial characteristics of subjective tinnitus evoked by eye gaze were studied in two adult human subjects. This uncommon form of tinnitus occurred approximately 4-6 weeks following neurosurgery for gross total excision of space Occupying lesions of the cerebellopontine angle and hearing was lost in the operated ear. In both cases, the gaze evoked tinnitus was characterized as being tonal in nature, with pitch and loudness percepts remaining constant as long as the same horizontal or vertical eye directions were maintained. Tinnitus was absent when the eyes were in a neutral head referenced position with subjects looking straight ahead. The results and implications of ophthalmological, standard and modified visual field assessment, pure tone audio metric assessment, spontaneous otoacoustic emission testing and detailed psychophysical assessment of pitch and loudness are discussed

  14. Musical metaphors: evidence for a spatial grounding of non-literal sentences describing auditory events.

    Science.gov (United States)

    Wolter, Sibylla; Dudschig, Carolin; de la Vega, Irmgard; Kaup, Barbara

    2015-03-01

    This study investigated whether the spatial terms high and low, when used in sentence contexts implying a non-literal interpretation, trigger similar spatial associations as would have been expected from the literal meaning of the words. In three experiments, participants read sentences describing either a high or a low auditory event (e.g., The soprano sings a high aria vs. The pianist plays a low note). In all Experiments, participants were asked to judge (yes/no) whether the sentences were meaningful by means of up/down (Experiments 1 and 2) or left/right (Experiment 3) key press responses. Contrary to previous studies reporting that metaphorical language understanding differs from literal language understanding with regard to simulation effects, the results show compatibility effects between sentence implied pitch height and response location. The results are in line with grounded models of language comprehension proposing that sensory motor experiences are being elicited when processing literal as well as non-literal sentences. PMID:25443988

  15. The shape of ears to come: dynamic coding of auditory space.

    Science.gov (United States)

    King, A J.; Schnupp, J W.H.; Doubell, T P.

    2001-06-01

    In order to pinpoint the location of a sound source, we make use of a variety of spatial cues that arise from the direction-dependent manner in which sounds interact with the head, torso and external ears. Accurate sound localization relies on the neural discrimination of tiny differences in the values of these cues and requires that the brain circuits involved be calibrated to the cues experienced by each individual. There is growing evidence that the capacity for recalibrating auditory localization continues well into adult life. Many details of how the brain represents auditory space and of how those representations are shaped by learning and experience remain elusive. However, it is becoming increasingly clear that the task of processing auditory spatial information is distributed over different regions of the brain, some working hierarchically, others independently and in parallel, and each apparently using different strategies for encoding sound source location. PMID:11390297

  16. Manipulating the stride length/stride velocity relationship of walking using a treadmill and rhythmic auditory cueing in non-disabled older individuals. A short-term feasibility study.

    OpenAIRE

    Eikema, Diderik Jan A.; Forrester, Larry W.; Whitall, Jill

    2014-01-01

    One target for rehabilitating locomotor disorders in older adults is to increase mobility by improving walking velocity. Combining rhythmic auditory cueing (RAC) and treadmill training permits the study of the stride length/stride velocity ratio (SL/SV), often reduced in those with mobility deficits. We investigated the use of RAC to increase velocity by manipulating the SL/SV ratio in older adults. Nine participants (6 female; age: 61.1 ± 8.8 yrs.) walked overground on a gait mat at preferre...

  17. Temporal asymmetries in auditory coding and perception reflect multi-layered nonlinearities.

    Science.gov (United States)

    Deneux, Thomas; Kempf, Alexandre; Daret, Aurélie; Ponsot, Emmanuel; Bathellier, Brice

    2016-01-01

    Sound recognition relies not only on spectral cues, but also on temporal cues, as demonstrated by the profound impact of time reversals on perception of common sounds. To address the coding principles underlying such auditory asymmetries, we recorded a large sample of auditory cortex neurons using two-photon calcium imaging in awake mice, while playing sounds ramping up or down in intensity. We observed clear asymmetries in cortical population responses, including stronger cortical activity for up-ramping sounds, which matches perceptual saliency assessments in mice and previous measures in humans. Analysis of cortical activity patterns revealed that auditory cortex implements a map of spatially clustered neuronal ensembles, detecting specific combinations of spectral and intensity modulation features. Comparing different models, we show that cortical responses result from multi-layered nonlinearities, which, contrary to standard receptive field models of auditory cortex function, build divergent representations of sounds with similar spectral content, but different temporal structure. PMID:27580932

  18. Colorful Success: Preschoolers' Use of Perceptual Color Cues to Solve a Spatial Reasoning Problem

    Science.gov (United States)

    Joh, Amy S.; Spivey, Leigh A.

    2012-01-01

    Spatial reasoning, a crucial skill for everyday actions, develops gradually during the first several years of childhood. Previous studies have shown that perceptual information and problem solving strategies are critical for successful spatial reasoning in young children. Here, we sought to link these two factors by examining children's use of…

  19. Characterizing spatial tuning functions of neurons in the auditory cortex of young and aged monkeys: A new perspective on old data.

    OpenAIRE

    James Engle; Gregg H Recanzone

    2013-01-01

    Age-related hearing deficits are a leading cause of disability among the aged. While some forms of hearing deficits are peripheral in origin, others are centrally mediated. One such deficit is the ability to localize sounds, a critical component for segregating different acoustic objects and events, which is dependent on the auditory cortex. Recent evidence indicates that in aged animals the normal sharpening of spatial tuning between neurons in primary auditory cortex to the caudal latera...

  20. Plasticity in the neural coding of auditory space in the mammalian brain

    Science.gov (United States)

    King, Andrew J.; Parsons, Carl H.; Moore, David R.

    2000-10-01

    Sound localization relies on the neural processing of monaural and binaural spatial cues that arise from the way sounds interact with the head and external ears. Neurophysiological studies of animals raised with abnormal sensory inputs show that the map of auditory space in the superior colliculus is shaped during development by both auditory and visual experience. An example of this plasticity is provided by monaural occlusion during infancy, which leads to compensatory changes in auditory spatial tuning that tend to preserve the alignment between the neural representations of visual and auditory space. Adaptive changes also take place in sound localization behavior, as demonstrated by the fact that ferrets raised and tested with one ear plugged learn to localize as accurately as control animals. In both cases, these adjustments may involve greater use of monaural spectral cues provided by the other ear. Although plasticity in the auditory space map seems to be restricted to development, adult ferrets show some recovery of sound localization behavior after long-term monaural occlusion. The capacity for behavioral adaptation is, however, task dependent, because auditory spatial acuity and binaural unmasking (a measure of the spatial contribution to the "cocktail party effect") are permanently impaired by chronically plugging one ear, both in infancy but especially in adulthood. Experience-induced plasticity allows the neural circuitry underlying sound localization to be customized to individual characteristics, such as the size and shape of the head and ears, and to compensate for natural conductive hearing losses, including those associated with middle ear disease in infancy.

  1. From repulsion to attraction: species- and spatial context-dependent threat sensitive response of the spider mite Tetranychus urticae to predatory mite cues

    Science.gov (United States)

    Fernández Ferrari, M. Celeste; Schausberger, Peter

    2013-06-01

    Prey perceiving predation risk commonly change their behavior to avoid predation. However, antipredator strategies are costly. Therefore, according to the threat-sensitive predator avoidance hypothesis, prey should match the intensity of their antipredator behaviors to the degree of threat, which may depend on the predator species and the spatial context. We assessed threat sensitivity of the two-spotted spider mite, Tetranychus urticae, to the cues of three predatory mites, Phytoseiulus persimilis, Neoseiulus californicus, and Amblyseius andersoni, posing different degrees of risk in two spatial contexts. We first conducted a no-choice test measuring oviposition and activity of T. urticae exposed to chemical traces of predators or traces plus predator eggs. Then, we tested the site preference of T. urticae in choice tests, using artificial cages and leaves. In the no-choice test, T. urticae deposited their first egg later in the presence of cues of P. persimilis than of the other two predators and cue absence, indicating interspecific threat-sensitivity. T. urticae laid also fewer eggs in the presence of cues of P. persimilis and A. andersoni than of N. californicus and cue absence. In the artificial cage test, the spider mites preferred the site with predator traces, whereas in the leaf test, they preferentially resided on leaves without traces. We argue that in a nonplant environment, chemical predator traces do not indicate a risk for T. urticae, and instead, these traces function as indirect habitat cues. The spider mites were attracted to these cues because they associated them with the existence of a nearby host plant.

  2. From ear to hand: the role of the auditory-motor loop in pointing to an auditory source

    Science.gov (United States)

    Boyer, Eric O.; Babayan, Bénédicte M.; Bevilacqua, Frédéric; Noisternig, Markus; Warusfel, Olivier; Roby-Brami, Agnes; Hanneton, Sylvain; Viaud-Delmon, Isabelle

    2013-01-01

    Studies of the nature of the neural mechanisms involved in goal-directed movements tend to concentrate on the role of vision. We present here an attempt to address the mechanisms whereby an auditory input is transformed into a motor command. The spatial and temporal organization of hand movements were studied in normal human subjects as they pointed toward unseen auditory targets located in a horizontal plane in front of them. Positions and movements of the hand were measured by a six infrared camera tracking system. In one condition, we assessed the role of auditory information about target position in correcting the trajectory of the hand. To accomplish this, the duration of the target presentation was varied. In another condition, subjects received continuous auditory feedback of their hand movement while pointing to the auditory targets. Online auditory control of the direction of pointing movements was assessed by evaluating how subjects reacted to shifts in heard hand position. Localization errors were exacerbated by short duration of target presentation but not modified by auditory feedback of hand position. Long duration of target presentation gave rise to a higher level of accuracy and was accompanied by early automatic head orienting movements consistently related to target direction. These results highlight the efficiency of auditory feedback processing in online motor control and suggest that the auditory system takes advantages of dynamic changes of the acoustic cues due to changes in head orientation in order to process online motor control. How to design an informative acoustic feedback needs to be carefully studied to demonstrate that auditory feedback of the hand could assist the monitoring of movements directed at objects in auditory space. PMID:23626532

  3. Crossmodal and incremental perception of audiovisual cues to emotional speech

    NARCIS (Netherlands)

    Barkhuysen, Pashiera; Krahmer, E.J.; Swerts, M.G.J.

    2010-01-01

    In this article we report on two experiments about the perception of audiovisual cues to emotional speech. The article addresses two questions: (1) how do visual cues from a speaker's face to emotion relate to auditory cues, and (2) what is the recognition speed for various facial cues to emotion? B

  4. Crossmodal and Incremental Perception of Audiovisual Cues to Emotional Speech

    Science.gov (United States)

    Barkhuysen, Pashiera; Krahmer, Emiel; Swerts, Marc

    2010-01-01

    In this article we report on two experiments about the perception of audiovisual cues to emotional speech. The article addresses two questions: (1) how do visual cues from a speaker's face to emotion relate to auditory cues, and (2) what is the recognition speed for various facial cues to emotion? Both experiments reported below are based on tests…

  5. Spatial attention and reading ability: ERP correlates of flanker and cue-size effects in good and poor adult phonological decoders.

    Science.gov (United States)

    Matthews, Allison Jane; Martin, Frances Heritage

    2015-12-01

    To investigate facilitatory and inhibitory processes during selective attention among adults with good (n=17) and poor (n=14) phonological decoding skills, a go/nogo flanker task was completed while EEG was recorded. Participants responded to a middle target letter flanked by compatible or incompatible flankers. The target was surrounded by a small or large circular cue which was presented simultaneously or 500ms prior. Poor decoders showed a greater RT cost for incompatible stimuli preceded by large cues and less RT benefit for compatible stimuli. Poor decoders also showed reduced modulation of ERPs by cue-size at left hemisphere posterior sites (N1) and by flanker compatibility at right hemisphere posterior sites (N1) and frontal sites (N2), consistent with processing differences in fronto-parietal attention networks. These findings have potential implications for understanding the relationship between spatial attention and phonological decoding in dyslexia. PMID:26562794

  6. The Effect of Auditory and Contextual Emotional Cues on the Ability to Recognise Facial Expressions of Emotion in Healthy Adult Aging

    OpenAIRE

    Duncan, Nikki

    2013-01-01

    Previous emotion recognition studies have suggested an age-related decline in the recognition of facial expressions of emotion. However, these studies often lack ecological validity and do not consider the multiple interacting sensory stimuli that are critical to realworld emotion recognition. In the current study, emotion recognition in everyday life was considered to comprise of the interaction between facial expressions, accompanied by an auditory expression and embedded in a situational c...

  7. Setting Goals to Switch between Tasks: Effect of Cue Transparency on Children's Cognitive Flexibility

    Science.gov (United States)

    Chevalier, Nicolas; Blaye, Agnes

    2009-01-01

    Three experiments examined the difficulty of translating cues into verbal representations of task goals by varying the degree of cue transparency (auditory transparent cues, visual transparent cues, visual arbitrary cues) in the Advanced Dimensional Change Card Sort, which requires switching between color- and shape-sorting rules on the basis of…

  8. Signaled two-way avoidance learning using electrical stimulation of the inferior colliculus as negative reinforcement: effects of visual and auditory cues as warning stimuli

    Directory of Open Access Journals (Sweden)

    A.C. Troncoso

    1998-03-01

    Full Text Available The inferior colliculus is a primary relay for the processing of auditory information in the brainstem. The inferior colliculus is also part of the so-called brain aversion system as animals learn to switch off the electrical stimulation of this structure. The purpose of the present study was to determine whether associative learning occurs between aversion induced by electrical stimulation of the inferior colliculus and visual and auditory warning stimuli. Rats implanted with electrodes into the central nucleus of the inferior colliculus were placed inside an open-field and thresholds for the escape response to electrical stimulation of the inferior colliculus were determined. The rats were then placed inside a shuttle-box and submitted to a two-way avoidance paradigm. Electrical stimulation of the inferior colliculus at the escape threshold (98.12 ± 6.15 (A, peak-to-peak was used as negative reinforcement and light or tone as the warning stimulus. Each session consisted of 50 trials and was divided into two segments of 25 trials in order to determine the learning rate of the animals during the sessions. The rats learned to avoid the inferior colliculus stimulation when light was used as the warning stimulus (13.25 ± 0.60 s and 8.63 ± 0.93 s for latencies and 12.5 ± 2.04 and 19.62 ± 1.65 for frequencies in the first and second halves of the sessions, respectively, P0.05 in both cases. Taken together, the present results suggest that rats learn to avoid the inferior colliculus stimulation when light is used as the warning stimulus. However, this learning process does not occur when the neutral stimulus used is an acoustic one. Electrical stimulation of the inferior colliculus may disturb the signal transmission of the stimulus to be conditioned from the inferior colliculus to higher brain structures such as amygdala

  9. Development of visuo-auditory integration in space and time

    Directory of Open Access Journals (Sweden)

    Monica Gori

    2012-09-01

    Full Text Available Adults integrate multisensory information optimally (e.g. Ernst & Banks, 2002 while children are not able to integrate multisensory visual haptic cues until 8-10 years of age (e.g. Gori, Del Viva, Sandini, & Burr, 2008. Before that age strong unisensory dominance is present for size and orientation visual-haptic judgments maybe reflecting a process of cross-sensory calibration between modalities. It is widely recognized that audition dominates time perception, while vision dominates space perception. If the cross sensory calibration process is necessary for development, then the auditory modality should calibrate vision in a bimodal temporal task, and the visual modality should calibrate audition in a bimodal spatial task. Here we measured visual-auditory integration in both the temporal and the spatial domains reproducing for the spatial task a child-friendly version of the ventriloquist stimuli used by Alais and Burr (2004 and for the temporal task a child-friendly version of the stimulus used by Burr, Banks and Morrone (2009. Unimodal and bimodal (conflictual or not conflictual audio-visual thresholds and PSEs were measured and compared with the Bayesian predictions. In the temporal domain, we found that both in children and adults, audition dominates the bimodal visuo-auditory task both in perceived time and precision thresholds. Contrarily, in the visual-auditory spatial task, children younger than 12 years of age show clear visual dominance (on PSEs and bimodal thresholds higher than the Bayesian prediction. Only in the adult group bimodal thresholds become optimal. In agreement with previous studies, our results suggest that also visual-auditory adult-like behaviour develops late. Interestingly, the visual dominance for space and the auditory dominance for time that we found might suggest a cross-sensory comparison of vision in a spatial visuo-audio task and a cross-sensory comparison of audition in a temporal visuo-audio task.

  10. Hippocampal-dependent memory in the plus-maze discriminative avoidance task: The role of spatial cues and CA1 activity.

    Science.gov (United States)

    Leão, Anderson H F F; Medeiros, André M; Apolinário, Gênedy K S; Cabral, Alícia; Ribeiro, Alessandra M; Barbosa, Flávio F; Silva, Regina H

    2016-05-01

    The plus-maze discriminative avoidance task (PMDAT) has been used to investigate interactions between aversive memory and an anxiety-like response in rodents. Suitable performance in this task depends on the activity of the basolateral amygdala, similar to other aversive-based memory tasks. However, the role of spatial cues and hippocampal-dependent learning in the performance of PMDAT remains unknown. Here, we investigated the role of proximal and distal cues in the retrieval of this task. Animals tested under misplaced proximal cues had diminished performance, and animals tested under both misplaced proximal cues and absent distal cues could not discriminate the aversive arm. We also assessed the role of the dorsal hippocampus (CA1) in this aversive memory task. Temporary bilateral inactivation of dorsal CA1 was conducted with muscimol (0.05μg, 0.1μg, and 0.2μg) prior to the training session. While the acquisition of the task was not altered, muscimol impaired the performance in the test session and reduced the anxiety-like response in the training session. We also performed a spreading analysis of a fluorophore-conjugated muscimol to confirm selective inhibition of CA1. In conclusion, both distal and proximal cues are required to retrieve the task, with the latter being more relevant to spatial orientation. Dorsal CA1 activity is also required for aversive memory formation in this task, and interfered with the anxiety-like response as well. Importantly, both effects were detected by different parameters in the same paradigm, endorsing the previous findings of independent assessment of aversive memory and anxiety-like behavior in the PMDAT. Taken together, these findings suggest that the PMDAT probably requires an integration of multiple systems for memory formation, resembling an episodic-like memory rather than a pure conditioning behavior. Furthermore, the concomitant and independent assessment of emotionality and memory in rodents is relevant to elucidate

  11. The role of diffusive architectural surfaces on auditory spatial discrimination in performance venues.

    Science.gov (United States)

    Robinson, Philip W; Pätynen, Jukka; Lokki, Tapio; Jang, Hyung Suk; Jeon, Jin Yong; Xiang, Ning

    2013-06-01

    In musical or theatrical performance, some venues allow listeners to individually localize and segregate individual performers, while others produce a well blended ensemble sound. The room acoustic conditions that make this possible, and the psycho-acoustic effects at work are not fully understood. This research utilizes auralizations from measured and simulated performance venues to investigate spatial discrimination of multiple acoustic sources in rooms. Signals were generated from measurements taken in a small theater, and listeners in the audience area were asked to distinguish pairs of speech sources on stage with various spatial separations. This experiment was repeated with the proscenium splay walls treated to be flat, diffusive, or absorptive. Similar experiments were conducted in a simulated hall, utilizing 11 early reflections with various characteristics, and measured late reverberation. The experiments reveal that discriminating the lateral arrangement of two sources is possible at narrower separation angles when reflections come from flat or absorptive rather than diffusive surfaces. PMID:23742348

  12. The Wellcome Prize Lecture. A map of auditory space in the mammalian brain: neural computation and development.

    Science.gov (United States)

    King, A J

    1993-09-01

    The experiments described in this review have demonstrated that the SC contains a two-dimensional map of auditory space, which is synthesized within the brain using a combination of monaural and binaural localization cues. There is also an adaptive fusion of auditory and visual space in this midbrain nucleus, providing for a common access to the motor pathways that control orientation behaviour. This necessitates a highly plastic relationship between the visual and auditory systems, both during postnatal development and in adult life. Because of the independent mobility of difference sense organs, gating mechanisms are incorporated into the auditory representation to provide up-to-date information about the spatial orientation of the eyes and ears. The SC therefore provides a valuable model system for studying a number of important issues in brain function, including the neural coding of sound location, the co-ordination of spatial information between different sensory systems, and the integration of sensory signals with motor outputs. PMID:8240794

  13. Two- to Eight-Month-Old Infants' Perception of Dynamic Auditory-Visual Spatial Colocation

    OpenAIRE

    Bremner, J. Gavin; Slater, Alan M.; Scott P Johnson; Mason, Ursula; Spring, Joanne; Bremner, Maggie E.

    2011-01-01

    From birth, infants detect associations between the locations of static visual objects and sounds they emit, but there is limited evidence regarding their sensitivity to the dynamic equivalent when a sound-emitting object moves. In 4 experiments involving thirty-six 2-month-olds, forty-eight 5-month-olds, and forty-eight 8-month-olds, we investigated infants' ability to process this form of spatial colocation. Whereas there was no evidence of spontaneous sensitivity, all age groups detected a...

  14. Eye Movements during Auditory Attention Predict Individual Differences in Dorsal Attention Network Activity

    Science.gov (United States)

    Braga, Rodrigo M.; Fu, Richard Z.; Seemungal, Barry M.; Wise, Richard J. S.; Leech, Robert

    2016-01-01

    The neural mechanisms supporting auditory attention are not fully understood. A dorsal frontoparietal network of brain regions is thought to mediate the spatial orienting of attention across all sensory modalities. Key parts of this network, the frontal eye fields (FEF) and the superior parietal lobes (SPL), contain retinotopic maps and elicit saccades when stimulated. This suggests that their recruitment during auditory attention might reflect crossmodal oculomotor processes; however this has not been confirmed experimentally. Here we investigate whether task-evoked eye movements during an auditory task can predict the magnitude of activity within the dorsal frontoparietal network. A spatial and non-spatial listening task was used with on-line eye-tracking and functional magnetic resonance imaging (fMRI). No visual stimuli or cues were used. The auditory task elicited systematic eye movements, with saccade rate and gaze position predicting attentional engagement and the cued sound location, respectively. Activity associated with these separate aspects of evoked eye-movements dissociated between the SPL and FEF. However these observed eye movements could not account for all the activation in the frontoparietal network. Our results suggest that the recruitment of the SPL and FEF during attentive listening reflects, at least partly, overt crossmodal oculomotor processes during non-visual attention. Further work is needed to establish whether the network’s remaining contribution to auditory attention is through covert crossmodal processes, or is directly involved in the manipulation of auditory information. PMID:27242465

  15. Auditory Processing Disorders

    Science.gov (United States)

    Auditory Processing Disorders Auditory processing disorders (APDs) are referred to by many names: central auditory processing disorders , auditory perceptual disorders , and central auditory disorders . APDs ...

  16. Characterizing spatial tuning functions of neurons in the auditory cortex of young and aged monkeys: A new perspective on old data.

    Directory of Open Access Journals (Sweden)

    James eEngle

    2013-01-01

    Full Text Available Age-related hearing deficits are a leading cause of disability among the aged. While some forms of hearing deficits are peripheral in origin, others are centrally mediated. One such deficit is the ability to localize sounds, a critical component for segregating different acoustic objects and events, which is dependent on the auditory cortex. Recent evidence indicates that in aged animals the normal sharpening of spatial tuning between neurons in primary auditory cortex to the caudal lateral field does not occur as it does in younger animals. As a decrease in inhibition with aging is common in the ascending auditory system, it is possible that this lack of spatial tuning sharpening is due to a decrease in inhibition at different periods within the response. It is also possible that spatial tuning was decreased as a consequence of reduced inhibition at non-best locations. In this report we found that aged animals did have greater activity throughout the response period, but primarily during the onset of the response. This was most prominent at non-best directions, consistent with the hypothesis that inhibition is a primary mechanism to sharpen spatial tuning curves. We also noted that in aged animals the latency of the response was much shorter than in younger animals, consistent with a decrease in pre-onset inhibition. These results can be interpreted in the context of a failure of the timing and efficiency of feed-forward thalamo-cortical and cortico-cortical circuits in aged animals. Such a mechanism, if generalized across cortical areas, could play a major role in age-related cognitive decline.

  17. Intestinal GPS: bile and bicarbonate control cyclic di-GMP to provide Vibrio cholerae spatial cues within the small intestine.

    Science.gov (United States)

    Koestler, Benjamin J; Waters, Christopher M

    2014-01-01

    The second messenger cyclic di-GMP (c-di-GMP) regulates numerous phenotypes in response to environmental stimuli to enable bacteria to transition between different lifestyles. Here we discuss our recent findings that the human pathogen Vibrio cholerae recognizes 2 host-specific signals, bile and bicarbonate, to regulate intracellular c-di-GMP. We have demonstrated that bile acids increase intracellular c-di-GMP to promote biofilm formation. We have also shown that this bile-mediated increase of intracellular c-di-GMP is negated by bicarbonate, and that this interaction is dependent on pH, suggesting that V. cholerae uses these 2 environmental cues to sense and adapt to its relative location in the small intestine. Increased intracellular c-di-GMP by bile is attributed to increased c-di-GMP synthesis by 3 diguanylate cyclases (DGCs) and decreased expression of one phosphodiesterase (PDE) in the presence of bile. The molecular mechanisms by which bile controls the activity of the 3 DGCs and the regulators of bile-mediated transcriptional repression of the PDE are not yet known. Moreover, the impact of varying concentrations of bile and bicarbonate at different locations within the small intestine and the response of V. cholerae to these cues remains unclear. The native microbiome and pharmaceuticals, such as omeprazole, can impact bile and pH within the small intestine, suggesting these are potential unappreciated factors that may alter V. cholerae pathogenesis. PMID:25621620

  18. Oscillations Go the Distance: Low-Frequency Human Hippocampal Oscillations Code Spatial Distance in the Absence of Sensory Cues during Teleportation.

    Science.gov (United States)

    Vass, Lindsay K; Copara, Milagros S; Seyal, Masud; Shahlaie, Kiarash; Farias, Sarah Tomaszewski; Shen, Peter Y; Ekstrom, Arne D

    2016-03-16

    Low-frequency (delta/theta band) hippocampal neural oscillations play prominent roles in computational models of spatial navigation, but their exact function remains unknown. Some theories propose they are primarily generated in response to sensorimotor processing, while others suggest a role in memory-related processing. We directly recorded hippocampal EEG activity in patients undergoing seizure monitoring while they explored a virtual environment containing teleporters. Critically, this manipulation allowed patients to experience movement through space in the absence of visual and self-motion cues. The prevalence and duration of low-frequency hippocampal oscillations were unchanged by this manipulation, indicating that sensorimotor processing was not required to elicit them during navigation. Furthermore, the frequency-wise pattern of oscillation prevalence during teleportation contained spatial information capable of classifying the distance teleported. These results demonstrate that movement-related sensory information is not required to drive spatially informative low-frequency hippocampal oscillations during navigation and suggest a specific function in memory-related spatial updating. PMID:26924436

  19. 不同范围提示下儿童视觉空间注意的ERP研究%An ERP Study of Children's Visual Spatial Attention to Different Spatial Scaling Cues

    Institute of Scientific and Technical Information of China (English)

    孙延超; 李秀艳; 高卫星; 许桂春; 杨海英; 刘晓芹

    2011-01-01

    目的:研究儿童在不同范围提示下视觉空间注意的事件相关电位(ERP)特征.方法:采用“提示-目标”的视觉实验范式,以圆圈提示不同等级的搜索范围,对14名儿童进行检测.通过ERP技术分析儿童不同空间注意等级的早成分.结果:随提示范围的减小,反应时加快,后部P1和N1波幅增大,前部P2波幅减小.结论:在空间注意加工的早期阶段,儿童依赖提示等级的有效性调动脑资源;而晚期阶段需要额外的脑资源.%Objective: To study the characteristic of children's Event-Related Potentials(ERP) by visual spatial attention to different spatial scaling cues. Methods: The "cue-target" experimental paradigm was adopted, and the attended range was cued by different circles. The subjects included fourteen health children. Results: The reaction time became shorter with the decrease of the cue scale, while P1 and N1 components amplitudes increased, P2 components amplitudes decreased. Conclusion: In the early stages of processing of spatial attention, children rely on the effectiveness of prompt mobilization of the brain resources; and the later stages of the brain requires additional resources.

  20. Auditory Display

    DEFF Research Database (Denmark)

    volume. The conference's topics include auditory exploration of data via sonification and audification; real time monitoring of multivariate date; sound in immersive interfaces and teleoperation; perceptual issues in auditory display; sound in generalized computer interfaces; technologies supporting...

  1. Superior temporal activation in response to dynamic audio-visual emotional cues

    OpenAIRE

    Robins, Diana L.; Hunyadi, Elinora; Schultz, Robert T.

    2008-01-01

    Perception of emotion is critical for successful social interaction, yet the neural mechanisms underlying the perception of dynamic, audiovisual emotional cues are poorly understood. Evidence from language and sensory paradigms suggests that the superior temporal sulcus and gyrus (STS/STG) play a key role in the integration of auditory and visual cues. Emotion perception research has focused on static facial cues; however, dynamic audiovisual (AV) cues mimic real-world social cues more accura...

  2. Auditory agnosia.

    Science.gov (United States)

    Slevc, L Robert; Shell, Alison R

    2015-01-01

    Auditory agnosia refers to impairments in sound perception and identification despite intact hearing, cognitive functioning, and language abilities (reading, writing, and speaking). Auditory agnosia can be general, affecting all types of sound perception, or can be (relatively) specific to a particular domain. Verbal auditory agnosia (also known as (pure) word deafness) refers to deficits specific to speech processing, environmental sound agnosia refers to difficulties confined to non-speech environmental sounds, and amusia refers to deficits confined to music. These deficits can be apperceptive, affecting basic perceptual processes, or associative, affecting the relation of a perceived auditory object to its meaning. This chapter discusses what is known about the behavioral symptoms and lesion correlates of these different types of auditory agnosia (focusing especially on verbal auditory agnosia), evidence for the role of a rapid temporal processing deficit in some aspects of auditory agnosia, and the few attempts to treat the perceptual deficits associated with auditory agnosia. A clear picture of auditory agnosia has been slow to emerge, hampered by the considerable heterogeneity in behavioral deficits, associated brain damage, and variable assessments across cases. Despite this lack of clarity, these striking deficits in complex sound processing continue to inform our understanding of auditory perception and cognition. PMID:25726291

  3. Auditory and Visual Sensations

    CERN Document Server

    Ando, Yoichi

    2010-01-01

    Professor Yoichi Ando, acoustic architectural designer of the Kirishima International Concert Hall in Japan, presents a comprehensive rational-scientific approach to designing performance spaces. His theory is based on systematic psychoacoustical observations of spatial hearing and listener preferences, whose neuronal correlates are observed in the neurophysiology of the human brain. A correlation-based model of neuronal signal processing in the central auditory system is proposed in which temporal sensations (pitch, timbre, loudness, duration) are represented by an internal autocorrelation representation, and spatial sensations (sound location, size, diffuseness related to envelopment) are represented by an internal interaural crosscorrelation function. Together these two internal central auditory representations account for the basic auditory qualities that are relevant for listening to music and speech in indoor performance spaces. Observed psychological and neurophysiological commonalities between auditor...

  4. Flexible cue use in food-caching birds.

    Science.gov (United States)

    LaDage, Lara D; Roth, Timothy C; Fox, Rebecca A; Pravosudov, Vladimir V

    2009-05-01

    An animal's memory may be limited in capacity, which may result in competition among available memory cues. If such competition exists, natural selection may favor prioritization of different memory cues based on cue reliability and on associated differences in the environment and life history. Food-caching birds store numerous food items and appear to rely on memory to retrieve caches. Previous studies suggested that caching species should always prioritize spatial cues over non-spatial cues when both are available, because non-spatial cues may be unreliable in a changing environment; however, it remains unclear whether non-spatial cues should always be ignored when spatial cues are available. We tested whether mountain chickadees (Poecile gambeli), a food-caching species, prioritize memory for spatial cues over color cues when relocating previously found food in an associative learning task. In training trials, birds were exposed to food in a feeder where both spatial location and color were associated. During subsequent unrewarded test trials, color was dissociated from spatial location. Chickadees showed a significant pattern of inspecting feeders associated with correct color first, prior to visiting correct spatial locations. Our findings argue against the hypothesis that the memory of spatial cues should always take priority over any non-spatial cues, including color cues, in food-caching species, because in our experiment mountain chickadees chose color over spatial cues. Our results thus suggest that caching species may be more flexible in cue use than previously thought, possibly dependent upon the environment and complexity of available cues. PMID:19050946

  5. The role of social cues in the deployment of spatial attention: Head-body relationships automatically activate directional spatial codes in a Simon task

    Directory of Open Access Journals (Sweden)

    Iwona ePomianowska

    2012-02-01

    Full Text Available The role of body orientation in the orienting and allocation of social attention was examined using an adapted Simon paradigm. Participants categorized the facial expression of forward facing, computer-generated human figures by pressing one of two response keys, each located left or right of the observers’ body midline, while the orientation of the stimulus figure’s body (trunk, arms, and legs, which was the task-irrelevant feature of interest, was manipulated (oriented towards the left or right visual hemifield with respect to the spatial location of the required response. We found that when the orientation of the body was compatible with the required response location, responses were slower relative to when body orientation was incompatible with the response location. This reverse compatibility effect suggests that body orientation is automatically processed into a directional spatial code, but that this code is based on an integration of head and body orientation within an allocentric-based frame of reference. Moreover, we argue that this code may be derived from the motion information implied in the image of a figure when head and body orientation are incongruent. Our results have implications for understanding the nature of the information that affects the allocation of attention for social orienting.

  6. Frequency band-importance functions for auditory and auditory-visual speech recognition

    Science.gov (United States)

    Grant, Ken W.

    2005-04-01

    In many everyday listening environments, speech communication involves the integration of both acoustic and visual speech cues. This is especially true in noisy and reverberant environments where the speech signal is highly degraded, or when the listener has a hearing impairment. Understanding the mechanisms involved in auditory-visual integration is a primary interest of this work. Of particular interest is whether listeners are able to allocate their attention to various frequency regions of the speech signal differently under auditory-visual conditions and auditory-alone conditions. For auditory speech recognition, the most important frequency regions tend to be around 1500-3000 Hz, corresponding roughly to important acoustic cues for place of articulation. The purpose of this study is to determine the most important frequency region under auditory-visual speech conditions. Frequency band-importance functions for auditory and auditory-visual conditions were obtained by having subjects identify speech tokens under conditions where the speech-to-noise ratio of different parts of the speech spectrum is independently and randomly varied on every trial. Point biserial correlations were computed for each separate spectral region and the normalized correlations are interpreted as weights indicating the importance of each region. Relations among frequency-importance functions for auditory and auditory-visual conditions will be discussed.

  7. Exogenous spatial attention decreases audiovisual integration.

    Science.gov (United States)

    Van der Stoep, N; Van der Stigchel, S; Nijboer, T C W

    2015-02-01

    Multisensory integration (MSI) and spatial attention are both mechanisms through which the processing of sensory information can be facilitated. Studies on the interaction between spatial attention and MSI have mainly focused on the interaction between endogenous spatial attention and MSI. Most of these studies have shown that endogenously attending a multisensory target enhances MSI. It is currently unclear, however, whether and how exogenous spatial attention and MSI interact. In the current study, we investigated the interaction between these two important bottom-up processes in two experiments. In Experiment 1 the target location was task-relevant, and in Experiment 2 the target location was task-irrelevant. Valid or invalid exogenous auditory cues were presented before the onset of unimodal auditory, unimodal visual, and audiovisual targets. We observed reliable cueing effects and multisensory response enhancement in both experiments. To examine whether audiovisual integration was influenced by exogenous spatial attention, the amount of race model violation was compared between exogenously attended and unattended targets. In both Experiment 1 and Experiment 2, a decrease in MSI was observed when audiovisual targets were exogenously attended, compared to when they were not. The interaction between exogenous attention and MSI was less pronounced in Experiment 2. Therefore, our results indicate that exogenous attention diminishes MSI when spatial orienting is relevant. The results are discussed in terms of models of multisensory integration and attention. PMID:25341648

  8. Fractal Fluctuations in Human Walking: Comparison between Auditory and Visually Guided Stepping

    OpenAIRE

    Terrier, Philippe

    2015-01-01

    In human locomotion, sensorimotor synchronization of gait consists of the coordination of stepping with rhythmic auditory cues (auditory cueing, AC). AC changes the long-range correlations among consecutive strides (fractal dynamics) into anti-correlations. Visual cueing (VC) is the alignment of step lengths with marks on the floor. The effects of VC on the fluctuation structure of walking have not been investigated. Therefore, the objective was to compare the effects of AC and VC on the fluc...

  9. Follow the sign! Top-down contingent attentional capture of masked arrow cues

    OpenAIRE

    Reuss, Heiko; Pohl, Carsten; Kiesel, Andrea; Kunde, Wilfried

    2011-01-01

    Arrow cues and other overlearned spatial symbols automatically orient attention according to their spatial meaning. This renders them similar to exogenous cues that occur at stimulus location. Exogenous cues trigger shifts of attention even when they are presented subliminally. Here, we investigate to what extent the mechanisms underlying the orienting of attention by exogenous cues and by arrow cues are comparable by analyzing the effects of visible and masked arrow cues on attention. In Exp...

  10. Auditory Neuropathy

    Science.gov (United States)

    ... field differ in their opinions about the potential benefits of hearing aids, cochlear implants, and other technologies for people with auditory neuropathy. Some professionals report that hearing aids and personal listening devices such as frequency modulation (FM) systems are ...

  11. Dissociating temporal attention from spatial attention and motor response preparation: A high-density EEG study.

    Science.gov (United States)

    Faugeras, Frédéric; Naccache, Lionel

    2016-01-01

    Engagement of various forms of attention and response preparation determines behavioral performance during stimulus-response tasks. Many studies explored the respective properties and neural signatures of each of these processes. However, very few experiments were conceived to explore their interaction. In the present work we used an auditory target detection task during which both temporal attention on the one side, and spatial attention and motor response preparation on the other side could be explicitly cued. Both cueing effects speeded response times, and showed strictly additive effects. Target ERP analysis revealed modulations of N1 and P3 responses by these two forms of cueing. Cue-target interval analysis revealed two main effects paralleling behavior. First, a typical contingent negative variation (CNV), induced by the cue and resolved immediately after target onset, was found larger for temporal attention cueing than for spatial and motor response cueing. Second, a posterior and late cue-P3 complex showed the reverse profile. Analyses of lateralized readiness potentials (LRP) revealed both patterns of motor response inhibition and activation. Taken together these results help to clarify and disentangle the respective effects of temporal attention on the one hand, and of the combination of spatial attention and motor response preparation on the other hand on brain activity and behavior. PMID:26433120

  12. Surface presentation of biochemical cues for stem cell expansion - Spatial distribution of growth factors and self-assembly of extracellular matrix

    Science.gov (United States)

    Liu, Xingyu

    Despite its great potential applications to stem cell technology and tissue engineering, matrix presentation of biochemical cues such as growth factors and extracellular matrix (ECM) components remains undefined. This is largely due to the difficulty in preserving the bioactivities of signaling molecules and in controlling the spatial distribution, cellular accessibility, molecular orientation and intermolecular assembly of the biochemical cues. This dissertation comprises of two parts that focuses on understanding surface presentation of a growth factor and ECM components, respectively. This dissertation addresses two fundamental questions in stem cell biology using two biomaterials platforms. How does nanoscale distribution of growth factor impact signaling activation and cellular behaviors of adult neural stem cells? How does ECM self-assembly impact human embryonic stem cell survival and proliferation? The first question was addressed by the design of a novel quantitative platform that allows the control of FGF-2 molecular presentation locally as either monomers or clusters when tethered to a polymeric substrate. This substrate-tethered FGF-2 enables a switch-like signaling activation in response to dose titration of FGF-2. This is in contrast to a continuous MAPK activation pattern elicited by soluble FGF-2. Consequently, cell proliferation, and spreading were also consistent with this FGF-2 does-response pattern. We demonstrated that the combination of FGF-2 concentration and its cluster size, rather than concentration alone, serves as the determinants to govern its biological effect on neural stem cells. The second part of this dissertation was inspired by the challenge that hESCs have extremely low clonal efficiency and hESC survival is critically dependent on cell substrate adhesion. We postulated that ECM integrity is a critical factor in preventing hESC anchorage-dependent apoptosis, and that the matrix for feeder-free culture need to be properly

  13. Self-affirmation in auditory persuasion

    NARCIS (Netherlands)

    Elbert, Sarah; Dijkstra, Arie

    2011-01-01

    Persuasive health information can be presented through an auditory channel. Curiously enough, the effect of voice cues in health persuasion has hardly been studied. Research concerning visual persuasive messages showed that self-affirmation results in a more open-minded reaction to threatening infor

  14. Influence of auditory and audiovisual stimuli on the right-left prevalence effect

    DEFF Research Database (Denmark)

    Vu, Kim-Phuong L; Minakata, Katsumi; Ngo, Mary Kim

    2014-01-01

    vertical coding through use of the spatial-musical association of response codes (SMARC) effect, where pitch is coded in terms of height in space. In Experiment 1, we found a larger right-left prevalence effect for unimodal auditory than visual stimuli. Neutral, non-pitch coded, audiovisual stimuli did not...... result in cross-modal facilitation, but did show evidence of visual dominance. The right-left prevalence effect was eliminated in the presence of SMARC audiovisual stimuli, but the effect influenced horizontal rather than vertical coding. Experiment 2 showed that the influence of the pitch dimension was...... not in terms of influencing response selection on a trial-to-trial basis, but in terms of altering the salience of the task environment. Taken together, these findings indicate that in the absence of salient vertical cues, auditory and audiovisual stimuli tend to be coded along the horizontal...

  15. Two- to eight-month-old infants’ perception of dynamic auditory-visual spatial co-location

    OpenAIRE

    Bremner, J. Gavin; Slater, Alan M.; Scott P Johnson; Mason, Uschi C.; Spring, Jo; Bremner, Maggie E.

    2011-01-01

    From birth, infants detect associations between the locations of static visual objects and sounds they emit, but there is limited evidence regarding their sensitivity to the dynamic equivalent when a sound-emitting object moves. In four experiments involving 36 2-month-olds, 48 5-month-olds and 48 8-month-olds, we investigated infants’ ability to process this form of spatial co-location. Whereas there was no evidence of spontaneous sensitivity, all age groups detected a dynamic co-location du...

  16. Characterization of auditory synaptic inputs to gerbil perirhinal cortex

    Directory of Open Access Journals (Sweden)

    Vibhakar C Kotak

    2015-08-01

    Full Text Available The representation of acoustic cues involves regions downstream from the auditory cortex (ACx. One such area, the perirhinal cortex (PRh, processes sensory signals containing mnemonic information. Therefore, our goal was to assess whether PRh receives auditory inputs from the auditory thalamus (MG and ACx in an auditory thalamocortical brain slice preparation and characterize these afferent-driven synaptic properties. When the MG or ACx was electrically stimulated, synaptic responses were recorded from the PRh neurons. Blockade of GABA-A receptors dramatically increased the amplitude of evoked excitatory potentials. Stimulation of the MG or ACx also evoked calcium transients in most PRh neurons. Separately, when fluoro ruby was injected in ACx in vivo, anterogradely labeled axons and terminals were observed in the PRh. Collectively, these data show that the PRh integrates auditory information from the MG and ACx and that auditory driven inhibition dominates the postsynaptic responses in a non-sensory cortical region downstream from the auditory cortex.

  17. Visual Cues, Verbal Cues and Child Development

    Science.gov (United States)

    Valentini, Nadia

    2004-01-01

    In this article, the author discusses two strategies--visual cues (modeling) and verbal cues (short, accurate phrases) which are related to teaching motor skills in maximizing learning in physical education classes. Both visual and verbal cues are strong influences in facilitating and promoting day-to-day learning. Both strategies reinforce…

  18. Behavioural and neural correlates of auditory attention

    OpenAIRE

    Roberts, Katherine Leonie

    2005-01-01

    The auditory attention skills of alterting, orienting, and executive control were assessed using behavioural and neuroimaging techniques. Initially, an auditory analgue of the visual attention network test (ANT) (FAN, McCandliss, Sommer, Raz, & Posner, 2002) was created and tested alongside the visual ANT in a group of 40 healthy subjects. The results from this study showed similarities between auditory and visual spatial orienting. An fMRI study was conducted to investigate whether the simil...

  19. Virtual adult ears reveal the roles of acoustical factors and experience in auditory space map development.

    Science.gov (United States)

    Campbell, Robert A A; King, Andrew J; Nodal, Fernando R; Schnupp, Jan W H; Carlile, Simon; Doubell, Timothy P

    2008-11-01

    Auditory neurons in the superior colliculus (SC) respond preferentially to sounds from restricted directions to form a map of auditory space. The development of this representation is shaped by sensory experience, but little is known about the relative contribution of peripheral and central factors to the emergence of adult responses. By recording from the SC of anesthetized ferrets at different age points, we show that the map matures gradually after birth; the spatial receptive fields (SRFs) become more sharply tuned and topographic order emerges by the end of the second postnatal month. Principal components analysis of the head-related transfer function revealed that the time course of map development is mirrored by the maturation of the spatial cues generated by the growing head and external ears. However, using virtual acoustic space stimuli, we show that these acoustical changes are not by themselves responsible for the emergence of SC map topography. Presenting stimuli to infant ferrets through virtual adult ears did not improve the order in the representation of sound azimuth in the SC. But by using linear discriminant analysis to compare different response properties across age, we found that the SRFs of infant neurons nevertheless became more adult-like when stimuli were delivered through virtual adult ears. Hence, although the emergence of auditory topography is likely to depend on refinements in neural circuitry, maturation of the structure of the SRFs (particularly their spatial extent) can be largely accounted for by changes in the acoustics associated with growth of the head and ears. PMID:18987192

  20. Effects of spatial configurations on the resolution of spatial working memory.

    Science.gov (United States)

    Mutluturk, Aysu; Boduroglu, Aysecan

    2014-11-01

    Recent research demonstrated that people represent spatial information configurally and preservation of configural cues at retrieval helps memory for spatial locations (Boduroğlu & Shah, Memory & Cognition, 37(8), 1120-1131 2009; Jiang, Olson, & Chun, Journal of Experimental Psychology: Learning, Memory, and Cognition, 26(3), 683-702 2000). The present study investigated the effects of spatial configurations on the resolution of individual location representations. In an open-ended task, participants first studied a set of object locations (three and five locations). Then, in a test display where available configural cues were manipulated, participants were asked to determine the original location of a target object whose color was auditorially cued. The difference between the reported location and the original location was taken as a measure of spatial resolution. In three experiments, we consistently observed that the resolution of spatial representations was facilitated by the preservation of spatial configurations at retrieval. We argue that participants may be using available configural cues in conjunction with the summary representation (e.g., centroid) of the original display in the computation of target locations. PMID:24939236

  1. Non-predictive cueing improves accuracy judgments for voluntary and involuntary spatial and feature/shape attention independent of backward masking.

    OpenAIRE

    Pack, Weston David

    2013-01-01

    Many psychophysics investigations have implemented pre-cues to direct an observer's attention to a specific location or feature. There is controversy over the mechanisms of involuntary attention and whether perceptual or decision processes can enhance target detection and identification as measured by accuracy judgments. Through four main experiments, this dissertation research has indicated that both involuntary and voluntary attention improve target identification and localization accuracy ...

  2. 邻近效应对多媒体学习中图文整合的影响:线索的作用%The Spatial Contiguity Effect in Multimedia Learning:The Role of Cueing

    Institute of Scientific and Technical Information of China (English)

    王福兴; 段朝辉; 周宗奎; 陈珺

    2015-01-01

    Text and illustrations integrated in spatial distribution could be helpful for learners’ performance during multimedia learning. In addition, recent studies showed that cues, e.g. highlighting with color, arrows, bold typeface, could guide learner’s attention and improve their learning outcomes. The researchers speculate that the picture and text close to each other can shorten the visual search time and reduce the cognitive load, thereby enhancing the learning results. Previous studies also showed that adding cues to the learning materials could guide the learners’ attention, promoted the organization and integration of the new learning knowledge. But what are the specific processes of the contiguity effect? Whether the changes of the picture-text’s location and adding cues would affect the allocation of attention? In this study, we expected that the contiguity effects and cueing would affect the learners' attention allocation, and then influence the memory tests. Consequently, the integrated text and pictures with cues would have more fixation counts and longer dwell time on the task related area, and higher scores in the retention test and transfer test. In this study, fifty one college students were recruited from Central China Normal University as the participants with prior knowledge questionnaire. And a computer-generated animation depicting the process of lightning formation was used as the experiment material. Highlighting red color on text and pictures were manipulated as cues. First of all, a demographic questionnaire including a prior knowledge questionnaire would be sent to all of the prospective participants who want to participate in the experiment. The student who could be the participants had been measured by the prior knowledge questionnaire, to ensure they knew little about the lightning knowledge. After that they were randomized into four groups. The four groups were as follows: the integrated text picture with cues, the integrated text

  3. Feeling music: integration of auditory and tactile inputs in musical meter perception.

    Directory of Open Access Journals (Sweden)

    Juan Huang

    Full Text Available Musicians often say that they not only hear, but also "feel" music. To explore the contribution of tactile information in "feeling" musical rhythm, we investigated the degree that auditory and tactile inputs are integrated in humans performing a musical meter recognition task. Subjects discriminated between two types of sequences, 'duple' (march-like rhythms and 'triple' (waltz-like rhythms presented in three conditions: 1 Unimodal inputs (auditory or tactile alone, 2 Various combinations of bimodal inputs, where sequences were distributed between the auditory and tactile channels such that a single channel did not produce coherent meter percepts, and 3 Simultaneously presented bimodal inputs where the two channels contained congruent or incongruent meter cues. We first show that meter is perceived similarly well (70%-85% when tactile or auditory cues are presented alone. We next show in the bimodal experiments that auditory and tactile cues are integrated to produce coherent meter percepts. Performance is high (70%-90% when all of the metrically important notes are assigned to one channel and is reduced to 60% when half of these notes are assigned to one channel. When the important notes are presented simultaneously to both channels, congruent cues enhance meter recognition (90%. Performance drops dramatically when subjects were presented with incongruent auditory cues (10%, as opposed to incongruent tactile cues (60%, demonstrating that auditory input dominates meter perception. We believe that these results are the first demonstration of cross-modal sensory grouping between any two senses.

  4. Behavioral correlates of auditory streaming in rhesus macaques

    OpenAIRE

    Christison-Lagay, Kate L.; Cohen, Yale E.

    2013-01-01

    Perceptual representations of auditory stimuli (i.e., sounds) are derived from the auditory system’s ability to segregate and group the spectral, temporal, and spatial features of auditory stimuli—a process called “auditory scene analysis”. Psychophysical studies have identified several of the principles and mechanisms that underlie a listener’s ability to segregate and group acoustic stimuli. One important psychophysical task that has illuminated many of these principles and mechanisms is th...

  5. Psychology of auditory perception.

    Science.gov (United States)

    Lotto, Andrew; Holt, Lori

    2011-09-01

    Audition is often treated as a 'secondary' sensory system behind vision in the study of cognitive science. In this review, we focus on three seemingly simple perceptual tasks to demonstrate the complexity of perceptual-cognitive processing involved in everyday audition. After providing a short overview of the characteristics of sound and their neural encoding, we present a description of the perceptual task of segregating multiple sound events that are mixed together in the signal reaching the ears. Then, we discuss the ability to localize the sound source in the environment. Finally, we provide some data and theory on how listeners categorize complex sounds, such as speech. In particular, we present research on how listeners weigh multiple acoustic cues in making a categorization decision. One conclusion of this review is that it is time for auditory cognitive science to be developed to match what has been done in vision in order for us to better understand how humans communicate with speech and music. WIREs Cogni Sci 2011 2 479-489 DOI: 10.1002/wcs.123 For further resources related to this article, please visit the WIREs website. PMID:26302301

  6. Perception and psychological evaluation for visual and auditory environment based on the correlation mechanisms

    Science.gov (United States)

    Fujii, Kenji

    2002-06-01

    In this dissertation, the correlation mechanism in modeling the process in the visual perception is introduced. It has been well described that the correlation mechanism is effective for describing subjective attributes in auditory perception. The main result is that it is possible to apply the correlation mechanism to the process in temporal vision and spatial vision, as well as in audition. (1) The psychophysical experiment was performed on subjective flicker rates for complex waveforms. A remarkable result is that the phenomenon of missing fundamental is found in temporal vision as analogous to the auditory pitch perception. This implies the existence of correlation mechanism in visual system. (2) For spatial vision, the autocorrelation analysis provides useful measures for describing three primary perceptual properties of visual texture: contrast, coarseness, and regularity. Another experiment showed that the degree of regularity is a salient cue for texture preference judgment. (3) In addition, the autocorrelation function (ACF) and inter-aural cross-correlation function (IACF) were applied for analysis of the temporal and spatial properties of environmental noise. It was confirmed that the acoustical properties of aircraft noise and traffic noise are well described. These analyses provided useful parameters extracted from the ACF and IACF in assessing the subjective annoyance for noise. Thesis advisor: Yoichi Ando Copies of this thesis written in English can be obtained from Junko Atagi, 6813 Mosonou, Saijo-cho, Higashi-Hiroshima 739-0024, Japan. E-mail address: atagi\\@urban.ne.jp.

  7. Multiple cue use and integration in pigeons (Columba livia).

    Science.gov (United States)

    Legge, Eric L G; Madan, Christopher R; Spetch, Marcia L; Ludvig, Elliot A

    2016-05-01

    Encoding multiple cues can improve the accuracy and reliability of navigation and goal localization. Problems may arise, however, if one cue is displaced and provides information which conflicts with other cues. Here we investigated how pigeons cope with cue conflict by training them to locate a goal relative to two landmarks and then varying the amount of conflict between the landmarks. When the amount of conflict was small, pigeons tended to integrate both cues in their search patterns. When the amount of conflict was large, however, pigeons used information from both cues independently. This context-dependent strategy for resolving spatial cue conflict agrees with Bayes optimal calculations for using information from multiple sources. PMID:26908004

  8. Discovering Structure in Auditory Input: Evidence from Williams Syndrome

    Science.gov (United States)

    Elsabbagh, Mayada; Cohen, Henri; Karmiloff-Smith, Annette

    2010-01-01

    We examined auditory perception in Williams syndrome by investigating strategies used in organizing sound patterns into coherent units. In Experiment 1, we investigated the streaming of sound sequences into perceptual units, on the basis of pitch cues, in a group of children and adults with Williams syndrome compared to typical controls. We showed…

  9. Auditory priming of frequency and temporal information: Effects of lateralized presentation

    OpenAIRE

    List, Alexandra; Justus, Timothy

    2007-01-01

    Asymmetric distribution of function between the cerebral hemispheres has been widely investigated in the auditory modality. The current approach borrows heavily from visual local-global research in an attempt to determine whether, as in vision, local-global auditory processing is lateralized. In vision, lateralized local-global processing likely relies on spatial frequency information. Drawing analogies between visual spatial frequency and auditory dimensions, two sets of auditory stimuli wer...

  10. Categorical vowel perception enhances the effectiveness and generalization of auditory feedback in human-machine-interfaces.

    Directory of Open Access Journals (Sweden)

    Eric Larson

    Full Text Available Human-machine interface (HMI designs offer the possibility of improving quality of life for patient populations as well as augmenting normal user function. Despite pragmatic benefits, utilizing auditory feedback for HMI control remains underutilized, in part due to observed limitations in effectiveness. The goal of this study was to determine the extent to which categorical speech perception could be used to improve an auditory HMI. Using surface electromyography, 24 healthy speakers of American English participated in 4 sessions to learn to control an HMI using auditory feedback (provided via vowel synthesis. Participants trained on 3 targets in sessions 1-3 and were tested on 3 novel targets in session 4. An "established categories with text cues" group of eight participants were trained and tested on auditory targets corresponding to standard American English vowels using auditory and text target cues. An "established categories without text cues" group of eight participants were trained and tested on the same targets using only auditory cuing of target vowel identity. A "new categories" group of eight participants were trained and tested on targets that corresponded to vowel-like sounds not part of American English. Analyses of user performance revealed significant effects of session and group (established categories groups and the new categories group, and a trend for an interaction between session and group. Results suggest that auditory feedback can be effectively used for HMI operation when paired with established categorical (native vowel targets with an unambiguous cue.

  11. Effects of an Auditory Lateralization Training in Children Suspected to Central Auditory Processing Disorder

    Science.gov (United States)

    Lotfi, Yones; Moosavi, Abdollah; Bakhshi, Enayatollah; Sadjedi, Hamed

    2016-01-01

    Background and Objectives Central auditory processing disorder [(C)APD] refers to a deficit in auditory stimuli processing in nervous system that is not due to higher-order language or cognitive factors. One of the problems in children with (C)APD is spatial difficulties which have been overlooked despite their significance. Localization is an auditory ability to detect sound sources in space and can help to differentiate between the desired speech from other simultaneous sound sources. Aim of this research was investigating effects of an auditory lateralization training on speech perception in presence of noise/competing signals in children suspected to (C)APD. Subjects and Methods In this analytical interventional study, 60 children suspected to (C)APD were selected based on multiple auditory processing assessment subtests. They were randomly divided into two groups: control (mean age 9.07) and training groups (mean age 9.00). Training program consisted of detection and pointing to sound sources delivered with interaural time differences under headphones for 12 formal sessions (6 weeks). Spatial word recognition score (WRS) and monaural selective auditory attention test (mSAAT) were used to follow the auditory lateralization training effects. Results This study showed that in the training group, mSAAT score and spatial WRS in noise (p value≤0.001) improved significantly after the auditory lateralization training. Conclusions We used auditory lateralization training for 6 weeks and showed that auditory lateralization can improve speech understanding in noise significantly. The generalization of this results needs further researches.

  12. Auditory Connections and Functions of Prefrontal Cortex

    Directory of Open Access Journals (Sweden)

    BethanyPlakke

    2014-07-01

    Full Text Available The functional auditory system extends from the ears to the frontal lobes with successively more complex functions occurring as one ascends the hierarchy of the nervous system. Several areas of the frontal lobe receive afferents from both early and late auditory processing regions within the temporal lobe. Afferents from the early part of the cortical auditory system, the auditory belt cortex, which are presumed to carry information regarding auditory features of sounds, project to only a few prefrontal regions and are most dense in the ventrolateral prefrontal cortex (VLPFC. In contrast, projections from the parabelt and the rostral superior temporal gyrus (STG most likely convey more complex information and target a larger, widespread region of the prefrontal cortex. Neuronal responses reflect these anatomical projections as some prefrontal neurons exhibit responses to features in acoustic stimuli, while other neurons display task-related responses. For example, recording studies in non-human primates indicate that VLPFC is responsive to complex sounds including vocalizations and that VLPFC neurons in area 12/47 respond to sounds with similar acoustic morphology. In contrast, neuronal responses during auditory working memory involve a wider region of the prefrontal cortex. In humans, the frontal lobe is involved in auditory detection, discrimination, and working memory. Past research suggests that dorsal and ventral subregions of the prefrontal cortex process different types of information with dorsal cortex processing spatial/visual information and ventral cortex processing non-spatial/auditory information. While this is apparent in the non-human primate and in some neuroimaging studies, most research in humans indicates that specific task conditions, stimuli or previous experience may bias the recruitment of specific prefrontal regions, suggesting a more flexible role for the frontal lobe during auditory cognition.

  13. Interaural timing cues do not contribute to the map of space in the ferret superior colliculus: a virtual acoustic space study.

    Science.gov (United States)

    Campbell, Robert A A; Doubell, Timothy P; Nodal, Fernando R; Schnupp, Jan W H; King, Andrew J

    2006-01-01

    In this study, we used individualized virtual acoustic space (VAS) stimuli to investigate the representation of auditory space in the superior colliculus (SC) of anesthetized ferrets. The VAS stimuli were generated by convolving broadband noise bursts with each animal's own head-related transfer function and presented over earphones. Comparison of the amplitude spectra of the free-field and VAS signals and of the spatial receptive fields of neurons recorded in the inferior colliculus with each form of stimulation confirmed that the VAS provided an accurate simulation of sounds presented in the free field. Units recorded in the deeper layers of the SC responded predominantly to virtual sound directions within the contralateral hemifield. In most cases, increasing the sound level resulted in stronger spike discharges and broader spatial receptive fields. However, the preferred sound directions, as defined by the direction of the centroid vector, remained largely unchanged across different levels and, as observed in previous free-field studies, varied topographically in azimuth along the rostrocaudal axis of the SC. We also examined the contribution of interaural time differences (ITDs) to map topography by digitally manipulating the VAS stimuli so that ITDs were held constant while allowing other spatial cues to vary naturally. The response properties of the majority of units, including centroid direction, remained unchanged with fixed ITDs, indicating that sensitivity to this cue is not responsible for tuning to different sound directions. These results are consistent with previous data suggesting that sensitivity to interaural level differences and spectral cues provides the basis for the map of auditory space in the mammalian SC. PMID:16162823

  14. Efficacy of the LiSN & Learn Auditory Training Software: randomized blinded controlled study

    Directory of Open Access Journals (Sweden)

    Sharon Cameron

    2012-01-01

    Full Text Available Background: Children with a spatial processing disorder (SPD require a more favorable signal-to-noise ratio in the classroom because they have difficulty perceiving sound source location cues. Previous research has shown that a novel training program - LiSN & Learn - employing spatialized sound, overcomes this deficit. Here we investigate whether improvements in spatial processing ability are specific to the LiSN & Learn training program. Materials and methods: Participants were ten children (aged between 6;0 [years;months] and 9;9 with normal peripheral hearing who were diagnosed as having SPD using the Listening in Spatialized Noise – Sentences Test (LISN-S. In a blinded controlled study, the participants were randomly allocated to train with either the LiSN & Learn or another auditory training program – Earobics - for approximately 15 minutes per day for twelve weeks. Results: There was a significant improvement post-training on the conditions of the LiSN-S that evaluate spatial processing ability for the LiSN & Learn group (p=0.03 to 0.0008, η2=0.75 to 0.95, n=5, but not for the Earobics group (p=0.5 to 0.7, η2=0.1 to 0.04, n=5. Results from questionnaires completed by the participants and their parents and teachers revealed improvements in real-world listening performance post-training were greater in the LiSN & Learn group than the Earobics group. Conclusions: LiSN & Learn training improved binaural processing ability in children with SPD, enhancing their ability to understand speech in noise. Exposure to non-spatialized auditory training does not produce similar outcomes, emphasizing the importance of deficit-specific remediation.

  15. Efficacy of the LiSN & Learn auditory training software: randomized blinded controlled study

    Directory of Open Access Journals (Sweden)

    Sharon Cameron

    2012-09-01

    Full Text Available Children with a spatial processing disorder (SPD require a more favorable signal-to-noise ratio in the classroom because they have difficulty perceiving sound source location cues. Previous research has shown that a novel training program - LiSN & Learn - employing spatialized sound, overcomes this deficit. Here we investigate whether improvements in spatial processing ability are specific to the LiSN & Learn training program. Participants were ten children (aged between 6;0 [years;months] and 9;9 with normal peripheral hearing who were diagnosed as having SPD using the Listening in Spatialized Noise - Sentences test (LiSN-S. In a blinded controlled study, the participants were randomly allocated to train with either the LiSN & Learn or another auditory training program - Earobics - for approximately 15 min per day for twelve weeks. There was a significant improvement post-training on the conditions of the LiSN-S that evaluate spatial processing ability for the LiSN & Learn group (P=0.03 to 0.0008, η 2=0.75 to 0.95, n=5, but not for the Earobics group (P=0.5 to 0.7, η 2=0.1 to 0.04, n=5. Results from questionnaires completed by the participants and their parents and teachers revealed improvements in real-world listening performance post-training were greater in the LiSN & Learn group than the Earobics group. LiSN & Learn training improved binaural processing ability in children with SPD, enhancing their ability to understand speech in noise. Exposure to non-spatialized auditory training does not produce similar outcomes, emphasizing the importance of deficit-specific remediation.

  16. Object Cueing System For Infrared Images

    Science.gov (United States)

    Ranganath, H. S.; McIngvale, Pat; Speigle, Scott

    1987-09-01

    This paper considers the design of an object cueing system as a rule-based expert system. The architecture is modular and the control strategy permits dynamic scheduling of tasks. In this approach, results of several algorithms and many object recognition heuristics are combined to achieve better performance levels. Importance of spatial knowledge representatiOn is also discussed.

  17. Nogo stimuli do not receive more attentional suppression or response inhibition than neutral stimuli: evidence from the N2pc, PD and N2 components in a spatial cueing paradigm

    Directory of Open Access Journals (Sweden)

    Caroline eBarras

    2016-05-01

    Full Text Available It has been claimed that stimuli sharing the color of the nogo-target are suppressed because of the strong incentive to not process the nogo-target, but we failed to replicate this finding. Participants searched for a color singleton in the target display and indicated its shape when it was in the go color. If the color singleton in the target display was in the nogo color, they had to withhold the response. The target display was preceded by a cue display that also contained a color singleton (the cue. The cue was either in the color of the go or nogo target, or it was in an unrelated, neutral color. With cues in the go color, reaction times (RTs were shorter when the cue appeared at the same location as the target compared to when it appeared at a different location. Also, electrophysiological recordings showed that an index of attentional selection, the N2pc, was elicited by go cues. Surprisingly, we failed to replicate cueing costs for cues in the nogo color that were originally reported by Anderson and Folk (2012. Consistently, we also failed to find an electrophysiological index of attentional suppression (the PD for cues in the nogo color. Further, fronto-central ERPs to the cue display showed the same negativity for nogo and neutral stimuli relative to go stimuli, which is at odds with response inhibition and conflict monitoring accounts of the Nogo-N2. Thus, the modified cueing paradigm employed here provides little evidence that features associated with nogo-targets are suppressed at the level of attention or response selection. Rather, nogo-stimuli are efficiently ignored and attention is focused on features that require a response.

  18. How far away is the next basket of eggs? Spatial memory and perceived cues shape aggregation patterns in a leaf beetle.

    Science.gov (United States)

    Stephan, Jörg G; Stenberg, Johan A; Björkman, Christer

    2015-04-01

    Gregarious organisms need to handle the trade-off between increasing food competition and the positive effects of group living, and this is particularly important for ovipositing females. We hypothesized that insect females consider how many conspecifics previously visited a host plant. In a no-choice assay, we show that the gregarious blue willow leaf beetle (Phratora vulgatissima) laid the most eggs and the largest clutches on plants where a sequence of few individual females was released, compared to plants where one or many different females were repeatedly released. Therefore, this species is more sensitive to the indirectly perceived number of conspecifics than the directly perceived number of eggs on a plant. We further hypothesized that females adjust their own intra-plant egg clutch distribution to that of conspecifics and discovered a new behavioral component, i.e., the modulation of distances between clutches. Females adjusted these distances in ways indicating the use of spatial memory, because the largest distance increases were observed on plants with their own clutches, compared to plants with clutches from conspecifics. However, adjustment of aggregation level and distance between clutches occurred only on a suitable, and not on an unsuitable, Salix genotype. We conclude that both behaviors should reduce competition between sibling and non-sibling larvae. PMID:26230012

  19. Cue conflicts in context

    DEFF Research Database (Denmark)

    Boeg Thomsen, Ditte; Poulsen, Mads

    2015-01-01

    When learning their first language, children develop strategies for assigning semantic roles to sentence structures, depending on morphosyntactic cues such as case and word order. Traditionally, comprehension experiments have presented transitive clauses in isolation, and crosslinguistically chil...

  20. Age differences in visual-auditory self-motion perception during a simulated driving task

    Directory of Open Access Journals (Sweden)

    Robert eRamkhalawansingh

    2016-04-01

    Full Text Available Recent evidence suggests that visual-auditory cue integration may change as a function of age such that integration is heightened among older adults. Our goal was to determine whether these changes in multisensory integration are also observed in the context of self-motion perception under realistic task constraints. Thus, we developed a simulated driving paradigm in which we provided older and younger adults with visual motion cues (i.e. optic flow and systematically manipulated the presence or absence of congruent auditory cues to self-motion (i.e. engine, tire, and wind sounds. Results demonstrated that the presence or absence of congruent auditory input had different effects on older and younger adults. Both age groups demonstrated a reduction in speed variability when auditory cues were present compared to when they were absent, but older adults demonstrated a proportionally greater reduction in speed variability under combined sensory conditions. These results are consistent with evidence indicating that multisensory integration is heightened in older adults. Importantly, this study is the first to provide evidence to suggest that age differences in multisensory integration may generalize from simple stimulus detection tasks to the integration of the more complex and dynamic visual and auditory cues that are experienced during self-motion.

  1. Detecting anxiety and defensiveness from visual and auditory cues.

    Science.gov (United States)

    Harrigan, J A; Harrigan, K M; Sale, B A; Rosenthal, R

    1996-09-01

    Defensive individuals have been shown to differ from nondefensive individuals on a number of physiological and behavioral measures. We report two studies on observers' inferences of defensiveness, and the contribution of communication channels in the inference of defensiveness. Observers judged high and low state anxious segments of high and low trait anxious defensive and nondefensive individuals. Accurate assessments were made of (a) defensiveness, (b) state anxiety, and (c) trait anxiety: Individuals with higher levels of each variable were perceived as more anxious compared with the lower level. Effects for defensiveness and state anxiety were greater in audio-only segments, while effects for trait anxiety were greater in video-only segments. Inferences of defensiveness were greater at higher levels of state anxiety and trait anxiety. Low trait anxious defensive individuals were perceived as more anxious than the true low trait anxious. Results for defensiveness and trait anxiety were replicated in Study 2, and observers' perceptions of state anxiety matched individuals' self-reports: Defensive individuals with maximal differences between high and low state anxiety were seen as more anxious in high state anxiety, while defensive individuals with minimal differences between high and low state anxiety were regarded as less anxious in high state anxiety. PMID:8776883

  2. Cannabis cue reactivity and craving among never, infrequent and heavy cannabis users.

    Science.gov (United States)

    Henry, Erika A; Kaye, Jesse T; Bryan, Angela D; Hutchison, Kent E; Ito, Tiffany A

    2014-04-01

    Substance cue reactivity is theorized as having a significant role in addiction processes, promoting compulsive patterns of drug-seeking and drug-taking behavior. However, research extending this phenomenon to cannabis has been limited. To that end, the goal of the current work was to examine the relationship between cannabis cue reactivity and craving in a sample of 353 participants varying in self-reported cannabis use. Participants completed a visual oddball task whereby neutral, exercise, and cannabis cue images were presented, and a neutral auditory oddball task while event-related brain potentials (ERPs) were recorded. Consistent with past research, greater cannabis use was associated with greater reactivity to cannabis images, as reflected in the P300 component of the ERP, but not to neutral auditory oddball cues. The latter indicates the specificity of cue reactivity differences as a function of substance-related cues and not generalized cue reactivity. Additionally, cannabis cue reactivity was significantly related to self-reported cannabis craving as well as problems associated with cannabis use. Implications for cannabis use and addiction more generally are discussed. PMID:24264815

  3. Auditory imagery: empirical findings.

    Science.gov (United States)

    Hubbard, Timothy L

    2010-03-01

    The empirical literature on auditory imagery is reviewed. Data on (a) imagery for auditory features (pitch, timbre, loudness), (b) imagery for complex nonverbal auditory stimuli (musical contour, melody, harmony, tempo, notational audiation, environmental sounds), (c) imagery for verbal stimuli (speech, text, in dreams, interior monologue), (d) auditory imagery's relationship to perception and memory (detection, encoding, recall, mnemonic properties, phonological loop), and (e) individual differences in auditory imagery (in vividness, musical ability and experience, synesthesia, musical hallucinosis, schizophrenia, amusia) are considered. It is concluded that auditory imagery (a) preserves many structural and temporal properties of auditory stimuli, (b) can facilitate auditory discrimination but interfere with auditory detection, (c) involves many of the same brain areas as auditory perception, (d) is often but not necessarily influenced by subvocalization, (e) involves semantically interpreted information and expectancies, (f) involves depictive components and descriptive components, (g) can function as a mnemonic but is distinct from rehearsal, and (h) is related to musical ability and experience (although the mechanisms of that relationship are not clear). PMID:20192565

  4. Persistent fluctuations in stride intervals under fractal auditory stimulation.

    Science.gov (United States)

    Marmelat, Vivien; Torre, Kjerstin; Beek, Peter J; Daffertshofer, Andreas

    2014-01-01

    Stride sequences of healthy gait are characterized by persistent long-range correlations, which become anti-persistent in the presence of an isochronous metronome. The latter phenomenon is of particular interest because auditory cueing is generally considered to reduce stride variability and may hence be beneficial for stabilizing gait. Complex systems tend to match their correlation structure when synchronizing. In gait training, can one capitalize on this tendency by using a fractal metronome rather than an isochronous one? We examined whether auditory cues with fractal variations in inter-beat intervals yield similar fractal inter-stride interval variability as isochronous auditory cueing in two complementary experiments. In Experiment 1, participants walked on a treadmill while being paced by either an isochronous or a fractal metronome with different variation strengths between beats in order to test whether participants managed to synchronize with a fractal metronome and to determine the necessary amount of variability for participants to switch from anti-persistent to persistent inter-stride intervals. Participants did synchronize with the metronome despite its fractal randomness. The corresponding coefficient of variation of inter-beat intervals was fixed in Experiment 2, in which participants walked on a treadmill while being paced by non-isochronous metronomes with different scaling exponents. As expected, inter-stride intervals showed persistent correlations similar to self-paced walking only when cueing contained persistent correlations. Our results open up a new window to optimize rhythmic auditory cueing for gait stabilization by integrating fractal fluctuations in the inter-beat intervals. PMID:24651455

  5. The acoustical cues to sound location in the Guinea pig (cavia porcellus)

    OpenAIRE

    Greene, Nathanial T; Anbuhl, Kelsey L; Williams, Whitney; Tollin, Daniel J.

    2014-01-01

    There are three main acoustical cues to sound location, each attributable to space-and frequency-dependent filtering of the propagating sound waves by the outer ears, head, and torso: Interaural differences in time (ITD) and level (ILD) as well as monaural spectral shape cues. While the guinea pig has been a common model for studying the anatomy, physiology, and behavior of binaural and spatial hearing, extensive measurements of their available acoustical cues are lacking. Here, these cues we...

  6. Shifts of attention in the early blind: an ERP study of attentional control processes in the absence of visual spatial information

    OpenAIRE

    Van Velzen, J.; Eardley, Alison F.; Forster, B.; Eimer, Martin

    2006-01-01

    To investigate the role of visual spatial information in the control of spatial attention, event-related brain potentials (ERPs) were recorded during a tactile attention task for a group of totally blind participants who were either congenitally blind or had lost vision during infancy, and for an age-matched, sighted control group who performed the task in the dark. Participants had to shift attention to the left or right hand (as indicated by an auditory cue presented at the start of each tr...

  7. Viewpoint-independent contextual cueing effect

    Directory of Open Access Journals (Sweden)

    taiga etsuchiai

    2012-06-01

    Full Text Available We usually perceive things in our surroundings as unchanged despite viewpoint changes caused by self-motion. The visual system therefore must have a function to process objects independently of viewpoint. In this study, we examined whether viewpoint-independent spatial layout can be obtained implicitly. For this purpose, we used a contextual cueing effect, a learning effect of spatial layout in visual search displays known to be an implicit effect. We compared the transfer of the contextual cueing effect between cases with and without self-motion by using visual search displays for 3D objects, which changed according to the participant’s assumed location for viewing the stimuli. The contextual cueing effect was obtained with self-motion but disappeared when the display changed without self-motion. This indicates that there is an implicit learning effect in spatial coordinates and suggests that the spatial representation of object layouts or scenes can be obtained and updated implicitly. We also showed that binocular disparity play an important role in the layout representations.

  8. Analysis of Parallel and Transverse Visual Cues on the Gait of Individuals with Idiopathic Parkinson's Disease

    Science.gov (United States)

    de Melo Roiz, Roberta; Azevedo Cacho, Enio Walker; Cliquet, Alberto, Jr.; Barasnevicius Quagliato, Elizabeth Maria Aparecida

    2011-01-01

    Idiopathic Parkinson's disease (IPD) has been defined as a chronic progressive neurological disorder with characteristics that generate changes in gait pattern. Several studies have reported that appropriate external influences, such as visual or auditory cues may improve the gait pattern of patients with IPD. Therefore, the objective of this…

  9. Auditory localisation of conventional and electric cars : laboratory results and implications for cycling safety

    OpenAIRE

    Stelling-Konczak, A. & Hagenzieker, M.P.

    2016-01-01

    When driven at low speeds, cars operating in electric mode have been found to be quieter than conventional cars. As a result, the auditory cues which pedestrians and cyclists use to assess the presence, proximity and location oncoming traffic may be reduced, posing a safety hazard. This laboratory study examined auditory localisation of conventional and electric cars including vehicle motion paths relevant for cycling activity. Participants (N = 65) in three age groups (16–18, 30–40 and 65–70...

  10. Multisensor image cueing (MUSIC)

    Science.gov (United States)

    Rodvold, David; Patterson, Tim J.

    2002-07-01

    There have been many years of research and development in the Automatic Target Recognition (ATR) community. This development has resulted in numerous algorithms to perform target detection automatically. The morphing of the ATR acronym to Aided Target Recognition provides a succinct commentary regarding the success of the automatic target recognition research. Now that the goal is aided recognition, many of the algorithms which were not able to provide autonomous recognition may now provide valuable assistance in cueing a human analyst where to look in the images under consideration. This paper describes the MUSIC system being developed for the US Air Force to provide multisensor image cueing. The tool works across multiple image phenomenologies and fuses the evidence across the set of available imagery. MUSIC is designed to work with a wide variety of sensors and platforms, and provide cueing to an image analyst in an information-rich environment. The paper concentrates on the current integration of algorithms into an extensible infrastructure to allow cueing in multiple image types.

  11. Composition: Cue Wheel

    DEFF Research Database (Denmark)

    Bergstrøm-Nielsen, Carl

    2014-01-01

    Cue Rondo is an open composition to be realised by improvising musicians. See more about my composition practise in the entry "Composition - General Introduction". This work is licensed under a Creative Commons "by-nc" License. You may for non-commercial purposes use and distribute it, performance...

  12. Auditory-olfactory integration: congruent or pleasant sounds amplify odor pleasantness.

    Science.gov (United States)

    Seo, Han-Seok; Hummel, Thomas

    2011-03-01

    Even though we often perceive odors while hearing auditory stimuli, surprisingly little is known about auditory-olfactory integration. This study aimed to investigate the influence of auditory cues on ratings of odor intensity and/or pleasantness, with a focus on 2 factors: "congruency" (Experiment 1) and the "halo/horns effect" of auditory pleasantness (Experiment 2). First, in Experiment 1, participants were presented with congruent, incongruent, or neutral sounds before and during the presentation of odor. Participants rated the odors as being more pleasant while listening to a congruent sound than while listening to an incongruent sound. In Experiment 2, participants received pleasant or unpleasant sounds before and during the presentation of either a pleasant or unpleasant odor. The hedonic valence of the sounds transferred to the odors, irrespective of the hedonic tone of the odor itself. The more the participants liked the preceding sound, the more pleasant the subsequent odor became. In contrast, the ratings of odor intensity appeared to be little or not at all influenced by the congruency or hedonic valence of the auditory cue. In conclusion, the present study for the first time provides an empirical demonstration that auditory cues can modulate odor pleasantness. PMID:21163913

  13. Cues for localization in the horizontal plane

    DEFF Research Database (Denmark)

    Jeppesen, Jakob; Møller, Henrik

    2005-01-01

    Spatial localization of sound is often described as unconscious evaluation of cues given by the interaural time difference (ITD) and the spectral information of the sound that reaches the two ears. Our present knowledge suggests the hypothesis that the ITD roughly determines the cone of the perce...... independently in HRTFs used for binaural synthesis. The ITD seems to be dominant for localization in the horizontal plane even when the spectral information is severely degraded....

  14. Auditory-motor learning influences auditory memory for music.

    Science.gov (United States)

    Brown, Rachel M; Palmer, Caroline

    2012-05-01

    In two experiments, we investigated how auditory-motor learning influences performers' memory for music. Skilled pianists learned novel melodies in four conditions: auditory only (listening), motor only (performing without sound), strongly coupled auditory-motor (normal performance), and weakly coupled auditory-motor (performing along with auditory recordings). Pianists' recognition of the learned melodies was better following auditory-only or auditory-motor (weakly coupled and strongly coupled) learning than following motor-only learning, and better following strongly coupled auditory-motor learning than following auditory-only learning. Auditory and motor imagery abilities modulated the learning effects: Pianists with high auditory imagery scores had better recognition following motor-only learning, suggesting that auditory imagery compensated for missing auditory feedback at the learning stage. Experiment 2 replicated the findings of Experiment 1 with melodies that contained greater variation in acoustic features. Melodies that were slower and less variable in tempo and intensity were remembered better following weakly coupled auditory-motor learning. These findings suggest that motor learning can aid performers' auditory recognition of music beyond auditory learning alone, and that motor learning is influenced by individual abilities in mental imagery and by variation in acoustic features. PMID:22271265

  15. Auditory temporal processing skills in musicians with dyslexia.

    Science.gov (United States)

    Bishop-Liebler, Paula; Welch, Graham; Huss, Martina; Thomson, Jennifer M; Goswami, Usha

    2014-08-01

    The core cognitive difficulty in developmental dyslexia involves phonological processing, but adults and children with dyslexia also have sensory impairments. Impairments in basic auditory processing show particular links with phonological impairments, and recent studies with dyslexic children across languages reveal a relationship between auditory temporal processing and sensitivity to rhythmic timing and speech rhythm. As rhythm is explicit in music, musical training might have a beneficial effect on the auditory perception of acoustic cues to rhythm in dyslexia. Here we took advantage of the presence of musicians with and without dyslexia in musical conservatoires, comparing their auditory temporal processing abilities with those of dyslexic non-musicians matched for cognitive ability. Musicians with dyslexia showed equivalent auditory sensitivity to musicians without dyslexia and also showed equivalent rhythm perception. The data support the view that extensive rhythmic experience initiated during childhood (here in the form of music training) can affect basic auditory processing skills which are found to be deficient in individuals with dyslexia. PMID:25044949

  16. Timing the events of directional cueing.

    Science.gov (United States)

    Girardi, Giovanna; Antonucci, Gabriella; Nico, Daniele

    2015-11-01

    To explore the role of temporal context on voluntary orienting of attention, we submitted healthy participants to a spatial cueing task in which cue-target stimulus onset asynchronies (SOAs) were organized according to two-dimensional parameters: range and central value. Three ranges of SOAs organized around two central SOA values were presented to six groups of participants. Results showed a complex pattern of responses in terms of spatial validity (faster responses to correctly cued target) and preparatory effect (faster responses to longer SOAs). Responses to validly and neutrally cued targets were affected by the increase in SOA duration if the difference between longer and shorter SOA was large. On the contrary, responses to invalidly cued targets did not vary according to SOA manipulations. The observed pattern of cueing effects does not fit in the typical description of spatial attention working as a mandatory disengaging-shifting-engaging routine. In contrast, results rather suggest a mechanism based on the interaction between context sensitive top-down processes and bottom-up attentional processes. PMID:25468210

  17. Using auditory-visual speech to probe the basis of noise-impaired consonant-vowel perception in dyslexia and auditory neuropathy

    Science.gov (United States)

    Ramirez, Joshua; Mann, Virginia

    2005-08-01

    Both dyslexics and auditory neuropathy (AN) subjects show inferior consonant-vowel (CV) perception in noise, relative to controls. To better understand these impairments, natural acoustic speech stimuli that were masked in speech-shaped noise at various intensities were presented to dyslexic, AN, and control subjects either in isolation or accompanied by visual articulatory cues. AN subjects were expected to benefit from the pairing of visual articulatory cues and auditory CV stimuli, provided that their speech perception impairment reflects a relatively peripheral auditory disorder. Assuming that dyslexia reflects a general impairment of speech processing rather than a disorder of audition, dyslexics were not expected to similarly benefit from an introduction of visual articulatory cues. The results revealed an increased effect of noise masking on the perception of isolated acoustic stimuli by both dyslexic and AN subjects. More importantly, dyslexics showed less effective use of visual articulatory cues in identifying masked speech stimuli and lower visual baseline performance relative to AN subjects and controls. Last, a significant positive correlation was found between reading ability and the ameliorating effect of visual articulatory cues on speech perception in noise. These results suggest that some reading impairments may stem from a central deficit of speech processing.

  18. Mind your pricing cues.

    Science.gov (United States)

    Anderson, Eric; Simester, Duncan

    2003-09-01

    For most of the items they buy, consumers don't have an accurate sense of what the price should be. Ask them to guess how much a four-pack of 35-mm film costs, and you'll get a variety of wrong answers: Most people will underestimate; many will only shrug. Research shows that consumers' knowledge of the market is so far from perfect that it hardly deserves to be called knowledge at all. Yet people happily buy film and other products every day. Is this because they don't care what kind of deal they're getting? No. Remarkably, it's because they rely on retailers to tell them whether they're getting a good price. In subtle and not-so-subtle ways, retailers send signals to customers, telling them whether a given price is relatively high or low. In this article, the authors review several common pricing cues retailers use--"sale" signs, prices that end in 9, signpost items, and price-matching guarantees. They also offer some surprising facts about how--and how well--those cues work. For instance, the authors' tests with several mail-order catalogs reveal that including the word "sale" beside a price can increase demand by more than 50%. The practice of using a 9 at the end of a price to denote a bargain is so common, you'd think customers would be numb to it. Yet in a study the authors did involving a women's clothing catalog, they increased demand by a third just by changing the price of a dress from $34 to $39. Pricing cues are powerful tools for guiding customers' purchasing decisions, but they must be applied judiciously. Used inappropriately, the cues may breach customers' trust, reduce brand equity, and give rise to lawsuits. PMID:12964397

  19. Auditory Integration Training

    Directory of Open Access Journals (Sweden)

    Zahra Jafari

    2002-07-01

    Full Text Available Auditory integration training (AIT is a hearing enhancement training process for sensory input anomalies found in individuals with autism, attention deficit hyperactive disorder, dyslexia, hyperactivity, learning disability, language impairments, pervasive developmental disorder, central auditory processing disorder, attention deficit disorder, depressin, and hyperacute hearing. AIT, recently introduced in the United States, and has received much notice of late following the release of The Sound of a Moracle, by Annabel Stehli. In her book, Mrs. Stehli describes before and after auditory integration training experiences with her daughter, who was diagnosed at age four as having autism.

  20. Neural coding and perception of pitch in the normal and impaired human auditory system

    DEFF Research Database (Denmark)

    Santurette, Sébastien

    2011-01-01

    Pitch is an important attribute of hearing that allows us to perceive the musical quality of sounds. Besides music perception, pitch contributes to speech communication, auditory grouping, and perceptual segregation of sound sources. In this work, several aspects of pitch perception in humans were...... for a variety of basic auditory tasks, indicating that it may be a crucial measure to consider for hearing-loss characterization. In contrast to hearing-impaired listeners, adults with dyslexia showed no deficits in binaural pitch perception, suggesting intact low-level auditory mechanisms. The second part...... that the use of spectral cues remained plausible. Simulations of auditory-nerve representations of the complex tones further suggested that a spectrotemporal mechanism combining precise timing information across auditory channels might best account for the behavioral data. Overall, this work provides insights...

  1. Feasibility of external rhythmic cueing with the Google Glass for improving gait in people with Parkinson’s disease

    OpenAIRE

    ZHAO Yan; Nonnekes, Jorik; Storcken, Erik J.M.; Janssen, Sabine; Wegen, van, J.; Bloem, Bastiaan R.; Dorresteijn, Lucille D. A.; Vugt, van, M.; Heida, Tjitske; Wezel, van, G.P.

    2016-01-01

    New mobile technologies like smartglasses can deliver external cues that may improve gait in people with Parkinson’s disease in their natural environment. However, the potential of these devices must first be assessed in controlled experiments. Therefore, we evaluated rhythmic visual and auditory cueing in a laboratory setting with a custom-made application for the Google Glass. Twelve participants (mean age = 66.8; mean disease duration = 13.6 years) were tested at end of dose. We compared s...

  2. Auditory reafferences: The influence of real-time feedback on movement control

    Directory of Open Access Journals (Sweden)

    Christian eKennel

    2015-01-01

    Full Text Available Auditory reafferences are real-time auditory products created by a person’s own movements. Whereas the interdependency of action and perception is generally well studied, the auditory feedback channel and the influence of perceptual processes during movement execution remain largely unconsidered. We argue that movements have a rhythmic character that is closely connected to sound, making it possible to manipulate auditory reafferences online to understand their role in motor control. We examined if step sounds, occurring as a by-product of running, have an influence on the performance of a complex movement task. Twenty participants completed a hurdling task in three auditory feedback conditions: a control condition with normal auditory feedback, a white noise condition in which sound was masked, and a delayed auditory feedback condition. Overall time and kinematic data were collected. Results show that delayed auditory feedback led to a significantly slower overall time and changed kinematic parameters. Our findings complement previous investigations in a natural movement situation with nonartificial auditory cues. Our results support the existing theoretical understanding of action–perception coupling and hold potential for applied work, where naturally occurring movement sounds can be implemented in the motor learning processes.

  3. Only pre-cueing but no retro-cueing effects emerge with masked arrow cues.

    Science.gov (United States)

    Janczyk, Markus; Reuss, Heiko

    2016-05-01

    The impact of masked stimulation on cognitive control processes is investigated with much interest. In many cases, masked stimulation suffices to initiate and employ control processes. Shifts of attention either happen in the external environment or internally, for example, in working memory. In the former, even masked cues (i.e., cues that are presented for a period too short to allow strategic use) were shown efficient for shifting attention to particular locations in pre-cue paradigms. Internal attention shifting can be investigated using retro-cues: long after encoding, a valid cue indicates the location to-be-tested via change detection, and this improves performance (retro-cue effect). In the present experiment, participants performed in both a pre- and a retro-cue task with masked and normally presented cues. While the masked cues benefitted performance in the pre-cue task, they did not in the retro-cue task. These results inform about limits of masked stimulation. PMID:26998561

  4. Overriding auditory attentional capture

    OpenAIRE

    Dalton, Polly; Lavie, Nilli

    2007-01-01

    Attentional capture by color singletons during shape search can be eliminated when the target is not a feature singleton (Bacon & Egeth, 1994). This suggests that a "singleton detection" search strategy must be adopted for attentional capture to occur. Here we find similar effects on auditory attentional capture. Irrelevant high-intensity singletons interfered with an auditory search task when the target itself was also a feature singleton. However, singleton interference was eliminated when ...

  5. [Central auditory prosthesis].

    Science.gov (United States)

    Lenarz, T; Lim, H; Joseph, G; Reuter, G; Lenarz, M

    2009-06-01

    Deaf patients with severe sensory hearing loss can benefit from a cochlear implant (CI), which stimulates the auditory nerve fibers. However, patients who do not have an intact auditory nerve cannot benefit from a CI. The majority of these patients are neurofibromatosis type 2 (NF2) patients who developed neural deafness due to growth or surgical removal of a bilateral acoustic neuroma. The only current solution is the auditory brainstem implant (ABI), which stimulates the surface of the cochlear nucleus in the brainstem. Although the ABI provides improvement in environmental awareness and lip-reading capabilities, only a few NF2 patients have achieved some limited open set speech perception. In the search for alternative procedures our research group in collaboration with Cochlear Ltd. (Australia) developed a human prototype auditory midbrain implant (AMI), which is designed to electrically stimulate the inferior colliculus (IC). The IC has the potential as a new target for an auditory prosthesis as it provides access to neural projections necessary for speech perception as well as a systematic map of spectral information. In this paper the present status of research and development in the field of central auditory prostheses is presented with respect to technology, surgical technique and hearing results as well as the background concepts of ABI and AMI. PMID:19517084

  6. Auditory localisation of conventional and electric cars : laboratory results and implications for cycling safety

    NARCIS (Netherlands)

    Stelling-Konczak, A. Hagenzieker, M.P. Commandeur, J.J.F. Agterberg, M.J.H. & Wee, B. van

    2016-01-01

    When driven at low speeds, cars operating in electric mode have been found to be quieter than conventional cars. As a result, the auditory cues which pedestrians and cyclists use to assess the presence, proximity and location oncoming traffic may be reduced, posing a safety hazard. This laboratory s

  7. A magnetorheological haptic cue accelerator for manual transmission vehicles

    International Nuclear Information System (INIS)

    This paper proposes a new haptic cue function for manual transmission vehicles to achieve optimal gear shifting. This function is implemented on the accelerator pedal by utilizing a magnetorheological (MR) brake mechanism. By combining the haptic cue function with the accelerator pedal, the proposed haptic cue device can transmit the optimal moment of gear shifting for manual transmission to a driver without requiring the driver's visual attention. As a first step to achieve this goal, a MR fluid-based haptic device is devised to enable rotary motion of the accelerator pedal. Taking into account spatial limitations, the design parameters are optimally determined using finite element analysis to maximize the relative control torque. The proposed haptic cue device is then manufactured and its field-dependent torque and time response are experimentally evaluated. Then the manufactured MR haptic cue device is integrated with the accelerator pedal. A simple virtual vehicle emulating the operation of the engine of a passenger vehicle is constructed and put into communication with the haptic cue device. A feed-forward torque control algorithm for the haptic cue is formulated and control performances are experimentally evaluated and presented in the time domain

  8. Binaural cues provide for a release from informational masking.

    Science.gov (United States)

    Tolnai, Sandra; Dolležal, Lena-Vanessa; Klump, Georg M

    2015-10-01

    Informational masking (IM) describes the insensitivity of detecting a change in sound features in a complex acoustical environment when such a change could easily be detected in the absence of distracting sounds. IM occurs because of the similarity between deviant sound and distracting sounds (so-called similarity-based IM) and/or stimulus uncertainty stemming from trial-to-trial variability (so-called uncertainty-based IM). IM can be abolished if similarity-based or uncertainty-based IM are minimized. Here, we modulated similarity-based IM using binaural cues. Standard/deviant tones and distracting tones were presented sequentially, and level-increment thresholds were measured. Deviant tones differed from standard tones by a higher sound level. Distracting tones covered a wide range of levels. Standard/deviant tones and distracting tones were characterized by their interaural time difference (ITD), interaural level difference (ILD), or both ITD and ILD. The larger the ITD or ILD was, the better similarity-based IM was overcome. If both interaural differences were applied to standard/deviant tones, the release from IM was larger than when either interaural difference was used. The results show that binaural cues are potent cues to abolish similarity-based IM and that the auditory system makes use of multiple available cues. PMID:26413722

  9. The acoustical cues to sound location in the rat: Measurements of directional transfer functions

    OpenAIRE

    Koka, Kanthaiah; Read, Heather L.; Tollin, Daniel J.

    2008-01-01

    The acoustical cues for sound location are generated by spatial- and frequency-dependent filtering of propagating sound waves by the head and external ears. Although rats have been a common model system for anatomy, physiology, and psychophysics of localization, there have been few studies of the acoustical cues available to rats. Here, directional transfer functions (DTFs), the directional components of the head-related transfer functions, were measured in six adult rats. The cues to locatio...

  10. The Power Cues%权力线索

    Institute of Scientific and Technical Information of China (English)

    魏秋江

    2012-01-01

    权力线索指人们判断权力所依赖的各种信息,其能预测人们的思维和行为。除以视觉刺激和听觉刺激的形式直接影响人们的权力感知外,权力线索也可利用人们对其在空间和数字上的心理表征,间接影响人们的权力判断。各种权力线索的具体效应仍存争议。学者已开始关注现有线索去伪存真、分类和标准化等问题,还从生理视角对其加以验证,并探求新的权力线索。%Power cues are the internal and external stimuli that people utilize to judge the power of others and themselves. Recognizing people's power is the basic interaction in social and organizational life, which reduces the likelihood of conflicts within and between the groups and effectively assigns resources. Recognizing power also important to self - reinforcing and self - definition. Power cues are not only the statement of targets' power, but also can be used to predict people's minds and behaviors. Generally speaking, there are two kinds of encoding, visual and auditory, for the input information. The visual encoding includes appearance, such as the formation of face, behaviors, especially non - verbal behaviors, which always come out without consciousness but indicate peoples' power more exactly. The auditory encoding includes several parameters of sound, such as formant dispersion (Dr) , fundamental frequency ( F0 ) , variation in F0 , intensity, and utterance duration. Some kinds of messages are different, such as semantic content, via both ways, which connect with power based on higher level of cognition. In these three viewpoints, more cues are needed to be explored. Surprisingly, there is another odd factor, i.e. , gender. Research related to it reveals a diversity of results. So gender is more of a moderator than a definite power cue, which calls for more attention to the interaction effect. Besides, the mental representation of power, which involves mental simulation of space

  11. Overriding auditory attentional capture.

    Science.gov (United States)

    Dalton, Polly; Lavie, Nilli

    2007-02-01

    Attentional capture by color singletons during shape search can be eliminated when the target is not a feature singleton (Bacon & Egeth, 1994). This suggests that a "singleton detection" search strategy must be adopted for attentional capture to occur. Here we find similar effects on auditory attentional capture. Irrelevant high-intensity singletons interfered with an auditory search task when the target itself was also a feature singleton. However, singleton interference was eliminated when the target was not a singleton (i.e., when nontargets were made heterogeneous, or when more than one target sound was presented). These results suggest that auditory attentional capture depends on the observer's attentional set, as does visual attentional capture. The suggestion that hearing might act as an early warning system that would always be tuned to unexpected unique stimuli must therefore be modified to accommodate these strategy-dependent capture effects. PMID:17557587

  12. Perception of aircraft Deviation Cues

    Science.gov (United States)

    Martin, Lynne; Azuma, Ronald; Fox, Jason; Verma, Savita; Lozito, Sandra

    2005-01-01

    To begin to address the need for new displays, required by a future airspace concept to support new roles that will be assigned to flight crews, a study of potentially informative display cues was undertaken. Two cues were tested on a simple plan display - aircraft trajectory and flight corridor. Of particular interest was the speed and accuracy with which participants could detect an aircraft deviating outside its flight corridor. Presence of the trajectory cue significantly reduced participant reaction time to a deviation while the flight corridor cue did not. Although non-significant, the flight corridor cue seemed to have a relationship with the accuracy of participants judgments rather than their speed. As this is the second of a series of studies, these issues will be addressed further in future studies.

  13. Resizing Auditory Communities

    DEFF Research Database (Denmark)

    Kreutzfeldt, Jacob

    2012-01-01

    Heard through the ears of the Canadian composer and music teacher R. Murray Schafer the ideal auditory community had the shape of a village. Schafer’s work with the World Soundscape Project in the 70s represent an attempt to interpret contemporary environments through musical and auditory...... of sound as an active component in shaping urban environments. As urban conditions spreads globally, new scales, shapes and forms of communities appear and call for new distinctions and models in the study and representation of sonic environments. Particularly so, since urban environments...

  14. How temporal cues can aid colour constancy

    Science.gov (United States)

    Foster, David H.; Amano, Kinjiro; Nascimento, Sérgio M. C.

    2007-01-01

    Colour constancy assessed by asymmetric simultaneous colour matching usually reveals limited levels of performance in the unadapted eye. Yet observers can readily discriminate illuminant changes on a scene from changes in the spectral reflectances of the surfaces making up the scene. This ability is probably based on judgements of relational colour constancy, in turn based on the physical stability of spatial ratios of cone excitations under illuminant changes. Evidence is presented suggesting that the ability to detect violations in relational colour constancy depends on temporal transient cues. Because colour constancy and relational colour constancy are closely connected, it should be possible to improve estimates of colour constancy by introducing similar transient cues into the matching task. To test this hypothesis, an experiment was performed in which observers made surface-colour matches between patterns presented in the same position in an alternating sequence with period 2 s or, as a control, presented simultaneously, side-by-side. The degree of constancy was significantly higher for sequential presentation, reaching 87% for matches averaged over 20 observers. Temporal cues may offer a useful source of information for making colour-constancy judgements. PMID:17515948

  15. Feasibility of external rhythmic cueing with the Google Glass for improving gait in people with Parkinson's disease.

    Science.gov (United States)

    Zhao, Yan; Nonnekes, Jorik; Storcken, Erik J M; Janssen, Sabine; van Wegen, Erwin E H; Bloem, Bastiaan R; Dorresteijn, Lucille D A; van Vugt, Jeroen P P; Heida, Tjitske; van Wezel, Richard J A

    2016-06-01

    New mobile technologies like smartglasses can deliver external cues that may improve gait in people with Parkinson's disease in their natural environment. However, the potential of these devices must first be assessed in controlled experiments. Therefore, we evaluated rhythmic visual and auditory cueing in a laboratory setting with a custom-made application for the Google Glass. Twelve participants (mean age = 66.8; mean disease duration = 13.6 years) were tested at end of dose. We compared several key gait parameters (walking speed, cadence, stride length, and stride length variability) and freezing of gait for three types of external cues (metronome, flashing light, and optic flow) and a control condition (no-cue). For all cueing conditions, the subjects completed several walking tasks of varying complexity. Seven inertial sensors attached to the feet, legs and pelvis captured motion data for gait analysis. Two experienced raters scored the presence and severity of freezing of gait using video recordings. User experience was evaluated through a semi-open interview. During cueing, a more stable gait pattern emerged, particularly on complicated walking courses; however, freezing of gait did not significantly decrease. The metronome was more effective than rhythmic visual cues and most preferred by the participants. Participants were overall positive about the usability of the Google Glass and willing to use it at home. Thus, smartglasses like the Google Glass could be used to provide personalized mobile cueing to support gait; however, in its current form, auditory cues seemed more effective than rhythmic visual cues. PMID:27113598

  16. Smell facilitates auditory contagious yawning in stranger rats.

    Science.gov (United States)

    Moyaho, Alejandro; Rivas-Zamudio, Xaman; Ugarte, Araceli; Eguibar, José R; Valencia, Jaime

    2015-01-01

    Most vertebrates yawn in situations ranging from relaxation to tension, but only humans and other primate species that show mental state attribution skills have been convincingly shown to display yawn contagion. Whether complex forms of empathy are necessary for yawn contagion to occur is still unclear. As empathy is a phylogenetically continuous trait, simple forms of empathy, such as emotional contagion, might be sufficient for non-primate species to show contagious yawning. In this study, we exposed pairs of male rats, which were selected for high yawning, with each other through a perforated wall and found that olfactory cues stimulated yawning, whereas visual cues inhibited it. Unexpectedly, cage-mate rats failed to show yawn contagion, although they did show correlated emotional reactivity. In contrast, stranger rats showed auditory contagious yawning and greater rates of smell-facilitated auditory contagious yawning, although they did not show correlated emotional reactivity. Strikingly, they did not show contagious yawning to rats from a low-yawning strain. These findings indicate that contagious yawning may be a widespread trait amongst vertebrates and that mechanisms other than empathy may be involved. We suggest that a communicatory function of yawning may be the mechanism responsible for yawn contagion in rats, as contagiousness was strain-specific and increased with olfactory cues, which are involved in mutual recognition. PMID:25156806

  17. Prefrontal activity predicts monkeys' decisions during an auditory category task

    Directory of Open Access Journals (Sweden)

    Jung Hoon Lee

    2009-06-01

    Full Text Available The neural correlates that relate auditory categorization to aspects of goal-directed behavior, such as decision-making, are not well understood. Since the prefrontal cortex plays an important role in executive function and the categorization of auditory objects, we hypothesized that neural activity in the prefrontal cortex (PFC should predict an animal's behavioral reports (decisions during a category task. To test this hypothesis, we tested PFC activity that was recorded while monkeys categorized human spoken words (Russ et al., 2008b. We found that activity in the ventrolateral PFC, on average, correlated best with the monkeys' choices than with the auditory stimuli. This finding demonstrates a direct link between PFC activity and behavioral choices during a non-spatial auditory task.

  18. Early visual deprivation severely compromises the auditory sense of space in congenitally blind children.

    Science.gov (United States)

    Vercillo, Tiziana; Burr, David; Gori, Monica

    2016-06-01

    A recent study has shown that congenitally blind adults, who have never had visual experience, are impaired on an auditory spatial bisection task (Gori, Sandini, Martinoli, & Burr, 2014). In this study we investigated how thresholds for auditory spatial bisection and auditory discrimination develop with age in sighted and congenitally blind children (9 to 14 years old). Children performed 2 spatial tasks (minimum audible angle and space bisection) and 1 temporal task (temporal bisection). There was no impairment in the temporal task for blind children but, like adults, they showed severely compromised thresholds for spatial bisection. Interestingly, the blind children also showed lower precision in judging minimum audible angle. These results confirm the adult study and go on to suggest that even simpler auditory spatial tasks are compromised in children, and that this capacity recovers over time. (PsycINFO Database Record PMID:27228448

  19. Semantic Framing of Speech : Emotional and Topical Cues in Perception of Poorly Specified Speech

    OpenAIRE

    Lidestam, Björn

    2003-01-01

    The general aim of this thesis was to test the effects of paralinguistic (emotional) and prior contextual (topical) cues on perception of poorly specified visual, auditory, and audiovisual speech. The specific purposes were to (1) examine if facially displayed emotions can facilitate speechreading performance; (2) to study the mechanism for such facilitation; (3) to map information-processing factors that are involved in processing of poorly specified speech; and (4) to present a comprehensiv...

  20. The owl’s cochlear nuclei process different sound localization cues

    OpenAIRE

    Konishi, Masakazu; Sullivan, W. Edward; Takahashi, Terry

    1985-01-01

    This paper discusses how the barn owl’s brain stem auditory pathway is divided into two physiologically and anatomically segregated channels for separate processing of interaural phase and intensity cues for sound localization. The paper also points out the power of the ‘‘downstream’’ approach by which the emergence of a higher‐order neuron’s stimulus selectivity can be traced through lower‐order stations.

  1. Reduced recruitment of orbitofrontal cortex to human social chemosensory cues in social anxiety

    OpenAIRE

    Zhou, Wen; HOU, PING; Zhou, Yuxiang; Chen, Denise

    2010-01-01

    Social anxiety refers to the prevalent and debilitating experience of fear and anxiety of being scrutinized in social situations. It originates from both learned (e.g. adverse social conditioning) and innate (e.g. shyness) factors. Research on social anxiety has traditionally focused on negative emotions induced by visual and auditory social cues in socially anxious clinical populations, and posits a dysfunctional orbitofrontal-amygdala circuit as a primary etiological mechanism. Yet as a tra...

  2. At-home training with closed-loop augmented-reality cueing device for improving gait in patients with Parkinson disease

    OpenAIRE

    Rakesh Shukla, PhD; Alok Kumar Dwivedi, PhD; Yoram Baram, PhD; Alberto J. Espay, MD, MSc; Maureen Gartner, RN, MEd; Laura Gaines, BA, CCRC; Andrew P. Duker, MD; Fredy J. Revilla, MD

    2010-01-01

    Shuffling and freezing while walking can impair function in patients with Parkinson disease (PD). Open-loop devices that provide fixed-velocity visual or auditory cues can improve gait but may be unreliable or exacerbate freezing of gait in some patients. We examined the efficacy of a closed-loop, accelerometer-driven, wearable, visual-auditory cueing device in 13 patients with PD with off-state gait impairment at baseline and after 2 weeks of twice daily (30 minute duration) at-home use. We ...

  3. Tuned with a Tune: Talker Normalization via General Auditory Processes.

    Science.gov (United States)

    Laing, Erika J C; Liu, Ran; Lotto, Andrew J; Holt, Lori L

    2012-01-01

    Voices have unique acoustic signatures, contributing to the acoustic variability listeners must contend with in perceiving speech, and it has long been proposed that listeners normalize speech perception to information extracted from a talker's speech. Initial attempts to explain talker normalization relied on extraction of articulatory referents, but recent studies of context-dependent auditory perception suggest that general auditory referents such as the long-term average spectrum (LTAS) of a talker's speech similarly affect speech perception. The present study aimed to differentiate the contributions of articulatory/linguistic versus auditory referents for context-driven talker normalization effects and, more specifically, to identify the specific constraints under which such contexts impact speech perception. Synthesized sentences manipulated to sound like different talkers influenced categorization of a subsequent speech target only when differences in the sentences' LTAS were in the frequency range of the acoustic cues relevant for the target phonemic contrast. This effect was true both for speech targets preceded by spoken sentence contexts and for targets preceded by non-speech tone sequences that were LTAS-matched to the spoken sentence contexts. Specific LTAS characteristics, rather than perceived talker, predicted the results suggesting that general auditory mechanisms play an important role in effects considered to be instances of perceptual talker normalization. PMID:22737140

  4. Tuned with a tune: Talker normalization via general auditory processes

    Directory of Open Access Journals (Sweden)

    Erika J C Laing

    2012-06-01

    Full Text Available Voices have unique acoustic signatures, contributing to the acoustic variability listeners must contend with in perceiving speech, and it has long been proposed that listeners normalize speech perception to information extracted from a talker’s speech. Initial attempts to explain talker normalization relied on extraction of articulatory referents, but recent studies of context-dependent auditory perception suggest that general auditory referents such as the long-term average spectrum (LTAS of a talker’s speech similarly affect speech perception. The present study aimed to differentiate the contributions of articulatory/linguistic versus auditory referents for context-driven talker normalization effects and, more specifically, to identify the specific constraints under which such contexts impact speech perception. Synthesized sentences manipulated to sound like different talkers influenced categorization of a subsequent speech target only when differences in the sentences’ LTAS were in the frequency range of the acoustic cues relevant for the target phonemic contrast. This effect was true both for speech targets preceded by spoken sentence contexts and for targets preceded by nonspeech tone sequences that were LTAS-matched to the spoken sentence contexts. Specific LTAS characteristics, rather than perceived talker, predicted the results suggesting that general auditory mechanisms play an important role in effects considered to be instances of perceptual talker normalization.

  5. A Transient Auditory Signal Shifts the Perceived Offset Position of a Moving Visual Object

    Directory of Open Access Journals (Sweden)

    Sung-EnChien

    2013-02-01

    Full Text Available Information received from different sensory modalities profoundly influences human perception. For example, changes in the auditory flutter rate induce changes in the apparent flicker rate of a flashing light (Shipley, 1964. In the present study, we investigated whether auditory information would affect the perceived offset position of a moving object. In Experiment 1, a visual object moved toward the center of the computer screen and disappeared abruptly. A transient auditory signal was presented at different times relative to the moment when the object disappeared. The results showed that if the auditory signal was presented before the abrupt offset of the moving object, the perceived final position was shifted backward, implying that the perceived offset position was affected by the transient auditory information. In Experiment 2, we presented the transient auditory signal to either the left or the right ear. The results showed that the perceived offset shifted backward more strongly when the auditory signal was presented to the same side from which the moving object originated. In Experiment 3, we found that the perceived timing of the visual offset was not affected by the spatial relation between the auditory signal and the visual offset. The present results are interpreted as indicating that an auditory signal may influence the offset position of a moving object through both spatial and temporal processes.

  6. A transient auditory signal shifts the perceived offset position of a moving visual object.

    Science.gov (United States)

    Chien, Sung-En; Ono, Fuminori; Watanabe, Katsumi

    2013-01-01

    Information received from different sensory modalities profoundly influences human perception. For example, changes in the auditory flutter rate induce changes in the apparent flicker rate of a flashing light (Shipley, 1964). In the present study, we investigated whether auditory information would affect the perceived offset position of a moving object. In Experiment 1, a visual object moved toward the center of the computer screen and disappeared abruptly. A transient auditory signal was presented at different times relative to the moment when the object disappeared. The results showed that if the auditory signal was presented before the abrupt offset of the moving object, the perceived final position was shifted backward, implying that the perceived visual offset position was affected by the transient auditory information. In Experiment 2, we presented the transient auditory signal to either the left or the right ear. The results showed that the perceived visual offset shifted backward more strongly when the auditory signal was presented to the same side from which the moving object originated. In Experiment 3, we found that the perceived timing of the visual offset was not affected by the spatial relation between the auditory signal and the visual offset. The present results are interpreted as indicating that an auditory signal may influence the offset position of a moving object through both spatial and temporal processes. PMID:23439729

  7. Electrically evoked hearing perception by functional neurostimulation of the central auditory system.

    Science.gov (United States)

    Tatagiba, M; Gharabaghi, A

    2005-01-01

    Perceptional benefits and potential risks of electrical stimulation of the central auditory system are constantly changing due to ongoing developments and technical modifications. Therefore, we would like to introduce current treatment protocols and strategies that might have an impact on functional results of auditory brainstem implants (ABI) in profoundly deaf patients. Patients with bilateral tumours as a result of neurofibromatosis type 2 with complete dysfunction of the eighth cranial nerves are the most frequent candidates for auditory brainstem implants. Worldwide, about 300 patients have already received an ABI through a translabyrinthine or suboccipital approach supported by multimodality electrophysiological monitoring. Patient selection is based on disease course, clinical signs, audiological, radiological and psycho-social criteria. The ABI provides the patients with access to auditory information such as environmental sound awareness together with distinct hearing cues in speech. In addition, this device markedly improves speech reception in combination with lip-reading. Nonetheless, there is only limited open-set speech understanding. Results of hearing function are correlated with electrode design, number of activated electrodes, speech processing strategies, duration of pre-existing deafness and extent of brainstem deformation. Functional neurostimulation of the central auditory system by a brainstem implant is a safe and beneficial procedure, which may considerably improve the quality of life in patients suffering from deafness due to bilateral retrocochlear lesions. The auditory outcome may be improved by a new generation of microelectrodes capable of penetrating the surface of the brainstem to access more directly the auditory neurons. PMID:15986735

  8. Auditory Learning. Dimensions in Early Learning Series.

    Science.gov (United States)

    Zigmond, Naomi K.; Cicci, Regina

    The monograph discusses the psycho-physiological operations for processing of auditory information, the structure and function of the ear, the development of auditory processes from fetal responses through discrimination, language comprehension, auditory memory, and auditory processes related to written language. Disorders of auditory learning…

  9. The auditory characteristics of children with inner auditory canal stenosis.

    Science.gov (United States)

    Ai, Yu; Xu, Lei; Li, Li; Li, Jianfeng; Luo, Jianfen; Wang, Mingming; Fan, Zhaomin; Wang, Haibo

    2016-07-01

    Conclusions This study shows that the prevalence of auditory neuropathy spectrum disorder (ANSD) in the children with inner auditory canal (IAC) stenosis is much higher than those without IAC stenosis, regardless of whether they have other inner ear anomalies. In addition, the auditory characteristics of ANSD with IAC stenosis are significantly different from those of ANSD without any middle and inner ear malformations. Objectives To describe the auditory characteristics in children with IAC stenosis as well as to examine whether the narrow inner auditory canal is associated with ANSD. Method A total of 21 children, with inner auditory canal stenosis, participated in this study. A series of auditory tests were measured. Meanwhile, a comparative study was conducted on the auditory characteristics of ANSD, based on whether the children were associated with isolated IAC stenosis. Results Wave V in the ABR was not observed in all the patients, while cochlear microphonic (CM) response was detected in 81.1% ears with stenotic IAC. Sixteen of 19 (84.2%) ears with isolated IAC stenosis had CM response present on auditory brainstem responses (ABR) waveforms. There was no significant difference in ANSD characteristics between the children with and without isolated IAC stenosis. PMID:26981851

  10. Learning auditory space: generalization and long-term effects.

    Directory of Open Access Journals (Sweden)

    Catarina Mendonça

    Full Text Available BACKGROUND: Previous findings have shown that humans can learn to localize with altered auditory space cues. Here we analyze such learning processes and their effects up to one month on both localization accuracy and sound externalization. Subjects were trained and retested, focusing on the effects of stimulus type in learning, stimulus type in localization, stimulus position, previous experience, externalization levels, and time. METHOD: We trained listeners in azimuth and elevation discrimination in two experiments. Half participated in the azimuth experiment first and half in the elevation first. In each experiment, half were trained in speech sounds and half in white noise. Retests were performed at several time intervals: just after training and one hour, one day, one week and one month later. In a control condition, we tested the effect of systematic retesting over time with post-tests only after training and either one day, one week, or one month later. RESULTS: With training all participants lowered their localization errors. This benefit was still present one month after training. Participants were more accurate in the second training phase, revealing an effect of previous experience on a different task. Training with white noise led to better results than training with speech sounds. Moreover, the training benefit generalized to untrained stimulus-position pairs. Throughout the post-tests externalization levels increased. In the control condition the long-term localization improvement was not lower without additional contact with the trained sounds, but externalization levels were lower. CONCLUSION: Our findings suggest that humans adapt easily to altered auditory space cues and that such adaptation spreads to untrained positions and sound types. We propose that such learning depends on all available cues, but each cue type might be learned and retrieved differently. The process of localization learning is global, not limited to

  11. Human Perception of Ambiguous Inertial Motion Cues

    Science.gov (United States)

    Zhang, Guan-Lu

    2010-01-01

    Human daily activities on Earth involve motions that elicit both tilt and translation components of the head (i.e. gazing and locomotion). With otolith cues alone, tilt and translation can be ambiguous since both motions can potentially displace the otolithic membrane by the same magnitude and direction. Transitions between gravity environments (i.e. Earth, microgravity and lunar) have demonstrated to alter the functions of the vestibular system and exacerbate the ambiguity between tilt and translational motion cues. Symptoms of motion sickness and spatial disorientation can impair human performances during critical mission phases. Specifically, Space Shuttle landing records show that particular cases of tilt-translation illusions have impaired the performance of seasoned commanders. This sensorimotor condition is one of many operational risks that may have dire implications on future human space exploration missions. The neural strategy with which the human central nervous system distinguishes ambiguous inertial motion cues remains the subject of intense research. A prevailing theory in the neuroscience field proposes that the human brain is able to formulate a neural internal model of ambiguous motion cues such that tilt and translation components can be perceptually decomposed in order to elicit the appropriate bodily response. The present work uses this theory, known as the GIF resolution hypothesis, as the framework for experimental hypothesis. Specifically, two novel motion paradigms are employed to validate the neural capacity of ambiguous inertial motion decomposition in ground-based human subjects. The experimental setup involves the Tilt-Translation Sled at Neuroscience Laboratory of NASA JSC. This two degree-of-freedom motion system is able to tilt subjects in the pitch plane and translate the subject along the fore-aft axis. Perception data will be gathered through subject verbal reports. Preliminary analysis of perceptual data does not indicate that

  12. Making the invisible visible: verbal but not visual cues enhance visual detection.

    Directory of Open Access Journals (Sweden)

    Gary Lupyan

    Full Text Available BACKGROUND: Can hearing a word change what one sees? Although visual sensitivity is known to be enhanced by attending to the location of the target, perceptual enhancements of following cues to the identity of an object have been difficult to find. Here, we show that perceptual sensitivity is enhanced by verbal, but not visual cues. METHODOLOGY/PRINCIPAL FINDINGS: Participants completed an object detection task in which they made an object-presence or -absence decision to briefly-presented letters. Hearing the letter name prior to the detection task increased perceptual sensitivity (d'. A visual cue in the form of a preview of the to-be-detected letter did not. Follow-up experiments found that the auditory cuing effect was specific to validly cued stimuli. The magnitude of the cuing effect positively correlated with an individual measure of vividness of mental imagery; introducing uncertainty into the position of the stimulus did not reduce the magnitude of the cuing effect, but eliminated the correlation with mental imagery. CONCLUSIONS/SIGNIFICANCE: Hearing a word made otherwise invisible objects visible. Interestingly, seeing a preview of the target stimulus did not similarly enhance detection of the target. These results are compatible with an account in which auditory verbal labels modulate lower-level visual processing. The findings show that a verbal cue in the form of hearing a word can influence even the most elementary visual processing and inform our understanding of how language affects perception.

  13. Ratchetaxis: Long-Range Directed Cell Migration by Local Cues.

    Science.gov (United States)

    Caballero, David; Comelles, Jordi; Piel, Matthieu; Voituriez, Raphaël; Riveline, Daniel

    2015-12-01

    Directed cell migration is usually thought to depend on the presence of long-range gradients of either chemoattractants or physical properties such as stiffness or adhesion. However, in vivo, chemical or mechanical gradients have not systematically been observed. Here we review recent in vitro experiments, which show that other types of spatial guidance cues can bias cell motility. Introducing local geometrical or mechanical anisotropy in the cell environment, such as adhesive/topographical microratchets or tilted micropillars, show that local and periodic external cues can direct cell motion. Together with modeling, these experiments suggest that cell motility can be viewed as a stochastic phenomenon, which can be biased by various types of local cues, leading to directional migration. PMID:26615123

  14. Compression of auditory space during forward self-motion.

    Directory of Open Access Journals (Sweden)

    Wataru Teramoto

    Full Text Available BACKGROUND: Spatial inputs from the auditory periphery can be changed with movements of the head or whole body relative to the sound source. Nevertheless, humans can perceive a stable auditory environment and appropriately react to a sound source. This suggests that the inputs are reinterpreted in the brain, while being integrated with information on the movements. Little is known, however, about how these movements modulate auditory perceptual processing. Here, we investigate the effect of the linear acceleration on auditory space representation. METHODOLOGY/PRINCIPAL FINDINGS: Participants were passively transported forward/backward at constant accelerations using a robotic wheelchair. An array of loudspeakers was aligned parallel to the motion direction along a wall to the right of the listener. A short noise burst was presented during the self-motion from one of the loudspeakers when the listener's physical coronal plane reached the location of one of the speakers (null point. In Experiments 1 and 2, the participants indicated which direction the sound was presented, forward or backward relative to their subjective coronal plane. The results showed that the sound position aligned with the subjective coronal plane was displaced ahead of the null point only during forward self-motion and that the magnitude of the displacement increased with increasing the acceleration. Experiment 3 investigated the structure of the auditory space in the traveling direction during forward self-motion. The sounds were presented at various distances from the null point. The participants indicated the perceived sound location by pointing a rod. All the sounds that were actually located in the traveling direction were perceived as being biased towards the null point. CONCLUSIONS/SIGNIFICANCE: These results suggest a distortion of the auditory space in the direction of movement during forward self-motion. The underlying mechanism might involve anticipatory spatial

  15. The processing of visual and auditory information for reaching movements.

    Science.gov (United States)

    Glazebrook, Cheryl M; Welsh, Timothy N; Tremblay, Luc

    2016-09-01

    Presenting target and non-target information in different modalities influences target localization if the non-target is within the spatiotemporal limits of perceptual integration. When using auditory and visual stimuli, the influence of a visual non-target on auditory target localization is greater than the reverse. It is not known, however, whether or how such perceptual effects extend to goal-directed behaviours. To gain insight into how audio-visual stimuli are integrated for motor tasks, the kinematics of reaching movements towards visual or auditory targets with or without a non-target in the other modality were examined. When present, the simultaneously presented non-target could be spatially coincident, to the left, or to the right of the target. Results revealed that auditory non-targets did not influence reaching trajectories towards a visual target, whereas visual non-targets influenced trajectories towards an auditory target. Interestingly, the biases induced by visual non-targets were present early in the trajectory and persisted until movement end. Subsequent experimentation indicated that the magnitude of the biases was equivalent whether participants performed a perceptual or motor task, whereas variability was greater for the motor versus the perceptual tasks. We propose that visually induced trajectory biases were driven by the perceived mislocation of the auditory target, which in turn affected both the movement plan and subsequent control of the movement. Such findings provide further evidence of the dominant role visual information processing plays in encoding spatial locations as well as planning and executing reaching action, even when reaching towards auditory targets. PMID:26253323

  16. Coding of auditory space

    OpenAIRE

    Konishi­, Masakazu

    2003-01-01

    Behavioral, anatomical, and physiological approaches can be integrated in the study of sound localization in barn owls. Space representation in owls provides a useful example for discussion of place and ensemble coding. Selectivity for space is broad and ambiguous in low-order neurons. Parallel pathways for binaural cues and for different frequency bands converge on high-order space-specific neurons, which encode space more precisely. An ensemble of broadly tuned place-coding neurons may conv...

  17. Auditory Discrimination and Auditory Sensory Behaviours in Autism Spectrum Disorders

    Science.gov (United States)

    Jones, Catherine R. G.; Happe, Francesca; Baird, Gillian; Simonoff, Emily; Marsden, Anita J. S.; Tregay, Jenifer; Phillips, Rebecca J.; Goswami, Usha; Thomson, Jennifer M.; Charman, Tony

    2009-01-01

    It has been hypothesised that auditory processing may be enhanced in autism spectrum disorders (ASD). We tested auditory discrimination ability in 72 adolescents with ASD (39 childhood autism; 33 other ASD) and 57 IQ and age-matched controls, assessing their capacity for successful discrimination of the frequency, intensity and duration…

  18. Auditory and non-auditory effects of noise on health

    NARCIS (Netherlands)

    Basner, M.; Babisch, W.; Davis, A.; Brink, M.; Clark, C.; Janssen, S.A.; Stansfeld, S.

    2013-01-01

    Noise is pervasive in everyday life and can cause both auditory and non-auditory health eff ects. Noise-induced hearing loss remains highly prevalent in occupational settings, and is increasingly caused by social noise exposure (eg, through personal music players). Our understanding of molecular mec

  19. Hypermnesia: the role of multiple retrieval cues.

    Science.gov (United States)

    Otani, H; Widner, R L; Whiteman, H L; St Louis, J P

    1999-09-01

    We demonstrate that encoding multiple cues enhances hypermnesia. College students were presented with 36 (Experiment 1) or 60 (Experiments 2 and 3) sets of words and were asked to encode the sets under single- or multiple-cue conditions. In the single-cue conditions, each set consisted of a cue and a target. In the multiple-cue conditions, each set consisted of three cues and a target. Following the presentation of the word sets, the participants received either three cued recall tests (Experiments 1 and 2) or three free recall tests (Experiment 3). With this manipulation, we observed greater hypermnesia in the multiple-cue conditions than in the single-cue conditions. Furthermore, the greater hypermnesic recall resulted from increased reminiscence rather than reduced intertest forgetting. The present findings support the hypothesis that the availability of multiple retrieval cues plays an important role in hypermnesia. PMID:10540821

  20. Visual landmarks facilitate rodent spatial navigation in virtual reality environments

    OpenAIRE

    Youngstrom, Isaac A.; Strowbridge, Ben W.

    2012-01-01

    Because many different sensory modalities contribute to spatial learning in rodents, it has been difficult to determine whether spatial navigation can be guided solely by visual cues. Rodents moving within physical environments with visual cues engage a variety of nonvisual sensory systems that cannot be easily inhibited without lesioning brain areas. Virtual reality offers a unique approach to ask whether visual landmark cues alone are sufficient to improve performance in a spatial task. We ...

  1. Methylphenidate attenuates limbic brain inhibition after cocaine-cues exposure in cocaine abusers.

    Directory of Open Access Journals (Sweden)

    Nora D Volkow

    Full Text Available Dopamine (phasic release is implicated in conditioned responses. Imaging studies in cocaine abusers show decreases in striatal dopamine levels, which we hypothesize may enhance conditioned responses since tonic dopamine levels modulate phasic dopamine release. To test this we assessed the effects of increasing tonic dopamine levels (using oral methylphenidate on brain activation induced by cocaine-cues in cocaine abusers. Brain metabolism (marker of brain function was measured with PET and (18FDG in 24 active cocaine abusers tested four times; twice watching a Neutral video (nature scenes and twice watching a Cocaine-cues video; each video was preceded once by placebo and once by methylphenidate (20 mg. The Cocaine-cues video increased craving to the same extent with placebo (68% and with methylphenidate (64%. In contrast, SPM analysis of metabolic images revealed that differences between Neutral versus Cocaine-cues conditions were greater with placebo than methylphenidate; whereas with placebo the Cocaine-cues decreased metabolism (p<0.005 in left limbic regions (insula, orbitofrontal, accumbens and right parahippocampus, with methylphenidate it only decreased in auditory and visual regions, which also occurred with placebo. Decreases in metabolism in these regions were not associated with craving; in contrast the voxel-wise SPM analysis identified significant correlations with craving in anterior orbitofrontal cortex (p<0.005, amygdala, striatum and middle insula (p<0.05. This suggests that methylphenidate's attenuation of brain reactivity to Cocaine-cues is distinct from that involved in craving. Cocaine-cues decreased metabolism in limbic regions (reflects activity over 30 minutes, which contrasts with activations reported by fMRI studies (reflects activity over 2-5 minutes that may reflect long-lasting limbic inhibition following activation. Studies to evaluate the clinical significance of methylphenidate's blunting of cue-induced limbic

  2. Methylphenidate attenuates limbic brain inhibition after cocaine-cues exposure in cocaine abusers.

    Energy Technology Data Exchange (ETDEWEB)

    Volkow, N.D.; Wang, G.; Volkow, N.D.; Wang, G.-J.; Tomasi, D.; Telang, F.; Fowler, J.S.; Pradhan, K.; Jayne, M.; Logan, J.; Goldstein, R.Z.; Alia-Klein, N.; Wong, C.T.

    2010-07-01

    Dopamine (phasic release) is implicated in conditioned responses. Imaging studies in cocaine abusers show decreases in striatal dopamine levels, which we hypothesize may enhance conditioned responses since tonic dopamine levels modulate phasic dopamine release. To test this we assessed the effects of increasing tonic dopamine levels (using oral methylphenidate) on brain activation induced by cocaine-cues in cocaine abusers. Brain metabolism (marker of brain function) was measured with PET and {sup 18}FDG in 24 active cocaine abusers tested four times; twice watching a Neutral video (nature scenes) and twice watching a Cocaine-cues video; each video was preceded once by placebo and once by methylphenidate (20 mg). The Cocaine-cues video increased craving to the same extent with placebo (68%) and with methylphenidate (64%). In contrast, SPM analysis of metabolic images revealed that differences between Neutral versus Cocaine-cues conditions were greater with placebo than methylphenidate; whereas with placebo the Cocaine-cues decreased metabolism (p<0.005) in left limbic regions (insula, orbitofrontal, accumbens) and right parahippocampus, with methylphenidate it only decreased in auditory and visual regions, which also occurred with placebo. Decreases in metabolism in these regions were not associated with craving; in contrast the voxel-wise SPM analysis identified significant correlations with craving in anterior orbitofrontal cortex (p<0.005), amygdala, striatum and middle insula (p<0.05). This suggests that methylphenidate's attenuation of brain reactivity to Cocaine-cues is distinct from that involved in craving. Cocaine-cues decreased metabolism in limbic regions (reflects activity over 30 minutes), which contrasts with activations reported by fMRI studies (reflects activity over 2-5 minutes) that may reflect long-lasting limbic inhibition following activation. Studies to evaluate the clinical significance of methylphenidate's blunting of cue

  3. Methylphenidate attenuates limbic brain inhibition after cocaine-cues exposure in cocaine abusers

    International Nuclear Information System (INIS)

    Dopamine (phasic release) is implicated in conditioned responses. Imaging studies in cocaine abusers show decreases in striatal dopamine levels, which we hypothesize may enhance conditioned responses since tonic dopamine levels modulate phasic dopamine release. To test this we assessed the effects of increasing tonic dopamine levels (using oral methylphenidate) on brain activation induced by cocaine-cues in cocaine abusers. Brain metabolism (marker of brain function) was measured with PET and 18FDG in 24 active cocaine abusers tested four times; twice watching a Neutral video (nature scenes) and twice watching a Cocaine-cues video; each video was preceded once by placebo and once by methylphenidate (20 mg). The Cocaine-cues video increased craving to the same extent with placebo (68%) and with methylphenidate (64%). In contrast, SPM analysis of metabolic images revealed that differences between Neutral versus Cocaine-cues conditions were greater with placebo than methylphenidate; whereas with placebo the Cocaine-cues decreased metabolism (p<0.005) in left limbic regions (insula, orbitofrontal, accumbens) and right parahippocampus, with methylphenidate it only decreased in auditory and visual regions, which also occurred with placebo. Decreases in metabolism in these regions were not associated with craving; in contrast the voxel-wise SPM analysis identified significant correlations with craving in anterior orbitofrontal cortex (p<0.005), amygdala, striatum and middle insula (p<0.05). This suggests that methylphenidate's attenuation of brain reactivity to Cocaine-cues is distinct from that involved in craving. Cocaine-cues decreased metabolism in limbic regions (reflects activity over 30 minutes), which contrasts with activations reported by fMRI studies (reflects activity over 2-5 minutes) that may reflect long-lasting limbic inhibition following activation. Studies to evaluate the clinical significance of methylphenidate's blunting of cue-induced limbic inhibition

  4. Hypermnesia using auditory input.

    Science.gov (United States)

    Allen, J

    1992-07-01

    The author investigated whether hypermnesia would occur with auditory input. In addition, the author examined the effects of subjects' knowledge that they would later be asked to recall the stimuli. Two groups of 26 subjects each were given three successive recall trials after they listened to an audiotape of 59 high-imagery nouns. The subjects in the uninformed group were not told that they would later be asked to remember the words; those in the informed group were. Hypermnesia was evident, but only in the uninformed group. PMID:1447564

  5. Partial Epilepsy with Auditory Features

    Directory of Open Access Journals (Sweden)

    J Gordon Millichap

    2004-07-01

    Full Text Available The clinical characteristics of 53 sporadic (S cases of idiopathic partial epilepsy with auditory features (IPEAF were analyzed and compared to previously reported familial (F cases of autosomal dominant partial epilepsy with auditory features (ADPEAF in a study at the University of Bologna, Italy.

  6. The acoustic and perceptual cues affecting melody segregation for listeners with a cochlear implant.

    Directory of Open Access Journals (Sweden)

    Jeremy eMarozeau

    2013-11-01

    Full Text Available Our ability to listen selectively to single sound sources in complex auditory environments is termed ‘auditory stream segregation.’ This ability is affected by peripheral disorders such as hearing loss, as well as plasticity in central processing such as occurs with musical training. Brain plasticity induced by musical training can enhance the ability to segregate sound, leading to improvements in a variety of auditory abilities. The melody segregation ability of 12 cochlear-implant recipients was tested using a new method to determine the perceptual distance needed to segregate a simple 4-note melody from a background of interleaved random-pitch distractor notes. In experiment 1, participants rated the difficulty of segregating the melody from distracter notes. Four physical properties of the distracter notes were changed. In experiment 2, listeners were asked to rate the dissimilarity between melody patterns whose notes differed on the four physical properties simultaneously. Multidimensional scaling analysis transformed the dissimilarity ratings into perceptual distances. Regression between physical and perceptual cues then derived the minimal perceptual distance needed to segregate the melody.The most efficient streaming cue for CI users was loudness. For the normal hearing listeners without musical backgrounds, a greater difference on the perceptual dimension correlated to the temporal envelope is needed for stream segregation in CI users. No differences in streaming efficiency were found between the perceptual dimensions linked to the F0 and the spectral envelope.Combined with our previous results in normally-hearing musicians and non-musicians, the results show that differences in training as well as differences in peripheral auditory processing (hearing impairment and the use of a hearing device influences the way that listeners use different acoustic cues for segregating interleaved musical streams.

  7. The acoustic and perceptual cues affecting melody segregation for listeners with a cochlear implant.

    Science.gov (United States)

    Marozeau, Jeremy; Innes-Brown, Hamish; Blamey, Peter J

    2013-01-01

    Our ability to listen selectively to single sound sources in complex auditory environments is termed "auditory stream segregation."This ability is affected by peripheral disorders such as hearing loss, as well as plasticity in central processing such as occurs with musical training. Brain plasticity induced by musical training can enhance the ability to segregate sound, leading to improvements in a variety of auditory abilities. The melody segregation ability of 12 cochlear-implant recipients was tested using a new method to determine the perceptual distance needed to segregate a simple 4-note melody from a background of interleaved random-pitch distractor notes. In experiment 1, participants rated the difficulty of segregating the melody from distracter notes. Four physical properties of the distracter notes were changed. In experiment 2, listeners were asked to rate the dissimilarity between melody patterns whose notes differed on the four physical properties simultaneously. Multidimensional scaling analysis transformed the dissimilarity ratings into perceptual distances. Regression between physical and perceptual cues then derived the minimal perceptual distance needed to segregate the melody. The most efficient streaming cue for CI users was loudness. For the normal hearing listeners without musical backgrounds, a greater difference on the perceptual dimension correlated to the temporal envelope is needed for stream segregation in CI users. No differences in streaming efficiency were found between the perceptual dimensions linked to the F0 and the spectral envelope. Combined with our previous results in normally-hearing musicians and non-musicians, the results show that differences in training as well as differences in peripheral auditory processing (hearing impairment and the use of a hearing device) influences the way that listeners use different acoustic cues for segregating interleaved musical streams. PMID:24223563

  8. Evidence for auditory localization ability in the turtle.

    Science.gov (United States)

    Lenhardt, M L

    1981-10-01

    Evidence is presented that the semiaquatic turtle Chrysemys scripta and the terrestrial turtle Terrapene carolina major can detect the direction of a tone within their sensitive area of hearing. It is further suggested that not only can these species respond behaviorally to sound without extensive manipulation but can use limited hearing in a problem-solving situation of maze learning. Adult emydid turtles (5 C. scripta, 3 T. carolina) learned a Y-maze with a 500-c/s signal to an invisible open goal box to avoid bright light. All animals performed above chance levels, but it required over 240 trials on the average to reach 60%-correct performance. Computations suggest that binaural cues used by mammals would not be adequately encoded by the primitive auditory systems of the species studied. It is further suggested that these turtles use bone conduction by coupling their ears to the substrate to hear vibrations in the immediate area. This would appear to be a carryover from the ancient reptile stem stock. The poor middle-ear impedance system relegates air-borne sound processing to be a somewhat insensitive limited low-pass system, depending heavily on monaural cues derived from head scanning. vocal output in these species appears to be spectrally imbalanced with their auditory sensitivity. The role of species-specific vocal signalling is unclear from the present data. PMID:7186502

  9. The Perception of Auditory Motion.

    Science.gov (United States)

    Carlile, Simon; Leung, Johahn

    2016-01-01

    The growing availability of efficient and relatively inexpensive virtual auditory display technology has provided new research platforms to explore the perception of auditory motion. At the same time, deployment of these technologies in command and control as well as in entertainment roles is generating an increasing need to better understand the complex processes underlying auditory motion perception. This is a particularly challenging processing feat because it involves the rapid deconvolution of the relative change in the locations of sound sources produced by rotational and translations of the head in space (self-motion) to enable the perception of actual source motion. The fact that we perceive our auditory world to be stable despite almost continual movement of the head demonstrates the efficiency and effectiveness of this process. This review examines the acoustical basis of auditory motion perception and a wide range of psychophysical, electrophysiological, and cortical imaging studies that have probed the limits and possible mechanisms underlying this perception. PMID:27094029

  10. The Influence of Visual Cues on Sound Externalization

    DEFF Research Database (Denmark)

    Carvajal, Juan Camilo Gil; Santurette, Sébastien; Cubick, Jens; Dau, Torsten

    due to incongruent auditory cues between the recording and playback room during sound reproduction or to an expectation effect from the visual impression of the room. This study investigated the influence of a priori acoustic and visual knowledge of the playback room on sound externalization.......Methods: Eighteen naïve listeners rated the externalization of virtual stimuli in terms of perceived distance, azimuthal localization, and compactness in three rooms: 1) a standard IEC listening room, 2) a small reverberant room, and 3) a large dry room. Before testing, individual BRIRs were recorded in room 1......, V, and AV conditions were much less pronounced. In contrast to distance, localization and compactness judgments were largely room independent, although localization judgments were less accurate and compactness ratings less consistent in conditions V and A than in condition VA. Conclusion: A mismatch...

  11. Responses of mink to auditory stimuli: Prerequisites for applying the ‘cognitive bias’ approach

    DEFF Research Database (Denmark)

    Svendsen, Pernille Maj; Malmkvist, Jens; Halekoh, Ulrich;

    2012-01-01

    The aim of the study was to determine and validate prerequisites for applying a cognitive (judgement) bias approach to assessing welfare in farmed mink (Neovison vison). We investigated discrimination ability and associative learning ability using auditory cues. The mink (n = 15 females) were...... farmed mink in a judgement bias approach would thus appear to be feasible. However several specific issues are to be considered in order to successfully adapt a cognitive bias approach to mink, and these are discussed....

  12. Peripheral Auditory Mechanisms

    CERN Document Server

    Hall, J; Hubbard, A; Neely, S; Tubis, A

    1986-01-01

    How weIl can we model experimental observations of the peripheral auditory system'? What theoretical predictions can we make that might be tested'? It was with these questions in mind that we organized the 1985 Mechanics of Hearing Workshop, to bring together auditory researchers to compare models with experimental observations. Tbe workshop forum was inspired by the very successful 1983 Mechanics of Hearing Workshop in Delft [1]. Boston University was chosen as the site of our meeting because of the Boston area's role as a center for hearing research in this country. We made a special effort at this meeting to attract students from around the world, because without students this field will not progress. Financial support for the workshop was provided in part by grant BNS- 8412878 from the National Science Foundation. Modeling is a traditional strategy in science and plays an important role in the scientific method. Models are the bridge between theory and experiment. Tbey test the assumptions made in experim...

  13. Visually directed pointing as a function of target distance, direction, and available cues.

    Science.gov (United States)

    Foley, J. M.; Held, R.

    1972-01-01

    In pointing at visual targets without sight of the hand, large errors occur. There is a tendency to overreach targets, and this tendency is much greater (about 25 cm) when convergence is the only cue to distance than when there are many cues (2 to 11 cm). Angular errors of up to 10 deg also occur. These tend to be to the side opposite the sighting eye, when the favored hand is used. The variance of the pointing response with convergence alone is reduced by approximately half with the introduction of several spatial cues. These results are interpreted as indicating that, for a target within the reach of the arm and with convergence alone as a cue, the depth signal produced by the visual system corresponds to a greater distance than that produced when many cues are available. The results are also consistent with the hypothesis that perceived direction tends to approximate direction from the sighting eye.

  14. Evaluation of multimodal ground cues

    DEFF Research Database (Denmark)

    Nordahl, Rolf; Lecuyer, Anatole; Serafin, Stefania; Turchet, Luca; Papetti, Stefano; Fontana, Federico; Visell, Yon

    This chapter presents an array of results on the perception of ground surfaces via multiple sensory modalities,with special attention to non visual perceptual cues, notably those arising from audition and haptics, as well as interactions between them. It also reviews approaches to combining...

  15. Moving Objects in the Barn Owl's Auditory World.

    Science.gov (United States)

    Langemann, Ulrike; Krumm, Bianca; Liebner, Katharina; Beutelmann, Rainer; Klump, Georg M

    2016-01-01

    Barn owls are keen hunters of moving prey. They have evolved an auditory system with impressive anatomical and physiological specializations for localizing their prey. Here we present behavioural data on the owl's sensitivity for discriminating acoustic motion direction in azimuth that, for the first time, allow a direct comparison of neuronal and perceptual sensitivity for acoustic motion in the same model species. We trained two birds to report a change in motion direction within a series of repeating wideband noise stimuli. For any trial the starting point, motion direction, velocity (53-2400°/s), duration (30-225 ms) and angular range (12-72°) of the noise sweeps were randomized. Each test stimulus had a motion direction being opposite to that of the reference stimuli. Stimuli were presented in the frontal or the lateral auditory space. The angular extent of the motion had a large effect on the owl's discrimination sensitivity allowing a better discrimination for a larger angular range of the motion. In contrast, stimulus velocity or stimulus duration had a smaller, although significant effect. Overall there was no difference in the owls' behavioural performance between "inward" noise sweeps (moving from lateral to frontal) compared to "outward" noise sweeps (moving from frontal to lateral). The owls did, however, respond more often to stimuli with changing motion direction in the frontal compared to the lateral space. The results of the behavioural experiments are discussed in relation to the neuronal representation of motion cues in the barn owl auditory midbrain. PMID:27080662

  16. The antagonism of ghrelin alters the appetitive response to learned cues associated with food.

    Science.gov (United States)

    Dailey, Megan J; Moran, Timothy H; Holland, Peter C; Johnson, Alexander W

    2016-04-15

    The rapid increase in obesity may be partly mediated by an increase in the exposure to cues for food. Food-paired cues play a role in food procurement and intake under conditions of satiety. The mechanism by which this occurs requires characterization, but may involve ghrelin. This orexigenic peptide alters the response to food-paired conditioned stimuli, and neural responses to food images in reward nuclei. Therefore, we tested whether a ghrelin receptor antagonist alters the influence of food-paired cues on the performance of instrumental responses that earn food and the consumption of food itself using tests of Pavlovian-to-instrumental transfer (PIT) and cue potentiated feeding (CPF), respectively. Food-deprived rats received Pavlovian conditioning where an auditory cue was paired with delivery of sucrose solution followed by instrumental conditioning to lever press for sucrose. Following training, rats were given ad libitum access to chow. On test day, rats were injected with the ghrelin receptor antagonist GHRP-6 [D-Lys3] and then tested for PIT or CPF. Disrupting ghrelin signaling enhanced expression of PIT. In addition, GHRP-6 [D-Lys3] impaired the initiation of feeding behavior in CPF without influencing overall intake of sucrose. Finally, in PIT tested rats, enhanced FOS immunoreactivity was revealed following the antagonist in regions thought to underlie PIT; however, the antagonist had no effect on FOS immunoreactivity in CPF tested rats. PMID:26802728

  17. Cues for localization in the horizontal plane

    DEFF Research Database (Denmark)

    Jeppesen, Jakob; Møller, Henrik

    2005-01-01

    manipulated in HRTFs used for binaural synthesis of sound in the horizontal plane. The manipulation of cues resulted in HRTFs with cues ranging from correct combinations of spectral information and ITDs to combinations with severely conflicting cues. Both the ITD and the spectral information seem to be...

  18. Fragrances as Cues for Remembering Words

    Science.gov (United States)

    Eich, James Eric

    1978-01-01

    Results of this experiment suggest that specific encoding of a word is not a necessary condition for cue effectiveness. Results imply that the effect of a nominal fragrance cue arises through the mediation of a functional, implicitly generated semantic cue. (Author/SW)

  19. Cue salience influences the use of height cues in reorientation in pigeons (Columba livia).

    Science.gov (United States)

    Du, Yu; Mahdi, Nuha; Paul, Breanne; Spetch, Marcia L

    2016-07-01

    Although orienting ability has been examined with numerous types of cues, most research has focused only on cues from the horizontal plane. The current study investigated pigeons' use of wall height, a vertical cue, in an open-field task and compared it with their use of horizontal cues. Pigeons were trained to locate food in 2 diagonal corners of a rectangular enclosure with 2 opposite high walls as height cues. Before each trial, pigeons were rotated to disorient them. In training, pigeons could use either the horizontal cues from the rectangular enclosure or the height information from the walls to locate the food. In testing, the apparatus was modified to provide (a) horizontal cues only, (b) height cues only, and (c) both height and horizontal cues in conflict. In Experiment 1 the lower and high walls, respectively, were 40 and 80 cm, whereas in Experiment 2 they were made more perceptually salient by shortening them to 20 and 40 cm. Pigeons accurately located the goal corners with horizontal cues alone in both experiments, but they searched accurately with height cues alone only in Experiment 2. When the height cues conflicted with horizontal cues, pigeons preferred the horizontal cues over the height cues in Experiment 1 but not in Experiment 2, suggesting that perceptual salience influences the relative weighting of cues. (PsycINFO Database Record PMID:27379717

  20. Enhanced representation of spectral contrasts in the primary auditory cortex

    Directory of Open Access Journals (Sweden)

    Nicolas eCatz

    2013-06-01

    Full Text Available The role of early auditory processing may be to extract some elementary features from an acoustic mixture in order to organize the auditory scene. To accomplish this task, the central auditory system may rely on the fact that sensory objects are often composed of spectral edges, i.e. regions where the stimulus energy changes abruptly over frequency. The processing of acoustic stimuli may benefit from a mechanism enhancing the internal representation of spectral edges. While the visual system is thought to rely heavily on this mechanism (enhancing spatial edges, it is still unclear whether a related process plays a significant role in audition. We investigated the cortical representation of spectral edges, using acoustic stimuli composed of multi-tone pips whose time-averaged spectral envelope contained suppressed or enhanced regions. Importantly, the stimuli were designed such that neural responses properties could be assessed as a function of stimulus frequency during stimulus presentation. Our results suggest that the representation of acoustic spectral edges is enhanced in the auditory cortex, and that this enhancement is sensitive to the characteristics of the spectral contrast profile, such as depth, sharpness and width. Spectral edges are maximally enhanced for sharp contrast and large depth. Cortical activity was also suppressed at frequencies within the suppressed region. To note, the suppression of firing was larger at frequencies nearby the lower edge of the suppressed region than at the upper edge. Overall, the present study gives critical insights into the processing of spectral contrasts in the auditory system.

  1. Flexible echolocation behavior of trawling bats during approach of continuous or transient prey cues.

    Science.gov (United States)

    Ubernickel, Kirstin; Tschapka, Marco; Kalko, Elisabeth K V

    2013-01-01

    Trawling bats use echolocation not only to detect and classify acoustically continuous cues originated from insects at and above water surfaces, but also to detect small water-dwelling prey items breaking the water surface for a very short time, producing only transient cues to be perceived acoustically. Generally, bats need to adjust their echolocation behavior to the specific task on hand, and because of the diversity of prey cues they use in hunting, trawling bats should be highly flexible in their echolocation behavior. We studied the adaptations in the behavior of Noctilio leporinus when approaching either a continuous cue or a transient cue that disappeared during the approach of the bat. Normally the bats reacted by dipping their feet in the water at the cue location. We found that the bats typically started to adapt their calling behavior at approximately 410 ms before prey contact in continuous cue trials, but were also able to adapt their approach behavior to stimuli onsets as short as 177 ms before contact, within a minimum reaction time of 50.9 ms in response to transient cues. In both tasks the approach phase ended between 32 and 53 ms before prey contact. Call emission always continued after the end of the approach phase until around prey contact. In some failed capture attempts, call emission did not cease at all after prey contact. Probably bats used spatial memory to dip at the original location of the transient cue after its disappearance. The duration of the pointed dips was significantly longer in transient cue trials than in continuous cue trials. Our results suggest that trawling bats possess the ability to modify their generally rather stereotyped echolocation behavior during approaches within very short reaction times depending on the sensory information available. PMID:23675352

  2. Acute stress switches spatial navigation strategy from egocentric to allocentric in a virtual Morris water maze.

    Science.gov (United States)

    van Gerven, Dustin J H; Ferguson, Thomas; Skelton, Ronald W

    2016-07-01

    Stress and stress hormones are known to influence the function of the hippocampus, a brain structure critical for cognitive-map-based, allocentric spatial navigation. The caudate nucleus, a brain structure critical for stimulus-response-based, egocentric navigation, is not as sensitive to stress. Evidence for this comes from rodent studies, which show that acute stress or stress hormones impair allocentric, but not egocentric navigation. However, there have been few studies investigating the effect of acute stress on human spatial navigation, and the results of these have been equivocal. To date, no study has investigated whether acute stress can shift human navigational strategy selection between allocentric and egocentric navigation. The present study investigated this question by exposing participants to an acute psychological stressor (the Paced Auditory Serial Addition Task, PASAT), before testing navigational strategy selection in the Dual-Strategy Maze, a modified virtual Morris water maze. In the Dual-Strategy maze, participants can chose to navigate using a constellation of extra-maze cues (allocentrically) or using a single cue proximal to the goal platform (egocentrically). Surprisingly, PASAT stress biased participants to solve the maze allocentrically significantly more, rather than less, often. These findings have implications for understanding the effects of acute stress on cognitive function in general, and the function of the hippocampus in particular. PMID:27174311

  3. Auditory perspective taking.

    Science.gov (United States)

    Martinson, Eric; Brock, Derek

    2013-06-01

    Effective communication with a mobile robot using speech is a difficult problem even when you can control the auditory scene. Robot self-noise or ego noise, echoes and reverberation, and human interference are all common sources of decreased intelligibility. Moreover, in real-world settings, these problems are routinely aggravated by a variety of sources of background noise. Military scenarios can be punctuated by high decibel noise from materiel and weaponry that would easily overwhelm a robot's normal speaking volume. Moreover, in nonmilitary settings, fans, computers, alarms, and transportation noise can cause enough interference to make a traditional speech interface unusable. This work presents and evaluates a prototype robotic interface that uses perspective taking to estimate the effectiveness of its own speech presentation and takes steps to improve intelligibility for human listeners. PMID:23096077

  4. Flexible echolocation behaviour of trawling bats during approach of continuous or transient prey cues

    Directory of Open Access Journals (Sweden)

    KirstinÜbernickel

    2013-05-01

    We studied the adaptations in the behaviour of Noctilio leporinus when approaching either a continuous cue or a transient cue that disappeared during the approach of the bat. Normally the bats reacted by dipping their feet in the water at the cue location. We found that the bats typically started to adapt their calling behaviour at approximately 410 ms before prey contact in continuous cue trials, but were also able to adapt their approach behaviour to stimuli onsets as short as 177 ms before contact, within a minimum reaction time of 50.9 ms in response to transient cues. In both tasks the approach phase ended between 32 and 53 ms before prey contact. Call emission always continued after the end of the approach phase until around prey contact. In some failed capture attempts, call emission did not cease at all after prey contact. Probably bats used spatial memory to dip at the original location of the transient cue after its disappearance. The duration of the pointed dips was significantly longer in transient cue trials than in continuous cue trials. Our results suggest that trawling bats possess the ability to modify their generally rather stereotyped echolocation behaviour during approaches within very short reaction times depending on the sensory information.

  5. Unimodal and crossmodal gradients of spatial attention

    DEFF Research Database (Denmark)

    Föcker, J.; Hötting, K.; Gondan, Matthias;

    2010-01-01

    Behavioral and event-related potential (ERP) studies have shown that spatial attention is gradually distributed around the center of the attentional focus. The present study compared uni- and crossmodal gradients of spatial attention to investigate whether the orienting of auditory and visual...... spatial attention is based on modality specific or supramodal representations of space. Auditory and visual stimuli were presented from five speaker locations positioned in the right hemifield. Participants had to attend to the innermost or outmost right position in order to detect either visual or...... auditory deviant stimuli. Detection rates and event-related potentials (ERPs) indicated that spatial attention is distributed as a gradient. Unimodal spatial ERP gradients correlated with the spatial resolution of the modality. Crossmodal spatial gradients were always broader than the corresponding...

  6. Auditory short-term memory in the primate auditory cortex.

    Science.gov (United States)

    Scott, Brian H; Mishkin, Mortimer

    2016-06-01

    Sounds are fleeting, and assembling the sequence of inputs at the ear into a coherent percept requires auditory memory across various time scales. Auditory short-term memory comprises at least two components: an active ׳working memory' bolstered by rehearsal, and a sensory trace that may be passively retained. Working memory relies on representations recalled from long-term memory, and their rehearsal may require phonological mechanisms unique to humans. The sensory component, passive short-term memory (pSTM), is tractable to study in nonhuman primates, whose brain architecture and behavioral repertoire are comparable to our own. This review discusses recent advances in the behavioral and neurophysiological study of auditory memory with a focus on single-unit recordings from macaque monkeys performing delayed-match-to-sample (DMS) tasks. Monkeys appear to employ pSTM to solve these tasks, as evidenced by the impact of interfering stimuli on memory performance. In several regards, pSTM in monkeys resembles pitch memory in humans, and may engage similar neural mechanisms. Neural correlates of DMS performance have been observed throughout the auditory and prefrontal cortex, defining a network of areas supporting auditory STM with parallels to that supporting visual STM. These correlates include persistent neural firing, or a suppression of firing, during the delay period of the memory task, as well as suppression or (less commonly) enhancement of sensory responses when a sound is repeated as a ׳match' stimulus. Auditory STM is supported by a distributed temporo-frontal network in which sensitivity to stimulus history is an intrinsic feature of auditory processing. This article is part of a Special Issue entitled SI: Auditory working memory. PMID:26541581

  7. Auditory and non-auditory effects of noise on health

    OpenAIRE

    Basner, Mathias; Babisch, Wolfgang; Davis, Adrian; Brink, Mark; Clark, Charlotte; Janssen, Sabine; Stansfeld, Stephen

    2013-01-01

    Noise is pervasive in everyday life and can cause both auditory and non-auditory health effects. Noise-induced hearing loss remains highly prevalent in occupational settings, and is increasingly caused by social noise exposure (eg, through personal music players). Our understanding of molecular mechanisms involved in noise-induced hair-cell and nerve damage has substantially increased, and preventive and therapeutic drugs will probably become available within 10 years. Evidence of the non-aud...

  8. Fractal Fluctuations in Human Walking: Comparison Between Auditory and Visually Guided Stepping.

    Science.gov (United States)

    Terrier, Philippe

    2016-09-01

    In human locomotion, sensorimotor synchronization of gait consists of the coordination of stepping with rhythmic auditory cues (auditory cueing, AC). AC changes the long-range correlations among consecutive strides (fractal dynamics) into anti-correlations. Visual cueing (VC) is the alignment of step lengths with marks on the floor. The effects of VC on the fluctuation structure of walking have not been investigated. Therefore, the objective was to compare the effects of AC and VC on the fluctuation pattern of basic spatiotemporal gait parameters. Thirty-six healthy individuals walked 3 × 500 strides on an instrumented treadmill with augmented reality capabilities. The conditions were no cueing (NC), AC, and VC. AC included an isochronous metronome. For VC, projected stepping stones were synchronized with the treadmill speed. Detrended fluctuation analysis assessed the correlation structure. The coefficient of variation (CV) was also assessed. The results showed that AC and VC similarly induced a strong anti-correlated pattern in the gait parameters. The CVs were similar between the NC and AC conditions but substantially higher in the VC condition. AC and VC probably mobilize similar motor control pathways and can be used alternatively in gait rehabilitation. However, the increased gait variability induced by VC should be considered. PMID:26903091

  9. Electrophysiological correlates of predictive coding of auditory location in the perception of natural audiovisual events

    Directory of Open Access Journals (Sweden)

    Jeroen eStekelenburg

    2012-05-01

    Full Text Available In many natural audiovisual events (e.g., a clap of the two hands, the visual signal precedes the sound and thus allows observers to predict when, where, and which sound will occur. Previous studies have already reported that there are distinct neural correlates of temporal (when versus phonetic/semantic (which content on audiovisual integration. Here we examined the effect of visual prediction of auditory location (where in audiovisual biological motion stimuli by varying the spatial congruency between the auditory and visual part of the audiovisual stimulus. Visual stimuli were presented centrally, whereas auditory stimuli were presented either centrally or at 90° azimuth. Typical subadditive amplitude reductions (AV – V < A were found for the auditory N1 and P2 for spatially congruent and incongruent conditions. The new finding is that the N1 suppression was larger for spatially congruent stimuli. A very early audiovisual interaction was also found at 30-50 ms in the spatially congruent condition, while no effect of congruency was found on the suppression of the P2. This indicates that visual prediction of auditory location can be coded very early in auditory processing.

  10. Humans as an animal model? : studies on cue interaction, occasion setting, and context dependency

    OpenAIRE

    Dibbets, Pauline

    2002-01-01

    The objective of the present thesis was to study human learning behaviour and to compare the results with those from animal learning studies. Three topics originating from animal learning research were examined: cue interaction, occasion setting, and context dependency. A series of experiments was first carried out to examine the influence of spatial position on cue-interaction effects in a predictive-learning task. Evidence that previously learned information about a stimulus can interact wi...

  11. The Effect of Visual Cues on Difficulty Ratings for Segregation of Musical Streams in Listeners with Impaired Hearing

    OpenAIRE

    Hamish Innes-Brown; Jeremy Marozeau; Peter Blamey

    2011-01-01

    BACKGROUND: Enjoyment of music is an important part of life that may be degraded for people with hearing impairments, especially those using cochlear implants. The ability to follow separate lines of melody is an important factor in music appreciation. This ability relies on effective auditory streaming, which is much reduced in people with hearing impairment, contributing to difficulties in music appreciation. The aim of this study was to assess whether visual cues could reduce the subjectiv...

  12. Visual cues for data mining

    Science.gov (United States)

    Rogowitz, Bernice E.; Rabenhorst, David A.; Gerth, John A.; Kalin, Edward B.

    1996-04-01

    This paper describes a set of visual techniques, based on principles of human perception and cognition, which can help users analyze and develop intuitions about tabular data. Collections of tabular data are widely available, including, for example, multivariate time series data, customer satisfaction data, stock market performance data, multivariate profiles of companies and individuals, and scientific measurements. In our approach, we show how visual cues can help users perform a number of data mining tasks, including identifying correlations and interaction effects, finding clusters and understanding the semantics of cluster membership, identifying anomalies and outliers, and discovering multivariate relationships among variables. These cues are derived from psychological studies on perceptual organization, visual search, perceptual scaling, and color perception. These visual techniques are presented as a complement to the statistical and algorithmic methods more commonly associated with these tasks, and provide an interactive interface for the human analyst.

  13. Auditory Processing Disorder in Children

    Science.gov (United States)

    ... free publications Find organizations Related Topics Auditory Neuropathy Autism Spectrum Disorder: Communication Problems in Children Dysphagia Quick ... NIH… Turning Discovery Into Health ® National Institute on Deafness and Other Communication Disorders 31 Center Drive, MSC ...

  14. Auditory Processing Disorder (For Parents)

    Science.gov (United States)

    ... and school. A positive, realistic attitude and healthy self-esteem in a child with APD can work wonders. And kids with APD can go on to ... Parents MORE ON THIS TOPIC Auditory Processing Disorder Special ...

  15. Sex difference in cue strategy in a modified version of the Morris water task: correlations between brain and behaviour.

    Directory of Open Access Journals (Sweden)

    Robin J Keeley

    Full Text Available BACKGROUND: Sex differences in spatial memory function have been reported with mixed results in the literature, with some studies showing male advantages and others showing no differences. When considering estrus cycle in females, results are mixed at to whether high or low circulating estradiol results in an advantage in spatial navigation tasks. Research involving humans and rodents has demonstrated males preferentially employ Euclidean strategies and utilize geometric cues in order to spatially navigate, whereas females employ landmark strategies and cues in order to spatially navigate. METHODOLOGY/PRINCIPAL FINDINGS: This study used the water-based snowcone maze in order to assess male and female preference for landmark or geometric cues, with specific emphasis placed on the effects of estrus cycle phase for female rat. Performance and preference for the geometric cue was examined in relation to total hippocampal and hippocampal subregions (CA1&2, CA3 and dentate gyrus volumes and entorhinal cortex thickness in order to determine the relation between strategy and spatial performance and brain area size. The study revealed that males outperformed females overall during training trials, relied on the geometric cue when the platform was moved and showed significant correlations between entorhinal cortex thickness and spatial memory performance. No gross differences in behavioural performance was observed within females when accounting for cyclicity, and only total hippocampal volume was correlated with performance during the learning trials. CONCLUSIONS/SIGNIFICANCE: This study demonstrates the sex-specific use of cues and brain areas in a spatial learning task.

  16. Biases in Visual, Auditory, and Audiovisual Perception of Space.

    Directory of Open Access Journals (Sweden)

    Brian Odegaard

    2015-12-01

    Full Text Available Localization of objects and events in the environment is critical for survival, as many perceptual and motor tasks rely on estimation of spatial location. Therefore, it seems reasonable to assume that spatial localizations should generally be accurate. Curiously, some previous studies have reported biases in visual and auditory localizations, but these studies have used small sample sizes and the results have been mixed. Therefore, it is not clear (1 if the reported biases in localization responses are real (or due to outliers, sampling bias, or other factors, and (2 whether these putative biases reflect a bias in sensory representations of space or a priori expectations (which may be due to the experimental setup, instructions, or distribution of stimuli. Here, to address these questions, a dataset of unprecedented size (obtained from 384 observers was analyzed to examine presence, direction, and magnitude of sensory biases, and quantitative computational modeling was used to probe the underlying mechanism(s driving these effects. Data revealed that, on average, observers were biased towards the center when localizing visual stimuli, and biased towards the periphery when localizing auditory stimuli. Moreover, quantitative analysis using a Bayesian Causal Inference framework suggests that while pre-existing spatial biases for central locations exert some influence, biases in the sensory representations of both visual and auditory space are necessary to fully explain the behavioral data. How are these opposing visual and auditory biases reconciled in conditions in which both auditory and visual stimuli are produced by a single event? Potentially, the bias in one modality could dominate, or the biases could interact/cancel out. The data revealed that when integration occurred in these conditions, the visual bias dominated, but the magnitude of this bias was reduced compared to unisensory conditions. Therefore, multisensory integration not only

  17. Multi-Cue Pedestrian Recognition

    OpenAIRE

    Munder, Stefan

    2007-01-01

    This thesis addresses the problem of detecting complex, deformable objects in an arbitrary, cluttered environment in sequences of video images. Often, no single best technique exists for such a challenging problem, as different approaches possess different characteristics with regard to detection accuracy, processing speed, or the kind of errors made. Therefore, multi-cue approaches are pursued in this thesis. By combining multiple detection methods, each utilizing a different aspect of the v...

  18. Measuring the performance of visual to auditory information conversion.

    Directory of Open Access Journals (Sweden)

    Shern Shiou Tan

    Full Text Available BACKGROUND: Visual to auditory conversion systems have been in existence for several decades. Besides being among the front runners in providing visual capabilities to blind users, the auditory cues generated from image sonification systems are still easier to learn and adapt to compared to other similar techniques. Other advantages include low cost, easy customizability, and universality. However, every system developed so far has its own set of strengths and weaknesses. In order to improve these systems further, we propose an automated and quantitative method to measure the performance of such systems. With these quantitative measurements, it is possible to gauge the relative strengths and weaknesses of different systems and rank the systems accordingly. METHODOLOGY: Performance is measured by both the interpretability and also the information preservation of visual to auditory conversions. Interpretability is measured by computing the correlation of inter image distance (IID and inter sound distance (ISD whereas the information preservation is computed by applying Information Theory to measure the entropy of both visual and corresponding auditory signals. These measurements provide a basis and some insights on how the systems work. CONCLUSIONS: With an automated interpretability measure as a standard, more image sonification systems can be developed, compared, and then improved. Even though the measure does not test systems as thoroughly as carefully designed psychological experiments, a quantitative measurement like the one proposed here can compare systems to a certain degree without incurring much cost. Underlying this research is the hope that a major breakthrough in image sonification systems will allow blind users to cost effectively regain enough visual functions to allow them to lead secure and productive lives.

  19. Memory retrieval in response to partial cues requires NMDA receptor-dependent neurotransmission in the medial prefrontal cortex.

    Science.gov (United States)

    Jo, Yong Sang; Choi, June-Seek

    2014-03-01

    The medial prefrontal cortex (mPFC) has been suggested to play a crucial role in retrieving detailed contextual information about a previous learning episode in response to a single retrieval cue. However, few studies investigated the neurochemical mechanisms that mediate the prefrontal retrieval process. In the current study, we examined whether N-methyl-D-aspartate receptors (NMDARs) in the mPFC were necessary for retrieval of a well-learned spatial location on the basis of partial or degraded spatial cues. Rats were initially trained to find a hidden platform in the Morris water maze using four extramaze cues in the surrounding environment. Their retrieval performance was subsequently tested under different cue conditions. Infusions of DL-2-amino-5-phosphonovaleric acid (APV), a NMDAR antagonist, significantly disrupted memory retrieval when three of the original cues were removed. By contrast, APV injections into the mPFC did not affect animals' retrieval performance when the original cues were presented or when three novels landmarks were added alongside the original cues. These results indicate that prefrontal NMDARs are required for memory retrieval when allocentric spatial information is degraded. NMDAR-dependent neurotransmission in the mPFC may facilitate an active retrieval process to reactivate complete contextual representations associated with partial retrieval cues. PMID:24269352

  20. Perceiving speech in context: Compensation for contextual variability during acoustic cue encoding and categorization

    Science.gov (United States)

    Toscano, Joseph Christopher

    Several fundamental questions about speech perception concern how listeners understand spoken language despite considerable variability in speech sounds across different contexts (the problem of lack of invariance in speech). This contextual variability is caused by several factors, including differences between individual talkers' voices, variation in speaking rate, and effects of coarticulatory context. A number of models have been proposed to describe how the speech system handles differences across contexts. Critically, these models make different predictions about (1) whether contextual variability is handled at the level of acoustic cue encoding or categorization, (2) whether it is driven by feedback from category-level processes or interactions between cues, and (3) whether listeners discard fine-grained acoustic information to compensate for contextual variability. Separating the effects of cue- and category-level processing has been difficult because behavioral measures tap processes that occur well after initial cue encoding and are influenced by task demands and linguistic information. Recently, we have used the event-related brain potential (ERP) technique to examine cue encoding and online categorization. Specifically, we have looked at differences in the auditory N1 as a measure of acoustic cue encoding and the P3 as a measure of categorization. This allows us to examine multiple levels of processing during speech perception and can provide a useful tool for studying effects of contextual variability. Here, I apply this approach to determine the point in processing at which context has an effect on speech perception and to examine whether acoustic cues are encoded continuously. Several types of contextual variability (talker gender, speaking rate, and coarticulation), as well as several acoustic cues (voice onset time, formant frequencies, and bandwidths), are examined in a series of experiments. The results suggest that (1) at early stages of speech

  1. Virtual adult ears reveal the roles of acoustical factors and experience in auditory space map development

    OpenAIRE

    Campbell, Robert A. A.; King, Andrew J; Nodal, Fernando R.; Schnupp, Jan W. H.; Carlile, Simon; Doubell, Timothy P.

    2008-01-01

    Auditory neurons in the superior colliculus (SC) respond preferentially to sounds from restricted directions to form a map of auditory space. The development of this representation is shaped by sensory experience, but little is known about the relative contribution of peripheral and central factors to the emergence of adult responses. By recording from the SC of anesthetized ferrets at different age points, we show that the map matures gradually after birth; the spatial receptive fields (SRFs...

  2. A rate code for sound azimuth in monkey auditory cortex: implications for human neuroimaging studies

    OpenAIRE

    Werner-Reiss, Uri; Jennifer M Groh

    2008-01-01

    Is sound location represented in the auditory cortex of humans and monkeys? Human neuroimaging experiments have had only mixed success at demonstrating sound location sensitivity in primary auditory cortex. This is in apparent conflict with studies in monkeys and other animals, where single-unit recording studies have found stronger evidence for spatial sensitivity. Does this apparent discrepancy reflect a difference between humans and animals, or does it reflect differences in the sensitivit...

  3. Quit interest influences smoking cue-reactivity.

    Science.gov (United States)

    Veilleux, Jennifer C; Skinner, Kayla D; Pollert, Garrett A

    2016-12-01

    Interest in quitting smoking is important to model in cue-reactivity studies, because the craving elicited by cue exposure likely requires different self-regulation efforts for smokers who are interested in quitting compared to those without any quit interest. The objective of the current study was to evaluate the role of quit interest in how cigarette cue exposure influences self-control efforts. Smokers interested in quitting (n=37) and smokers with no interest in quitting (n=53) were randomly assigned to a cigarette or neutral cue exposure task. Following the cue exposure, all participants completed two self-control tasks, a measure of risky gambling (the Iowa Gambling Task) and a cold pressor tolerance task. Results indicated that smokers interested in quitting had worse performance on the gambling task when exposed to a cigarette cue compared to neutral cue exposure. We also found that people interested in quitting tolerated the cold pressor task for a shorter amount of time than people not interested in quitting. Finally, we found that for people interested in quitting, exposure to a cigarette cue was associated with increased motivation to take steps toward decreasing use. Overall these results suggest that including quit interest in studies of cue reactivity is valuable, as quit interest influenced smoking cue-reactivity responses. PMID:27487082

  4. Attentional demands influence vocal compensations to pitch errors heard in auditory feedback.

    Science.gov (United States)

    Tumber, Anupreet K; Scheerer, Nichole E; Jones, Jeffery A

    2014-01-01

    Auditory feedback is required to maintain fluent speech. At present, it is unclear how attention modulates auditory feedback processing during ongoing speech. In this event-related potential (ERP) study, participants vocalized/a/, while they heard their vocal pitch suddenly shifted downward a ½ semitone in both single and dual-task conditions. During the single-task condition participants passively viewed a visual stream for cues to start and stop vocalizing. In the dual-task condition, participants vocalized while they identified target stimuli in a visual stream of letters. The presentation rate of the visual stimuli was manipulated in the dual-task condition in order to produce a low, intermediate, and high attentional load. Visual target identification accuracy was lowest in the high attentional load condition, indicating that attentional load was successfully manipulated. Results further showed that participants who were exposed to the single-task condition, prior to the dual-task condition, produced larger vocal compensations during the single-task condition. Thus, when participants' attention was divided, less attention was available for the monitoring of their auditory feedback, resulting in smaller compensatory vocal responses. However, P1-N1-P2 ERP responses were not affected by divided attention, suggesting that the effect of attentional load was not on the auditory processing of pitch altered feedback, but instead it interfered with the integration of auditory and motor information, or motor control itself. PMID:25303649

  5. Sexual selection in the squirrel treefrog Hyla squirella: the role of multimodal cue assessment in female choice

    Science.gov (United States)

    Taylor, Ryan C.; Buchanan, Bryant W.; Doherty, Jessie L.

    2007-01-01

    Anuran amphibians have provided an excellent system for the study of animal communication and sexual selection. Studies of female mate choice in anurans, however, have focused almost exclusively on the role of auditory signals. In this study, we examined the effect of both auditory and visual cues on female choice in the squirrel treefrog. Our experiments used a two-choice protocol in which we varied male vocalization properties, visual cues, or both, to assess female preferences for the different cues. Females discriminated against high-frequency calls and expressed a strong preference for calls that contained more energy per unit time (faster call rate). Females expressed a preference for the visual stimulus of a model of a calling male when call properties at the two speakers were held the same. They also showed a significant attraction to a model possessing a relatively large lateral body stripe. These data indicate that visual cues do play a role in mate attraction in this nocturnal frog species. Furthermore, this study adds to a growing body of evidence that suggests that multimodal signals play an important role in sexual selection.

  6. The role of visual spatial attention in audiovisual speech perception

    DEFF Research Database (Denmark)

    Andersen, Tobias; Tiippana, K.; Laarni, J.;

    2009-01-01

    recent reports have challenged this view. Here we study the effect of visual spatial attention on the McGurk effect. By presenting a movie of two faces symmetrically displaced to each side of a central fixation point and dubbed with a single auditory speech track, we were able to discern the influences......Auditory and visual information is integrated when perceiving speech, as evidenced by the McGurk effect in which viewing an incongruent talking face categorically alters auditory speech perception. Audiovisual integration in speech perception has long been considered automatic and pre-attentive but...... from each of the faces and from the voice on the auditory speech percept. We found that directing visual spatial attention towards a face increased the influence of that face on auditory perception. However, the influence of the voice on auditory perception did not change suggesting that audiovisual...

  7. Cue predictability changes scaling in eye-movement fluctuations.

    Science.gov (United States)

    Wallot, Sebastian; Coey, Charles A; Richardson, Michael J

    2015-10-01

    Recent research has provided evidence for scaling-relations in eye-movement fluctuations, but not much is known about what these scaling relations imply about cognition or eye-movement control. Generally, scaling relations in behavioral and neurophysiological data have been interpreted as an indicator for the coordination of neurophysiological and cognitive processes. In this study, we investigated the effect of predictability in timing and gaze-direction on eye-movement fluctuations. Participants performed a simple eye-movement task, in which a visual cue prompted their gaze to different locations on a spatial layout, and the predictability about temporal and directional aspects of the cue were manipulated. The results showed that scaling exponents in eye-movements decreased with predictability and were related to the participants' perceived effort during the task. In relation to past research, these findings suggest that scaling exponents reflect a relative demand for voluntary control during task performance. PMID:26337612

  8. Auditory and non-auditory effects of noise on health.

    Science.gov (United States)

    Basner, Mathias; Babisch, Wolfgang; Davis, Adrian; Brink, Mark; Clark, Charlotte; Janssen, Sabine; Stansfeld, Stephen

    2014-04-12

    Noise is pervasive in everyday life and can cause both auditory and non-auditory health effects. Noise-induced hearing loss remains highly prevalent in occupational settings, and is increasingly caused by social noise exposure (eg, through personal music players). Our understanding of molecular mechanisms involved in noise-induced hair-cell and nerve damage has substantially increased, and preventive and therapeutic drugs will probably become available within 10 years. Evidence of the non-auditory effects of environmental noise exposure on public health is growing. Observational and experimental studies have shown that noise exposure leads to annoyance, disturbs sleep and causes daytime sleepiness, affects patient outcomes and staff performance in hospitals, increases the occurrence of hypertension and cardiovascular disease, and impairs cognitive performance in schoolchildren. In this Review, we stress the importance of adequate noise prevention and mitigation strategies for public health. PMID:24183105

  9. Reduced object related negativity response indicates impaired auditory scene analysis in adults with autistic spectrum disorder

    Directory of Open Access Journals (Sweden)

    Veema Lodhia

    2014-02-01

    Full Text Available Auditory Scene Analysis provides a useful framework for understanding atypical auditory perception in autism. Specifically, a failure to segregate the incoming acoustic energy into distinct auditory objects might explain the aversive reaction autistic individuals have to certain auditory stimuli or environments. Previous research with non-autistic participants has demonstrated the presence of an Object Related Negativity (ORN in the auditory event related potential that indexes pre-attentive processes associated with auditory scene analysis. Also evident is a later P400 component that is attention dependent and thought to be related to decision-making about auditory objects. We sought to determine whether there are differences between individuals with and without autism in the levels of processing indexed by these components. Electroencephalography (EEG was used to measure brain responses from a group of 16 autistic adults, and 16 age- and verbal-IQ-matched typically-developing adults. Auditory responses were elicited using lateralized dichotic pitch stimuli in which inter-aural timing differences create the illusory perception of a pitch that is spatially separated from a carrier noise stimulus. As in previous studies, control participants produced an ORN in response to the pitch stimuli. However, this component was significantly reduced in the participants with autism. In contrast, processing differences were not observed between the groups at the attention-dependent level (P400. These findings suggest that autistic individuals have difficulty segregating auditory stimuli into distinct auditory objects, and that this difficulty arises at an early pre-attentive level of processing.

  10. Individual differences in auditory abilities.

    Science.gov (United States)

    Kidd, Gary R; Watson, Charles S; Gygi, Brian

    2007-07-01

    Performance on 19 auditory discrimination and identification tasks was measured for 340 listeners with normal hearing. Test stimuli included single tones, sequences of tones, amplitude-modulated and rippled noise, temporal gaps, speech, and environmental sounds. Principal components analysis and structural equation modeling of the data support the existence of a general auditory ability and four specific auditory abilities. The specific abilities are (1) loudness and duration (overall energy) discrimination; (2) sensitivity to temporal envelope variation; (3) identification of highly familiar sounds (speech and nonspeech); and (4) discrimination of unfamiliar simple and complex spectral and temporal patterns. Examination of Scholastic Aptitude Test (SAT) scores for a large subset of the population revealed little or no association between general or specific auditory abilities and general intellectual ability. The findings provide a basis for research to further specify the nature of the auditory abilities. Of particular interest are results suggestive of a familiar sound recognition (FSR) ability, apparently specialized for sound recognition on the basis of limited or distorted information. This FSR ability is independent of normal variation in both spectral-temporal acuity and of general intellectual ability. PMID:17614500

  11. Moving to music: Effects of heard and imagined musical cues on movement-related brain activity

    Directory of Open Access Journals (Sweden)

    Rebecca S Schaefer

    2014-09-01

    Full Text Available Music is commonly used to facilitate or support movement, and increasingly used in movement rehabilitation. Additionally, there is some evidence to suggest that music imagery, which is reported to lead to brain signatures similar to music perception, may also assist movement. However, it is not yet known whether either imagined or musical cueing changes the way in which the motor system of the human brain is activated during simple movements. Here, functional Magnetic Resonance Imaging (fMRI was used to compare neural activity during wrist flexions performed to either heard or imagined music with self-pacing of the same movement without any cueing. Focusing specifically on the motor network of the brain, analyses were performed within a mask of BA4, BA6, the basal ganglia (putamen, caudate and pallidum, the motor nuclei of the thalamus and the whole cerebellum. Results revealed that moving to music compared with self-paced movement resulted in significantly increased activation in left cerebellum VI. Moving to imagined music led to significantly more activation in pre-supplementary motor area (pre-SMA and right globus pallidus, relative to self-paced movement. When the music and imagery cueing conditions were contrasted directly, movements in the music condition showed significantly more activity in left hemisphere cerebellum VII and right hemisphere and vermis of cerebellum IX, while the imagery condition revealed more significant activity in pre-SMA. These results suggest that cueing movement with actual or imagined music impacts upon engagement of motor network regions during the movement, and suggest that heard and imagined cues can modulate movement in subtly different ways. These results may have implications for the applicability of auditory cueing in movement rehabilitation for different patient populations.

  12. Neural Correlates of an Auditory Afterimage in Primary Auditory Cortex

    OpenAIRE

    Noreña, A. J.; Eggermont, J. J.

    2003-01-01

    The Zwicker tone (ZT) is defined as an auditory negative afterimage, perceived after the presentation of an appropriate inducer. Typically, a notched noise (NN) with a notch width of 1/2 octave induces a ZT with a pitch falling in the frequency range of the notch. The aim of the present study was to find potential neural correlates of the ZT in the primary auditory cortex of ketamine-anesthetized cats. Responses of multiunits were recorded simultaneously with two 8-electrode arrays during 1 s...

  13. Kin-informative recognition cues in ants

    DEFF Research Database (Denmark)

    Nehring, Volker; Evison, Sophie E F; Santorelli, Lorenzo A;

    2011-01-01

    found little or no kin information in recognition cues. Here, we test the hypothesis that social insects do not have kin-informative recognition cues by investigating the recognition cues and relatedness of workers from four colonies of the ant Acromyrmex octospinosus. Contrary to the theoretical...... prediction, we show that the cuticular hydrocarbons of ant workers in all four colonies are informative enough to allow full-sisters to be distinguished from half-sisters with a high accuracy. These results contradict the hypothesis of non-heritable recognition cues and suggest that there is more potential...

  14. Visual cues for landmine detection

    Science.gov (United States)

    Staszewski, James J.; Davison, Alan D.; Tischuk, Julia A.; Dippel, David J.

    2007-04-01

    Can human vision supplement the information that handheld landmine detection equipment provides its operators to increase detection rates and reduce the hazard of the task? Contradictory viewpoints exist regarding the viability of visual detection of landmines. Assuming both positions are credible, this work aims to reconcile them by exploring the visual information produced by landmine burial and how any visible signatures change as a function of time in a natural environment. Its objective is to acquire objective, foundational knowledge on which training could be based and subsequently evaluated. A representative set of demilitarized landmines were buried at a field site with bare soil and vegetated surfaces using doctrinal procedures. High resolution photographs of the ground surface were taken for approximately one month starting in April 2006. Photos taken immediately after burial show clearly visible surface signatures. Their features change with time and weather exposure, but the patterns they define persist, as photos taken a month later show. An analysis exploiting the perceptual sensitivity of expert observers showed signature photos to domain experts with instructions to identify the cues and patterns that defined the signatures. Analysis of experts' verbal descriptions identified a small set of easily communicable cues that characterize signatures and their changes over the duration of observation. Findings suggest that visual detection training is viable and has potential to enhance detection capabilities. The photos and descriptions generated offer materials for designing such training and testing its utility. Plans for investigating the generality of the findings, especially potential limiting conditions, are discussed.

  15. Auditory Hallucinations in Acute Stroke

    Directory of Open Access Journals (Sweden)

    Yair Lampl

    2005-01-01

    Full Text Available Auditory hallucinations are uncommon phenomena which can be directly caused by acute stroke, mostly described after lesions of the brain stem, very rarely reported after cortical strokes. The purpose of this study is to determine the frequency of this phenomenon. In a cross sectional study, 641 stroke patients were followed in the period between 1996–2000. Each patient underwent comprehensive investigation and follow-up. Four patients were found to have post cortical stroke auditory hallucinations. All of them occurred after an ischemic lesion of the right temporal lobe. After no more than four months, all patients were symptom-free and without therapy. The fact the auditory hallucinations may be of cortical origin must be taken into consideration in the treatment of stroke patients. The phenomenon may be completely reversible after a couple of months.

  16. Adaptation in the auditory system: an overview

    OpenAIRE

    David ePérez-González; Malmierca, Manuel S.

    2014-01-01

    The early stages of the auditory system need to preserve the timing information of sounds in order to extract the basic features of acoustic stimuli. At the same time, different processes of neuronal adaptation occur at several levels to further process the auditory information. For instance, auditory nerve fiber responses already experience adaptation of their firing rates, a type of response that can be found in many other auditory nuclei and may be useful for emphasizing the onset of the s...

  17. Functional neuroanatomy of spatial sound processing in Alzheimer's disease.

    OpenAIRE

    Golden, HL; Agustus, JL; Nicholas, JM; Schott, JM; Crutch, SJ; L. Mancini; Warren, JD

    2016-01-01

    Deficits of auditory scene analysis accompany Alzheimer's disease (AD). However, the functional neuroanatomy of spatial sound processing has not been defined in AD. We addressed this using a "sparse" fMRI virtual auditory spatial paradigm in 14 patients with typical AD in relation to 16 healthy age-matched individuals. Sound stimulus sequences discretely varied perceived spatial location and pitch of the sound source in a factorial design. AD was associated with loss of differentiated cortica...

  18. Encoding audio motion: spatial impairment in early blind individuals

    OpenAIRE

    Finocchietti, Sara; Cappagli, Giulia; Gori, Monica

    2015-01-01

    The consequence of blindness on auditory spatial localization has been an interesting issue of research in the last decade providing mixed results. Enhanced auditory spatial skills in individuals with visual impairment have been reported by multiple studies, while some aspects of spatial hearing seem to be impaired in the absence of vision. In this study, the ability to encode the trajectory of a 2-dimensional sound motion, reproducing the complete movement, and reaching the correct end-point...

  19. Contour identification with pitch and loudness cues using cochlear implants.

    Science.gov (United States)

    Luo, Xin; Masterson, Megan E; Wu, Ching-Chih

    2014-01-01

    Different from speech, pitch and loudness cues may or may not co-vary in music. Cochlear implant (CI) users with poor pitch perception may use loudness contour cues more than normal-hearing (NH) listeners. Contour identification was tested in CI users and NH listeners; the five-note contours contained either pitch cues alone, loudness cues alone, or both. Results showed that NH listeners' contour identification was better with pitch cues than with loudness cues; CI users performed similarly with either cues. When pitch and loudness cues were co-varied, CI performance significantly improved, suggesting that CI users were able to integrate the two cues. PMID:24437857

  20. Cueing Animations: Dynamic Signaling Aids Information Extraction and Comprehension

    Science.gov (United States)

    Boucheix, Jean-Michel; Lowe, Richard K.; Putri, Dian K.; Groff, Jonathan

    2013-01-01

    The effectiveness of animations containing two novel forms of animation cueing that target relations between event units rather than individual entities was compared with that of animations containing conventional entity-based cueing or no cues. These relational event unit cues ("progressive path" and "local coordinated" cues) were specifically…

  1. Dual streams of auditory afferents target multiple domains in the primate prefrontal cortex

    Science.gov (United States)

    Romanski, L. M.; Tian, B.; Fritz, J.; Mishkin, M.; Goldman-Rakic, P. S.; Rauschecker, J. P.

    2009-01-01

    ‘What’ and ‘where’ visual streams define ventrolateral object and dorsolateral spatial processing domains in the prefrontal cortex of nonhuman primates. We looked for similar streams for auditory–prefrontal connections in rhesus macaques by combining microelectrode recording with anatomical tract-tracing. Injection of multiple tracers into physiologically mapped regions AL, ML and CL of the auditory belt cortex revealed that anterior belt cortex was reciprocally connected with the frontal pole (area 10), rostral principal sulcus (area 46) and ventral prefrontal regions (areas 12 and 45), whereas the caudal belt was mainly connected with the caudal principal sulcus (area 46) and frontal eye fields (area 8a). Thus separate auditory streams originate in caudal and rostral auditory cortex and target spatial and non-spatial domains of the frontal lobe, respectively. PMID:10570492

  2. Neural Correlates of Auditory Figure-Ground Segregation Based on Temporal Coherence.

    Science.gov (United States)

    Teki, Sundeep; Barascud, Nicolas; Picard, Samuel; Payne, Christopher; Griffiths, Timothy D; Chait, Maria

    2016-09-01

    To make sense of natural acoustic environments, listeners must parse complex mixtures of sounds that vary in frequency, space, and time. Emerging work suggests that, in addition to the well-studied spectral cues for segregation, sensitivity to temporal coherence-the coincidence of sound elements in and across time-is also critical for the perceptual organization of acoustic scenes. Here, we examine pre-attentive, stimulus-driven neural processes underlying auditory figure-ground segregation using stimuli that capture the challenges of listening in complex scenes where segregation cannot be achieved based on spectral cues alone. Signals ("stochastic figure-ground": SFG) comprised a sequence of brief broadband chords containing random pure tone components that vary from 1 chord to another. Occasional tone repetitions across chords are perceived as "figures" popping out of a stochastic "ground." Magnetoencephalography (MEG) measurement in naïve, distracted, human subjects revealed robust evoked responses, commencing from about 150 ms after figure onset that reflect the emergence of the "figure" from the randomly varying "ground." Neural sources underlying this bottom-up driven figure-ground segregation were localized to planum temporale, and the intraparietal sulcus, demonstrating that this area, outside the "classic" auditory system, is also involved in the early stages of auditory scene analysis." PMID:27325682

  3. Effect of post-training unilateral labyrinthectomy in a spatial orientation task by guinea pigs.

    Science.gov (United States)

    Chapuis, N; Krimm, M; de Waele, C; Vibert, N; Berthoz, A

    1992-11-15

    The effects of unilateral labyrinthectomy in guinea pigs have been studied on an angular orientation task consisting, in an open field, of running to a hidden goal oriented at 45 degrees with respect to the cephalocaudal axis of the animal placed in a starting-box. The task was conducted in light but in an homogeneous environment, i.e. without visual, auditory or olfactory cues indicating the location of the goal. A second group of animals was submitted to a similar task running to a hidden goal but the place of the goal was indicated by a colored card. All the animals were trained before the lesion and tested in their respective task for 1 month after the lesion. In the task conducted without conspicuous cues, animals were dramatically disturbed. In contrast, animals pretrained in the visually guided task were not impaired after the lesion. These results point out the important role of vestibular information in performing spatial tasks based on angular estimation, since, even if proprioceptive and visuokinesthetic information remain available, subjects seemed not able to maintain a correct angular trajectory. The trajectories being not disturbed in the visually guided task, one can exclude the hypothesis that such deficit was due to a purely motor disturbance. PMID:1466778

  4. Gaze in Visual Search Is Guided More Efficiently by Positive Cues than by Negative Cues.

    Directory of Open Access Journals (Sweden)

    Günter Kugler

    Full Text Available Visual search can be accelerated when properties of the target are known. Such knowledge allows the searcher to direct attention to items sharing these properties. Recent work indicates that information about properties of non-targets (i.e., negative cues can also guide search. In the present study, we examine whether negative cues lead to different search behavior compared to positive cues. We asked observers to search for a target defined by a certain shape singleton (broken line among solid lines. Each line was embedded in a colored disk. In "positive cue" blocks, participants were informed about possible colors of the target item. In "negative cue" blocks, the participants were informed about colors that could not contain the target. Search displays were designed such that with both the positive and negative cues, the same number of items could potentially contain the broken line ("relevant items". Thus, both cues were equally informative. We measured response times and eye movements. Participants exhibited longer response times when provided with negative cues compared to positive cues. Although negative cues did guide the eyes to relevant items, there were marked differences in eye movements. Negative cues resulted in smaller proportions of fixations on relevant items, longer duration of fixations and in higher rates of fixations per item as compared to positive cues. The effectiveness of both cue types, as measured by fixations on relevant items, increased over the course of each search. In sum, a negative color cue can guide attention to relevant items, but it is less efficient than a positive cue of the same informational value.

  5. Children's recognition of emotions from vocal cues

    NARCIS (Netherlands)

    D.A. Sauter; C. Panattoni; F. Happé

    2013-01-01

    Emotional cues contain important information about the intentions and feelings of others. Despite a wealth of research into children's understanding of facial signals of emotions, little research has investigated the developmental trajectory of interpreting affective cues in the voice. In this study

  6. Stereotyping in Fear of Success Cues.

    Science.gov (United States)

    Juran, Shelley

    Prior studies suggest that sex-role stereotypes influence responses to Horner's fear of success cue. This study investigates stereotypes about both sex roles and achievement settings. One hundred sixty college males and females wrote stories to different cues, then rated the masculinity-femininity of their story characters. Both "John and "Anne"…

  7. Different patterns of auditory cortex activation revealed by functional magnetic resonance imaging

    International Nuclear Information System (INIS)

    In the last few years, functional Magnetic Resonance Imaging (fMRI) has been widely accepted as an effective tool for mapping brain activities in both the sensorimotor and the cognitive field. The present work aims to assess the possibility of using fMRI methods to study the cortical response to different acoustic stimuli. Furthermore, we refer to recent data collected at Frankfurt University on the cortical pattern of auditory hallucinations. Healthy subjects showed broad bilateral activation, mostly located in the transverse gyrus of Heschl. The analysis of the cortical activation induced by different stimuli has pointed out a remarkable difference in the spatial and temporal features of the auditory cortex response to pulsed tones and pure tones. The activated areas during episodes of auditory hallucinations match the location of primary auditory cortex as defined in control measurements with the same patients and in the experiments on healthy subjects. (authors)

  8. Guiding Attention by Cooperative Cues

    Institute of Scientific and Technical Information of China (English)

    KangWoo Lee

    2008-01-01

    A common assumption in visual attention is based on the rationale of "limited capacity of information pro-ceasing". From this view point there is little consideration of how different information channels or modules are cooperating because cells in processing stages are forced to compete for the limited resource. To examine the mechanism behind the cooperative behavior of information channels, a computational model of selective attention is implemented based on two hypotheses. Unlike the traditional view of visual attention, the cooperative behavior is assumed to be a dynamic integration process between the bottom-up and top-down information. Furthermore, top-down information is assumed to provide a contextual cue during selection process and to guide the attentional allocation among many bottom-up candidates. The result from a series of simulation with still and video images showed some interesting properties that could not be explained by the competitive aspect of selective attention alone.

  9. The effects of verbal cueing on implicit hand maps.

    Science.gov (United States)

    Mattioni, Stefania; Longo, Matthew R

    2014-11-01

    The use of position sense to perceive the external spatial location of the body requires that immediate proprioceptive afferent signals be combined with stored representations of body size and shape. Longo and Haggard (2010) developed a method to isolate and measure this representation in which participants judge the location of several landmarks on their occluded hand. The relative location of judgements is used to construct a perceptual map of hand shape. Studies using this paradigm have revealed large, and highly stereotyped, distortions of the hand, which is represented as wider than it actually is and with shortened fingers. Previous studies using this paradigm have cued participants to respond by giving verbal labels of the knuckles and fingertips. A recent study has shown differential effects of verbal and tactile cueing of localisation judgements about bodily landmarks (Cardinali et al., 2011). The present study therefore investigated implicit hand maps measuring through localisation judgements made in response to verbal labels and tactile stimuli applied to the same landmarks. The characteristic set of distortions of hand size and shape were clearly apparent in both conditions, indicating that the distortions reported previously are not an artefact of the use of verbal cues. However, there were also differences in the magnitude of distortions between conditions, suggesting that the use of verbal cues may alter the representation of the body underlying position sense. PMID:25305592

  10. Spatial Language and Children’s Spatial Landmark Use

    Directory of Open Access Journals (Sweden)

    Amber A. Ankowski

    2012-01-01

    Full Text Available We examined how spatial language affected search behavior in a landmark spatial search task. In Experiment 1, two- to six-year-old children were trained to find a toy in the center of a square array of four identical landmarks. Children heard one of three spatial language cues once during the initial training trial (“here,” “in the middle,” “next to this one”. After search performance reached criterion, children received a probe test trial in which the landmark array was expanded. In Experiment 2, two- to four-year-old children participated in the search task and also completed a language comprehension task. Results revealed that children’s spatial language comprehension scores and spatial language cues heard during training trials were related to children’s performance in the search task.

  11. Music As a Sacred Cue? Effects of Religious Music on Moral Behavior

    Science.gov (United States)

    Lang, Martin; Mitkidis, Panagiotis; Kundt, Radek; Nichols, Aaron; Krajčíková, Lenka; Xygalatas, Dimitris

    2016-01-01

    Religion can have an important influence in moral decision-making, and religious reminders may deter people from unethical behavior. Previous research indicated that religious contexts may increase prosocial behavior and reduce cheating. However, the perceptual-behavioral link between religious contexts and decision-making lacks thorough scientific understanding. This study adds to the current literature by testing the effects of purely audial religious symbols (instrumental music) on moral behavior across three different sites: Mauritius, the Czech Republic, and the USA. Participants were exposed to one of three kinds of auditory stimuli (religious, secular, or white noise), and subsequently were given a chance to dishonestly report on solved mathematical equations in order to increase their monetary reward. The results showed cross-cultural differences in the effects of religious music on moral behavior, as well as a significant interaction between condition and religiosity across all sites, suggesting that religious participants were more influenced by the auditory religious stimuli than non-religious participants. We propose that religious music can function as a subtle cue associated with moral standards via cultural socialization and ritual participation. Such associative learning can charge music with specific meanings and create sacred cues that influence normative behavior. Our findings provide preliminary support for this view, which we hope further research will investigate more closely. PMID:27375515

  12. Music As a Sacred Cue? Effects of Religious Music on Moral Behavior.

    Science.gov (United States)

    Lang, Martin; Mitkidis, Panagiotis; Kundt, Radek; Nichols, Aaron; Krajčíková, Lenka; Xygalatas, Dimitris

    2016-01-01

    Religion can have an important influence in moral decision-making, and religious reminders may deter people from unethical behavior. Previous research indicated that religious contexts may increase prosocial behavior and reduce cheating. However, the perceptual-behavioral link between religious contexts and decision-making lacks thorough scientific understanding. This study adds to the current literature by testing the effects of purely audial religious symbols (instrumental music) on moral behavior across three different sites: Mauritius, the Czech Republic, and the USA. Participants were exposed to one of three kinds of auditory stimuli (religious, secular, or white noise), and subsequently were given a chance to dishonestly report on solved mathematical equations in order to increase their monetary reward. The results showed cross-cultural differences in the effects of religious music on moral behavior, as well as a significant interaction between condition and religiosity across all sites, suggesting that religious participants were more influenced by the auditory religious stimuli than non-religious participants. We propose that religious music can function as a subtle cue associated with moral standards via cultural socialization and ritual participation. Such associative learning can charge music with specific meanings and create sacred cues that influence normative behavior. Our findings provide preliminary support for this view, which we hope further research will investigate more closely. PMID:27375515

  13. Contour identification with pitch and loudness cues using cochlear implants

    OpenAIRE

    Luo, Xin; Masterson, Megan E.; Wu, Ching-Chih

    2013-01-01

    Different from speech, pitch and loudness cues may or may not co-vary in music. Cochlear implant (CI) users with poor pitch perception may use loudness contour cues more than normal-hearing (NH) listeners. Contour identification was tested in CI users and NH listeners; the five-note contours contained either pitch cues alone, loudness cues alone, or both. Results showed that NH listeners' contour identification was better with pitch cues than with loudness cues; CI users performed similarly w...

  14. Neural entrainment to rhythmically-presented auditory, visual and audio-visual speech in children

    Directory of Open Access Journals (Sweden)

    Alan James Power

    2012-07-01

    Full Text Available Auditory cortical oscillations have been proposed to play an important role in speech perception. It is suggested that the brain may take temporal ‘samples’ of information from the speech stream at different rates, phase-resetting ongoing oscillations so that they are aligned with similar frequency bands in the input (‘phase locking’. Information from these frequency bands is then bound together for speech perception. To date, there are no explorations of neural phase-locking and entrainment to speech input in children. However, it is clear from studies of language acquisition that infants use both visual speech information and auditory speech information in learning. In order to study neural entrainment to speech in typically-developing children, we use a rhythmic entrainment paradigm (underlying 2 Hz or delta rate based on repetition of the syllable ba, presented in either the auditory modality alone, the visual modality alone, or as auditory-visual speech (via a talking head. To ensure attention to the task, children aged 13 years were asked to press a button as fast as possible when the ba stimulus violated the rhythm for each stream type. Rhythmic violation depended on delaying the occurrence of a ba in the isochronous stream. Neural entrainment was demonstrated for all stream types, and individual differences in standardized measures of language processing were related to auditory entrainment at the theta rate. Further, there was significant modulation of the preferred phase of auditory entrainment in the theta band when visual speech cues were present, indicating cross-modal phase resetting. The rhythmic entrainment paradigm developed here offers a method for exploring individual differences in oscillatory phase locking during development. In particular, a method for assessing neural entrainment and cross-modal phase resetting would be useful for exploring developmental learning difficulties thought to involve temporal sampling

  15. Is the auditory evoked P2 response a biomarker of learning?

    Science.gov (United States)

    Tremblay, Kelly L; Ross, Bernhard; Inoue, Kayo; McClannahan, Katrina; Collet, Gregory

    2014-01-01

    Even though auditory training exercises for humans have been shown to improve certain perceptual skills of individuals with and without hearing loss, there is a lack of knowledge pertaining to which aspects of training are responsible for the perceptual gains, and which aspects of perception are changed. To better define how auditory training impacts brain and behavior, electroencephalography (EEG) and magnetoencephalography (MEG) have been used to determine the time course and coincidence of cortical modulations associated with different types of training. Here we focus on P1-N1-P2 auditory evoked responses (AEP), as there are consistent reports of gains in P2 amplitude following various types of auditory training experiences; including music and speech-sound training. The purpose of this experiment was to determine if the auditory evoked P2 response is a biomarker of learning. To do this, we taught native English speakers to identify a new pre-voiced temporal cue that is not used phonemically in the English language so that coinciding changes in evoked neural activity could be characterized. To differentiate possible effects of repeated stimulus exposure and a button-pushing task from learning itself, we examined modulations in brain activity in a group of participants who learned to identify the pre-voicing contrast and compared it to participants, matched in time, and stimulus exposure, that did not. The main finding was that the amplitude of the P2 auditory evoked response increased across repeated EEG sessions for all groups, regardless of any change in perceptual performance. What's more, these effects are retained for months. Changes in P2 amplitude were attributed to changes in neural activity associated with the acquisition process and not the learned outcome itself. A further finding was the expression of a late negativity (LN) wave 600-900 ms post-stimulus onset, post-training exclusively for the group that learned to identify the pre-voiced contrast

  16. Is the auditory evoked P2 response a biomarker of learning?

    Directory of Open Access Journals (Sweden)

    Kelly eTremblay

    2014-02-01

    Full Text Available Even though auditory training exercises for humans have been shown to improve certain perceptual skills of individuals with and without hearing loss, there is a lack of knowledge pertaining to which aspects of training are responsible for the perceptual gains, and which aspects of perception are changed. To better define how auditory training impacts brain and behavior, electroencephalography and magnetoencephalography have been used to determine the time course and coincidence of cortical modulations associated with different types of training. Here we focus on P1-N1-P2 auditory evoked responses (AEP, as there are consistent reports of gains in P2 amplitude following various types of auditory training experiences; including music and speech-sound training. The purpose of this experiment was to determine if the auditory evoked P2 response is a biomarker of learning. To do this, we taught native English speakers to identify a new pre-voiced temporal cue that is not used phonemically in the English language so that coinciding changes in evoked neural activity could be characterized. To differentiate possible effects of repeated stimulus exposure and a button-pushing task from learning itself, we examined modulations in brain activity in a group of participants who learned to identify the pre-voicing contrast and compared it to participants, matched in time, and stimulus exposure, that did not. The main finding was that the amplitude of the P2 auditory evoked response increased across repeated EEG sessions for all groups, regardless of any change in perceptual performance. What’s more, these effects were retained for months. Changes in P2 amplitude were attributed to changes in neural activity associated with the acquisition process and not the learned outcome itself. A further finding was the expression of a late negativity (LN wave 600-900 ms post-stimulus onset, post-training, exclusively for the group that learned to identify the pre

  17. The perception of speech modulation cues in lexical tones is guided by early language-specific experience

    Directory of Open Access Journals (Sweden)

    Laurianne eCabrera

    2015-08-01

    Full Text Available A number of studies showed that infants reorganize their perception of speech sounds according to their native language categories during their first year of life. Still, information is lacking about the contribution of basic auditory mechanisms to this process. This study aimed to evaluate when native language experience starts to noticeably affect the perceptual processing of basic acoustic cues (i.e., frequency-modulation (FM and amplitude-modulation (AM information known to be crucial for speech perception in adults. The discrimination of a lexical-tone contrast (rising versus low was assessed in 6- and 10-month-old infants learning either French or Mandarin using a visual habituation paradigm. The lexical tones were presented in two conditions designed to either keep intact or to severely degrade the FM and fine spectral cues needed to accurately perceive voice-pitch trajectory. A third condition was designed to assess the discrimination of the same voice-pitch trajectories using click trains containing only the FM cues related to the fundamental-frequency (F0 in French- and Mandarin-learning 10-month-old infants. Results showed that the younger infants of both language groups and the Mandarin-learning 10-month-olds discriminated the intact lexical-tone contrast while French-learning 10-month-olds failed. However, only the French 10-month-olds discriminated degraded lexical tones when FM, and thus voice-pitch cues were reduced. Moreover, Mandarin-learning 10-month-olds were found to discriminate the pitch trajectories as presented in click trains better than French infants. Altogether, these results reveal that the perceptual reorganization occurring during the first year of life for lexical tones is coupled with changes in the auditory ability to use speech modulation cues.

  18. Sequential assessment of prey through the use of multiple sensory cues by an eavesdropping bat

    Science.gov (United States)

    Page, Rachel A.; Schnelle, Tanja; Kalko, Elisabeth K. V.; Bunge, Thomas; Bernal, Ximena E.

    2012-06-01

    Predators are often confronted with a broad diversity of potential prey. They rely on cues associated with prey quality and palatability to optimize their hunting success and to avoid consuming toxic prey. Here, we investigate a predator's ability to assess prey cues during capture, handling, and consumption when confronted with conflicting information about prey quality. We used advertisement calls of a preferred prey item (the túngara frog) to attract fringe-lipped bats, Trachops cirrhosus, then offered palatable, poisonous, and chemically manipulated anurans as prey. Advertisement calls elicited an attack response, but as bats approached, they used additional sensory cues in a sequential manner to update their information about prey size and palatability. While both palatable and poisonous small anurans were readily captured, large poisonous toads were approached but not contacted suggesting the use of echolocation for assessment of prey size at close range. Once prey was captured, bats used chemical cues to make final, post-capture decisions about whether to consume the prey. Bats dropped small, poisonous toads as well as palatable frogs coated in toad toxins either immediately or shortly after capture. Our study suggests that echolocation and chemical cues obtained at close range supplement information obtained from acoustic cues at long range. Updating information about prey quality minimizes the occurrence of costly errors and may be advantageous in tracking temporal and spatial fluctuations of prey and exploiting novel food sources. These findings emphasize the sequential, complex nature of prey assessment that may allow exploratory and flexible hunting behaviors.

  19. It Depends Who Is Watching You: 3-D Agent Cues Increase Fairness.

    Science.gov (United States)

    Krátký, Jan; McGraw, John J; Xygalatas, Dimitris; Mitkidis, Panagiotis; Reddish, Paul

    2016-01-01

    Laboratory and field studies have demonstrated that exposure to cues of intentional agents in the form of eyes can increase prosocial behavior. However, previous research mostly used 2-dimensional depictions as experimental stimuli. Thus far no study has examined the influence of the spatial properties of agency cues on this prosocial effect. To investigate the role of dimensionality of agency cues on fairness, 345 participants engaged in a decision-making task in a naturalistic setting. The experimental treatment included a 3-dimensional pseudo-realistic model of a human head and a 2-dimensional picture of the same object. The control stimuli consisted of a real plant and its 2-D image. Our results partly support the findings of previous studies that cues of intentional agents increase prosocial behavior. However, this effect was only found for the 3-D cues, suggesting that dimensionality is a critical variable in triggering these effects in a real-world settings. Our research sheds light on a hitherto unexplored aspect of the effects of environmental cues and their morphological properties on decision-making. PMID:26859562

  20. It Depends Who Is Watching You: 3-D Agent Cues Increase Fairness.

    Directory of Open Access Journals (Sweden)

    Jan Krátký

    Full Text Available Laboratory and field studies have demonstrated that exposure to cues of intentional agents in the form of eyes can increase prosocial behavior. However, previous research mostly used 2-dimensional depictions as experimental stimuli. Thus far no study has examined the influence of the spatial properties of agency cues on this prosocial effect. To investigate the role of dimensionality of agency cues on fairness, 345 participants engaged in a decision-making task in a naturalistic setting. The experimental treatment included a 3-dimensional pseudo-realistic model of a human head and a 2-dimensional picture of the same object. The control stimuli consisted of a real plant and its 2-D image. Our results partly support the findings of previous studies that cues of intentional agents increase prosocial behavior. However, this effect was only found for the 3-D cues, suggesting that dimensionality is a critical variable in triggering these effects in a real-world settings. Our research sheds light on a hitherto unexplored aspect of the effects of environmental cues and their morphological properties on decision-making.

  1. Action experience changes attention to kinematic cues

    Directory of Open Access Journals (Sweden)

    Courtney eFilippi

    2016-02-01

    Full Text Available The current study used remote corneal reflection eye-tracking to examine the relationship between motor experience and action anticipation in 13-month-old infants. To measure online anticipation of actions infants watched videos where the actor’s hand provided kinematic information (in its orientation about the type of object that the actor was going to reach for. The actor’s hand orientation either matched the orientation of a rod (congruent cue or did not match the orientation of the rod (incongruent cue. To examine relations between motor experience and action anticipation, we used a 2 (reach first vs. observe first x 2 (congruent kinematic cue vs. incongruent kinematic cue between-subjects design. We show that 13-month-old infants in the observe first condition spontaneously generate rapid online visual predictions to congruent hand orientation cues and do not visually anticipate when presented incongruent cues. We further demonstrate that the speed that these infants generate predictions to congruent motor cues is correlated with their own ability to pre-shape their hands. Finally, we demonstrate that following reaching experience, infants generate rapid predictions to both congruent and incongruent hand shape cues—suggesting that short-term experience changes attention to kinematics.

  2. Neural Representation of Concurrent Vowels in Macaque Primary Auditory Cortex123

    Science.gov (United States)

    Micheyl, Christophe; Steinschneider, Mitchell

    2016-01-01

    Abstract Successful speech perception in real-world environments requires that the auditory system segregate competing voices that overlap in frequency and time into separate streams. Vowels are major constituents of speech and are comprised of frequencies (harmonics) that are integer multiples of a common fundamental frequency (F0). The pitch and identity of a vowel are determined by its F0 and spectral envelope (formant structure), respectively. When two spectrally overlapping vowels differing in F0 are presented concurrently, they can be readily perceived as two separate “auditory objects” with pitches at their respective F0s. A difference in pitch between two simultaneous vowels provides a powerful cue for their segregation, which in turn, facilitates their individual identification. The neural mechanisms underlying the segregation of concurrent vowels based on pitch differences are poorly understood. Here, we examine neural population responses in macaque primary auditory cortex (A1) to single and double concurrent vowels (/a/ and /i/) that differ in F0 such that they are heard as two separate auditory objects with distinct pitches. We find that neural population responses in A1 can resolve, via a rate-place code, lower harmonics of both single and double concurrent vowels. Furthermore, we show that the formant structures, and hence the identities, of single vowels can be reliably recovered from the neural representation of double concurrent vowels. We conclude that A1 contains sufficient spectral information to enable concurrent vowel segregation and identification by downstream cortical areas.

  3. Auditory and visual distance perception: The proximity-image effect revisited

    Science.gov (United States)

    Zahorik, Pavel

    2003-04-01

    The proximity-image effect [M. B. Gardner, J. Acoust. Soc. Am. 43, 163 (1968)] describes a phenomenon in which the apparent distance of an auditory target is determined by the distance of the nearest plausible visual target rather than by acoustic distance cues. Here this effect is examined using a single visual target (an un-energized loudspeaker) and invisible virtual sound sources. These sources were synthesized from binaural impulse-response measurements at distances ranging from 1 to 5 m (0.25-m steps) in the semi-reverberant room (7.7 m×4.2 m×2.7 m) in which the experiment was conducted. Listeners (n=11) were asked whether or not the auditory target appeared to be at the same distance as the visual target. Within a block of trials, the visual target was placed at a fixed distance of 1.5, 3, or 4.5 m, and the auditory target varied randomly from trial-to-trial over the sample of measurement distances. The resulting psychometric functions are consistent with the proximity-image effect, and can be predicted using a simple model of sensory integration and decision in which perceived auditory space is both compressed in distance and has lower resolution than perceived visual space. [Work supported by NIH-NEI.

  4. Auditory lateralisation deficits in neglect patients.

    Science.gov (United States)

    Guilbert, Alma; Clément, Sylvain; Senouci, Latifa; Pontzeele, Sylvain; Martin, Yves; Moroni, Christine

    2016-05-01

    Although visual deficits due to unilateral spatial neglect (USN) have been frequently described in the literature, fewer studies have been interested in directional hearing impairment in USN. The aim of this study was to explore sound lateralisation deficits in USN. Using a paradigm inspired by Tanaka et al. (1999), interaural time differences (ITD) were presented over headphones to give the illusion of a leftward or a rightward movement of sound. Participants were asked to respond "right" and "left" as soon as possible to indicate whether they heard the sound moving to the right or to the left side of the auditory space. We additionally adopted a single-case method to analyse the performance of 15 patients with right-hemisphere (RH) stroke and added two additional measures to underline sound lateralisation on the left side and on the right side. We included 15 patients with RH stoke (5 with a severe USN, 5 with a mild USN and 5 without USN) and 11 healthy age-matched participants. We expected to replicate findings of abnormal sound lateralisation in USN. However, although a sound lateralisation deficit was observed in USN, two different deficit profiles were identified. Namely, patients with a severe USN seemed to have left sound lateralisation impairment whereas patients with a mild USN seemed to be more influenced by a systematic bias in auditory representation with respect to body meridian axis (egocentric deviation). This latter profile was unexpected as sounds were manipulated with ITD and, thus, would not be perceived as coming from an external source of the head. Future studies should use this paradigm in order to better understand these two distinct profiles. PMID:27018451

  5. Visual Cues Generated during Action Facilitate 14-Month-Old Infants' Mental Rotation

    Science.gov (United States)

    Antrilli, Nick K.; Wang, Su-hua

    2016-01-01

    Although action experience has been shown to enhance the development of spatial cognition, the mechanism underlying the effects of action is still unclear. The present research examined the role of visual cues generated during action in promoting infants' mental rotation. We sought to clarify the underlying mechanism by decoupling different…

  6. Effects of an auditory biofeedback device on plantar pressure in patients with chronic ankle instability.

    Science.gov (United States)

    Donovan, Luke; Feger, Mark A; Hart, Joseph M; Saliba, Susan; Park, Joseph; Hertel, Jay

    2016-02-01

    Chronic ankle instability (CAI) patients have been shown to have increased lateral column plantar pressure throughout the stance phase of gait. To date, traditional CAI rehabilitation programs have been unable to alter gait. We developed an auditory biofeedback device that can be worn in shoes that elicits an audible cue when an excessive amount of pressure is applied to a sensor. This study determined whether using this device can decrease lateral plantar pressure in participants with CAI and alter surface electromyography (sEMG) amplitudes (anterior tibialis, peroneus longus, medial gastrocnemius, and gluteus medius). Ten CAI patients completed baseline treadmill walking while in-shoe plantar pressures and sEMG were measured (baseline condition). Next, the device was placed into the shoe and set to a threshold that would elicit an audible cue during each step of the participant's normal gait. Then, participants were instructed to walk in a manner that would not trigger the audible cue, while plantar pressure and sEMG measures were recorded (auditory feedback (AUD FB) condition). Compared to baseline, there was a statistically significant reduction in peak pressure in the lateral midfoot-forefoot and central forefoot during the AUD FB condition. In addition, there were increases in peroneus longus and medial gastrocnemius sEMG amplitudes 200ms post-initial contact during the AUD FB condition. The use of this auditory biofeedback device resulted in decreased plantar pressure in the lateral column of the foot during treadmill walking in CAI patients and may have been caused by the increase in sEMG activation of the peroneus longus. PMID:27004629

  7. Cues of maternal condition influence offspring selfishness.

    Directory of Open Access Journals (Sweden)

    Janine W Y Wong

    Full Text Available The evolution of parent-offspring communication was mostly studied from the perspective of parents responding to begging signals conveying information about offspring condition. Parents should respond to begging because of the differential fitness returns obtained from their investment in offspring that differ in condition. For analogous reasons, offspring should adjust their behavior to cues/signals of parental condition: parents that differ in condition pay differential costs of care and, hence, should provide different amounts of food. In this study, we experimentally tested in the European earwig (Forficula auricularia if cues of maternal condition affect offspring behavior in terms of sibling cannibalism. We experimentally manipulated female condition by providing them with different amounts of food, kept nymph condition constant, allowed for nymph exposure to chemical maternal cues over extended time, quantified nymph survival (deaths being due to cannibalism and extracted and analyzed the females' cuticular hydrocarbons (CHC. Nymph survival was significantly affected by chemical cues of maternal condition, and this effect depended on the timing of breeding. Cues of poor maternal condition enhanced nymph survival in early broods, but reduced nymph survival in late broods, and vice versa for cues of good condition. Furthermore, female condition affected the quantitative composition of their CHC profile which in turn predicted nymph survival patterns. Thus, earwig offspring are sensitive to chemical cues of maternal condition and nymphs from early and late broods show opposite reactions to the same chemical cues. Together with former evidence on maternal sensitivities to condition-dependent nymph chemical cues, our study shows context-dependent reciprocal information exchange about condition between earwig mothers and their offspring, potentially mediated by cuticular hydrocarbons.

  8. Cues of maternal condition influence offspring selfishness.

    Science.gov (United States)

    Wong, Janine W Y; Lucas, Christophe; Kölliker, Mathias

    2014-01-01

    The evolution of parent-offspring communication was mostly studied from the perspective of parents responding to begging signals conveying information about offspring condition. Parents should respond to begging because of the differential fitness returns obtained from their investment in offspring that differ in condition. For analogous reasons, offspring should adjust their behavior to cues/signals of parental condition: parents that differ in condition pay differential costs of care and, hence, should provide different amounts of food. In this study, we experimentally tested in the European earwig (Forficula auricularia) if cues of maternal condition affect offspring behavior in terms of sibling cannibalism. We experimentally manipulated female condition by providing them with different amounts of food, kept nymph condition constant, allowed for nymph exposure to chemical maternal cues over extended time, quantified nymph survival (deaths being due to cannibalism) and extracted and analyzed the females' cuticular hydrocarbons (CHC). Nymph survival was significantly affected by chemical cues of maternal condition, and this effect depended on the timing of breeding. Cues of poor maternal condition enhanced nymph survival in early broods, but reduced nymph survival in late broods, and vice versa for cues of good condition. Furthermore, female condition affected the quantitative composition of their CHC profile which in turn predicted nymph survival patterns. Thus, earwig offspring are sensitive to chemical cues of maternal condition and nymphs from early and late broods show opposite reactions to the same chemical cues. Together with former evidence on maternal sensitivities to condition-dependent nymph chemical cues, our study shows context-dependent reciprocal information exchange about condition between earwig mothers and their offspring, potentially mediated by cuticular hydrocarbons. PMID:24498046

  9. Adaptation in the auditory system: an overview

    Directory of Open Access Journals (Sweden)

    David Pérez-González

    2014-02-01

    Full Text Available The early stages of the auditory system need to preserve the timing information of sounds in order to extract the basic features of acoustic stimuli. At the same time, different processes of neuronal adaptation occur at several levels to further process the auditory information. For instance, auditory nerve fiber responses already experience adaptation of their firing rates, a type of response that can be found in many other auditory nuclei and may be useful for emphasizing the onset of the stimuli. However, it is at higher levels in the auditory hierarchy where more sophisticated types of neuronal processing take place. For example, stimulus-specific adaptation, where neurons show adaptation to frequent, repetitive stimuli, but maintain their responsiveness to stimuli with different physical characteristics, thus representing a distinct kind of processing that may play a role in change and deviance detection. In the auditory cortex, adaptation takes more elaborate forms, and contributes to the processing of complex sequences, auditory scene analysis and attention. Here we review the multiple types of adaptation that occur in the auditory system, which are part of the pool of resources that the neurons employ to process the auditory scene, and are critical to a proper understanding of the neuronal mechanisms that govern auditory perception.

  10. Conceptual priming for realistic auditory scenes and for auditory words.

    Science.gov (United States)

    Frey, Aline; Aramaki, Mitsuko; Besson, Mireille

    2014-02-01

    Two experiments were conducted using both behavioral and Event-Related brain Potentials methods to examine conceptual priming effects for realistic auditory scenes and for auditory words. Prime and target sounds were presented in four stimulus combinations: Sound-Sound, Word-Sound, Sound-Word and Word-Word. Within each combination, targets were conceptually related to the prime, unrelated or ambiguous. In Experiment 1, participants were asked to judge whether the primes and targets fit together (explicit task) and in Experiment 2 they had to decide whether the target was typical or ambiguous (implicit task). In both experiments and in the four stimulus combinations, reaction times and/or error rates were longer/higher and the N400 component was larger to ambiguous targets than to conceptually related targets, thereby pointing to a common conceptual system for processing auditory scenes and linguistic stimuli in both explicit and implicit tasks. However, fine-grained analyses also revealed some differences between experiments and conditions in scalp topography and duration of the priming effects possibly reflecting differences in the integration of perceptual and cognitive attributes of linguistic and nonlinguistic sounds. These results have clear implications for the building-up of virtual environments that need to convey meaning without words. PMID:24378910

  11. Self-generated auditory feedback as a cue to support rhythmic motor stability

    DEFF Research Database (Denmark)

    Krupenia, Stas S.; Hoffmann, Pablo F.; Zalmanov, Hagar;

    2011-01-01

    necessary to render the sounds of virtual balls hitting virtual hands within the juggling training simulator. First, we recorded sounds at the jugglers’ ears and found the sound of ball hitting hands to be audible. Second, we asked 24 jugglers to juggle under normal conditions (Audible) or while listening...

  12. Synchronization and leadership in string quartet performance: a case study of auditory and visual cues

    Directory of Open Access Journals (Sweden)

    Renee eTimmers

    2014-06-01

    Full Text Available Temporal coordination between members of a string quartet was investigated across repeated performances of an excerpt of Haydn’s string quartet in G Major, Op. 77 No. 1. Cross-correlations between interbeat intervals of performances at different lags showed a unidirectional dependence of Viola on Violin I, and of Violin I on Cello. Bidirectional dependence was observed for the relationships between Violin II and Cello and Violin II and Viola. Own-reported dependencies after the performances reflected these measured dependencies more closely than dependencies of players reported by the other players, which instead showed more typical leader-follower patterns in which Violin I leads. On the other hand, primary leadership from Violin I was observed in an analysis of the bow speed characteristics preceding the first tone onset. The anticipatory movement of Violin I set the tempo of the excerpt. Taken together the results show a more complex and differentiated pattern of dependencies than expected from a traditional role division of leadership suggesting several avenues for further research.

  13. Blocking Spatial Navigation Across Environments That Have a Different Shape

    OpenAIRE

    Buckley, Matthew G.; Smith, Alastair D; Haselgrove, Mark

    2015-01-01

    According to the geometric module hypothesis, organisms encode a global representation of the space in which they navigate, and this representation is not prone to interference from other cues. A number of studies, however, have shown that both human and non-human animals can navigate on the basis of local geometric cues provided by the shape of an environment. According to the model of spatial learning proposed by Miller and Shettleworth (2007, 2008), geometric cues compete for associative s...

  14. Effect of dual sensory loss on auditory localization: implications for intervention.

    Science.gov (United States)

    Simon, Helen J; Levitt, Harry

    2007-12-01

    Our sensory systems are remarkable in several respects. They are extremely sensitive, they each perform more than one function, and they interact in a complementary way, thereby providing a high degree of redundancy that is particularly helpful should one or more sensory systems be impaired. In this article, the problem of dual hearing and vision loss is addressed. A brief description is provided on the use of auditory cues in vision loss, the use of visual cues in hearing loss, and the additional difficulties encountered when both sensory systems are impaired. A major focus of this article is the use of sound localization by normal hearing, hearing impaired, and blind individuals and the special problem of sound localization in people with dual sensory loss. PMID:18003869

  15. Psychophysiological responses to auditory change.

    Science.gov (United States)

    Chuen, Lorraine; Sears, David; McAdams, Stephen

    2016-06-01

    A comprehensive characterization of autonomic and somatic responding within the auditory domain is currently lacking. We studied whether simple types of auditory change that occur frequently during music listening could elicit measurable changes in heart rate, skin conductance, respiration rate, and facial motor activity. Participants heard a rhythmically isochronous sequence consisting of a repeated standard tone, followed by a repeated target tone that changed in pitch, timbre, duration, intensity, or tempo, or that deviated momentarily from rhythmic isochrony. Changes in all parameters produced increases in heart rate. Skin conductance response magnitude was affected by changes in timbre, intensity, and tempo. Respiratory rate was sensitive to deviations from isochrony. Our findings suggest that music researchers interpreting physiological responses as emotional indices should consider acoustic factors that may influence physiology in the absence of induced emotions. PMID:26927928

  16. Auditory distraction and serial memory

    OpenAIRE

    Jones, D M; Hughes, Rob; Macken, W.J.

    2010-01-01

    One mental activity that is very vulnerable to auditory distraction is serial recall. This review of the contemporary findings relating to serial recall charts the key determinants of distraction. It is evident that there is one form of distraction that is a joint product of the cognitive characteristics of the task and of the obligatory cognitive processing of the sound. For sequences of sound, distraction appears to be an ineluctable product of similarity-of-process, specifically, the seria...

  17. 基于环境重置的视听觉刺激在脑卒中偏侧忽略的护理研究%Research of Visual and Auditory Stimulation Based on Environment Reset on Hemi-spatial Neglect in Patients with Cerebral Apoplexy

    Institute of Scientific and Technical Information of China (English)

    韩宇花; 陶希; 邓景贵; 刘佳; 宋涛; 何娟

    2014-01-01

    目的:探讨基于环境重置的视听觉刺激对脑卒中偏侧忽略( hemispatial neglect, HSN)的影响。方法2010年3月-2012年9月收治的脑卒中HSN 49例随机分为观察组27例和对照组22例。两组均给予常规治疗,对照组对病房及康复环境不做要求,观察组对病房床位及康复环境进行重新设置。于治疗前、治疗4周及治疗8周时分别行直线二等分( LB)测试和线段划消( LC)测试评估HSN的程度,以美国国立研究院脑卒中评定量表( NIHSS)评定神经功能缺损和改良Barthel Index( MBI)评估日常生活活动能力( ADL)。结果治疗4、8周时两组LB、LC及NIHSS评分均低于治疗前,MBI评分均高于治疗前(P<0.05)。治疗8周时两组LB、NIHSS评分和观察组LC均较治疗4周时降低,MBI评分较治疗4周时升高,观察组LB、LC低于对照组,MBI评分高于对照组(P<0.05)。结论基于环境重置的视听觉刺激对脑卒中HSN患者有益,可提高ADL能力,但对神经功能缺损影响可能不大。%Objective To explore the effect of visual and auditory stimulation based on environment reset on he-mi-spatial neglect ( HSN) in patients with cerebral apoplexy. Methods A total of 49 patients with cerebral apoplexy combined with HSN during March 2010 and September 2012 were randomly divided into control group (n=22) and ob-servation group (n=27). Conventional therapy was performed in the two groups. Wards and rehabilitation environment for patients in control group had no special requirement, while wards and rehabilitation environment for patients were rese-ted regularly. HSN degrees were assessed by test of line bisection ( LB) and line cancellation ( LC);scores of neurologic impairment were evaluated with National Institute of Health stroke scale ( NIHSS) , and abilities of activity of daily living ( ADL) were evaluated with modified Barthel index ( MBI) before treatment, after treatment for 4 weeks and 8 weeks. Results Compared with those before

  18. Auditory learning: a developmental method.

    Science.gov (United States)

    Zhang, Yilu; Weng, Juyang; Hwang, Wey-Shiuan

    2005-05-01

    Motivated by the human autonomous development process from infancy to adulthood, we have built a robot that develops its cognitive and behavioral skills through real-time interactions with the environment. We call such a robot a developmental robot. In this paper, we present the theory and the architecture to implement a developmental robot and discuss the related techniques that address an array of challenging technical issues. As an application, experimental results on a real robot, self-organizing, autonomous, incremental learner (SAIL), are presented with emphasis on its audition perception and audition-related action generation. In particular, the SAIL robot conducts the auditory learning from unsegmented and unlabeled speech streams without any prior knowledge about the auditory signals, such as the designated language or the phoneme models. Neither available before learning starts are the actions that the robot is expected to perform. SAIL learns the auditory commands and the desired actions from physical contacts with the environment including the trainers. PMID:15940990

  19. The Perceptual Cues that Reshape Expert Reasoning

    Science.gov (United States)

    Harré, Michael; Bossomaier, Terry; Snyder, Allan

    2012-07-01

    The earliest stages in our perception of the world have a subtle but powerful influence on later thought processes; they provide the contextual cues within which our thoughts are framed and they adapt to many different environments throughout our lives. Understanding the changes in these cues is crucial to understanding how our perceptual ability develops, but these changes are often difficult to quantify in sufficiently complex tasks where objective measures of development are available. Here we simulate perceptual learning using neural networks and demonstrate fundamental changes in these cues as a function of skill. These cues are cognitively grouped together to form perceptual templates that enable rapid `whole scene' categorisation of complex stimuli. Such categories reduce the computational load on our capacity limited thought processes, they inform our higher cognitive processes and they suggest a framework of perceptual pre-processing that captures the central role of perception in expertise.

  20. Auditory sequence analysis and phonological skill.

    Science.gov (United States)

    Grube, Manon; Kumar, Sukhbinder; Cooper, Freya E; Turton, Stuart; Griffiths, Timothy D

    2012-11-01

    This work tests the relationship between auditory and phonological skill in a non-selected cohort of 238 school students (age 11) with the specific hypothesis that sound-sequence analysis would be more relevant to phonological skill than the analysis of basic, single sounds. Auditory processing was assessed across the domains of pitch, time and timbre; a combination of six standard tests of literacy and language ability was used to assess phonological skill. A significant correlation between general auditory and phonological skill was demonstrated, plus a significant, specific correlation between measures of phonological skill and the auditory analysis of short sequences in pitch and time. The data support a limited but significant link between auditory and phonological ability with a specific role for sound-sequence analysis, and provide a possible new focus for auditory training strategies to aid language development in early adolescence. PMID:22951739

  1. Poor vigilance affects attentional orienting triggered by central uninformative gaze and arrow cues.

    Science.gov (United States)

    Marotta, Andrea; Martella, Diana; Maccari, Lisa; Sebastiani, Mara; Casagrande, Maria

    2014-11-01

    Behaviour and neuroimaging studies have shown that poor vigilance (PV) due to sleep deprivation (SD) negatively affects exogenously cued selective attention. In the current study, we assessed the impact of PV due to both partial SD and night-time hours on reflexive attentional orienting triggered by central un-informative eye-gaze and arrow cues. Subjective mood and interference performance in emotional Stroop task were also investigated. Twenty healthy participants performed spatial cueing tasks using central directional arrow and eye-gaze as a cue to orient attention. The target was a word written in different coloured inks. The participant's task was to identify the colour of the ink while ignoring the semantic content of the word (with negative or neutral emotional valence). The experiment took place on 2 days. On the first day, each participant performed a 10-min training session of the spatial cueing task. On the second day, half of participants performed the task once at 4:30 p.m. (BSL) and once at 6:30 a.m. (PV), whereas the other half performed the task in the reversed order. Results showed that mean reaction times on the spatial cueing tasks were worsened by PV, although gaze paradigm was more resistant to this effect as compared to the arrow paradigm. Moreover, PV negatively affects attentional orienting triggered by both central un-informative gaze and arrow cues. Finally, prolonged wakefulness affects self-reported mood but does not influence interference control in emotional Stroop task. PMID:24718933

  2. Early blindness results in a degraded auditory map of space in the optic tectum of the barn owl.

    OpenAIRE

    Knudsen, E I

    1988-01-01

    The optic tectum of the barn owl (Tyto alba) contains a neural map of auditory space consisting of neurons that are sharply tuned for sound source location and organized precisely according to their spatial tuning. The importance of vision for the development of this auditory map was investigated by comparing space maps measured in normal owls with those measured in owls raised with both eyelids sutured closed. The results demonstrate that owls raised without sight, but with normal hearing, d...

  3. Design guidelines for the use of audio cues in computer interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Sumikawa, D.A.; Blattner, M.M.; Joy, K.I.; Greenberg, R.M.

    1985-07-01

    A logical next step in the evolution of the computer-user interface is the incorporation of sound thereby using our senses of ''hearing'' in our communication with the computer. This allows our visual and auditory capacities to work in unison leading to a more effective and efficient interpretation of information received from the computer than by sight alone. In this paper we examine earcons, which are audio cues, used in the computer-user interface to provide information and feedback to the user about computer entities (these include messages and functions, as well as states and labels). The material in this paper is part of a larger study that recommends guidelines for the design and use of audio cues in the computer-user interface. The complete work examines the disciplines of music, psychology, communication theory, advertising, and psychoacoustics to discover how sound is utilized and analyzed in those areas. The resulting information is organized according to the theory of semiotics, the theory of signs, into the syntax, semantics, and pragmatics of communication by sound. Here we present design guidelines for the syntax of earcons. Earcons are constructed from motives, short sequences of notes with a specific rhythm and pitch, embellished by timbre, dynamics, and register. Compound earcons and family earcons are introduced. These are related motives that serve to identify a family of related cues. Examples of earcons are given.

  4. Development of Receiver Stimulator for Auditory Prosthesis

    OpenAIRE

    K. Raja Kumar; P. Seetha Ramaiah

    2010-01-01

    The Auditory Prosthesis (AP) is an electronic device that can provide hearing sensations to people who are profoundly deaf by stimulating the auditory nerve via an array of electrodes with an electric current allowing them to understand the speech. The AP system consists of two hardware functional units such as Body Worn Speech Processor (BWSP) and Receiver Stimulator. The prototype model of Receiver Stimulator for Auditory Prosthesis (RSAP) consists of Speech Data Decoder, DAC, ADC, constant...

  5. Auditory stimulation and cardiac autonomic regulation

    OpenAIRE

    Vitor E Valenti; Guida, Heraldo L.; Frizzo, Ana C F; Cardoso, Ana C. V.; Vanderlei, Luiz Carlos M; Luiz Carlos de Abreu

    2012-01-01

    Previous studies have already demonstrated that auditory stimulation with music influences the cardiovascular system. In this study, we described the relationship between musical auditory stimulation and heart rate variability. Searches were performed with the Medline, SciELO, Lilacs and Cochrane databases using the following keywords: "auditory stimulation", "autonomic nervous system", "music" and "heart rate variability". The selected studies indicated that there is a strong correlation bet...

  6. Perception of health from facial cues.

    Science.gov (United States)

    Henderson, Audrey J; Holzleitner, Iris J; Talamas, Sean N; Perrett, David I

    2016-05-01

    Impressions of health are integral to social interactions, yet poorly understood. A review of the literature reveals multiple facial characteristics that potentially act as cues to health judgements. The cues vary in their stability across time: structural shape cues including symmetry and sexual dimorphism alter slowly across the lifespan and have been found to have weak links to actual health, but show inconsistent effects on perceived health. Facial adiposity changes over a medium time course and is associated with both perceived and actual health. Skin colour alters over a short time and has strong effects on perceived health, yet links to health outcomes have barely been evaluated. Reviewing suggested an additional influence of demeanour as a perceptual cue to health. We, therefore, investigated the association of health judgements with multiple facial cues measured objectively from two-dimensional and three-dimensional facial images. We found evidence for independent contributions of face shape and skin colour cues to perceived health. Our empirical findings: (i) reinforce the role of skin yellowness; (ii) demonstrate the utility of global face shape measures of adiposity; and (iii) emphasize the role of affect in facial images with nominally neutral expression in impressions of health. PMID:27069057

  7. Effects of Methylphenidate (Ritalin) on Auditory Performance in Children with Attention and Auditory Processing Disorders.

    Science.gov (United States)

    Tillery, Kim L.; Katz, Jack; Keller, Warren D.

    2000-01-01

    A double-blind, placebo-controlled study examined effects of methylphenidate (Ritalin) on auditory processing in 32 children with both attention deficit hyperactivity disorder and central auditory processing (CAP) disorder. Analyses revealed that Ritalin did not have a significant effect on any of the central auditory processing measures, although…

  8. Seeing the song: left auditory structures may track auditory-visual dynamic alignment.

    Directory of Open Access Journals (Sweden)

    Julia A Mossbridge

    Full Text Available Auditory and visual signals generated by a single source tend to be temporally correlated, such as the synchronous sounds of footsteps and the limb movements of a walker. Continuous tracking and comparison of the dynamics of auditory-visual streams is thus useful for the perceptual binding of information arising from a common source. Although language-related mechanisms have been implicated in the tracking of speech-related auditory-visual signals (e.g., speech sounds and lip movements, it is not well known what sensory mechanisms generally track ongoing auditory-visual synchrony for non-speech signals in a complex auditory-visual environment. To begin to address this question, we used music and visual displays that varied in the dynamics of multiple features (e.g., auditory loudness and pitch; visual luminance, color, size, motion, and organization across multiple time scales. Auditory activity (monitored using auditory steady-state responses, ASSR was selectively reduced in the left hemisphere when the music and dynamic visual displays were temporally misaligned. Importantly, ASSR was not affected when attentional engagement with the music was reduced, or when visual displays presented dynamics clearly dissimilar to the music. These results appear to suggest that left-lateralized auditory mechanisms are sensitive to auditory-visual temporal alignment, but perhaps only when the dynamics of auditory and visual streams are similar. These mechanisms may contribute to correct auditory-visual binding in a busy sensory environment.

  9. Seeing the song: left auditory structures may track auditory-visual dynamic alignment.

    Science.gov (United States)

    Mossbridge, Julia A; Grabowecky, Marcia; Suzuki, Satoru

    2013-01-01

    Auditory and visual signals generated by a single source tend to be temporally correlated, such as the synchronous sounds of footsteps and the limb movements of a walker. Continuous tracking and comparison of the dynamics of auditory-visual streams is thus useful for the perceptual binding of information arising from a common source. Although language-related mechanisms have been implicated in the tracking of speech-related auditory-visual signals (e.g., speech sounds and lip movements), it is not well known what sensory mechanisms generally track ongoing auditory-visual synchrony for non-speech signals in a complex auditory-visual environment. To begin to address this question, we used music and visual displays that varied in the dynamics of multiple features (e.g., auditory loudness and pitch; visual luminance, color, size, motion, and organization) across multiple time scales. Auditory activity (monitored using auditory steady-state responses, ASSR) was selectively reduced in the left hemisphere when the music and dynamic visual displays were temporally misaligned. Importantly, ASSR was not affected when attentional engagement with the music was reduced, or when visual displays presented dynamics clearly dissimilar to the music. These results appear to suggest that left-lateralized auditory mechanisms are sensitive to auditory-visual temporal alignment, but perhaps only when the dynamics of auditory and visual streams are similar. These mechanisms may contribute to correct auditory-visual binding in a busy sensory environment. PMID:24194873

  10. Direct and Indirect Cues to Knowledge States during Word Learning

    Science.gov (United States)

    Saylor, Megan M.; Carroll, C. Brooke

    2009-01-01

    The present study investigated three-year-olds' sensitivity to direct and indirect cues to others' knowledge states for word learning purposes. Children were given either direct, physical cues to knowledge or indirect, verbal cues to knowledge. Preschoolers revealed a better ability to learn words from a speaker following direct, physical cues to…

  11. Corticofugal modulation of peripheral auditory responses

    Directory of Open Access Journals (Sweden)

    Paul Hinckley Delano

    2015-09-01

    Full Text Available The auditory efferent system originates in the auditory cortex and projects to the medial geniculate body, inferior colliculus, cochlear nucleus and superior olivary complex reaching the cochlea through olivocochlear fibers. This unique neuronal network is organized in several afferent-efferent feedback loops including: the (i colliculo-thalamic-cortico-collicular, (ii cortico-(collicular-olivocochlear and (iii cortico-(collicular-cochlear nucleus pathways. Recent experiments demonstrate that blocking ongoing auditory-cortex activity with pharmacological and physical methods modulates the amplitude of cochlear potentials. In addition, auditory-cortex microstimulation independently modulates cochlear sensitivity and the strength of the olivocochlear reflex. In this mini-review, anatomical and physiological evidence supporting the presence of a functional efferent network from the auditory cortex to the cochlear receptor is presented. Special emphasis is given to the corticofugal effects on initial auditory processing, that is, on cochlear nucleus, auditory nerve and cochlear responses. A working model of three parallel pathways from the auditory cortex to the cochlea and auditory nerve is proposed.

  12. Corticofugal modulation of peripheral auditory responses.

    Science.gov (United States)

    Terreros, Gonzalo; Delano, Paul H

    2015-01-01

    The auditory efferent system originates in the auditory cortex and projects to the medial geniculate body (MGB), inferior colliculus (IC), cochlear nucleus (CN) and superior olivary complex (SOC) reaching the cochlea through olivocochlear (OC) fibers. This unique neuronal network is organized in several afferent-efferent feedback loops including: the (i) colliculo-thalamic-cortico-collicular; (ii) cortico-(collicular)-OC; and (iii) cortico-(collicular)-CN pathways. Recent experiments demonstrate that blocking ongoing auditory-cortex activity with pharmacological and physical methods modulates the amplitude of cochlear potentials. In addition, auditory-cortex microstimulation independently modulates cochlear sensitivity and the strength of the OC reflex. In this mini-review, anatomical and physiological evidence supporting the presence of a functional efferent network from the auditory cortex to the cochlear receptor is presented. Special emphasis is given to the corticofugal effects on initial auditory processing, that is, on CN, auditory nerve and cochlear responses. A working model of three parallel pathways from the auditory cortex to the cochlea and auditory nerve is proposed. PMID:26483647

  13. Rapid context-based identification of target sounds in an auditory scene

    Science.gov (United States)

    Gamble, Marissa L.; Woldorff, Marty G.

    2015-01-01

    To make sense of our dynamic and complex auditory environment, we must be able to parse the sensory input into usable parts and pick out relevant sounds from all the potentially distracting auditory information. While it is unclear exactly how we accomplish this difficult task, Gamble and Woldorff (2014) recently reported an ERP study of an auditory target-search task in a temporally and spatially distributed, rapidly presented, auditory scene. They reported an early, differential, bilateral activation (beginning ~60 ms) between feature-deviating Target stimuli and physically equivalent feature-deviating Nontargets, reflecting a rapid Target-detection process. This was followed shortly later (~130 ms) by the lateralized N2ac ERP activation, reflecting the focusing of auditory spatial attention toward the Target sound and paralleling attentional-shifting processes widely studied in vision. Here we directly examined the early, bilateral, Target-selective effect to better understand its nature and functional role. Participants listened to midline-presented sounds that included Target and Nontarget stimuli that were randomly either embedded in a brief rapid stream or presented alone. The results indicate that this early bilateral effect results from a template for the Target that utilizes its feature deviancy within a stream to enable rapid identification. Moreover, individual-differences analysis showed that the size of this effect was larger for subjects with faster response times. The findings support the hypothesis that our auditory attentional systems can implement and utilize a context-based relational template for a Target sound, making use of additional auditory information in the environment when needing to rapidly detect a relevant sound. PMID:25848684

  14. Auditory Brainstem Response to Complex Sounds Predicts Self-Reported Speech-in-Noise Performance

    Science.gov (United States)

    Anderson, Samira; Parbery-Clark, Alexandra; White-Schwoch, Travis; Kraus, Nina

    2013-01-01

    Purpose: To compare the ability of the auditory brainstem response to complex sounds (cABR) to predict subjective ratings of speech understanding in noise on the Speech, Spatial, and Qualities of Hearing Scale (SSQ; Gatehouse & Noble, 2004) relative to the predictive ability of the Quick Speech-in-Noise test (QuickSIN; Killion, Niquette,…

  15. Local field potential correlates of auditory working memory in primate dorsal temporal pole.

    Science.gov (United States)

    Bigelow, James; Ng, Chi-Wing; Poremba, Amy

    2016-06-01

    Dorsal temporal pole (dTP) is a cortical region at the rostral end of the superior temporal gyrus that forms part of the ventral auditory object processing pathway. Anatomical connections with frontal and medial temporal areas, as well as a recent single-unit recording study, suggest this area may be an important part of the network underlying auditory working memory (WM). To further elucidate the role of dTP in auditory WM, local field potentials (LFPs) were recorded from the left dTP region of two rhesus macaques during an auditory delayed matching-to-sample (DMS) task. Sample and test sounds were separated by a 5-s retention interval, and a behavioral response was required only if the sounds were identical (match trials). Sensitivity of auditory evoked responses in dTP to behavioral significance and context was further tested by passively presenting the sounds used as auditory WM memoranda both before and after the DMS task. Average evoked potentials (AEPs) for all cue types and phases of the experiment comprised two small-amplitude early onset components (N20, P40), followed by two broad, large-amplitude components occupying the remainder of the stimulus period (N120, P300), after which a final set of components were observed following stimulus offset (N80OFF, P170OFF). During the DMS task, the peak amplitude and/or latency of several of these components depended on whether the sound was presented as the sample or test, and whether the test matched the sample. Significant differences were also observed among the DMS task and passive exposure conditions. Comparing memory-related effects in the LFP signal with those obtained in the spiking data raises the possibility some memory-related activity in dTP may be locally produced and actively generated. The results highlight the involvement of dTP in auditory stimulus identification and recognition and its sensitivity to the behavioral significance of sounds in different contexts. This article is part of a Special

  16. Functional neuroanatomy of spatial sound processing in Alzheimer's disease.

    Science.gov (United States)

    Golden, Hannah L; Agustus, Jennifer L; Nicholas, Jennifer M; Schott, Jonathan M; Crutch, Sebastian J; Mancini, Laura; Warren, Jason D

    2016-03-01

    Deficits of auditory scene analysis accompany Alzheimer's disease (AD). However, the functional neuroanatomy of spatial sound processing has not been defined in AD. We addressed this using a "sparse" fMRI virtual auditory spatial paradigm in 14 patients with typical AD in relation to 16 healthy age-matched individuals. Sound stimulus sequences discretely varied perceived spatial location and pitch of the sound source in a factorial design. AD was associated with loss of differentiated cortical profiles of auditory location and pitch processing at the prescribed threshold, and significant group differences were identified for processing auditory spatial variation in posterior cingulate cortex (controls > AD) and the interaction of pitch and spatial variation in posterior insula (AD > controls). These findings build on emerging evidence for altered brain mechanisms of auditory scene analysis and suggest complex dysfunction of network hubs governing the interface of internal milieu and external environment in AD. Auditory spatial processing may be a sensitive probe of this interface and contribute to characterization of brain network failure in AD and other neurodegenerative syndromes. PMID:26923412

  17. Stimulation of the human auditory nerve with optical radiation

    Science.gov (United States)

    Fishman, Andrew; Winkler, Piotr; Mierzwinski, Jozef; Beuth, Wojciech; Izzo Matic, Agnella; Siedlecki, Zygmunt; Teudt, Ingo; Maier, Hannes; Richter, Claus-Peter

    2009-02-01

    A novel, spatially selective method to stimulate cranial nerves has been proposed: contact free stimulation with optical radiation. The radiation source is an infrared pulsed laser. The Case Report is the first report ever that shows that optical stimulation of the auditory nerve is possible in the human. The ethical approach to conduct any measurements or tests in humans requires efficacy and safety studies in animals, which have been conducted in gerbils. This report represents the first step in a translational research project to initiate a paradigm shift in neural interfaces. A patient was selected who required surgical removal of a large meningioma angiomatum WHO I by a planned transcochlear approach. Prior to cochlear ablation by drilling and subsequent tumor resection, the cochlear nerve was stimulated with a pulsed infrared laser at low radiation energies. Stimulation with optical radiation evoked compound action potentials from the human auditory nerve. Stimulation of the auditory nerve with infrared laser pulses is possible in the human inner ear. The finding is an important step for translating results from animal experiments to human and furthers the development of a novel interface that uses optical radiation to stimulate neurons. Additional measurements are required to optimize the stimulation parameters.

  18. Effects of Verbal Cues versus Pictorial Cues on the Transfer of Stimulus Control for Children with Autism

    Science.gov (United States)

    West, Elizabeth Anne

    2008-01-01

    The author examined the transfer of stimulus control from instructor assistance to verbal cues and pictorial cues. The intent was to determine whether it is easier to transfer stimulus control to one form of cue or the other. No studies have conducted such comparisons to date; however, literature exists to suggest that visual cues may be…

  19. Preschoolers' Learning of Brand Names from Visual Cues.

    OpenAIRE

    Macklin, M Carole

    1996-01-01

    This research addresses the question of how perceptual cues affect preschoolers' learning of brand names. It is found that when visual cues are provided in addition to brand names that are prior-associated in children's memory structures, children better remember the brand names. Although two cues (a picture and a color) improve memory over the imposition of a single cue, extensive visual cues may overtax young children's processing abilities. The study contributes to our understanding of how...

  20. Auditory hallucinations suppressed by etizolam in a patient with schizophrenia.

    Science.gov (United States)

    Benazzi, F; Mazzoli, M; Rossi, E

    1993-10-01

    A patient presented with a 15 year history of schizophrenia with auditory hallucinations. Though unresponsive to prolonged trials of neuroleptics, the auditory hallucinations disappeared with etizolam. PMID:7902201

  1. Temporal visual cues aid speech recognition

    DEFF Research Database (Denmark)

    Zhou, Xiang; Ross, Lars; Lehn-Schiøler, Tue;

    2006-01-01

    temporal synchronicity of the visual input that aids parsing of the auditory stream. More specifically, we expected that purely temporal information, which does not convey information such as place of articulation may facility word recognition. METHODS: To test this prediction we used temporal features of......BACKGROUND: It is well known that under noisy conditions, viewing a speaker's articulatory movement aids the recognition of spoken words. Conventionally it is thought that the visual input disambiguates otherwise confusing auditory input. HYPOTHESIS: In contrast we hypothesize that it is the...... audio to generate an artificial talking-face video and measured word recognition performance on simple monosyllabic words. RESULTS: When presenting words together with the artificial video we find that word recognition is improved over purely auditory presentation. The effect is significant (p...

  2. Auditory Association Cortex Lesions Impair Auditory Short-Term Memory in Monkeys

    Science.gov (United States)

    Colombo, Michael; D'Amato, Michael R.; Rodman, Hillary R.; Gross, Charles G.

    1990-01-01

    Monkeys that were trained to perform auditory and visual short-term memory tasks (delayed matching-to-sample) received lesions of the auditory association cortex in the superior temporal gyrus. Although visual memory was completely unaffected by the lesions, auditory memory was severely impaired. Despite this impairment, all monkeys could discriminate sounds closer in frequency than those used in the auditory memory task. This result suggests that the superior temporal cortex plays a role in auditory processing and retention similar to the role the inferior temporal cortex plays in visual processing and retention.

  3. Solving small spaces: investigating the use of landmark cues in brown capuchins (Cebus apella).

    Science.gov (United States)

    Hughes, Kelly D; Mullo, Enma; Santos, Laurie R

    2013-09-01

    Some researchers have recently argued that humans may be unusual among primates in preferring to use landmark information when reasoning about some kinds of spatial problems. Some have explained this phenomenon by positing that our species' tendency to prefer landmarks stems from a human-unique trait: language. Here, we test this hypothesis-that preferring to use landmarks to solve such tasks is related to language ability-by exploring landmark use in a spatial task in one non-human primate, the brown capuchin monkey (Cebus apella). We presented our subjects with the rotational displacement task, in which subjects attempt to relocate a reward hidden within an array of hiding locations which are subsequently rotated to a new position. Over several experiments, we varied the availability and the salience of a landmark cue within the array. Specifically, we varied (1) visual access to the array during rotation, (2) the type of landmark, (3) the consistency of the landmark qualities, and (4) the amount of exposure to the landmark. Across Experiments 1 through 4, capuchins did not successfully use landmarks cues, suggesting that non-linguistic primates may not spontaneously use landmarks to solve some spatial problems, as in this case of a small-scale dynamic spatial task. Importantly, we also observed that capuchins demonstrated some capacity to learn to use landmarks in Experiment 4, suggesting that non-linguistic creatures may be able to use some landmarks cues in similar spatial tasks with extensive training. PMID:23430144

  4. The capture and recreation of 3D auditory scenes

    Science.gov (United States)

    Li, Zhiyun

    The main goal of this research is to develop the theory and implement practical tools (in both software and hardware) for the capture and recreation of 3D auditory scenes. Our research is expected to have applications in virtual reality, telepresence, film, music, video games, auditory user interfaces, and sound-based surveillance. The first part of our research is concerned with sound capture via a spherical microphone array. The advantage of this array is that it can be steered into any 3D directions digitally with the same beampattern. We develop design methodologies to achieve flexible microphone layouts, optimal beampattern approximation and robustness constraint. We also design novel hemispherical and circular microphone array layouts for more spatially constrained auditory scenes. Using the captured audio, we then propose a unified and simple approach for recreating them by exploring the reciprocity principle that is satisfied between the two processes. Our approach makes the system easy to build, and practical. Using this approach, we can capture the 3D sound field by a spherical microphone array and recreate it using a spherical loudspeaker array, and ensure that the recreated sound field matches the recorded field up to a high order of spherical harmonics. For some regular or semi-regular microphone layouts, we design an efficient parallel implementation of the multi-directional spherical beamformer by using the rotational symmetries of the beampattern and of the spherical microphone array. This can be implemented in either software or hardware and easily adapted for other regular or semi-regular layouts of microphones. In addition, we extend this approach for headphone-based system. Design examples and simulation results are presented to verify our algorithms. Prototypes are built and tested in real-world auditory scenes.

  5. Effect of Auditory Constraints on Motor Learning Depends on Stage of Recovery Post Stroke

    Directory of Open Access Journals (Sweden)

    Viswanath eAluru

    2014-06-01

    Full Text Available In order to develop evidence-based rehabilitation protocols post stroke, one must first reconcile the vast heterogeneity in the post-stroke population and develop protocols to facilitate motor learning in the various subgroups. The main purpose of this study is to show that auditory constraints interact with the stage of recovery post stroke to influence motor learning. We characterized the stages of upper limb recovery using task-based kinematic measures in twenty subjects with chronic hemiparesis, and used a bimanual wrist extension task using a custom-made wrist trainer to facilitate learning of wrist extension in the paretic hand under four auditory conditions: 1 without auditory cueing; 2 to non-musical happy sounds; 3 to self-selected music; and 4 to a metronome beat set at a comfortable tempo. Two bimanual trials (15 s each were followed by one unimanual trial with the paretic hand over six cycles under each condition. Clinical metrics, wrist and arm kinematics and electromyographic activity were recorded. Hierarchical cluster analysis with the Mahalanobis metric based on baseline speed and extent of wrist movement stratified subjects into three distinct groups which reflected their stage of recovery: spastic paresis, spastic co-contraction, and minimal paresis. In spastic paresis, the metronome beat increased wrist extension, but also increased muscle co-activation across the wrist. In contrast, in spastic co-contraction, no auditory stimulation increased wrist extension and reduced co-activation. In minimal paresis, wrist extension did not improve under any condition. The results suggest that auditory task constraints interact with stage of recovery during motor learning after stroke, perhaps due to recruitment of distinct neural substrates over the course of recovery. The findings advance our understanding of the mechanisms of progression of motor recovery and lay the foundation for personalized treatment algorithms post stroke.

  6. Segregation and integration of auditory streams when listening to multi-part music.

    Directory of Open Access Journals (Sweden)

    Marie Ragert

    Full Text Available In our daily lives, auditory stream segregation allows us to differentiate concurrent sound sources and to make sense of the scene we are experiencing. However, a combination of segregation and the concurrent integration of auditory streams is necessary in order to analyze the relationship between streams and thus perceive a coherent auditory scene. The present functional magnetic resonance imaging study investigates the relative role and neural underpinnings of these listening strategies in multi-part musical stimuli. We compare a real human performance of a piano duet and a synthetic stimulus of the same duet in a prioritized integrative attention paradigm that required the simultaneous segregation and integration of auditory streams. In so doing, we manipulate the degree to which the attended part of the duet led either structurally (attend melody vs. attend accompaniment or temporally (asynchronies vs. no asynchronies between parts, and thus the relative contributions of integration and segregation used to make an assessment of the leader-follower relationship. We show that perceptually the relationship between parts is biased towards the conventional structural hierarchy in western music in which the melody generally dominates (leads the accompaniment. Moreover, the assessment varies as a function of both cognitive load, as shown through difficulty ratings and the interaction of the temporal and the structural relationship factors. Neurally, we see that the temporal relationship between parts, as one important cue for stream segregation, revealed distinct neural activity in the planum temporale. By contrast, integration used when listening to both the temporally separated performance stimulus and the temporally fused synthetic stimulus resulted in activation of the intraparietal sulcus. These results support the hypothesis that the planum temporale and IPS are key structures underlying the mechanisms of segregation and integration of

  7. Effect of auditory constraints on motor performance depends on stage of recovery post-stroke.

    Science.gov (United States)

    Aluru, Viswanath; Lu, Ying; Leung, Alan; Verghese, Joe; Raghavan, Preeti

    2014-01-01

    In order to develop evidence-based rehabilitation protocols post-stroke, one must first reconcile the vast heterogeneity in the post-stroke population and develop protocols to facilitate motor learning in the various subgroups. The main purpose of this study is to show that auditory constraints interact with the stage of recovery post-stroke to influence motor learning. We characterized the stages of upper limb recovery using task-based kinematic measures in 20 subjects with chronic hemiparesis. We used a bimanual wrist extension task, performed with a custom-made wrist trainer, to facilitate learning of wrist extension in the paretic hand under four auditory conditions: (1) without auditory cueing; (2) to non-musical happy sounds; (3) to self-selected music; and (4) to a metronome beat set at a comfortable tempo. Two bimanual trials (15 s each) were followed by one unimanual trial with the paretic hand over six cycles under each condition. Clinical metrics, wrist and arm kinematics, and electromyographic activity were recorded. Hierarchical cluster analysis with the Mahalanobis metric based on baseline speed and extent of wrist movement stratified subjects into three distinct groups, which reflected their stage of recovery: spastic paresis, spastic co-contraction, and minimal paresis. In spastic paresis, the metronome beat increased wrist extension, but also increased muscle co-activation across the wrist. In contrast, in spastic co-contraction, no auditory stimulation increased wrist extension and reduced co-activation. In minimal paresis, wrist extension did not improve under any condition. The results suggest that auditory task constraints interact with stage of recovery during motor learning after stroke, perhaps due to recruitment of distinct neural substrates over the course of recovery. The findings advance our understanding of the mechanisms of progression of motor recovery and lay the foundation for personalized treatment algorithms post-stroke. PMID

  8. A comparative study of simple auditory reaction time in blind (congenitally and sighted subjects

    Directory of Open Access Journals (Sweden)

    Pritesh Hariprasad Gandhi

    2013-01-01

    Full Text Available Background: Reaction time is the time interval between the application of a stimulus and the appearance of appropriate voluntary response by a subject. It involves stimulus processing, decision making, and response programming. Reaction time study has been popular due to their implication in sports physiology. Reaction time has been widely studied as its practical implications may be of great consequence e.g., a slower than normal reaction time while driving can have grave results. Objective: To study simple auditory reaction time in congenitally blind subjects and in age sex matched sighted subjects. To compare the simple auditory reaction time between congenitally blind subjects and healthy control subjects. Materials and Methods: Study had been carried out in two groups: The 1 st of 50 congenitally blind subjects and 2 nd group comprises of 50 healthy controls. It was carried out on Multiple Choice Reaction Time Apparatus, Inco Ambala Ltd. (Accuracy΁0.001 s in a sitting position at Government Medical College and Hospital, Bhavnagar and at a Blind School, PNR campus, Bhavnagar, Gujarat, India. Observations / Results: Simple auditory reaction time response with four different type of sound (horn, bell, ring, and whistle was recorded in both groups. According to our study, there is no significant different in reaction time between congenital blind and normal healthy persons. Conclusion: Blind individuals commonly utilize tactual and auditory cues for information and orientation and they reliance on touch and audition, together with more practice in using these modalities to guide behavior, is often reflected in better performance of blind relative to sighted participants in tactile or auditory discrimination tasks, but there is not any difference in reaction time between congenitally blind and sighted people.

  9. The Auditory-Visual Speech Benefit on Working Memory in Older Adults with Hearing Impairment.

    Science.gov (United States)

    Frtusova, Jana B; Phillips, Natalie A

    2016-01-01

    This study examined the effect of auditory-visual (AV) speech stimuli on working memory in older adults with poorer-hearing (PH) in comparison to age- and education-matched older adults with better hearing (BH). Participants completed a working memory n-back task (0- to 2-back) in which sequences of digits were presented in visual-only (i.e., speech-reading), auditory-only (A-only), and AV conditions. Auditory event-related potentials (ERP) were collected to assess the relationship between perceptual and working memory processing. The behavioral results showed that both groups were faster in the AV condition in comparison to the unisensory conditions. The ERP data showed perceptual facilitation in the AV condition, in the form of reduced amplitudes and latencies of the auditory N1 and/or P1 components, in the PH group. Furthermore, a working memory ERP component, the P3, peaked earlier for both groups in the AV condition compared to the A-only condition. In general, the PH group showed a more robust AV benefit; however, the BH group showed a dose-response relationship between perceptual facilitation and working memory improvement, especially for facilitation of processing speed. Two measures, reaction time and P3 amplitude, suggested that the presence of visual speech cues may have helped the PH group to counteract the demanding auditory processing, to the level that no group differences were evident during the AV modality despite lower performance during the A-only condition. Overall, this study provides support for the theory of an integrated perceptual-cognitive system. The practical significance of these findings is also discussed. PMID:27148106

  10. The Auditory-Visual Speech Benefit on Working Memory in Older Adults with Hearing Impairment

    Directory of Open Access Journals (Sweden)

    Jana B. Frtusova

    2016-04-01

    Full Text Available This study examined the effect of auditory-visual (AV speech stimuli on working memory in hearing impaired participants (HIP in comparison to age- and education-matched normal elderly controls (NEC. Participants completed a working memory n-back task (0- to 2-back in which sequences of digits were presented in visual-only (i.e., speech-reading, auditory-only (A-only, and AV conditions. Auditory event-related potentials (ERP were collected to assess the relationship between perceptual and working memory processing. The behavioural results showed that both groups were faster in the AV condition in comparison to the unisensory conditions. The ERP data showed perceptual facilitation in the AV condition, in the form of reduced amplitudes and latencies of the auditory N1 and/or P1 components, in the HIP group. Furthermore, a working memory ERP component, the P3, peaked earlier for both groups in the AV condition compared to the A-only condition. In general, the HIP group showed a more robust AV benefit; however, the NECs showed a dose-response relationship between perceptual facilitation and working memory improvement, especially for facilitation of processing speed. Two measures, reaction time and P3 amplitude, suggested that the presence of visual speech cues may have helped the HIP to counteract the demanding auditory processing, to the level that no group differences were evident during the AV modality despite lower performance during the A-only condition. Overall, this study provides support for the theory of an integrated perceptual-cognitive system. The practical significance of these findings is also discussed.

  11. Colliding Cues in Word Segmentation: The Role of Cue Strength and General Cognitive Processes

    Science.gov (United States)

    Weiss, Daniel J.; Gerfen, Chip; Mitchel, Aaron D.

    2010-01-01

    The process of word segmentation is flexible, with many strategies potentially available to learners. This experiment explores how segmentation cues interact, and whether successful resolution of cue competition is related to general executive functioning. Participants listened to artificial speech streams that contained both statistical and…

  12. Counterconditioning reduces cue-induced craving and actual cue-elicited consumption.

    NARCIS (Netherlands)

    D. van Gucht; F. Baeyens; D. Vansteenwegen; D. Hermans; T. Beckers

    2010-01-01

    Cue-induced craving is not easily reduced by an extinction or exposure procedure and may constitute an important route toward relapse in addictive behavior after treatment. In the present study, we investigated the effectiveness of counterconditioning as an alternative procedure to reduce cue-induce

  13. Cues for Better Writing: Empirical Assessment of a Word Counter and Cueing Application's Effectiveness

    Science.gov (United States)

    Vijayasarathy, Leo R.; Gould, Susan Martin; Gould, Michael

    2015-01-01

    Written clarity and conciseness are desired by employers and emphasized in business communication courses. We developed and tested the efficacy of a cueing tool--Scribe Bene--to help students reduce their use of imprecise and ambiguous words and wordy phrases. Effectiveness was measured by comparing cue word usage between a treatment group given…

  14. Mapping procedures can produce non-centered auditory images in bilateral cochlear implantees

    OpenAIRE

    Goupell, Matthew J.; Kan, Alan; Litovsky, Ruth Y.

    2013-01-01

    Good localization accuracy depends on an auditory spatial map that provides consistent binaural information across frequency and level. This study investigated whether mapping bilateral cochlear implants (CIs) independently contributes to distorted perceptual spatial maps. In a meta-analysis, interaural level differences necessary to perceptually center sound images were calculated for 127 pitch-matched pairs of electrodes; many needed large current adjustments to be perceptually centered. In...

  15. The effect of visual cues on difficulty ratings for segregation of musical streams in listeners with impaired hearing.

    Directory of Open Access Journals (Sweden)

    Hamish Innes-Brown

    Full Text Available BACKGROUND: Enjoyment of music is an important part of life that may be degraded for people with hearing impairments, especially those using cochlear implants. The ability to follow separate lines of melody is an important factor in music appreciation. This ability relies on effective auditory streaming, which is much reduced in people with hearing impairment, contributing to difficulties in music appreciation. The aim of this study was to assess whether visual cues could reduce the subjective difficulty of segregating a melody from interleaved background notes in normally hearing listeners, those using hearing aids, and those using cochlear implants. METHODOLOGY/PRINCIPAL FINDINGS: Normally hearing listeners (N = 20, hearing aid users (N = 10, and cochlear implant users (N = 11 were asked to rate the difficulty of segregating a repeating four-note melody from random interleaved distracter notes. The pitch of the background notes was gradually increased or decreased throughout blocks, providing a range of difficulty from easy (with a large pitch separation between melody and distracter to impossible (with the melody and distracter completely overlapping. Visual cues were provided on half the blocks, and difficulty ratings for blocks with and without visual cues were compared between groups. Visual cues reduced the subjective difficulty of extracting the melody from the distracter notes for normally hearing listeners and cochlear implant users, but not hearing aid users. CONCLUSION/SIGNIFICANCE: Simple visual cues may improve the ability of cochlear implant users to segregate lines of music, thus potentially increasing their enjoyment of music. More research is needed to determine what type of acoustic cues to encode visually in order to optimise the benefits they may provide.

  16. Narrow, duplicated internal auditory canal

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, T. [Servico de Neurorradiologia, Hospital Garcia de Orta, Avenida Torrado da Silva, 2801-951, Almada (Portugal); Shayestehfar, B. [Department of Radiology, UCLA Oliveview School of Medicine, Los Angeles, California (United States); Lufkin, R. [Department of Radiology, UCLA School of Medicine, Los Angeles, California (United States)

    2003-05-01

    A narrow internal auditory canal (IAC) constitutes a relative contraindication to cochlear implantation because it is associated with aplasia or hypoplasia of the vestibulocochlear nerve or its cochlear branch. We report an unusual case of a narrow, duplicated IAC, divided by a bony septum into a superior relatively large portion and an inferior stenotic portion, in which we could identify only the facial nerve. This case adds support to the association between a narrow IAC and aplasia or hypoplasia of the vestibulocochlear nerve. The normal facial nerve argues against the hypothesis that the narrow IAC is the result of a primary bony defect which inhibits the growth of the vestibulocochlear nerve. (orig.)

  17. Auditory brainstem response in dolphins.

    OpenAIRE

    Ridgway, S. H.; Bullock, T H; Carder, D.A.; Seeley, R L; Woods, D.; Galambos, R

    1981-01-01

    We recorded the auditory brainstem response (ABR) in four dolphins (Tursiops truncatus and Delphinus delphis). The ABR evoked by clicks consists of seven waves within 10 msec; two waves often contain dual peaks. The main waves can be identified with those of humans and laboratory mammals; in spite of a much longer path, the latencies of the peaks are almost identical to those of the rat. The dolphin ABR waves increase in latency as the intensity of a sound decreases by only 4 microseconds/dec...

  18. Auditory Processing Disorder and Foreign Language Acquisition

    Science.gov (United States)

    Veselovska, Ganna

    2015-01-01

    This article aims at exploring various strategies for coping with the auditory processing disorder in the light of foreign language acquisition. The techniques relevant to dealing with the auditory processing disorder can be attributed to environmental and compensatory approaches. The environmental one involves actions directed at creating a…

  19. Deciphering faces: quantifiable visual cues to weight.

    Science.gov (United States)

    Coetzee, Vinet; Chen, Jingying; Perrett, David I; Stephen, Ian D

    2010-01-01

    Body weight plays a crucial role in mate choice, as weight is related to both attractiveness and health. People are quite accurate at judging weight in faces, but the cues used to make these judgments have not been defined. This study consisted of two parts. First, we wanted to identify quantifiable facial cues that are related to body weight, as defined by body mass index (BMI). Second, we wanted to test whether people use these cues to judge weight. In study 1, we recruited two groups of Caucasian and two groups of African participants, determined their BMI and measured their 2-D facial images for: width-to-height ratio, perimeter-to-area ratio, and cheek-to-jaw-width ratio. All three measures were significantly related to BMI in males, while the width-to-height and cheek-to-jaw-width ratios were significantly related to BMI in females. In study 2, these images were rated for perceived weight by Caucasian observers. We showed that these observers use all three cues to judge weight in African and Caucasian faces of both sexes. These three facial cues, width-to-height ratio, perimeter-to-area ratio, and cheek-to-jaw-width ratio, are therefore not only related to actual weight but provide a basis for perceptual attributes as well. PMID:20301846

  20. Enhancing Manual Scan Registration Using Audio Cues

    Science.gov (United States)

    Ntsoko, T.; Sithole, G.

    2014-04-01

    Indoor mapping and modelling requires that acquired data be processed by editing, fusing, formatting the data, amongst other operations. Currently the manual interaction the user has with the point cloud (data) while processing it is visual. Visual interaction does have limitations, however. One way of dealing with these limitations is to augment audio in point cloud processing. Audio augmentation entails associating points of interest in the point cloud with audio objects. In coarse scan registration, reverberation, intensity and frequency audio cues were exploited to help the user estimate depth and occupancy of space of points of interest. Depth estimations were made reliably well when intensity and frequency were both used as depth cues. Coarse changes of depth could be estimated in this manner. The depth between surfaces can therefore be estimated with the aid of the audio objects. Sound reflections of an audio object provided reliable information of the object surroundings in some instances. For a point/area of interest in the point cloud, these reflections can be used to determine the unseen events around that point/area of interest. Other processing techniques could benefit from this while other information is estimated using other audio cues like binaural cues and Head Related Transfer Functions. These other cues could be used in position estimations of audio objects to aid in problems such as indoor navigation problems.

  1. Scene-based contextual cueing in pigeons.

    Science.gov (United States)

    Wasserman, Edward A; Teng, Yuejia; Brooks, Daniel I

    2014-10-01

    Repeated pairings of a particular visual context with a specific location of a target stimulus facilitate target search in humans. We explored an animal model of such contextual cueing. Pigeons had to peck a target, which could appear in 1 of 4 locations on color photographs of real-world scenes. On half of the trials, each of 4 scenes was consistently paired with 1 of 4 possible target locations; on the other half of the trials, each of 4 different scenes was randomly paired with the same 4 possible target locations. In Experiments 1 and 2, pigeons exhibited robust contextual cueing when the context preceded the target by 1 s to 8 s, with reaction times to the target being shorter on predictive-scene trials than on random-scene trials. Pigeons also responded more frequently during the delay on predictive-scene trials than on random-scene trials; indeed, during the delay on predictive-scene trials, pigeons predominately pecked toward the location of the upcoming target, suggesting that attentional guidance contributes to contextual cueing. In Experiment 3, involving left-right and top-bottom scene reversals, pigeons exhibited stronger control by global than by local scene cues. These results attest to the robustness and associative basis of contextual cueing in pigeons. PMID:25546098

  2. Relating auditory attributes of multichannel sound to preference and to physical parameters

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian Maria

    2006-01-01

    Sound reproduced by multichannel systems is affected by many factors giving rise to various sensations, or auditory attributes. Relating specific attributes to overall preference and to physical measures of the sound field provides valuable information for a better understanding of the parameters...... within and between musical program materials, allowing for a careful generalization regarding the perception of spatial audio reproduction. Finally, a set of objective measures is derived from analysis of the sound field at the listening position in an attempt to predict the auditory attributes....

  3. Effects of Auditory Rhythm and Music on Gait Disturbances in Parkinson's Disease.

    Science.gov (United States)

    Ashoori, Aidin; Eagleman, David M; Jankovic, Joseph

    2015-01-01

    Gait abnormalities, such as shuffling steps, start hesitation, and freezing, are common and often incapacitating symptoms of Parkinson's disease (PD) and other parkinsonian disorders. Pharmacological and surgical approaches have only limited efficacy in treating these gait disorders. Rhythmic auditory stimulation (RAS), such as playing marching music and dance therapy, has been shown to be a safe, inexpensive, and an effective method in improving gait in PD patients. However, RAS that adapts to patients' movements may be more effective than rigid, fixed-tempo RAS used in most studies. In addition to auditory cueing, immersive virtual reality technologies that utilize interactive computer-generated systems through wearable devices are increasingly used for improving brain-body interaction and sensory-motor integration. Using multisensory cues, these therapies may be particularly suitable for the treatment of parkinsonian freezing and other gait disorders. In this review, we examine the affected neurological circuits underlying gait and temporal processing in PD patients and summarize the current studies demonstrating the effects of RAS on improving these gait deficits. PMID:26617566

  4. Neural representation in the auditory midbrain of the envelope of vocalizations based on a peripheral ear model

    Directory of Open Access Journals (Sweden)

    Thilo eRode

    2013-10-01

    Full Text Available The auditory midbrain implant (AMI consists of a single shank array (20 sites for stimulation along the tonotopic axis of the central nucleus of the inferior colliculus (ICC and has been safely implanted in deaf patients who cannot benefit from a cochlear implant (CI. The AMI improves lip-reading abilities and environmental awareness in the implanted patients. However, the AMI cannot achieve the high levels of speech perception possible with the CI. It appears the AMI can transmit sufficient spectral cues but with limited temporal cues required for speech understanding. Currently, the AMI uses a CI-based strategy, which was originally designed to stimulate each frequency region along the cochlea with amplitude-modulated pulse trains matching the envelope of the bandpass-filtered sound components. However, it is unclear if this type of stimulation with only a single site within each frequency lamina of the ICC can elicit sufficient temporal cues for speech perception. At least speech understanding in quiet is still possible with envelope cues as low as 50 Hz. Therefore, we investigated how ICC neurons follow the bandpass-filtered envelope structure of natural stimuli in ketamine-anesthetized guinea pigs. We identified a subset of ICC neurons that could closely follow the envelope structure (up to ~100 Hz of a diverse set of species-specific calls, which was revealed by using a peripheral ear model to estimate the true bandpass-filtered envelopes observed by the brain. Although previous studies have suggested a complex neural transformation from the auditory nerve to the ICC, our data suggest that the brain maintains a robust temporal code in a subset of ICC neurons matching the envelope structure of natural stimuli. Clinically, these findings suggest that a CI-based strategy may still be effective for the AMI if the appropriate neurons are entrained to the envelope of the acoustic stimulus and can transmit sufficient temporal cues to higher

  5. Glycinergic transmission modulates GABAergic inhibition in the avian auditory pathway

    Directory of Open Access Journals (Sweden)

    Matthew J Fischl

    2014-03-01

    Full Text Available For all neurons, a proper balance of synaptic excitation and inhibition is crucial to effect computational precision. Achievement of this balance is remarkable when one considers factors that modulate synaptic strength operate on multiple overlapping time scales and affect both pre- and postsynaptic elements. Recent studies have shown that inhibitory transmitters, glycine and GABA, are co-released in auditory nuclei involved in the computation of interaural time disparities (ITDs, a cue used to process sound source location. The co-release expressed at these synapses is heavily activity dependent, and generally occurs when input rates are high. This circuitry, in both birds and mammals, relies on inhibitory input to maintain the temporal precision necessary for ITD encoding. Studies of co-release in other brain regions suggest that GABA and glycine receptors (GlyRs interact via cross-suppressive modulation of receptor conductance. We performed in vitro whole-cell recordings in several nuclei of the chicken brainstem auditory circuit to assess whether this cross-suppressive phenomenon was evident in the avian brainstem. We evaluated the effect of pressure-puff applied glycine on synaptically evoked inhibitory currents in nucleus magnocellularis (NM and the superior olivary nucleus (SON. Glycine pre-application reduced the amplitude of inhibitory postsynaptic currents evoked during a 100Hz train stimulus in both nuclei. This apparent glycinergic modulation was blocked in the presence of strychnine. Further experiments showed that this modulation did not depend on postsynaptic biochemical interactions such as phosphatase activity, or direct interactions between GABA and glycine receptor proteins. Rather, voltage clamp experiments in which we manipulated Cl- flux during agonist application suggest that activation of one receptor will modulate the conductance of the other via local changes in Cl- ion concentration within microdomains of the

  6. THE EFFECTS OF SALICYLATE ON AUDITORY EVOKED POTENTIAL AMPLITWDE FROM THE AUDITORY CORTEX AND AUDITORY BRAINSTEM

    Institute of Scientific and Technical Information of China (English)

    Brian Sawka; SUN Wei

    2014-01-01

    Tinnitus has often been studied using salicylate in animal models as they are capable of inducing tempo-rary hearing loss and tinnitus. Studies have recently observed enhancement of auditory evoked responses of the auditory cortex (AC) post salicylate treatment which is also shown to be related to tinnitus like behavior in rats. The aim of this study was to observe if enhancements of the AC post salicylate treatment are also present at structures in the brainstem. Four male Sprague Dawley rats with AC implanted electrodes were tested for both AC and auditory brainstem response (ABR) recordings pre and post 250 mg/kg intraperitone-al injections of salicylate. The responses were recorded as the peak to trough amplitudes of P1-N1 (AC), ABR wave V, and ABR waveⅡ. AC responses resulted in statistically significant enhancement of ampli-tude at 2 hours post salicylate with 90 dB stimuli tone bursts of 4, 8, 12, and 20 kHz. Wave V of ABR re-sponses at 90 dB resulted in a statistically significant reduction of amplitude 2 hours post salicylate and a mean decrease of amplitude of 31%for 16 kHz. WaveⅡamplitudes at 2 hours post treatment were signifi-cantly reduced for 4, 12, and 20 kHz stimuli at 90 dB SPL. Our results suggest that the enhancement chang-es of the AC related to salicylate induced tinnitus are generated superior to the level of the inferior colliculus and may originate in the AC.

  7. Distracted by cues for suppressed memories.

    Science.gov (United States)

    Hertel, Paula T; Hayes, Jeffrey A

    2015-06-01

    We examined the potential cost of practicing suppression of negative thoughts on subsequent performance in an unrelated task. Cues for previously suppressed and unsuppressed (baseline) responses in a think/no-think procedure were displayed as irrelevant flankers for neutral words to be judged for emotional valence. These critical flankers were homographs with one negative meaning denoted by their paired response during learning. Responses to the targets were delayed when suppression cues (compared with baseline cues and new negative homographs) were used as flankers, but only following direct-suppression instructions and not when benign substitutes had been provided to aid suppression. On a final recall test, suppression-induced forgetting following direct suppression and the flanker task was positively correlated with the flanker effect. Experiment 2 replicated these findings. Finally, valence ratings of neutral targets were influenced by the valence of the flankers but not by the prior role of the negative flankers. PMID:25904596

  8. Highly effective photonic cue for repulsive axonal guidance.

    Directory of Open Access Journals (Sweden)

    Bryan J Black

    Full Text Available In vivo nerve repair requires not only the ability to regenerate damaged axons, but most importantly, the ability to guide developing or regenerating axons along paths that will result in functional connections. Furthermore, basic studies in neuroscience and neuro-electronic interface design require the ability to construct in vitro neural circuitry. Both these applications require the development of a noninvasive, highly effective tool for axonal growth-cone guidance. To date, a myriad of technologies have been introduced based on chemical, electrical, mechanical, and hybrid approaches (such as electro-chemical, optofluidic flow and photo-chemical methods. These methods are either lacking in desired spatial and temporal selectivity or require the introduction of invasive external factors. Within the last fifteen years however, several attractive guidance cues have been developed using purely light based cues to achieve axonal guidance. Here, we report a novel, purely optical repulsive guidance technique that uses low power, near infrared light, and demonstrates the guidance of primary goldfish retinal ganglion cell axons through turns of up to 120 degrees and over distances of ∼90 µm.

  9. Topographical cues regulate the crosstalk between MSCs and macrophages

    Science.gov (United States)

    Vallés, Gema; Bensiamar, Fátima; Crespo, Lara; Arruebo, Manuel; Vilaboa, Nuria; Saldaña, Laura

    2015-01-01

    Implantation of scaffolds may elicit a host foreign body response triggered by monocyte/macrophage lineage cells. Growing evidence suggests that topographical cues of scaffolds play an important role in MSC functionality. In this work, we examined whether surface topographical features can regulate paracrine interactions that MSCs establish with macrophages. Three-dimensional (3D) topography sensing drives MSCs into a spatial arrangement that stimulates the production of the anti-inflammatory proteins PGE2 and TSG-6. Compared to two-dimensional (2D) settings, 3D arrangement of MSCs co-cultured with macrophages leads to an important decrease in the secretion of soluble factors related with inflammation and chemotaxis including IL-6 and MCP-1. Attenuation of MCP-1 secretion in 3D co-cultures correlates with a decrease in the accumulation of its mRNA levels in MSCs and macrophages. Using neutralizing antibodies, we identified that the interplay between PGE2, IL-6, TSG-6 and MCP-1 in the co-cultures is strongly influenced by the micro-architecture that supports MSCs. Local inflammatory milieu provided by 3D-arranged MSCs in co-cultures induces a decrease in monocyte migration as compared to monolayer cells. This effect is partially mediated by reduced levels of IL-6 and MCP-1, proteins that up-regulate each other's secretion. Our findings highlight the importance of topographical cues in the soluble factor-guided communication between MSCs and macrophages. PMID:25453943

  10. Coordinated sensor cueing for chemical plume detection

    Science.gov (United States)

    Abraham, Nathan J.; Jensenius, Andrea M.; Watkins, Adam S.; Hawthorne, R. Chad; Stepnitz, Brian J.

    2011-05-01

    This paper describes an organic data fusion and sensor cueing approach for Chemical, Biological, Radiological, and Nuclear (CBRN) sensors. The Joint Warning and Reporting Network (JWARN) uses a hardware component referred to as the JWARN Component Interface Device (JCID). The Edgewood Chemical and Biological Center has developed a small footprint and open architecture solution for the JCID capability called JCID-on-a-Chip (JoaC). The JoaC program aims to reduce the cost and complexity of the JCID by shrinking the necessary functionality down to a small single board computer. This effort focused on development of a fusion and cueing algorithm organic to the JoaC hardware. By embedding this capability in the JoaC, sensors have the ability to receive and process cues from other sensors without the use of a complex and costly centralized infrastructure. Additionally, the JoaC software is hardware agnostic, as evidenced by its drop-in inclusion in two different system-on-a-chip platforms including Windows CE and LINUX environments. In this effort, a partnership between JPM-CA, JHU/APL, and the Edgewood Chemical and Biological Center (ECBC), the authors implemented and demonstrated a new algorithm for cooperative detection and localization of a chemical agent plume. This experiment used a pair of mobile Joint Services Lightweight Standoff Chemical Agent Detector (JSLSCAD) units which were controlled by fusion and cueing algorithms hosted on a JoaC. The algorithms embedded in the JoaC enabled the two sensor systems to perform cross cueing and cooperatively form a higher fidelity estimate of chemical releases by combining sensor readings. Additionally, each JSLSCAD had the ability to focus its search on smaller regions than those required by a single sensor system by using the cross cue information from the other sensor.

  11. Auditory agnosia due to long-term severe hydrocephalus caused by spina bifida - specific auditory pathway versus nonspecific auditory pathway.

    Science.gov (United States)

    Zhang, Qing; Kaga, Kimitaka; Hayashi, Akimasa

    2011-07-01

    A 27-year-old female showed auditory agnosia after long-term severe hydrocephalus due to congenital spina bifida. After years of hydrocephalus, she gradually suffered from hearing loss in her right ear at 19 years of age, followed by her left ear. During the time when she retained some ability to hear, she experienced severe difficulty in distinguishing verbal, environmental, and musical instrumental sounds. However, her auditory brainstem response and distortion product otoacoustic emissions were largely intact in the left ear. Her bilateral auditory cortices were preserved, as shown by neuroimaging, whereas her auditory radiations were severely damaged owing to progressive hydrocephalus. Although she had a complete bilateral hearing loss, she felt great pleasure when exposed to music. After years of self-training to read lips, she regained fluent ability to communicate. Clinical manifestations of this patient indicate that auditory agnosia can occur after long-term hydrocephalus due to spina bifida; the secondary auditory pathway may play a role in both auditory perception and hearing rehabilitation. PMID:21413843

  12. Active listening for spatial orientation in a complex auditory scene.

    Directory of Open Access Journals (Sweden)

    Cynthia F Moss

    2006-04-01

    Full Text Available To successfully negotiate a complex environment, an animal must control the timing of motor behaviors in coordination with dynamic sensory information. Here, we report on adaptive temporal control of vocal-motor behavior in an echolocating bat, Eptesicus fuscus, as it captured tethered insects close to background vegetation. Recordings of the bat's sonar vocalizations were synchronized with high-speed video images that were used to reconstruct the bat's three-dimensional flight path and the positions of target and vegetation. When the bat encountered the difficult task of taking insects as close as 10-20 cm from the vegetation, its behavior changed significantly from that under open room conditions. Its success rate decreased by about 50%, its time to initiate interception increased by a factor of ten, and its high repetition rate "terminal buzz" decreased in duration by a factor of three. Under all conditions, the bat produced prominent sonar "strobe groups," clusters of echolocation pulses with stable intervals. In the final stages of insect capture, the bat produced strobe groups at a higher incidence when the insect was positioned near clutter. Strobe groups occurred at all phases of the wingbeat (and inferred respiration cycle, challenging the hypothesis of strict synchronization between respiration and sound production in echolocating bats. The results of this study provide a clear demonstration of temporal vocal-motor control that directly impacts the signals used for perception.

  13. Functional maps of human auditory cortex: effects of acoustic features and attention.

    Directory of Open Access Journals (Sweden)

    David L Woods

    Full Text Available BACKGROUND: While human auditory cortex is known to contain tonotopically organized auditory cortical fields (ACFs, little is known about how processing in these fields is modulated by other acoustic features or by attention. METHODOLOGY/PRINCIPAL FINDINGS: We used functional magnetic resonance imaging (fMRI and population-based cortical surface analysis to characterize the tonotopic organization of human auditory cortex and analyze the influence of tone intensity, ear of delivery, scanner background noise, and intermodal selective attention on auditory cortex activations. Medial auditory cortex surrounding Heschl's gyrus showed large sensory (unattended activations with two mirror-symmetric tonotopic fields similar to those observed in non-human primates. Sensory responses in medial regions had symmetrical distributions with respect to the left and right hemispheres, were enlarged for tones of increased intensity, and were enhanced when sparse image acquisition reduced scanner acoustic noise. Spatial distribution analysis suggested that changes in tone intensity shifted activation within isofrequency bands. Activations to monaural tones were enhanced over the hemisphere contralateral to stimulation, where they produced activations similar to those produced by binaural sounds. Lateral regions of auditory cortex showed small sensory responses that were larger in the right than left hemisphere, lacked tonotopic organization, and were uninfluenced by acoustic parameters. Sensory responses in both medial and lateral auditory cortex decreased in magnitude throughout stimulus blocks. Attention-related modulations (ARMs were larger in lateral than medial regions of auditory cortex and appeared to arise primarily in belt and parabelt auditory fields. ARMs lacked tonotopic organization, were unaffected by acoustic parameters, and had distributions that were distinct from those of sensory responses. Unlike the gradual adaptation seen for sensory responses

  14. Spontaneous high-gamma band activity reflects functional organization of auditory cortex in the awake macaque.

    Science.gov (United States)

    Fukushima, Makoto; Saunders, Richard C; Leopold, David A; Mishkin, Mortimer; Averbeck, Bruno B

    2012-06-01

    In the absence of sensory stimuli, spontaneous activity in the brain has been shown to exhibit organization at multiple spatiotemporal scales. In the macaque auditory cortex, responses to acoustic stimuli are tonotopically organized within multiple, adjacent frequency maps aligned in a caudorostral direction on the supratemporal plane (STP) of the lateral sulcus. Here, we used chronic microelectrocorticography to investigate the correspondence between sensory maps and spontaneous neural fluctuations in the auditory cortex. We first mapped tonotopic organization across 96 electrodes spanning approximately two centimeters along the primary and higher auditory cortex. In separate sessions, we then observed that spontaneous activity at the same sites exhibited spatial covariation that reflected the tonotopic map of the STP. This observation demonstrates a close relationship between functional organization and spontaneous neural activity in the sensory cortex of the awake monkey. PMID:22681693

  15. Proximal vocal threat recruits the right voice-sensitive auditory cortex.

    Science.gov (United States)

    Ceravolo, Leonardo; Frühholz, Sascha; Grandjean, Didier

    2016-05-01

    The accurate estimation of the proximity of threat is important for biological survival and to assess relevant events of everyday life. We addressed the question of whether proximal as compared with distal vocal threat would lead to a perceptual advantage for the perceiver. Accordingly, we sought to highlight the neural mechanisms underlying the perception of proximal vs distal threatening vocal signals by the use of functional magnetic resonance imaging. Although we found that the inferior parietal and superior temporal cortex of human listeners generally decoded the spatial proximity of auditory vocalizations, activity in the right voice-sensitive auditory cortex was specifically enhanced for proximal aggressive relative to distal aggressive voices as compared with neutral voices. Our results shed new light on the processing of imminent danger signaled by proximal vocal threat and show the crucial involvement of the right mid voice-sensitive auditory cortex in such processing. PMID:26746180

  16. A Comparison of Three Auditory Discrimination-Perception Tests

    Science.gov (United States)

    Koenke, Karl

    1978-01-01

    Comparisons were made between scores of 52 third graders on three measures of auditory discrimination: Wepman's Auditory Discrimination Test, the Goldman-Fristoe Woodcock (GFW) Test of Auditory Discrimination, and the Kimmell-Wahl Screening Test of Auditory Perception (STAP). (CL)

  17. Large cross-sectional study of presbycusis reveals rapid progressive decline in auditory temporal acuity.

    Science.gov (United States)

    Ozmeral, Erol J; Eddins, Ann C; Frisina, D Robert; Eddins, David A

    2016-07-01

    The auditory system relies on extraordinarily precise timing cues for the accurate perception of speech, music, and object identification. Epidemiological research has documented the age-related progressive decline in hearing sensitivity that is known to be a major health concern for the elderly. Although smaller investigations indicate that auditory temporal processing also declines with age, such measures have not been included in larger studies. Temporal gap detection thresholds (TGDTs; an index of auditory temporal resolution) measured in 1071 listeners (aged 18-98 years) were shown to decline at a minimum rate of 1.05 ms (15%) per decade. Age was a significant predictor of TGDT when controlling for audibility (partial correlation) and when restricting analyses to persons with normal-hearing sensitivity (n = 434). The TGDTs were significantly better for males (3.5 ms; 51%) than females when averaged across the life span. These results highlight the need for indices of temporal processing in diagnostics, as treatment targets, and as factors in models of aging. PMID:27255816

  18. Auditory Efferent System Modulates Mosquito Hearing.

    Science.gov (United States)

    Andrés, Marta; Seifert, Marvin; Spalthoff, Christian; Warren, Ben; Weiss, Lukas; Giraldo, Diego; Winkler, Margret; Pauls, Stephanie; Göpfert, Martin C

    2016-08-01

    The performance of vertebrate ears is controlled by auditory efferents that originate in the brain and innervate the ear, synapsing onto hair cell somata and auditory afferent fibers [1-3]. Efferent activity can provide protection from noise and facilitate the detection and discrimination of sound by modulating mechanical amplification by hair cells and transmitter release as well as auditory afferent action potential firing [1-3]. Insect auditory organs are thought to lack efferent control [4-7], but when we inspected mosquito ears, we obtained evidence for its existence. Antibodies against synaptic proteins recognized rows of bouton-like puncta running along the dendrites and axons of mosquito auditory sensory neurons. Electron microscopy identified synaptic and non-synaptic sites of vesicle release, and some of the innervating fibers co-labeled with somata in the CNS. Octopamine, GABA, and serotonin were identified as efferent neurotransmitters or neuromodulators that affect auditory frequency tuning, mechanical amplification, and sound-evoked potentials. Mosquito brains thus modulate mosquito ears, extending the use of auditory efferent systems from vertebrates to invertebrates and adding new levels of complexity to mosquito sound detection and communication. PMID:27476597

  19. Cue reactivity in virtual reality: the role of context.

    Science.gov (United States)

    Paris, Megan M; Carter, Brian L; Traylor, Amy C; Bordnick, Patrick S; Day, Susan X; Armsworth, Mary W; Cinciripini, Paul M

    2011-07-01

    Cigarette smokers in laboratory experiments readily respond to smoking stimuli with increased craving. An alternative to traditional cue-reactivity methods (e.g., exposure to cigarette photos), virtual reality (VR) has been shown to be a viable cue presentation method to elicit and assess cigarette craving within complex virtual environments. However, it remains poorly understood whether contextual cues from the environment contribute to craving increases in addition to specific cues, like cigarettes. This study examined the role of contextual cues in a VR environment to evoke craving. Smokers were exposed to a virtual convenience store devoid of any specific cigarette cues followed by exposure to the same convenience store with specific cigarette cues added. Smokers reported increased craving following exposure to the virtual convenience store without specific cues, and significantly greater craving following the convenience store with cigarette cues added. However, increased craving recorded after the second convenience store may have been due to the pre-exposure to the first convenience store. This study offers evidence that an environmental context where cigarette cues are normally present (but are not), elicits significant craving in the absence of specific cigarette cues. This finding suggests that VR may have stronger ecological validity over traditional cue reactivity exposure methods by exposing smokers to the full range of cigarette-related environmental stimuli, in addition to specific cigarette cues, that smokers typically experience in their daily lives. PMID:21349649

  20. The modulation of spatial congruency by object-based attention: analysing the "locus" of the modulation.

    Science.gov (United States)

    Luo, Chunming; Lupiáñez, Juan; Funes, María Jesús; Fu, Xiaolan

    2011-12-01

    Earlier studies have demonstrated that spatial cueing differentially reduces stimulus-stimulus congruency (e.g., spatial Stroop) interference but not stimulus-response congruency (e.g., Simon; e.g., Lupiáñez & Funes, 2005). This spatial cueing modulation over spatial Stroop seems to be entirely attributable to object-based attention (e.g., Luo, Lupiáñez, Funes, & Fu, 2010). In the present study, two experiments were conducted to further explore whether the cueing modulation of spatial Stroop is object based and/or space based and to analyse the "locus" of this modulation. In Experiment 1, we found that the cueing modulation over spatial Stroop is entirely object based, independent of stimulus-response congruency. In Experiment 2, we observed that the modulation of object-based attention over the spatial Stroop only occurred at a short cue-target interval (i.e., stimulus onset asynchrony; SOA), whereas the stimulus-response congruency effect was not modulated either by object-based or by location-based attentional cueing. The overall pattern of results suggests that the spatial cueing modulation over spatial Stroop arises from object-based attention and occurs at the perceptual stage of processing. PMID:21923623

  1. Phase Sensitive Cueing for 3D Objects in Overhead Images

    Energy Technology Data Exchange (ETDEWEB)

    Paglieroni, D W; Eppler, W G; Poland, D N

    2005-02-18

    A 3D solid model-aided object cueing method that matches phase angles of directional derivative vectors at image pixels to phase angles of vectors normal to projected model edges is described. It is intended for finding specific types of objects at arbitrary position and orientation in overhead images, independent of spatial resolution, obliqueness, acquisition conditions, and type of imaging sensor. It is shown that the phase similarity measure can be efficiently evaluated over all combinations of model position and orientation using the FFT. The highest degree of similarity over all model orientations is captured in a match surface of similarity values vs. model position. Unambiguous peaks in this surface are sorted in descending order of similarity value, and the small image thumbnails that contain them are presented to human analysts for inspection in sorted order.

  2. Functional Neurochemistry of the Auditory System

    Directory of Open Access Journals (Sweden)

    Nourollah Agha Ebrahimi

    1993-03-01

    Full Text Available Functional Neurochemistry is one of the fields of studies in the auditory system which has had an outstanding development in the recent years. Many of the findings in the mentioned field had led not only the basic auditory researches but also the clinicians to new points of view in audiology.Here, we are aimed at discussing the latest investigations in the Functional Neurochemistry of the auditory system and have focused this review mainly on the researches which will arise flashes of hope for future clinical studies

  3. Auditory Neuropathy/Dyssynchrony in Biotinidase Deficiency

    Science.gov (United States)

    Yaghini, Omid

    2016-01-01

    Biotinidase deficiency is a disorder inherited autosomal recessively showing evidence of hearing loss and optic atrophy in addition to seizures, hypotonia, and ataxia. In the present study, a 2-year-old boy with Biotinidase deficiency is presented in which clinical symptoms have been reported with auditory neuropathy/auditory dyssynchrony (AN/AD). In this case, transient-evoked otoacoustic emissions showed bilaterally normal responses representing normal function of outer hair cells. In contrast, acoustic reflex test showed absent reflexes bilaterally, and visual reinforcement audiometry and auditory brainstem responses indicated severe to profound hearing loss in both ears. These results suggest AN/AD in patients with Biotinidase deficiency. PMID:27144235

  4. Functional Neurochemistry of the Auditory System

    OpenAIRE

    Nourollah Agha Ebrahimi

    1993-01-01

    Functional Neurochemistry is one of the fields of studies in the auditory system which has had an outstanding development in the recent years. Many of the findings in the mentioned field had led not only the basic auditory researches but also the clinicians to new points of view in audiology.Here, we are aimed at discussing the latest investigations in the Functional Neurochemistry of the auditory system and have focused this review mainly on the researches which will arise flashes of hope f...

  5. Assessing the aging effect on auditory-verbal memory by Persian version of dichotic auditory verbal memory test

    Directory of Open Access Journals (Sweden)

    Zahra Shahidipour

    2014-01-01

    Conclusion: Based on the obtained results, significant reduction in auditory memory was seen in aged group and the Persian version of dichotic auditory-verbal memory test, like many other auditory verbal memory tests, showed the aging effects on auditory verbal memory performance.

  6. Effects of similarity on environmental context cueing.

    Science.gov (United States)

    Smith, Steven M; Handy, Justin D; Angello, Genna; Manzano, Isabel

    2014-01-01

    Three experiments examined the prediction that context cues which are similar to study contexts can facilitate episodic recall, even if those cues are never seen before the recall test. Environmental context cueing effects have typically produced such small effect sizes that influences of moderating factors, such as the similarity between encoding and retrieval contexts, would be difficult to observe experimentally. Videos of environmental contexts, however, can be used to produce powerful context-dependent memory effects, particularly when only one memory target is associated with each video context, intentional item-context encoding is encouraged, and free recall tests are used. Experiment 1 showed that a not previously viewed video of the study context provided an effective recall cue, although it was not as effective as the originally viewed video context. Experiments 2 and 3 showed that videos of environments that were conceptually similar to encoding contexts (e.g., both were videos of ball field games) also cued recall, but not as well if the encoding contexts were given specific labels (e.g., "home run") incompatible with test contexts (e.g., a soccer scene). A fourth experiment that used incidental item-context encoding showed that video context reinstatement has a robust effect on paired associate memory, indicating that the video context reinstatement effect does not depend on interactive item-context encoding or free recall testing. PMID:23721293

  7. Verbal Cueing as a Behavior Change Instrument.

    Science.gov (United States)

    Prieto, Alfonso G.; Rutherford, Robert B., Jr.

    A study involving four boys (9 to 14 years old) labeled as emotionally handicapped was conducted to examine the effect of a verbal cueing technique (involving an illogical statement which evokes psychological reactance) on behaviorally disordered children. Illogical statements made by the teacher produced positive change in target behaviors (such…

  8. Auditory functional magnetic resonance imaging in dogs – normalization and group analysis and the processing of pitch in the canine auditory pathways

    OpenAIRE

    Bach, Jan-Peter; Lüpke, Matthias; Dziallas, Peter; Wefstaedt, Patrick; Uppenkamp, Stefan; Seifert, Hermann; Nolte, Ingo

    2016-01-01

    Background Functional magnetic resonance imaging (fMRI) is an advanced and frequently used technique for studying brain functions in humans and increasingly so in animals. A key element of analyzing fMRI data is group analysis, for which valid spatial normalization is a prerequisite. In the current study we applied normalization and group analysis to a dataset from an auditory functional MRI experiment in anesthetized beagles. The stimulation paradigm used in the experiment was composed of si...

  9. Physiological Measures of Auditory Function

    Science.gov (United States)

    Kollmeier, Birger; Riedel, Helmut; Mauermann, Manfred; Uppenkamp, Stefan

    When acoustic signals enter the ears, they pass several processing stages of various complexities before they will be perceived. The auditory pathway can be separated into structures dealing with sound transmission in air (i.e. the outer ear, ear canal, and the vibration of tympanic membrane), structures dealing with the transformation of sound pressure waves into mechanical vibrations of the inner ear fluids (i.e. the tympanic membrane, ossicular chain, and the oval window), structures carrying mechanical vibrations in the fluid-filled inner ear (i.e. the cochlea with basilar membrane, tectorial membrane, and hair cells), structures that transform mechanical oscillations into a neural code, and finally several stages of neural processing in the brain along the pathway from the brainstem to the cortex.

  10. Seeing the Song: Left Auditory Structures May Track Auditory-Visual Dynamic Alignment

    OpenAIRE

    Mossbridge, Julia A.; Grabowecky, Marcia; Suzuki, Satoru

    2013-01-01

    Auditory and visual signals generated by a single source tend to be temporally correlated, such as the synchronous sounds of footsteps and the limb movements of a walker. Continuous tracking and comparison of the dynamics of auditory-visual streams is thus useful for the perceptual binding of information arising from a common source. Although language-related mechanisms have been implicated in the tracking of speech-related auditory-visual signals (e.g., speech sounds and lip movements), it i...

  11. AUDITORY CORTICAL PLASTICITY: DOES IT PROVIDE EVIDENCE FOR COGNITIVE PROCESSING IN THE AUDITORY CORTEX?

    OpenAIRE

    Irvine, Dexter R. F.

    2007-01-01

    The past 20 years have seen substantial changes in our view of the nature of the processing carried out in auditory cortex. Some processing of a cognitive nature, previously attributed to higher order “association” areas, is now considered to take place in auditory cortex itself. One argument adduced in support of this view is the evidence indicating a remarkable degree of plasticity in the auditory cortex of adult animals. Such plasticity has been demonstrated in a wide range of paradigms, i...

  12. The (unclear effects of invalid retro-cues.

    Directory of Open Access Journals (Sweden)

    Marcel eGressmann

    2016-03-01

    Full Text Available Studies with the retro-cue paradigm have shown that validly cueing objects in visual working memory long after encoding can still benefit performance on subsequent change detection tasks. With regard to the effects of invalid cues, the literature is less clear. Some studies reported costs, others did not. We here revisit two recent studies that made interesting suggestions concerning invalid retro-cues: One study suggested that costs only occur for larger set sizes, and another study suggested that inclusion of invalid retro-cues diminishes the retro-cue benefit. New data from one experiment and a reanalysis of published data are provided to address these conclusions. The new data clearly show costs (and benefits that were independent of set size, and the reanalysis suggests no influence of the inclusion of invalid retro-cues on the retro-cue benefit. Thus, previous interpretations may be taken with some caution at present.

  13. A pilot evaluation of two G-seat cueing schemes

    Science.gov (United States)

    Showalter, T. W.

    1978-01-01

    A comparison was made of two contrasting G-seat cueing schemes. The G-seat, an aircraft simulation subsystem, creates aircraft acceleration cues via seat contour changes. Of the two cueing schemes tested, one was designed to create skin pressure cues and the other was designed to create body position cues. Each cueing scheme was tested and evaluated subjectively by five pilots regarding its ability to cue the appropriate accelerations in each of four simple maneuvers: a pullout, a pushover, an S-turn maneuver, and a thrusting maneuver. A divergence of pilot opinion occurred, revealing that the perception and acceptance of G-seat stimuli is a highly individualistic phenomena. The creation of one acceptable G-seat cueing scheme was, therefore, deemed to be quite difficult.

  14. Extinction of Drug Cue Reactivity in Methamphetamine-Dependent Individuals

    OpenAIRE

    Price, Kimber L.; Saladin, Michael E.; Baker, Nathaniel L.; Tolliver, Bryan K.; DeSantis, Stacia M.; McRae-Clark, Aimee L.; Brady, Kathleen T.

    2010-01-01

    Conditioned responses to drug-related environmental cues (such as craving) play a critical role in relapse to drug use. Animal models demonstrate that repeated exposure to drug-associated cues in the absence of drug administration leads to the extinction of conditioned responses, but the few existing clinical trials focused on extinction of conditioned responses to drug-related cues in drug-dependent individuals show equivocal results. The current study examined drug-related cue reactivity an...

  15. Cue Reactivity in Virtual Reality: The Role of Context

    OpenAIRE

    Paris, Megan M.; Carter, Brian L.; Traylor, Amy C.; Bordnick, Patrick S.; Day, Susan X.; Armsworth, Mary W.; Cinciripini, Paul M.

    2011-01-01

    Cigarette smokers in laboratory experiments readily respond to smoking stimuli with increased craving. An alternative to traditional cue-reactivity methods (e.g., exposure to cigarette photos), virtual reality (VR) has been shown to be a viable cue presentation method to elicit and assess cigarette craving within complex virtual environments. However, it remains poorly understood whether contextual cues from the environment contribute to craving increases in addition to specific cues, like ci...

  16. On the Motivational Properties of Reward Cues: Individual Differences

    OpenAIRE

    ROBINSON, TERRY E.; Yager, Lindsay M.; Cogan, Elizabeth S.; Saunders, Benjamin T.

    2013-01-01

    Cues associated with rewards, such as food or drugs of abuse, can themselves acquire motivational properties. Acting as incentive stimuli, such cues can exert powerful control over motivated behavior, and in the case of cues associated with drugs, they can goad continued drug-seeking behavior and relapse. However, recent studies reviewed here suggest that there are large individual differences in the extent to which food and drug cues are attributed with incentive salience. Rats prone to appr...

  17. Reactivity to Cannabis Cues in Virtual Reality Environments†

    OpenAIRE

    Bordnick, Patrick S.; Copp, Hilary L.; Traylor, Amy; Graap, Ken M.; Carter, Brian L.; Walton, Alicia; Ferrer, Mirtha

    2009-01-01

    Virtual reality (VR) cue environments have been developed and successfully tested in nicotine, cocaine, and alcohol abusers. Aims in the current article include the development and testing of a novel VR cannabis cue reactivity assessment system. It was hypothesized that subjective craving levels and attention to cannabis cues would be higher in VR environments merits with cannabis cues compared to VR neutral environments. Twenty nontreatment-seeking current cannabis smokers participated in th...

  18. Memory for location and visual cues in white-eared hummingbirds Hylocharis leucotis

    OpenAIRE

    Guillermo PÉREZ, Carlos LARA, José VICCON-PALE, Martha SIGNORET-POILLON

    2011-01-01

    In nature hummingbirds face floral resources whose availability, quality and quantity can vary spatially and temporally. Thus, they must constantly make foraging decisions about which patches, plants and flowers to visit, partly as a function of the nectar reward. The uncertainty of these decisions would possibly be reduced if an individual could remember locations or use visual cues to avoid revisiting recently depleted flowers. In the present study, we carried out field experiments with whi...

  19. Emotional pictures and sounds: A review of multimodal interactions of emotion cues in multiple domains

    Directory of Open Access Journals (Sweden)

    Antje B M Gerdes

    2014-12-01

    Full Text Available In everyday life, multiple sensory channels jointly trigger emotional experiences and one channel may alter processing in another channel. For example, seeing an emotional facial expression and hearing the voice’s emotional tone will jointly create the emotional experience. This example, where auditory and visual input is related to social communication, has gained considerable attention by researchers. However, interactions of visual and auditory emotional information are not limited to social communication but can extend to much broader contexts including human, animal, and environmental cues. In this article, we review current research on audiovisual emotion processing beyond face-voice stimuli to develop a broader perspective on multimodal interactions in emotion processing. We argue that current concepts of multimodality should be extended in considering an ecologically valid variety of stimuli in audiovisual emotion processing. Therefore, we provide an overview of studies in which emotional sounds and interactions with complex pictures of scenes were investigated. In addition to behavioral studies, we focus on neuroimaging, electro- and peripher-physiological findings. Furthermore, we integrate these findings and identify similarities or differences. We conclude with suggestions for future research.

  20. The Effects of Overt and Covert Cues on Written Syntax.

    Science.gov (United States)

    Combs, Warren E.; Smith, William L.

    1980-01-01

    Experiments conducted with freshman composition students suggested that (1) the repeated use of a control stimulus passage does not result in increased syntactic complexity; (2) both overt and covert cues elicit more complex writing than do no-cue situations; and (3) the effect of overt cues seems to be retained, at least across a short duration.…

  1. In search of an auditory engram

    Science.gov (United States)

    Fritz, Jonathan; Mishkin, Mortimer; Saunders, Richard C.

    2005-01-01

    Monkeys trained preoperatively on a task designed to assess auditory recognition memory were impaired after removal of either the rostral superior temporal gyrus or the medial temporal lobe but were unaffected by lesions of the rhinal cortex. Behavioral analysis indicated that this result occurred because the monkeys did not or could not use long-term auditory recognition, and so depended instead on short-term working memory, which is unaffected by rhinal lesions. The findings suggest that monkeys may be unable to place representations of auditory stimuli into a long-term store and thus question whether the monkey's cerebral memory mechanisms in audition are intrinsically different from those in other sensory modalities. Furthermore, it raises the possibility that language is unique to humans not only because it depends on speech but also because it requires long-term auditory memory. PMID:15967995

  2. Auditory stimulation and cardiac autonomic regulation

    Directory of Open Access Journals (Sweden)

    Vitor E. Valenti

    2012-08-01

    Full Text Available Previous studies have already demonstrated that auditory stimulation with music influences the cardiovascular system. In this study, we described the relationship between musical auditory stimulation and heart rate variability. Searches were performed with the Medline, SciELO, Lilacs and Cochrane databases using the following keywords: "auditory stimulation", "autonomic nervous system", "music" and "heart rate variability". The selected studies indicated that there is a strong correlation between noise intensity and vagal-sympathetic balance. Additionally, it was reported that music therapy improved heart rate variability in anthracycline-treated breast cancer patients. It was hypothesized that dopamine release in the striatal system induced by pleasurable songs is involved in cardiac autonomic regulation. Musical auditory stimulation influences heart rate variability through a neural mechanism that is not well understood. Further studies are necessary to develop new therapies to treat cardiovascular disorders.

  3. Auditory filters at low-frequencies

    DEFF Research Database (Denmark)

    Orellana, Carlos Andrés Jurado; Pedersen, Christian Sejer; Møller, Henrik

    2009-01-01

    Prediction and assessment of low-frequency noise problems requires information about the auditory filter characteristics at low-frequencies. Unfortunately, data at low-frequencies is scarce and practically no results have been published for frequencies below 100 Hz. Extrapolation of ERB results...... from previous studies suggests the filter bandwidth keeps decreasing below 100 Hz, although at a relatively lower rate than at higher frequencies. Main characteristics of the auditory filter were studied from below 100 Hz up to 1000 Hz. Center frequencies evaluated were 50, 63, 125, 250, 500, and 1000......-ear transfer function), the asymmetry of the auditory filter changed from steeper high-frequency slopes at 1000 Hz to steeper low-frequency slopes below 100 Hz. Increasing steepness at low-frequencies of the middle-ear high-pass filter is thought to cause this effect. The dynamic range of the auditory filter...

  4. Environment for Auditory Research Facility (EAR)

    Data.gov (United States)

    Federal Laboratory Consortium — EAR is an auditory perception and communication research center enabling state-of-the-art simulation of various indoor and outdoor acoustic environments. The heart...

  5. Auditory motion capturing ambiguous visual motion

    Directory of Open Access Journals (Sweden)

    ArjenAlink

    2012-01-01

    Full Text Available In this study, it is demonstrated that moving sounds have an effect on the direction in which one sees visual stimuli move. During the main experiment sounds were presented consecutively at four speaker locations inducing left- or rightwards auditory apparent motion. On the path of auditory apparent motion, visual apparent motion stimuli were presented with a high degree of directional ambiguity. The main outcome of this experiment is that our participants perceived visual apparent motion stimuli that were ambiguous (equally likely to be perceived as moving left- or rightwards more often as moving in the same direction than in the opposite direction of auditory apparent motion. During the control experiment we replicated this finding and found no effect of sound motion direction on eye movements. This indicates that auditory motion can capture our visual motion percept when visual motion direction is insufficiently determinate without affecting eye movements.

  6. Effect of omega-3 on auditory system

    Directory of Open Access Journals (Sweden)

    Vida Rahimi

    2014-01-01

    Full Text Available Background and Aim: Omega-3 fatty acid have structural and biological roles in the body 's various systems . Numerous studies have tried to research about it. Auditory system is affected a s well. The aim of this article was to review the researches about the effect of omega-3 on auditory system.Methods: We searched Medline , Google Scholar, PubMed, Cochrane Library and SID search engines with the "auditory" and "omega-3" keywords and read textbooks about this subject between 19 70 and 20 13.Conclusion: Both excess and deficient amounts of dietary omega-3 fatty acid can cause harmful effects on fetal and infant growth and development of brain and central nervous system esspesially auditory system. It is important to determine the adequate dosage of omega-3.

  7. Material differences of auditory source retrieval:Evidence from event-related potential studies

    Institute of Scientific and Technical Information of China (English)

    NIE AiQing; GUO ChunYan; SHEN MoWei

    2008-01-01

    Two event-related potential experiments were conducted to investigate the temporal and the spatial distributions of the old/new effects for the item recognition task and the auditory source retrieval task using picture and Chinese character as stimuli respectively. Stimuli were presented on the center of the screen with their names read out either by female or by male voice simultaneously during the study phase and then two testa were performed separately. One test task was to differentiate the old items from the new ones, and the other task was to judge the items read out by a certain voice during the study phase as targets and other ones as non-targets. The results showed that the old/new effect of the auditory source retrieval task was more sustained over time than that of the item recognition task in both experiments, and the spatial distribution of the former effect was wider than that of the latter one. Both experiments recorded reliable old/new effect over the prefrontal cortex during the source retrieval task. However, there existed some differences of the old/new effect for the auditory source retrieval task between picture and Chinese character, and LORETA source analysis indicated that the differ-ences might be rooted in the temporal lobe. These findings demonstrate that the relevancy of the old/new effects between the item recognition task and the auditory source retrieval task supports the dual-process model; the spatial and the temporal distributions of the old/new effect elicited by the auditory source retrieval task are regulated by both the feature of the experimental material and the perceptual attribute of the voice.

  8. Blocking spatial navigation across environments that have a different shape.

    Science.gov (United States)

    Buckley, Matthew G; Smith, Alastair D; Haselgrove, Mark

    2016-01-01

    According to the geometric module hypothesis, organisms encode a global representation of the space in which they navigate, and this representation is not prone to interference from other cues. A number of studies, however, have shown that both human and non-human animals can navigate on the basis of local geometric cues provided by the shape of an environment. According to the model of spatial learning proposed by Miller and Shettleworth (2007, 2008), geometric cues compete for associative strength in the same manner as non-geometric cues do. The experiments reported here were designed to test if humans learn about local geometric cues in a manner consistent with the Miller-Shettleworth model. Experiment 1 replicated previous findings that humans transfer navigational behavior, based on local geometric cues, from a rectangle-shaped environment to a kite-shaped environment, and vice versa. In Experiments 2 and 3, it was observed that learning about non-geometric cues blocked, and were blocked by, learning about local geometric cues. The reciprocal blocking observed is consistent with associative theories of spatial learning; however, it is difficult to explain the observed effects with theories of global-shape encoding in their current form. PMID:26569017

  9. Corticofugal modulation of peripheral auditory responses

    OpenAIRE

    Terreros, Gonzalo; Delano, Paul H.

    2015-01-01

    The auditory efferent system originates in the auditory cortex and projects to the medial geniculate body (MGB), inferior colliculus (IC), cochlear nucleus (CN) and superior olivary complex (SOC) reaching the cochlea through olivocochlear (OC) fibers. This unique neuronal network is organized in several afferent-efferent feedback loops including: the (i) colliculo-thalamic-cortico-collicular; (ii) cortico-(collicular)-OC; and (iii) cortico-(collicular)-CN pathways. Recent experiments demonstr...

  10. Corticofugal modulation of peripheral auditory responses

    OpenAIRE

    Paul Hinckley Delano

    2015-01-01

    The auditory efferent system originates in the auditory cortex and projects to the medial geniculate body, inferior colliculus, cochlear nucleus and superior olivary complex reaching the cochlea through olivocochlear fibers. This unique neuronal network is organized in several afferent-efferent feedback loops including: the (i) colliculo-thalamic-cortico-collicular, (ii) cortico-(collicular)-olivocochlear and (iii) cortico-(collicular)-cochlear nucleus pathways. Recent experiments demonstrate...

  11. Auditory memory function in expert chess players

    OpenAIRE

    Fattahi, Fariba; Geshani, Ahmad; Jafari, Zahra; Jalaie, Shohreh; Salman Mahini, Mona

    2015-01-01

    Background: Chess is a game that involves many aspects of high level cognition such as memory, attention, focus and problem solving. Long term practice of chess can improve cognition performances and behavioral skills. Auditory memory, as a kind of memory, can be influenced by strengthening processes following long term chess playing like other behavioral skills because of common processing pathways in the brain. The purpose of this study was to evaluate the auditory memory function of expert...

  12. Music perception, pitch, and the auditory system

    OpenAIRE

    McDermott, Josh H.; Oxenham, Andrew J.

    2008-01-01

    The perception of music depends on many culture-specific factors, but is also constrained by properties of the auditory system. This has been best characterized for those aspects of music that involve pitch. Pitch sequences are heard in terms of relative, as well as absolute, pitch. Pitch combinations give rise to emergent properties not present in the component notes. In this review we discuss the basic auditory mechanisms contributing to these and other perceptual effects in music.

  13. Auditory brain-stem responses in syphilis.

    OpenAIRE

    Rosenhall, U; Roupe, G

    1981-01-01

    Analysis of auditory brain-stem electrical responses (BSER) provides an effective means of detecting lesions in the auditory pathways. In the present study the wave patterns were analysed in 11 patients with secondary or latent syphilis with no clinical symptoms referrable to the central nervous system and in two patients with congenital syphilis and general paralysis. Decreased amplitudes and prolonged latencies occurred frequently in patients with secondary and with advanced syphilis. This ...

  14. Auditory sequence analysis and phonological skill

    OpenAIRE

    Grube, Manon; Kumar, Sukhbinder; Cooper, Freya E.; Turton, Stuart; Griffiths, Timothy D

    2012-01-01

    This work tests the relationship between auditory and phonological skill in a non-selected cohort of 238 school students (age 11) with the specific hypothesis that sound-sequence analysis would be more relevant to phonological skill than the analysis of basic, single sounds. Auditory processing was assessed across the domains of pitch, time and timbre; a combination of six standard tests of literacy and language ability was used to assess phonological skill. A significant correlation between ...

  15. Ambiguous Tilt and Translation Motion Cues after Space Flight and Otolith Assessment during Post-Flight Re-Adaptation

    Science.gov (United States)

    Wood, Scott J.; Clarke, A. H.; Harm, D. L.; Rupert, A. H.; Clement, G. R.

    2009-01-01

    Adaptive changes during space flight in how the brain integrates vestibular cues with other sensory information can lead to impaired movement coordination, vertigo, spatial disorientation and perceptual illusions following Gtransitions. These studies are designed to examine both the physiological basis and operational implications for disorientation and tilt-translation disturbances following short duration space flights.

  16. Simple ears-flexible behavior: Information processing in the moth auditory pathway

    Institute of Scientific and Technical Information of China (English)

    Gerit PFUHL; Blanka KALINOVA; Irena VALTEROVA; Bente G.BERG

    2015-01-01

    Lepidoptera evolved tympanic ears in response to echolocating bats.Comparative studies have shown that moth ears evolved many times independently from chordotonal organs.With only 1 to 4 receptor cells,they are one of the simplest hearing organs.The small number of receptors does not imply simplicity,neither in behavior nor in the neural circuit.Behaviorally,the response to ultrasound is far from being a simple reflex.Moths' escape behavior is modulated by a variety of cues,especially pheromones,which can alter the auditory response.Neurally the receptor cell(s) diverges onto many intemeurons,enabling pa rallel processing and feature extraction.Ascending interneurons and sound-sensitive brain neurons innervate a neuropil in the ventrolateral protocerebrum.Further,recent electrophysiological data provides the first glimpses into how the acoustic response is modulated as well as how ultrasound influences the other senses.So far,the auditory pathway has been studied in noctuids.The findings agree well with common computational principles found in other insects.However,moth ears also show unique mechanical and neural adaptation.Here,we first describe the variety of moths' auditory behavior,especially the co-option of ultrasonic signals for intraspecific communication.Second,we describe the current knowledge of the neural pathway gained from noctuid moths.Finally,we argue that Galleriinae which show negative and positive phonotaxis,are an interesting model species for future electrophysiological studies of the auditory pathway and multimodal sensory integration,and so are ideally suited for the study of the evolution of behavioral mechanisms given a few receptors [Current Zoology 61 (2):292-302,2015].

  17. What does spatial alternation tell us about retrosplenial cortex function?

    Directory of Open Access Journals (Sweden)

    Andrew John Dudley Nelson

    2015-05-01

    Full Text Available The retrosplenial cortex supports navigation, but there are good reasons to suppose that the retrosplenial cortex has a very different role in spatial memory from that of the hippocampus and anterior thalamic nuclei. For example, retrosplenial lesions appear to have little or no effect on standard tests of spatial alternation. To examine these differences, the current study sought to determine whether the retrosplenial cortex is important for just one spatial cue type (e.g. allocentric, directional or intra-maze cues or whether the retrosplenial cortex helps the animal switch between competing spatial strategies or competing cue types. Using T-maze alternation, retrosplenial lesion rats were challenged with situations in which the available spatial information between the sample and test phases was changed, so taxing the interaction between different cue types. Clear lesion deficits emerged when intra- and extra-maze cues were placed in conflict (by rotating the maze between the sample and choice phases, or when the animals were tested in the dark in a double-maze. Finally, temporary inactivation of the retrosplenial cortex by muscimol infusions resulted in a striking deficit on standard T-maze alternation, indicating that, over time, other sites may be able to compensate for the loss of the retrosplenial cortex. This pattern of results is consistent with the impoverished use of both allocentric and directional information, exacerbated by an impaired ability to switch between different cue types.

  18. Speech Evoked Auditory Brainstem Response in Stuttering

    Directory of Open Access Journals (Sweden)

    Ali Akbar Tahaei

    2014-01-01

    Full Text Available Auditory processing deficits have been hypothesized as an underlying mechanism for stuttering. Previous studies have demonstrated abnormal responses in subjects with persistent developmental stuttering (PDS at the higher level of the central auditory system using speech stimuli. Recently, the potential usefulness of speech evoked auditory brainstem responses in central auditory processing disorders has been emphasized. The current study used the speech evoked ABR to investigate the hypothesis that subjects with PDS have specific auditory perceptual dysfunction. Objectives. To determine whether brainstem responses to speech stimuli differ between PDS subjects and normal fluent speakers. Methods. Twenty-five subjects with PDS participated in this study. The speech-ABRs were elicited by the 5-formant synthesized syllable/da/, with duration of 40 ms. Results. There were significant group differences for the onset and offset transient peaks. Subjects with PDS had longer latencies for the onset and offset peaks relative to the control group. Conclusions. Subjects with PDS showed a deficient neural timing in the early stages of the auditory pathway consistent with temporal processing deficits and their abnormal timing may underlie to their disfluency.

  19. The Role of Visual Spatial Attention in Audiovisual Speech Perception

    OpenAIRE

    Andersen, Tobias; Tiippana, K.; Laarni, J.; Kojo, I.; Sams, M.

    2008-01-01

    Auditory and visual information is integrated when perceiving speech, as evidenced by the McGurk effect in which viewing an incongruent talking face categorically alters auditory speech perception. Audiovisual integration in speech perception has long been considered automatic and pre-attentive but recent reports have challenged this view. Here we study the effect of visual spatial attention on the McGurk effect. By presenting a movie of two faces symmetrically displaced to each side of a cen...

  20. Introspective responses to cues and motivation to reduce cigarette smoking influence state and behavioral responses to cue exposure.

    Science.gov (United States)

    Veilleux, Jennifer C; Skinner, Kayla D

    2016-09-01

    In the current study, we aimed to extend smoking cue-reactivity research by evaluating delay discounting as an outcome of cigarette cue exposure. We also separated introspection in response to cues (e.g., self-reporting craving and affect) from cue exposure alone, to determine if introspection changes behavioral responses to cigarette cues. Finally, we included measures of quit motivation and resistance to smoking to assess motivational influences on cue exposure. Smokers were invited to participate in an online cue-reactivity study. Participants were randomly assigned to view smoking images or neutral images, and were randomized to respond to cues with either craving and affect questions (e.g., introspection) or filler questions. Following cue exposure, participants completed a delay discounting task and then reported state affect, craving, and resistance to smoking, as well as an assessment of quit motivation. We found that after controlling for trait impulsivity, participants who introspected on craving and affect showed higher delay discounting, irrespective of cue type, but we found no effect of response condition on subsequent craving (e.g., craving reactivity). We also found that motivation to quit interacted with experimental conditions to predict state craving and state resistance to smoking. Although asking about craving during cue exposure did not increase later craving, it resulted in greater delaying of discounted rewards. Overall, our findings suggest the need to further assess the implications of introspection and motivation on behavioral outcomes of cue exposure. PMID:27115733

  1. The use of auditory and visual context in speech perception by listeners with normal hearing and listeners with cochlear implants

    Directory of Open Access Journals (Sweden)

    MatthewWinn

    2013-11-01

    Full Text Available There is a wide range of acoustic and visual variability across different talkers and different speaking contexts. Listeners with normal hearing accommodate that variability in ways that facilitate efficient perception, but it is not known whether listeners with cochlear implants can do the same. In this study, listeners with normal hearing (NH and listeners with cochlear implants (CIs were tested for accommodation to auditory and visual phonetic contexts created by gender-driven speech differences as well as vowel coarticulation and lip rounding in both consonants and vowels. Accommodation was measured as the shifting of perceptual boundaries between /s/ and /ʃ/ sounds in various contexts, as modeled by mixed-effects logistic regression. Owing to the spectral contrasts thought to underlie these context effects, CI listeners were predicted to perform poorly, but showed considerable success. Listeners with cochlear implants not only showed sensitivity to auditory cues to gender, they were also able to use visual cues to gender (i.e. faces as a supplement or proxy for information in the acoustic domain, in a pattern that was not observed for listeners with normal hearing. Spectrally-degraded stimuli heard by listeners with normal hearing generally did not elicit strong context effects, underscoring the limitations of noise vocoders and/or the importance of experience with electric hearing. Visual cues for consonant lip rounding and vowel lip rounding were perceived in a manner consistent with coarticulation and were generally used more heavily by listeners with CIs. Results suggest that listeners with cochlear implants are able to accommodate various sources of acoustic variability either by attending to appropriate acoustic cues or by inferring them via the visual signal.

  2. Task-specific modulation of human auditory evoked responses in a delayed-match-to-sample task

    Directory of Open Access Journals (Sweden)

    Feng eRong

    2011-05-01

    Full Text Available In this study, we focus our investigation on task-specific cognitive modulation of early cortical auditory processing in human cerebral cortex. During the experiments, we acquired whole-head magnetoencephalography (MEG data while participants were performing an auditory delayed-match-to-sample (DMS task and associated control tasks. Using a spatial filtering beamformer technique to simultaneously estimate multiple source activities inside the human brain, we observed a significant DMS-specific suppression of the auditory evoked response to the second stimulus in a sound pair, with the center of the effect being located in the vicinity of the left auditory cortex. For the right auditory cortex, a non-invariant suppression effect was observed in both DMS and control tasks. Furthermore, analysis of coherence revealed a beta band (12 ~ 20 Hz DMS-specific enhanced functional interaction between the sources in left auditory cortex and those in left inferior frontal gyrus, which has been shown to involve in short-term memory processing during the delay period of DMS task. Our findings support the view that early evoked cortical responses to incoming acoustic stimuli can be modulated by task-specific cognitive functions by means of frontal-temporal functional interactions.

  3. Adaptive auditory feedback control of the production of formant trajectories in the Mandarin triphthong /iau/ and its pattern of generalization.

    Science.gov (United States)

    Cai, Shanqing; Ghosh, Satrajit S; Guenther, Frank H; Perkell, Joseph S

    2010-10-01

    In order to test whether auditory feedback is involved in the planning of complex articulatory gestures in time-varying phonemes, the current study examined native Mandarin speakers' responses to auditory perturbations of their auditory feedback of the trajectory of the first formant frequency during their production of the triphthong /iau/. On average, subjects adaptively adjusted their productions to partially compensate for the perturbations in auditory feedback. This result indicates that auditory feedback control of speech movements is not restricted to quasi-static gestures in monophthongs as found in previous studies, but also extends to time-varying gestures. To probe the internal structure of the mechanisms of auditory-motor transformations, the pattern of generalization of the adaptation learned on the triphthong /iau/ to other vowels with different temporal and spatial characteristics (produced only under masking noise) was tested. A broad but weak pattern of generalization was observed; the strength of the generalization diminished with increasing dissimilarity from /iau/. The details and implications of the pattern of generalization are examined and discussed in light of previous sensorimotor adaptation studies of both speech and limb motor control and a neurocomputational model of speech motor control. PMID:20968374

  4. Brain dynamic mechanisms on the visual attention scale with Chinese characters cues

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The temporal dynamics in brain evoked by the scale of visual attention with the cues of Chinese characters were studied by recording event-related potentials (ERPs). With the fixed orientation of visual attention, 14 healthy young participants performed a search task in which the search array was preceded by Chinese characters cues, "大, 中, 小" (large, medium, small). 128 channels scalp ERPs were recorded to study the role of visual attention scale played in the visual spatial attention. The results showed that there was no significant difference in the ERP components evoked by the three Chinese characters cues except the inferoposterior N2 latency. The targets evoked P2, N2 amplitudes and latency have significant differences with the different cues of large, middle and small, while P1 and N1 components had no significant difference. The results suggested that the processing of scale of visual attention was mainly concerned with P2, N2 components, while the P1, N1 components were mainly related with the processing of visual orientation information.

  5. Dynamic Bayesian Networks for Cue Integration

    OpenAIRE

    Paul Maier; Frederike Petzschner

    2012-01-01

    If we want to understand how humans use contextual cues to solve tasks such as estimating distances from optic flow during path integration, our models need to represent the available information and formally describe how these representations are processed. In particular the temporal dynamics need to be incorporated, since it has been shown that humans exploit short-term experience gained in previous trials (Petzschner und Glasauer, 2011). Existing studies often use a Bayesian approach to mo...

  6. Yaw Motion Cues in Helicopter Simulation

    Science.gov (United States)

    Schroeder, Jeffrey A.; Johnson, Walter W.

    1996-01-01

    A piloted simulation that examined the effects of yaw motion cues on pilot-vehicle performance, pilot workload, and pilot motion perception was conducted on the NASA Ames Vertical Motion Simulator. The vehicle model that was used represented an AH-64 helicopter. Three tasks were performed in which only combinations of vehicle yaw and vertical displacement were allowed. The commands issued to the motion platform were modified to present the following four motion configurations for a pilot located forward of the center of rotation: (1) only the linear translations, (2) only the angular rotation, (3) both the linear translations and the angular rotation, and (4) no motion. The objective data indicated that pilot-vehicle performance was reduced and the necessary control activity increased when linear motion was removed; however, the lack of angular rotation did not result in a measured degradation for almost all cases. Also, pilots provided subjective assessments of their compensation required, the motion fidelity, and their judgment of whether or not linear or rotational cockpit motion was present. Ratings of compensation and fidelity were affected only by linear acceleration, and the rotational motion had no significant impact. Also, when only linear motion was present, pilots typically reported the presence of rotation. Thus, linear acceleration cues, not yaw rotational cues, appear necessary to simulate hovering flight.

  7. Cues indicating location in pigeon navigation.

    Science.gov (United States)

    Beason, Robert C; Wiltschko, Wolfgang

    2015-10-01

    Domesticated Rock Pigeons (Columba livia f. domestica) have been selected for returning home after being displaced. They appear to use many of the physical cue sources available in the natural environment for Map-and-Compass navigation. Two compass mechanisms that have been well documented in pigeons are a time-compensated sun compass and a magnetic inclination compass. Location-finding, or map, mechanisms have been more elusive. Visual landmarks, magnetic fields, odors, gravity and now also infrasound have been proposed as sources of information on location. Even in highly familiar locations, pigeons appear to neither use nor need landmarks and can even return to the loft while wearing frosted lenses. Direct and indirect evidence indicates magnetic field information influences pigeon navigation in ways that are consistent with magnetic map components. The role of odors is unclear; it might be motivational in nature rather than navigational. The influence of gravity must be further analyzed. Experiments with infrasound have been interpreted in the sense that they provide information on the home direction, but this hypothesis is inconsistent with the Map-and-Compass Model. All these factors appear to be components of a multifactorial system, with the pigeons being opportunistic, preferring those cues that prove most suitable in their home region. This has made understanding the roles of individual cues challenging. PMID:26149606

  8. Cues, quantification, and agreement in language comprehension.

    Science.gov (United States)

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension. PMID:25987192

  9. Improvement of auditory hallucinations and reduction of primary auditory area's activation following TMS

    International Nuclear Information System (INIS)

    Background: In the present case study, improvement of auditory hallucinations following transcranial magnetic stimulation (TMS) therapy was investigated with respect to activation changes of the auditory cortices. Methods: Using functional magnetic resonance imaging (fMRI), activation of the auditory cortices was assessed prior to and after a 4-week TMS series of the left superior temporal gyrus in a schizophrenic patient with medication-resistant auditory hallucinations. Results: Hallucinations decreased slightly after the third and profoundly after the fourth week of TMS. Activation in the primary auditory area decreased, whereas activation in the operculum and insula remained stable. Conclusions: Combination of TMS and repetitive fMRI is promising to elucidate the physiological changes induced by TMS.

  10. Effect of auditory feedback differs according to side of hemiparesis: a comparative pilot study

    Directory of Open Access Journals (Sweden)

    Bensmail Djamel

    2009-12-01

    Full Text Available Abstract Background Following stroke, patients frequently demonstrate loss of motor control and function and altered kinematic parameters of reaching movements. Feedback is an essential component of rehabilitation and auditory feedback of kinematic parameters may be a useful tool for rehabilitation of reaching movements at the impairment level. The aim of this study was to investigate the effect of 2 types of auditory feedback on the kinematics of reaching movements in hemiparetic stroke patients and to compare differences between patients with right (RHD and left hemisphere damage (LHD. Methods 10 healthy controls, 8 stroke patients with LHD and 8 with RHD were included. Patient groups had similar levels of upper limb function. Two types of auditory feedback (spatial and simple were developed and provided online during reaching movements to 9 targets in the workspace. Kinematics of the upper limb were recorded with an electromagnetic system. Kinematics were compared between groups (Mann Whitney test and the effect of auditory feedback on kinematics was tested within each patient group (Friedman test. Results In the patient groups, peak hand velocity was lower, the number of velocity peaks was higher and movements were more curved than in the healthy group. Despite having a similar clinical level, kinematics differed between LHD and RHD groups. Peak velocity was similar but LHD patients had fewer velocity peaks and less curved movements than RHD patients. The addition of auditory feedback improved the curvature index in patients with RHD and deteriorated peak velocity, the number of velocity peaks and curvature index in LHD patients. No difference between types of feedback was found in either patient group. Conclusion In stroke patients, side of lesion should be considered when examining arm reaching kinematics. Further studies are necessary to evaluate differences in responses to auditory feedback between patients with lesions in opposite

  11. External auditory canal carcinoma treatment

    International Nuclear Information System (INIS)

    External auditory canal (EAC) carcinomas are relatively rare conditions lack on established treatment strategy. We analyzed a treatment modalities and outcome in 32 cases of EAC squamous cell carcinoma treated between 1980 and 2008. Subjects-17 men and 15 women ranging from 33 to 92 years old (average: 66) were divided by Arriaga's tumor staging into 12 T1, 5 T2, 6 T3, and 9 T4. Survival was calculated by the Kaplan-Meier method. Disease-specific 5-year survival was 100% for T1, T2, 44% for T3, and 33% for T4. In contrast to 100% 5-year survival for T1+T2 cancer, the 5-year survival for T3+T4 cancer was 37% with high recurrence due to positive surgical margins. The first 22 years of the 29 years surveyed, we performed surgery mainly, and irradiation or chemotherapy was selected for early disease or cases with positive surgical margins as postoperative therapy. During the 22-years, 5-year survival with T3+T4 cancer was 20%. After we started superselective intra-arterial (IA) rapid infusion chemotherapy combined with radiotherapy in 2003, we achieved negative surgical margins for advanced disease, and 5-year survival for T3+T4 cancer rise to 80%. (author)

  12. Auditory-prefrontal axonal connectivity in the macaque cortex: quantitative assessment of processing streams.

    Science.gov (United States)

    Bezgin, Gleb; Rybacki, Konrad; van Opstal, A John; Bakker, Rembrandt; Shen, Kelly; Vakorin, Vasily A; McIntosh, Anthony R; Kötter, Rolf

    2014-08-01

    Primate sensory systems subserve complex neurocomputational functions. Consequently, these systems are organised anatomically in a distributed fashion, commonly linking areas to form specialised processing streams. Each stream is related to a specific function, as evidenced from studies of the visual cortex, which features rather prominent segregation into spatial and non-spatial domains. It has been hypothesised that other sensory systems, including auditory, are organised in a similar way on the cortical level. Recent studies offer rich qualitative evidence for the dual stream hypothesis. Here we provide a new paradigm to quantitatively uncover these patterns in the auditory system, based on an analysis of multiple anatomical studies using multivariate techniques. As a test case, we also apply our assessment techniques to more ubiquitously-explored visual system. Importantly, the introduced framework opens the possibility for these techniques to be applied to other neural systems featuring a dichotomised organisation, such as language or music perception. PMID:24980416

  13. Expression and function of scleraxis in the developing auditory system.

    Directory of Open Access Journals (Sweden)

    Zoe F Mann

    Full Text Available A study of genes expressed in the developing inner ear identified the bHLH transcription factor Scleraxis (Scx in the developing cochlea. Previous work has demonstrated an essential role for Scx in the differentiation and development of tendons, ligaments and cells of chondrogenic lineage. Expression in the cochlea has been shown previously, however the functional role for Scx in the cochlea is unknown. Using a Scx-GFP reporter mouse line we examined the spatial and temporal patterns of Scx expression in the developing cochlea between embryonic day 13.5 and postnatal day 25. Embryonically, Scx is expressed broadly throughout the cochlear duct and surrounding mesenchyme and at postnatal ages becomes restricted to the inner hair cells and the interdental cells of the spiral limbus. Deletion of Scx results in hearing impairment indicated by elevated auditory brainstem response (ABR thresholds and diminished distortion product otoacoustic emission (DPOAE amplitudes, across a range of frequencies. No changes in either gross cochlear morphology or expression of the Scx target genes Col2A, Bmp4 or Sox9 were observed in Scx(-/- mutants, suggesting that the auditory defects observed in these animals may be a result of unidentified Scx-dependent processes within the cochlea.

  14. Efficacy of auditory training in elderly subjects

    Directory of Open Access Journals (Sweden)

    Aline Albuquerque Morais

    2015-05-01

    Full Text Available Auditory training (AT  has been used for auditory rehabilitation in elderly individuals and is an effective tool for optimizing speech processing in this population. However, it is necessary to distinguish training-related improvements from placebo and test-retest effects. Thus, we investigated the efficacy of short-term auditory training (acoustically controlled auditory training - ACAT in elderly subjects through behavioral measures and P300. Sixteen elderly individuals with APD received an initial evaluation (evaluation 1 - E1 consisting of behavioral and electrophysiological tests (P300 evoked by tone burst and speech sounds to evaluate their auditory processing. The individuals were divided into two groups. The Active Control Group [ACG (n=8] underwent placebo training. The Passive Control Group [PCG (n=8] did not receive any intervention. After 12 weeks, the subjects were  revaluated (evaluation 2 - E2. Then, all of the subjects underwent ACAT. Following another 12 weeks (8 training sessions, they underwent the final evaluation (evaluation 3 – E3. There was no significant difference between E1 and E2 in the behavioral test [F(9.6=0,.6 p=0.92, λ de Wilks=0.65] or P300 [F(8.7=2.11, p=0.17, λ de Wilks=0.29] (discarding the presence of placebo effects and test-retest. A significant improvement was observed between the pre- and post-ACAT conditions (E2 and E3 for all auditory skills according to the behavioral methods [F(4.27=0.18, p=0.94, λ de Wilks=0.97]. However, the same result was not observed for P300 in any condition. There was no significant difference between P300 stimuli. The ACAT improved the behavioral performance of the elderly for all auditory skills and was an effective method for hearing rehabilitation.

  15. Beethoven's Last Piano Sonata and Those Who Follow Crocodiles: Cross-Domain Mappings of Auditory Pitch in a Musical Context

    Science.gov (United States)

    Eitan, Zohar; Timmers, Renee

    2010-01-01

    Though auditory pitch is customarily mapped in Western cultures onto spatial verticality (high-low), both anthropological reports and cognitive studies suggest that pitch may be mapped onto a wide variety of other domains. We collected a total number of 35 pitch mappings and investigated in four experiments how these mappings are used and…

  16. An Investigation of Spatial Hearing in Children with Normal Hearing and with Cochlear Implants and the Impact of Executive Function

    Science.gov (United States)

    Misurelli, Sara M.

    The ability to analyze an "auditory scene"---that is, to selectively attend to a target source while simultaneously segregating and ignoring distracting information---is one of the most important and complex skills utilized by normal hearing (NH) adults. The NH adult auditory system and brain work rather well to segregate auditory sources in adverse environments. However, for some children and individuals with hearing loss, selectively attending to one source in noisy environments can be extremely challenging. In a normal auditory system, information arriving at each ear is integrated, and thus these binaural cues aid in speech understanding in noise. A growing number of individuals who are deaf now receive cochlear implants (CIs), which supply hearing through electrical stimulation to the auditory nerve. In particular, bilateral cochlear implants (BICIs) are now becoming more prevalent, especially in children. However, because CI sound processing lacks both fine structure cues and coordination between stimulation at the two ears, binaural cues may either be absent or inconsistent. For children with NH and with BiCIs, this difficulty in segregating sources is of particular concern because their learning and development commonly occurs within the context of complex auditory environments. This dissertation intends to explore and understand the ability of children with NH and with BiCIs to function in everyday noisy environments. The goals of this work are to (1) Investigate source segregation abilities in children with NH and with BiCIs; (2) Examine the effect of target-interferer similarity and the benefits of source segregation for children with NH and with BiCIs; (3) Investigate measures of executive function that may predict performance in complex and realistic auditory tasks of source segregation for listeners with NH; and (4) Examine source segregation abilities in NH listeners, from school-age to adults.

  17. Auditory and motor imagery modulate learning in music performance

    Science.gov (United States)

    Brown, Rachel M.; Palmer, Caroline

    2013-01-01

    Skilled performers such as athletes or musicians can improve their performance by imagining the actions or sensory outcomes associated with their skill. Performers vary widely in their auditory and motor imagery abilities, and these individual differences influence sensorimotor learning. It is unknown whether imagery abilities influence both memory encoding and retrieval. We examined how auditory and motor imagery abilities influence musicians' encoding (during Learning, as they practiced novel melodies), and retrieval (during Recall of those melodies). Pianists learned melodies by listening without performing (auditory learning) or performing without sound (motor learning); following Learning, pianists performed the melodies from memory with auditory feedback (Recall). During either Learning (Experiment 1) or Recall (Experiment 2), pianists experienced either auditory interference, motor interference, or no interference. Pitch accuracy (percentage of correct pitches produced) and temporal regularity (variability of quarter-note interonset intervals) were measured at Recall. Independent tests measured auditory and motor imagery skills. Pianists' pitch accuracy was higher following auditory learning than following motor learning and lower in motor interference conditions (Experiments 1 and 2). Both auditory and motor imagery skills improved pitch accuracy overall. Auditory imagery skills modulated pitch accuracy encoding (Experiment 1): Higher auditory imagery skill corresponded to higher pitch accuracy following auditory learning with auditory or motor interference, and following motor learning with motor or no interference. These findings suggest that auditory imagery abilities decrease vulnerability to interference and compensate for missing auditory feedback at encoding. Auditory imagery skills also influenced temporal regularity at retrieval (Experiment 2): Higher auditory imagery skill predicted greater temporal regularity during Recall in the presence of

  18. Auditory and motor imagery modulate learning in music performance.

    Science.gov (United States)

    Brown, Rachel M; Palmer, Caroline

    2013-01-01

    Skilled performers such as athletes or musicians can improve their performance by imagining the actions or sensory outcomes associated with their skill. Performers vary widely in their auditory and motor imagery abilities, and these individual differences influence sensorimotor learning. It is unknown whether imagery abilities influence both memory encoding and retrieval. We examined how auditory and motor imagery abilities influence musicians' encoding (during Learning, as they practiced novel melodies), and retrieval (during Recall of those melodies). Pianists learned melodies by listening without performing (auditory learning) or performing without sound (motor learning); following Learning, pianists performed the melodies from memory with auditory feedback (Recall). During either Learning (Experiment 1) or Recall (Experiment 2), pianists experienced either auditory interference, motor interference, or no interference. Pitch accuracy (percentage of correct pitches produced) and temporal regularity (variability of quarter-note interonset intervals) were measured at Recall. Independent tests measured auditory and motor imagery skills. Pianists' pitch accuracy was higher following auditory learning than following motor learning and lower in motor interference conditions (Experiments 1 and 2). Both auditory and motor imagery skills improved pitch accuracy overall. Auditory imagery skills modulated pitch accuracy encoding (Experiment 1): Higher auditory imagery skill corresponded to higher pitch accuracy following auditory learning with auditory or motor interference, and following motor learning with motor or no interference. These findings suggest that auditory imagery abilities decrease vulnerability to interference and compensate for missing auditory feedback at encoding. Auditory imagery skills also influenced temporal regularity at retrieval (Experiment 2): Higher auditory imagery skill predicted greater temporal regularity during Recall in the presence of

  19. Contingent capture of involuntary visual attention interferes with detection of auditory stimuli

    Directory of Open Access Journals (Sweden)

    MarcR.Kamke

    2014-06-01

    Full Text Available The involuntary capture of attention by salient visual stimuli can be influenced by the behavioral goals of an observer. For example, when searching for a target item, irrelevant items that possess the target-defining characteristic capture attention more strongly than items not possessing that feature. Such contingent capture involves a shift of spatial attention toward the item with the target-defining characteristic. It is not clear, however, if the associated decrements in performance for detecting the target item are entirely due to involuntary orienting of spatial attention. To investigate whether contingent capture also involves a non-spatial interference, adult observers were presented with streams of visual and auditory stimuli and were tasked with simultaneously monitoring for targets in each modality. Visual and auditory targets could be preceded by a lateralized visual distractor that either did, or did not, possess the target-defining feature (a specific color. In agreement with the contingent capture hypothesis, target-colored distractors interfered with visual detection performance (response time and accuracy more than distractors that did not possess the target color. Importantly, the same pattern of results was obtained for the auditory task: visual target-colored distractors interfered with sound detection. The decrement in auditory performance following a target-colored distractor suggests that contingent capture involves a source of processing interference in addition to that caused by a spatial shift of attention. Specifically, we argue that distractors possessing the target-defining characteristic enter a capacity-limited, serial stage of neural processing, which delays detection of subsequently presented stimuli regardless of the sensory modality.

  20. Extra-classical tuning predicts stimulus-dependent receptive fields in auditory neurons

    OpenAIRE

    Schneider, David M.; Woolley, Sarah M. N.

    2011-01-01

    The receptive fields of many sensory neurons are sensitive to statistical differences among classes of complex stimuli. For example, excitatory spectral bandwidths of midbrain auditory neurons and the spatial extent of cortical visual neurons differ during the processing of natural stimuli compared to the processing of artificial stimuli. Experimentally characterizing neuronal non-linearities that contribute to stimulus-dependent receptive fields is important for understanding how neurons res...

  1. Auditory Model Identification Using REVCOR Method

    Directory of Open Access Journals (Sweden)

    Lamia Bouafif

    2014-08-01

    Full Text Available Auditory models are very useful in many applications such as speech coding and compression, cochlea prosthesis, and audio watermarking. In this paper we will develop a new auditory model based on the REVCOR method. This technique is based on the estimation of the impulse response of a suitable filter characterizing the auditory neuron and the cochlea. The first step of our study is focused on the development of a mathematical model based on the gammachirp system. This model is then programmed, implemented and simulated under Matlab. The obtained results are compared with the experimental values (REVCOR experiments for the validation and a better optimization of the model parameters. Two objective criteria are used in order to optimize the audio model estimation which are the SNR (signal to noise ratio and the MQE (mean quadratic error. The simulation results demonstrated that for the auditory model, only a reduced number of channels are excited (from 3 to 6. This result is very interesting for auditory implants because only significant channels will be stimulated. Besides, this simplifies the electronic implementation and medical intervention.

  2. Effects of Caffeine on Auditory Brainstem Response

    Directory of Open Access Journals (Sweden)

    Saleheh Soleimanian

    2008-06-01

    Full Text Available Background and Aim: Blocking of the adenosine receptor in central nervous system by caffeine can lead to increasing the level of neurotransmitters like glutamate. As the adenosine receptors are present in almost all brain areas like central auditory pathway, it seems caffeine can change conduction in this way. The purpose of this study was to evaluate the effects of caffeine on latency and amplitude of auditory brainstem response(ABR.Materials and Methods: In this clinical trial study 43 normal 18-25 years old male students were participated. The subjects consumed 0, 2 and 3 mg/kg BW caffeine in three different sessions. Auditory brainstem responses were recorded before and 30 minute after caffeine consumption. The results were analyzed by Friedman and Wilcoxone test to assess the effects of caffeine on auditory brainstem response.Results: Compared to control group the latencies of waves III,V and I-V interpeak interval of the cases decreased significantly after 2 and 3mg/kg BW caffeine consumption. Wave I latency significantly decreased after 3mg/kg BW caffeine consumption(p<0.01. Conclusion: Increasing of the glutamate level resulted from the adenosine receptor blocking brings about changes in conduction in the central auditory pathway.

  3. Developmental changes in using verbal self-cueing in task-switching situations: The impact of task practice and task-sequencing demands

    Directory of Open Access Journals (Sweden)

    Jutta eKray

    2013-12-01

    Full Text Available In this study we examined whether developmental changes in using verbal self-cueing for task-goal maintenance are dependent on the amount of task practice and task-sequencing demands. To measure task-goal maintenance we applied a switching paradigm in which children either performed only task A or B in single-task blocks or switched between them on every second trial in mixed-task blocks. Task-goal maintenance was determined by comparing the performance between both blocks (mixing costs. The influence of verbal self-cueing was measured by instructing children to either name the next task aloud or not to verbalize during task preparation. Task-sequencing demands were varied between groups whereas one group received spatial task cues to support keeping track of the task sequence, while the other group did not. We also varied by the amount of prior practice in task switching while one group of participants practiced task switching first, before performing the task naming in addition, and the other group did it vice versa. Results of our study investigating younger (8-10 years and older children (11-13 years revealed no age differences in beneficial effects of verbal self-cueing. In line with previous findings, children showed reduced mixing costs under task-naming instructions and under conditions of low task-sequence demands (with the presence of spatial task cues. Our results also indicated that these benefits were only obtained for those groups of children that first received practice in task switching alone with no additional verbalization instruction. These findings suggest that internal task-cueing strategies can be efficiently used in children but only if they received prior practice in the underlying task so that demands on keeping and coordinating various instructions are reduced. Moreover, children benefitted from spatial task cues for better task-goal maintenance only if no verbal task-cueing strategy was introduced first.

  4. Efficient coding of spectrotemporal binaural sounds leads to emergence of the auditory space representation

    Directory of Open Access Journals (Sweden)

    Wiktor Mlynarski

    2014-03-01

    Full Text Available To date a number of studies have shown that receptive field shapes of early sensory neurons can be reproduced by optimizing coding efficiency of natural stimulus ensembles. A still unresolved question is whether the efficientcoding hypothesis explains formation of neurons which explicitly represent environmental features of different functional importance. This paper proposes that the spatial selectivity of higher auditory neurons emerges as a direct consequence of learning efficient codes for natural binaural sounds. Firstly, it is demonstrated that a linear efficient coding transform - Independent Component Analysis (ICA trained on spectrograms of naturalistic simulated binaural sounds extracts spatial information present in the signal. A simple hierarchical ICA extension allowing for decoding of sound position is proposed. Furthermore, it is shown that units revealing spatial selectivity can be learned from a binaural recording of a natural auditory scene. In both cases a relatively small subpopulation of learned spectrogram features suffices to perform accurate sound localization. Representation of the auditory space is therefore learned in a purely unsupervised way by maximizing the coding efficiency and without any task-specific constraints. This results imply that efficient coding is a useful strategy for learning structures which allow for making behaviorally vital inferences about the environment.

  5. The effect of background music in auditory health persuasion

    NARCIS (Netherlands)

    Elbert, Sarah; Dijkstra, Arie

    2013-01-01

    In auditory health persuasion, threatening information regarding health is communicated by voice only. One relevant context of auditory persuasion is the addition of background music. There are different mechanisms through which background music might influence persuasion, for example through mood (

  6. Effect of task-related continuous auditory feedback during learning of tracking motion exercises

    Directory of Open Access Journals (Sweden)

    Rosati Giulio

    2012-10-01

    Full Text Available Abstract Background This paper presents the results of a set of experiments in which we used continuous auditory feedback to augment motor training exercises. This feedback modality is mostly underexploited in current robotic rehabilitation systems, which usually implement only very basic auditory interfaces. Our hypothesis is that properly designed continuous auditory feedback could be used to represent temporal and spatial information that could in turn, improve performance and motor learning. Methods We implemented three different experiments on healthy subjects, who were asked to track a target on a screen by moving an input device (controller with their hand. Different visual and auditory feedback modalities were envisaged. The first experiment investigated whether continuous task-related auditory feedback can help improve performance to a greater extent than error-related audio feedback, or visual feedback alone. In the second experiment we used sensory substitution to compare different types of auditory feedback with equivalent visual feedback, in order to find out whether mapping the same information on a different sensory channel (the visual channel yielded comparable effects with those gained in the first experiment. The final experiment applied a continuously changing visuomotor transformation between the controller and the screen and mapped kinematic information, computed in either coordinate system (controller or video, to the audio channel, in order to investigate which information was more relevant to the user. Results Task-related audio feedback significantly improved performance with respect to visual feedback alone, whilst error-related feedback did not. Secondly, performance in audio tasks was significantly better with respect to the equivalent sensory-substituted visual tasks. Finally, with respect to visual feedback alone, video-task-related sound feedback decreased the tracking error during the learning of a novel

  7. Musical experience shapes top-down auditory mechanisms: evidence from masking and auditory attention performance.

    Science.gov (United States)

    Strait, Dana L; Kraus, Nina; Parbery-Clark, Alexandra; Ashley, Richard

    2010-03-01

    A growing body of research suggests that cognitive functions, such as attention and memory, drive perception by tuning sensory mechanisms to relevant acoustic features. Long-term musical experience also modulates lower-level auditory function, although the mechanisms by which this occurs remain uncertain. In order to tease apart the mechanisms that drive perceptual enhancements in musicians, we posed the question: do well-developed cognitive abilities fine-tune auditory perception in a top-down fashion? We administered a standardized battery of perceptual and cognitive tests to adult musicians and non-musicians, including tasks either more or less susceptible to cognitive control (e.g., backward versus simultaneous masking) and more or less dependent on auditory or visual processing (e.g., auditory versus visual attention). Outcomes indicate lower perceptual thresholds in musicians specifically for auditory tasks that relate with cognitive abilities, such as backward masking and auditory attention. These enhancements were observed in the absence of group differences for the simultaneous masking and visual attention tasks. Our results suggest that long-term musical practice strengthens cognitive functions and that these functions benefit auditory skills. Musical training bolsters higher-level mechanisms that, when impaired, relate to language and literacy deficits. Thus, musical training may serve to lessen the impact of these deficits by strengthening the corticofugal system for hearing. PMID:20018234

  8. Electrophysiological correlates of auditory change detection and change deafness in complex auditory scenes.

    Science.gov (United States)

    Puschmann, Sebastian; Sandmann, Pascale; Ahrens, Janina; Thorne, Jeremy; Weerda, Riklef; Klump, Georg; Debener, Stefan; Thiel, Christiane M

    2013-07-15

    Change deafness describes the failure to perceive even intense changes within complex auditory input, if the listener does not attend to the changing sound. Remarkably, previous psychophysical data provide evidence that this effect occurs independently of successful stimulus encoding, indicating that undetected changes are processed to some extent in auditory cortex. Here we investigated cortical representations of detected and undetected auditory changes using electroencephalographic (EEG) recordings and a change deafness paradigm. We applied a one-shot change detection task, in which participants listened successively to three complex auditory scenes, each of them consisting of six simultaneously presented auditory streams. Listeners had to decide whether all scenes were identical or whether the pitch of one stream was changed between the last two presentations. Our data show significantly increased middle-latency Nb responses for both detected and undetected changes as compared to no-change trials. In contrast, only successfully detected changes were associated with a later mismatch response in auditory cortex, followed by increased N2, P3a and P3b responses, originating from hierarchically higher non-sensory brain regions. These results strengthen the view that undetected changes are successfully encoded at sensory level in auditory cortex, but fail to trigger later change-related cortical responses that lead to conscious perception of change. PMID:23466938

  9. What determines auditory distraction? On the roles of local auditory changes and expectation violations.

    Directory of Open Access Journals (Sweden)

    Jan P Röer

    Full Text Available Both the acoustic variability of a distractor sequence and the degree to which it violates expectations are important determinants of auditory distraction. In four experiments we examined the relative contribution of local auditory changes on the one hand and expectation violations on the other hand in the disruption of serial recall by irrelevant sound. We present evidence for a greater disruption by auditory sequences ending in unexpected steady state distractor repetitions compared to auditory sequences with expected changing state endings even though the former contained fewer local changes. This effect was demonstrated with piano melodies (Experiment 1 and speech distractors (Experiment 2. Furthermore, it was replicated when the expectation violation occurred after the encoding of the target items (Experiment 3, indicating that the items' maintenance in short-term memory was disrupted by attentional capture and not their encoding. This seems to be primarily due to the violation of a model of the specific auditory distractor sequences because the effect vanishes and even reverses when the experiment provides no opportunity to build up a specific neural model about the distractor sequence (Experiment 4. Nevertheless, the violation of abstract long-term knowledge about auditory regularities seems to cause a small and transient capture effect: Disruption decreased markedly over the course of the experiments indicating that participants habituated to the unexpected distractor repetitions across trials. The overall pattern of results adds to the growing literature that the degree to which auditory distractors violate situation-specific expectations is a more important determinant of auditory distraction than the degree to which a distractor sequence contains local auditory changes.

  10. What Determines Auditory Distraction? On the Roles of Local Auditory Changes and Expectation Violations

    Science.gov (United States)

    Röer, Jan P.; Bell, Raoul; Buchner, Axel

    2014-01-01

    Both the acoustic variability of a distractor sequence and the degree to which it violates expectations are important determinants of auditory distraction. In four experiments we examined the relative contribution of local auditory changes on the one hand and expectation violations on the other hand in the disruption of serial recall by irrelevant sound. We present evidence for a greater disruption by auditory sequences ending in unexpected steady state distractor repetitions compared to auditory sequences with expected changing state endings even though the former contained fewer local changes. This effect was demonstrated with piano melodies (Experiment 1) and speech distractors (Experiment 2). Furthermore, it was replicated when the expectation violation occurred after the encoding of the target items (Experiment 3), indicating that the items' maintenance in short-term memory was disrupted by attentional capture and not their encoding. This seems to be primarily due to the violation of a model of the specific auditory distractor sequences because the effect vanishes and even reverses when the experiment provides no opportunity to build up a specific neural model about the distractor sequence (Experiment 4). Nevertheless, the violation of abstract long-term knowledge about auditory regularities seems to cause a small and transient capture effect: Disruption decreased markedly over the course of the experiments indicating that participants habituated to the unexpected distractor repetitions across trials. The overall pattern of results adds to the growing literature that the degree to which auditory distractors violate situation-specific expectations is a more important determinant of auditory distraction than the degree to which a distractor sequence contains local auditory changes. PMID:24400081

  11. Applied research in auditory data representation

    Science.gov (United States)

    Frysinger, Steve P.

    1990-08-01

    A class of data displays, characterized generally as Auditory Data Representation, is described and motivated. This type of data representation takes advantage of the tremendous pattern recognition capability of the human auditory channel. Audible displays offer an alternative means of conveying quantitative data to the analyst to facilitate information extraction, and are successfully used alone and in conjunction with visual displays. The Auditory Data Representation literature is reviewed, along with elements of the allied fields of investigation, Psychoacoustics and Musical Perception. A methodology for applied research in this field, based upon the well-developed discipline of psychophysics, is elaborated using a recent experiment as a case study. This method permits objective estimation of a data representation technique by comparing it to alternative displays for the pattern recognition task at hand. The psychophysical threshold of signal to noise level, for constant pattern recognition performance, is the measure of display effectiveness.

  12. Cooperative dynamics in auditory brain response

    CERN Document Server

    Kwapien, J; Liu, L C; Ioannides, A A

    1998-01-01

    Simultaneous estimates of the activity in the left and right auditory cortex of five normal human subjects were extracted from Multichannel Magnetoencephalography recordings. Left, right and binaural stimulation were used, in separate runs, for each subject. The resulting time-series of left and right auditory cortex activity were analysed using the concept of mutual information. The analysis constitutes an objective method to address the nature of inter-hemispheric correlations in response to auditory stimulations. The results provide a clear evidence for the occurrence of such correlations mediated by a direct information transport, with clear laterality effects: as a rule, the contralateral hemisphere leads by 10-20ms, as can be seen in the average signal. The strength of the inter-hemispheric coupling, which cannot be extracted from the average data, is found to be highly variable from subject to subject, but remarkably stable for each subject.

  13. A Circuit for Motor Cortical Modulation of Auditory Cortical Activity

    OpenAIRE

    Nelson, Anders; Schneider, David M.; Takatoh, Jun; Sakurai, Katsuyasu; Wang, Fan; Mooney, Richard

    2013-01-01

    Normal hearing depends on the ability to distinguish self-generated sounds from other sounds, and this ability is thought to involve neural circuits that convey copies of motor command signals to various levels of the auditory system. Although such interactions at the cortical level are believed to facilitate auditory comprehension during movements and drive auditory hallucinations in pathological states, the synaptic organization and function of circuitry linking the motor and auditory corti...

  14. Auditory and motor imagery modulate learning in music performance

    Directory of Open Access Journals (Sweden)

    Rachel M. Brown

    2013-07-01

    Full Text Available Skilled performers such as athletes or musicians can improve their performance by imagining the actions or sensory outcomes associated with their skill. Performers vary widely in their auditory and motor imagery abilities, and these individual differences influence sensorimotor learning. It is unknown whether imagery abilities influence both memory encoding and retrieval. We examined how auditory and motor imagery abilities influence musicians’ encoding (during Learning, as they practiced novel melodies, and retrieval (during Recall of those melodies. Pianists learned melodies by listening without performing (auditory learning or performing without sound (motor learning; following Learning, pianists performed the melodies from memory with auditory feedback (Recall. During either Learning (Experiment 1 or Recall (Experiment 2, pianists experienced either auditory interference, motor interference, or no interference. Pitch accuracy (percentage of correct pitches produced and temporal regularity (variability of quarter-note interonset intervals were measured at Recall. Independent tests measured auditory and motor imagery skills. Pianists’ pitch accuracy was higher following auditory learning than following motor learning and lower in motor interference conditions (Experiments 1 and 2. Both auditory and motor imagery skills improved pitch accuracy overall. Auditory imagery skills modulated pitch accuracy encoding (Experiment 1: Higher auditory imagery skill corresponded to higher pitch accuracy following auditory learning with auditory or motor interference, and following motor learning with motor or no interference. These findings suggest that auditory imagery abilities decrease vulnerability to interference and compensate for missing auditory feedback at encoding. Auditory imagery skills also influenced temporal regularity at retrieval (Experiment 2: Higher auditory imagery skill predicted greater temporal regularity during Recall in the

  15. Functional neuroanatomy of auditory scene analysis in Alzheimer's disease

    OpenAIRE

    Golden, Hannah L.; Jennifer L. Agustus; Johanna C. Goll; Downey, Laura E; Mummery, Catherine J.; Jonathan M Schott; Crutch, Sebastian J.; Jason D Warren

    2015-01-01

    Auditory scene analysis is a demanding computational process that is performed automatically and efficiently by the healthy brain but vulnerable to the neurodegenerative pathology of Alzheimer's disease. Here we assessed the functional neuroanatomy of auditory scene analysis in Alzheimer's disease using the well-known ‘cocktail party effect’ as a model paradigm whereby stored templates for auditory objects (e.g., hearing one's spoken name) are used to segregate auditory ‘foreground’ and ‘back...

  16. Auditory ERP response to successive stimuli in infancy

    OpenAIRE

    Chen, Ao; Peter, Varghese; Burnham, Denis

    2016-01-01

    Background. Auditory Event-Related Potentials (ERPs) are useful for understanding early auditory development among infants, as it allows the collection of a relatively large amount of data in a short time. So far, studies that have investigated development in auditory ERPs in infancy have mainly used single sounds as stimuli. Yet in real life, infants must decode successive rather than single acoustic events. In the present study, we tested 4-, 8-, and 12-month-old infants’ auditory ERPs to m...

  17. Auditory Neuropathy Spectrum Disorder Masquerading as Social Anxiety

    OpenAIRE

    Behere, Rishikesh V.; Rao, Mukund G.; Mishra, Shree; Varambally, Shivarama; Nagarajarao, Shivashankar; Bangalore N Gangadhar

    2015-01-01

    The authors report a case of a 47-year-old man who presented with treatment-resistant anxiety disorder. Behavioral observation raised clinical suspicion of auditory neuropathy spectrum disorder. The presence of auditory neuropathy spectrum disorder was confirmed on audiological investigations. The patient was experiencing extreme symptoms of anxiety, which initially masked the underlying diagnosis of auditory neuropathy spectrum disorder. Challenges in diagnosis and treatment of auditory neur...

  18. Auditory resting-state network connectivity in tinnitus: a functional MRI study.

    Directory of Open Access Journals (Sweden)

    Audrey Maudoux

    Full Text Available The underlying functional neuroanatomy of tinnitus remains poorly understood. Few studies have focused on functional cerebral connectivity changes in tinnitus patients. The aim of this study was to test if functional MRI "resting-state" connectivity patterns in auditory network differ between tinnitus patients and normal controls. Thirteen chronic tinnitus subjects and fifteen age-matched healthy controls were studied on a 3 tesla MRI. Connectivity was investigated using independent component analysis and an automated component selection approach taking into account the spatial and temporal properties of each component. Connectivity in extra-auditory regions such as brainstem, basal ganglia/NAc, cerebellum, parahippocampal, right prefrontal, parietal, and sensorimotor areas was found to be increased in tinnitus subjects. The right primary auditory cortex, left prefrontal, left fusiform gyrus, and bilateral occipital regions showed a decreased connectivity in tinnitus. These results show that there is a modification of cortical and subcortical functional connectivity in tinnitus encompassing attentional, mnemonic, and emotional networks. Our data corroborate the hypothesized implication of non-auditory regions in tinnitus physiopathology and suggest that various regions of the brain seem involved in the persistent awareness of the phenomenon as well as in the development of the associated distress leading to disabling chronic tinnitus.

  19. Auditory Resting-State Network Connectivity in Tinnitus: A Functional MRI Study

    Science.gov (United States)

    Maudoux, Audrey; Lefebvre, Philippe; Cabay, Jean-Evrard; Demertzi, Athena; Vanhaudenhuyse, Audrey; Laureys, Steven; Soddu, Andrea

    2012-01-01

    The underlying functional neuroanatomy of tinnitus remains poorly understood. Few studies have focused on functional cerebral connectivity changes in tinnitus patients. The aim of this study was to test if functional MRI “resting-state” connectivity patterns in auditory network differ between tinnitus patients and normal controls. Thirteen chronic tinnitus subjects and fifteen age-matched healthy controls were studied on a 3 tesla MRI. Connectivity was investigated using independent component analysis and an automated component selection approach taking into account the spatial and temporal properties of each component. Connectivity in extra-auditory regions such as brainstem, basal ganglia/NAc, cerebellum, parahippocampal, right prefrontal, parietal, and sensorimotor areas was found to be increased in tinnitus subjects. The right primary auditory cortex, left prefrontal, left fusiform gyrus, and bilateral occipital regions showed a decreased connectivity in tinnitus. These results show that there is a modification of cortical and subcortical functional connectivity in tinnitus encompassing attentional, mnemonic, and emotional networks. Our data corroborate the hypothesized implication of non-auditory regions in tinnitus physiopathology and suggest that various regions of the brain seem involved in the persistent awareness of the phenomenon as well as in the development of the associated distress leading to disabling chronic tinnitus. PMID:22574141

  20. Comparison of Auditory Event-Related Potential P300 in Sighted and Early Blind Individuals

    Directory of Open Access Journals (Sweden)

    Fatemeh Heidari

    2010-06-01

    Full Text Available Background and Aim: Following an early visual deprivation, the neural network involved in processing auditory spatial information undergoes a profound reorganization. In order to investigate this process, event-related potentials provide accurate information about time course neural activation as well as perception and cognitive processes. In this study, the latency and amplitude of auditory P300 were compared in sighted and early blind individuals in age range of 18-25 years old.Methods: In this cross-sectional study, auditory P300 potential was measured in conventional oddball paradigm by using two tone burst stimuli (1000 and 2000 Hz on 40 sighted subjects and 19 early blind subjects with mean age 20.94 years old.Results: The mean latency of P300 in early blind subjects was significantly smaller than sighted subjects (p=0.00.( There was no significant difference in amplitude between two groups (p>0.05.Conclusion: Reduced latency of P300 in early blind subjects in comparison to sighted subjects probably indicates the rate of automatic processing and information categorization is faster in early blind subjects because of sensory compensation. It seems that neural plasticity increases the rate of auditory processing and attention in early blind subjects.

  1. Auditory Brainstem Response Improvements in Hyperbillirubinemic Infants

    Science.gov (United States)

    Abdollahi, Farzaneh Zamiri; Manchaiah, Vinaya; Lotfi, Yones

    2016-01-01

    Background and Objectives Hyperbillirubinemia in infants have been associated with neuronal damage including in the auditory system. Some researchers have suggested that the bilirubin-induced auditory neuronal damages may be temporary and reversible. This study was aimed at investigating the auditory neuropathy and reversibility of auditory abnormalities in hyperbillirubinemic infants. Subjects and Methods The study participants included 41 full term hyperbilirubinemic infants (mean age 39.24 days) with normal birth weight (3,200-3,700 grams) that admitted in hospital for hyperbillirubinemia and 39 normal infants (mean age 35.54 days) without any hyperbillirubinemia or other hearing loss risk factors for ruling out maturational changes. All infants in hyperbilirubinemic group had serum bilirubin level more than 20 milligram per deciliter and undergone one blood exchange transfusion. Hearing evaluation for each infant was conducted twice: the first one after hyperbilirubinemia treatment and before leaving hospital and the second one three months after the first hearing evaluation. Hearing evaluations included transient evoked otoacoustic emission (TEOAE) screening and auditory brainstem response (ABR) threshold tracing. Results The TEOAE and ABR results of control group and TEOAE results of the hyperbilirubinemic group did not change significantly from the first to the second evaluation. However, the ABR results of the hyperbilirubinemic group improved significantly from the first to the second assessment (p=0.025). Conclusions The results suggest that the bilirubin induced auditory neuronal damage can be reversible over time so we suggest that infants with hyperbilirubinemia who fail the first hearing tests should be reevaluated after 3 months of treatment. PMID:27144228

  2. Location serves as a conditional cue when harp seals (Pagophilus groenlandicus) solve object discrimination reversal problems.

    Science.gov (United States)

    Walsh, Stephanie J; Skinner, Darlene M; Martin, Gerard M

    2007-03-01

    We examined the capacity of harp seals (Pagophilus groenlandicus) to use spatial context (i.e., their tank) as a conditional cue to solve a two-choice visual discrimination reversal task. Seals were trained to touch one of two 3D objects. Two of four seals experienced a context shift that coincided with each of five reversals in the reward value of the two stimuli (i.e., a reversal of S+ and S-); these seals solved the six discriminations in significantly fewer trials than did seals that did not experience a context shift with the contingency reversal. Thus, harp seals use contextual cues when encoding information. The findings are discussed in terms of harp seals' adaptations to the pack-ice environment, the constraints of the learning tasks, and the nature of the subjects that were raised in captivity. PMID:17479741

  3. Skin-derived cues control arborization of sensory dendrites in Caenorhabditis elegans.

    Science.gov (United States)

    Salzberg, Yehuda; Díaz-Balzac, Carlos A; Ramirez-Suarez, Nelson J; Attreed, Matthew; Tecle, Eillen; Desbois, Muriel; Kaprielian, Zaven; Bülow, Hannes E

    2013-10-10

    Sensory dendrites depend on cues from their environment to pattern their growth and direct them toward their correct target tissues. Yet, little is known about dendrite-substrate interactions during dendrite morphogenesis. Here, we describe MNR-1/menorin, which is part of the conserved Fam151 family of proteins and is expressed in the skin to control the elaboration of "menorah"-like dendrites of mechanosensory neurons in Caenorhabditis elegans. We provide biochemical and genetic evidence that MNR-1 acts as a contact-dependent or short-range cue in concert with the neural cell adhesion molecule SAX-7/L1CAM in the skin and through the neuronal leucine-rich repeat transmembrane receptor DMA-1 on sensory dendrites. Our data describe an unknown pathway that provides spatial information from the skin substrate to pattern sensory dendrite development nonautonomously. PMID:24120132

  4. Brain networks underlying navigation in the Cincinnati water maze with external and internal cues.

    Science.gov (United States)

    Arias, Natalia; Méndez, Marta; Arias, Jorge L

    2014-07-25

    The present study investigated the behavioural performance and the contributions of different brain regions on a spatial task performed by Wistar rats in the Cincinnati water maze (CWM) in two conditions: one where both distal and proximal visual cues were available (CWM-light group, n=7) and another where visual cues were eliminated by testing in complete darkness (CWM-dark group, n=7). There were differences in the behavioural performance. Energetic brain metabolism revealed significant differences in the infralimbic, orbitofrontal cortex and anterodorsal striatum. At the same time different brain networks were found. The CWM-light group showed a relationship between the orbitofrontal cortex and medial septum, whereas the CWM-dark group revealed three different networks involving the prefrontal cortex, ventral striatum, hippocampus and amygdala nuclei. The study shows that brain activation differs in these two conditions. PMID:24915295

  5. A virtual auditory environment for investigating the auditory signal processing of realistic sounds

    DEFF Research Database (Denmark)

    Favrot, Sylvain Emmanuel

    A loudspeaker-based virtual auditory environment (VAE) has been developed to provide a realistic versatile research environment for investigating the auditory signal processing in real environments, i.e., considering multiple sound sources and room reverberation. The VAE allows a full control of...... the acoustic scenario in order to systematically study the auditory processing of reverberant sounds. It is based on the ODEON software, which is state-of-the-art software for room acoustic simulations developed at Acoustic Technology, DTU. First, a MATLAB interface to the ODEON software has been...

  6. Scene-Based Contextual Cueing in Pigeons

    OpenAIRE

    Wasserman, Edward A.; Teng, Yuejia; Brooks, Daniel I.

    2014-01-01

    Repeated pairings of a particular visual context with a specific location of a target stimulus facilitate target search in humans. We explored an animal model of such contextual cueing. Pigeons had to peck a target which could appear in one of four locations on color photographs of real-world scenes. On half of the trials, each of four scenes was consistently paired with one of four possible target locations; on the other half of the trials, each of four different scenes was randomly paired w...

  7. Attention to health cues on product packages

    DEFF Research Database (Denmark)

    Orquin, Jacob Lund; Scholderer, Joachim

    2011-01-01

    The objectives of the study were (a) to examine which information and design elements on dairy product packages operate as cues in consumer evaluations of product healthfulness, and (b) to measure the degree to which consumers voluntarily attend to these elements during product choice. Visual...... attention was measured by means of eye-tracking. Task (free viewing, product healthfulness evaluation, and purchase likelihood evaluation) and product (five different yoghurt products) were varied in a mixed within-between subjects design. The free viewing condition served as a baseline against which...

  8. Blood cues induce antipredator behavior in Nile tilapia conspecifics.

    Directory of Open Access Journals (Sweden)

    Rodrigo Egydio Barreto

    Full Text Available In this study, we show that the fish Nile tilapia displays an antipredator response to chemical cues present in the blood of conspecifics. This is the first report of alarm response induced by blood-borne chemical cues in fish. There is a body of evidence showing that chemical cues from epidermal 'club' cells elicit an alarm reaction in fish. However, the chemical cues of these 'club' cells are restricted to certain species of fish. Thus, as a parsimonious explanation, we assume that an alarm response to blood cues is a generalized response among animals because it occurs in mammals, birds and protostomian animals. Moreover, our results suggest that researchers must use caution when studying chemically induced alarm reactions because it is difficult to separate club cell cues from traces of blood.

  9. Auditory Hypersensitivity in Children with Autism Spectrum Disorders

    Science.gov (United States)

    Lucker, Jay R.

    2013-01-01

    A review of records was completed to determine whether children with auditory hypersensitivities have difficulty tolerating loud sounds due to auditory-system factors or some other factors not directly involving the auditory system. Records of 150 children identified as not meeting autism spectrum disorders (ASD) criteria and another 50 meeting…

  10. AN EVALUATION OF AUDITORY LEARNING IN FILIAL IMPRINTING

    NARCIS (Netherlands)

    BOLHUIS, JJ; VANKAMPEN, HS

    1992-01-01

    The characteristics of auditory learning in filial imprinting in precocial birds are reviewed. Numerous studies have demonstrated that the addition of an auditory stimulus improves following of a visual stimulus. This paper evaluates whether there is genuine auditory imprinting, i.e. the formation o

  11. Auditory Stream Biasing in Children with Reading Impairments

    Science.gov (United States)

    Ouimet, Tialee; Balaban, Evan

    2010-01-01

    Reading impairments have previously been associated with auditory processing differences. We examined "auditory stream biasing", a global aspect of auditory temporal processing. Children with reading impairments, control children and adults heard a 10 s long stream-bias-inducing sound sequence (a repeating 1000 Hz tone) and a test sequence (eight…

  12. Auditory issues in handheld land mine detectors

    Science.gov (United States)

    Vause, Nancy L.; Letowski, Tomasz R.; Ferguson, Larry G.; Mermagen, Timothy J.

    1999-08-01

    Most handled landmine detection systems use tones or other simple acoustic signals to provide detector information to the operator. Such signals are not necessarily the best carriers of information about the characteristics of hidden objects. To be effective, the auditory signals must present the information in a manner that the operator can comfortably and efficiently, the auditory signals must present the information in a manner that the operator can comfortably and efficiently interpret under stress and high mental load. The signals must also preserve their audibility and specific properties in various adverse acoustic environments. This paper will present several issues on optimizing the audio display interface between the operator and machine.

  13. Auditory Perception of Statistically Blurred Sound Textures

    DEFF Research Database (Denmark)

    McWalter, Richard Ian; MacDonald, Ewen; Dau, Torsten

    Sound textures have been identified as a category of sounds which are processed by the peripheral auditory system and captured with running timeaveraged statistics. Although sound textures are temporally homogeneous, they offer a listener with enough information to identify and differentiate...... sources. This experiment investigated the ability of the auditory system to identify statistically blurred sound textures and the perceptual relationship between sound textures. Identification performance of statistically blurred sound textures presented at a fixed blur increased over those presented as a...... gradual blur. The results suggests that the correct identification of sound textures is influenced by the preceding blurred stimulus. These findings draw parallels to the recognition of blurred images....

  14. Binaural processing by the gecko auditory periphery

    DEFF Research Database (Denmark)

    Christensen-Dalsgaard, Jakob; Tang, Ye Zhong; Carr, Catherine E

    2011-01-01

    Tokay gecko with neurophysiological recordings from the auditory nerve. Laser vibrometry shows that their ear is a two-input system with approximately unity interaural transmission gain at the peak frequency (around 1.6 kHz). Median interaural delays are 260 μs, almost three times larger than predicted...... from gecko head size, suggesting interaural transmission may be boosted by resonances in the large, open mouth cavity (Vossen et al., 2010). Auditory nerve recordings are sensitive to both interaural time differences (ITD) and interaural level differences (ILD), reflecting the acoustical interactions...

  15. Cues of mosquito host finding and oviposition site selection

    OpenAIRE

    Afify, Ali

    2014-01-01

    The aim of this work was to study odor cues that affect mosquito host seeking and oviposition behavior. Due to the vastness (and sometimes contradictions) of mosquito oviposition cues available in literature, I started by reviewing these cues, their source in nature, and their role in mosquito oviposition. Then, I tested the oviposition response of Aedes aegypti towards the contradictory oviposition odor p-cresol and its isomer m-cresol in different situations.p-cresol showed an oviposition d...

  16. Increasing confidence in vergence as a cue to distance.

    OpenAIRE

    Tresilian, J R; Mon-Williams, M; Kelly, B M

    1999-01-01

    Multiple cues contribute to the visual perception of an object's distance from the observer. The manner in which the nervous system combines these various cues is of considerable interest. Although it is accepted that image cues play a significant role in distance perception, controversy exists regarding the use of kinaesthetic information about the eyes' state of convergence. We used a perturbation technique to explore the contribution of vergence to visually based distance estimates as a fu...

  17. Comparison of horizontal head movements evoked by auditory and visual targets.

    Science.gov (United States)

    Fuller, J H

    1996-01-01

    Head movement propensity-the pattern of head saccades dependent on methods of target presentation-varies among individuals. The present group of 9 young adults was previously ranked in a visual saccadic task according to this propensity. The present report examines how and why this propensity changes if the saccades are made to auditory targets. 1) Spatially identical, interleaved, auditorily and visually elicited horizontal saccadic gaze shifts (jumps) differed in amplitude and in starting and/or ending position. The jumps were executed in two head movement modes: first, the non-aligned mode was a standard reaction-time single gaze step between two points. Second, the head-aligned mode required alignment of the head with the fixation (starting) point; thereafter both modes were identical. All results in the auditory task are expressed relative to the visual results. 2) In the non-aligned mode, head movement amplitudes were increased on average by 15% (for example, an 80 degrees jump elicited a 12 degrees larger head movement), and velocity decreased by 12%, reflecting the increased demands of the auditory task. More importantly, the differences between subjects was narrowed; that is, head movement propensity was homogenized in the auditory task. In the visual task, head-movers willingly move their heads off and across the midline, whereas non-movers are unwilling to leave the midline from eccentric starting points or to eccentric ending points. This is called the midline attraction effect and was previously linked to spatial reference frames. The homogenization in the auditory task was characterized by head-movers increasing, and non-movers decreasing, their midline attraction, suggesting altered spatial reference frames. 3) For heuristic purposes, the ideal head-mover is defined by a gain of 1.0 in the visual task, and by external earth-fixed reference frames. Similarly, the ideal non-mover has a gain of 0.0 and has a bias toward body (or some par of the body

  18. Haven't a Cue? Mapping the CUE Space as an Aid to HRA Modeling

    Energy Technology Data Exchange (ETDEWEB)

    David I Gertman; Ronald L Boring; Jacques Hugo; William Phoenix

    2012-06-01

    Advances in automation present a new modeling environment for the human reliability analysis (HRA) practitioner. Many, if not most, current day HRA methods have their origin in characterizing and quantifying human performance in analog environments where mode awareness and system status indications are potentially less comprehensive, but simpler to comprehend at a glance when compared to advanced presentation systems. The introduction of highly complex automation has the potential to lead to: decreased levels of situation awareness caused by the need for increased monitoring; confusion regarding the often non-obvious causes of automation failures, and emergent system dependencies that formerly may have been uncharacterized. Understanding the relation of incoming cues available to operators during plant upset conditions, in conjunction with operating procedures, yields insight into understanding the nature of the expected operator response in this control room environment. Static systems methods such as fault trees do not contain the appropriate temporal information or necessarily specify the relationship among cues leading to operator response. In this paper, we do not attempt to replace standard performance shaping factors commonly used in HRA nor offer a new HRA method, existing methods may suffice. In this paper we strive to enhance current understanding of the basis for operator response through a technique that can be used during the qualitative portion of the HRA analysis process. The CUE map is a means to visualize the relationship among salient cues in the control room that help influence operator response, show how the cognitive map of the operator changes as information is gained or lost, and is applicable to existing as well as advanced hybrid plants and small modular reactor designs. A brief application involving loss of condensate is presented and advantages and limitations of the modeling approach and use of the CUE map are discussed.

  19. Haven't a Cue? Mapping the CUE Space as an Aid to HRA Modeling

    International Nuclear Information System (INIS)

    Advances in automation present a new modeling environment for the human reliability analysis (HRA) practitioner. Many, if not most, current day HRA methods have their origin in characterizing and quantifying human performance in analog environments where mode awareness and system status indications are potentially less comprehensive, but simpler to comprehend at a glance when compared to advanced presentation systems. The introduction of highly complex automation has the potential to lead to: decreased levels of situation awareness caused by the need for increased monitoring; confusion regarding the often non-obvious causes of automation failures, and emergent system dependencies that formerly may have been uncharacterized. Understanding the relation of incoming cues available to operators during plant upset conditions, in conjunction with operating procedures, yields insight into understanding the nature of the expected operator response in this control room environment. Static systems methods such as fault trees do not contain the appropriate temporal information or necessarily specify the relationship among cues leading to operator response. In this paper, we do not attempt to replace standard performance shaping factors commonly used in HRA nor offer a new HRA method, existing methods may suffice. In this paper we strive to enhance current understanding of the basis for operator response through a technique that can be used during the qualitative portion of the HRA analysis process. The CUE map is a means to visualize the relationship among salient cues in the control room that help influence operator response, show how the cognitive map of the operator changes as information is gained or lost, and is applicable to existing as well as advanced hybrid plants and small modular reactor designs. A brief application involving loss of condensate is presented and advantages and limitations of the modeling approach and use of the CUE map are discussed.

  20. Exploring Evolving Media Discourse Through Event Cueing.

    Science.gov (United States)

    Lu, Yafeng; Steptoe, Michael; Burke, Sarah; Wang, Hong; Tsai, Jiun-Yi; Davulcu, Hasan; Montgomery, Douglas; Corman, Steven R; Maciejewski, Ross

    2016-01-01

    Online news, microblogs and other media documents all contain valuable insight regarding events and responses to events. Underlying these documents is the concept of framing, a process in which communicators act (consciously or unconsciously) to construct a point of view that encourages facts to be interpreted by others in a particular manner. As media discourse evolves, how topics and documents are framed can undergo change, shifting the discussion to different viewpoints or rhetoric. What causes these shifts can be difficult to determine directly; however, by linking secondary datasets and enabling visual exploration, we can enhance the hypothesis generation process. In this paper, we present a visual analytics framework for event cueing using media data. As discourse develops over time, our framework applies a time series intervention model which tests to see if the level of framing is different before or after a given date. If the model indicates that the times before and after are statistically significantly different, this cues an analyst to explore related datasets to help enhance their understanding of what (if any) events may have triggered these changes in discourse. Our framework consists of entity extraction and sentiment analysis as lenses for data exploration and uses two different models for intervention analysis. To demonstrate the usage of our framework, we present a case study on exploring potential relationships between climate change framing and conflicts in Africa. PMID:26529702