WorldWideScience

Sample records for wireless gaze tracker

  1. EyeDroid: An Open Source Mobile Gaze Tracker on Android for Eyewear Computers

    DEFF Research Database (Denmark)

    Jalaliniya, Shahram; Mardanbeigi, Diako; Sintos, Ioannis

    2015-01-01

    In this paper we report on development and evaluation of a video-based mobile gaze tracker for eyewear computers. Unlike most of the previous work, our system performs all its processing workload on an Android device and sends the coordinates of the gaze point to an eyewear device through wireless...... connection. We propose a lightweight software architecture for Android to increase the efficiency of image processing needed for eye tracking. The evaluation of the system indicated an accuracy of 1:06 and a battery lifetime of approximate 4.5 hours....

  2. Wearable Gaze Trackers: Mapping Visual Attention in 3D

    DEFF Research Database (Denmark)

    Jensen, Rasmus Ramsbøl; Stets, Jonathan Dyssel; Suurmets, Seidi

    2017-01-01

    gaze trackers allows respondents to move freely in any real world 3D environment, removing the previous restrictions. In this paper we propose a novel approach for processing visual attention of respondents using mobile wearable gaze trackers in a 3D environment. The pipeline consists of 3 steps...

  3. Real-time sharing of gaze data between multiple eye trackers-evaluation, tools, and advice.

    Science.gov (United States)

    Nyström, Marcus; Niehorster, Diederick C; Cornelissen, Tim; Garde, Henrik

    2017-08-01

    Technological advancements in combination with significant reductions in price have made it practically feasible to run experiments with multiple eye trackers. This enables new types of experiments with simultaneous recordings of eye movement data from several participants, which is of interest for researchers in, e.g., social and educational psychology. The Lund University Humanities Laboratory recently acquired 25 remote eye trackers, which are connected over a local wireless network. As a first step toward running experiments with this setup, demanding situations with real time sharing of gaze data were investigated in terms of network performance as well as clock and screen synchronization. Results show that data can be shared with a sufficiently low packet loss (0.1 %) and latency (M = 3 ms, M A D = 2 ms) across 8 eye trackers at a rate of 60 Hz. For a similar performance using 24 computers, the send rate needs to be reduced to 20 Hz. To help researchers conduct similar measurements on their own multi-eye-tracker setup, open source software written in Python and PsychoPy are provided. Part of the software contains a minimal working example to help researchers kick-start experiments with two or more eye trackers.

  4. Use of a Remote Eye-Tracker for the Analysis of Gaze during Treadmill Walking and Visual Stimuli Exposition

    Directory of Open Access Journals (Sweden)

    V. Serchi

    2016-01-01

    Full Text Available The knowledge of the visual strategies adopted while walking in cognitively engaging environments is extremely valuable. Analyzing gaze when a treadmill and a virtual reality environment are used as motor rehabilitation tools is therefore critical. Being completely unobtrusive, remote eye-trackers are the most appropriate way to measure the point of gaze. Still, the point of gaze measurements are affected by experimental conditions such as head range of motion and visual stimuli. This study assesses the usability limits and measurement reliability of a remote eye-tracker during treadmill walking while visual stimuli are projected. During treadmill walking, the head remained within the remote eye-tracker workspace. Generally, the quality of the point of gaze measurements declined as the distance from the remote eye-tracker increased and data loss occurred for large gaze angles. The stimulus location (a dot-target did not influence the point of gaze accuracy, precision, and trackability during both standing and walking. Similar results were obtained when the dot-target was replaced by a static or moving 2D target and “region of interest” analysis was applied. These findings foster the feasibility of the use of a remote eye-tracker for the analysis of gaze during treadmill walking in virtual reality environments.

  5. Evaluation of a low-cost open-source gaze tracker

    DEFF Research Database (Denmark)

    San Agustin, Javier; Jensen, Henrik Tomra Skovsgaard Hegner; Møllenbach, Emilie

    2010-01-01

    This paper presents a low-cost gaze tracking system that is based on a webcam mounted close to the user's eye. The performance of the gaze tracker was evaluated in an eye-typing task using two different typing applications. Participants could type between 3.56 and 6.78 words per minute, depending...... on the typing system used. A pilot study to assess the usability of the system was also carried out in the home of a user with severe motor impairments. The user successfully typed on a wall-projected interface using his eye movements....

  6. iShadow: Design of a Wearable, Real-Time Mobile Gaze Tracker.

    Science.gov (United States)

    Mayberry, Addison; Hu, Pan; Marlin, Benjamin; Salthouse, Christopher; Ganesan, Deepak

    2014-06-01

    Continuous, real-time tracking of eye gaze is valuable in a variety of scenarios including hands-free interaction with the physical world, detection of unsafe behaviors, leveraging visual context for advertising, life logging, and others. While eye tracking is commonly used in clinical trials and user studies, it has not bridged the gap to everyday consumer use. The challenge is that a real-time eye tracker is a power-hungry and computation-intensive device which requires continuous sensing of the eye using an imager running at many tens of frames per second, and continuous processing of the image stream using sophisticated gaze estimation algorithms. Our key contribution is the design of an eye tracker that dramatically reduces the sensing and computation needs for eye tracking, thereby achieving orders of magnitude reductions in power consumption and form-factor. The key idea is that eye images are extremely redundant, therefore we can estimate gaze by using a small subset of carefully chosen pixels per frame. We instantiate this idea in a prototype hardware platform equipped with a low-power image sensor that provides random access to pixel values, a low-power ARM Cortex M3 microcontroller, and a bluetooth radio to communicate with a mobile phone. The sparse pixel-based gaze estimation algorithm is a multi-layer neural network learned using a state-of-the-art sparsity-inducing regularization function that minimizes the gaze prediction error while simultaneously minimizing the number of pixels used. Our results show that we can operate at roughly 70mW of power, while continuously estimating eye gaze at the rate of 30 Hz with errors of roughly 3 degrees.

  7. iShadow: Design of a Wearable, Real-Time Mobile Gaze Tracker

    Science.gov (United States)

    Mayberry, Addison; Hu, Pan; Marlin, Benjamin; Salthouse, Christopher; Ganesan, Deepak

    2015-01-01

    Continuous, real-time tracking of eye gaze is valuable in a variety of scenarios including hands-free interaction with the physical world, detection of unsafe behaviors, leveraging visual context for advertising, life logging, and others. While eye tracking is commonly used in clinical trials and user studies, it has not bridged the gap to everyday consumer use. The challenge is that a real-time eye tracker is a power-hungry and computation-intensive device which requires continuous sensing of the eye using an imager running at many tens of frames per second, and continuous processing of the image stream using sophisticated gaze estimation algorithms. Our key contribution is the design of an eye tracker that dramatically reduces the sensing and computation needs for eye tracking, thereby achieving orders of magnitude reductions in power consumption and form-factor. The key idea is that eye images are extremely redundant, therefore we can estimate gaze by using a small subset of carefully chosen pixels per frame. We instantiate this idea in a prototype hardware platform equipped with a low-power image sensor that provides random access to pixel values, a low-power ARM Cortex M3 microcontroller, and a bluetooth radio to communicate with a mobile phone. The sparse pixel-based gaze estimation algorithm is a multi-layer neural network learned using a state-of-the-art sparsity-inducing regularization function that minimizes the gaze prediction error while simultaneously minimizing the number of pixels used. Our results show that we can operate at roughly 70mW of power, while continuously estimating eye gaze at the rate of 30 Hz with errors of roughly 3 degrees. PMID:26539565

  8. A neural-based remote eye gaze tracker under natural head motion.

    Science.gov (United States)

    Torricelli, Diego; Conforto, Silvia; Schmid, Maurizio; D'Alessio, Tommaso

    2008-10-01

    A novel approach to view-based eye gaze tracking for human computer interface (HCI) is presented. The proposed method combines different techniques to address the problems of head motion, illumination and usability in the framework of low cost applications. Feature detection and tracking algorithms have been designed to obtain an automatic setup and strengthen the robustness to light conditions. An extensive analysis of neural solutions has been performed to deal with the non-linearity associated with gaze mapping under free-head conditions. No specific hardware, such as infrared illumination or high-resolution cameras, is needed, rather a simple commercial webcam working in visible light spectrum suffices. The system is able to classify the gaze direction of the user over a 15-zone graphical interface, with a success rate of 95% and a global accuracy of around 2 degrees , comparable with the vast majority of existing remote gaze trackers.

  9. Gaze-Based Controlling a Vehicle

    DEFF Research Database (Denmark)

    Mardanbeigi, Diako; Witzner Hansen, Dan

    ) as an example of a complex gaze-based task in environment. This paper discusses the possibilities and limitations of how gaze interaction can be performed for controlling vehicles not only using a remote gaze tracker but also in general challenging situations where the user and robot are mobile...... modality if gaze trackers are embedded into the head- mounted devices. The domain of gaze-based interactive applications increases dramatically as interaction is no longer constrained to 2D displays. This paper proposes a general framework for gaze-based controlling a non- stationary robot (vehicle...... and the movements may be governed by several degrees of freedom (e.g. flying). A case study is also introduced where the mobile gaze tracker is used for controlling a Roomba vacuum cleaner....

  10. Gaze Tracking Through Smartphones

    DEFF Research Database (Denmark)

    Skovsgaard, Henrik; Hansen, John Paulin; Møllenbach, Emilie

    Mobile gaze trackers embedded in smartphones or tablets provide a powerful personal link to game devices, head-mounted micro-displays, pc´s, and TV’s. This link may offer a main road to the mass market for gaze interaction, we suggest.......Mobile gaze trackers embedded in smartphones or tablets provide a powerful personal link to game devices, head-mounted micro-displays, pc´s, and TV’s. This link may offer a main road to the mass market for gaze interaction, we suggest....

  11. A Gaze Interactive Textual Smartwatch Interface

    DEFF Research Database (Denmark)

    Hansen, John Paulin; Biermann, Florian; Askø Madsen, Janus

    2015-01-01

    Mobile gaze interaction is challenged by inherent motor noise. We examined the gaze tracking accuracy and precision of twelve subjects wearing a gaze tracker on their wrist while standing and walking. Results suggest that it will be possible to detect whether people are glancing the watch, but no...

  12. Evaluation of a remote webcam-based eye tracker

    DEFF Research Database (Denmark)

    Skovsgaard, Henrik; Agustin, Javier San; Johansen, Sune Alstrup

    2011-01-01

    In this paper we assess the performance of an open-source gaze tracker in a remote (i.e. table-mounted) setup, and compare it with two other commercial eye trackers. An experiment with 5 subjects showed the open-source eye tracker to have a significantly higher level of accuracy than one...

  13. Gaze-controlled Driving

    DEFF Research Database (Denmark)

    Tall, Martin; Alapetite, Alexandre; San Agustin, Javier

    2009-01-01

    We investigate if the gaze (point of regard) can control a remote vehicle driving on a racing track. Five different input devices (on-screen buttons, mouse-pointing low-cost webcam eye tracker and two commercial eye tracking systems) provide heading and speed control on the scene view transmitted...

  14. Remote gaze tracking system for 3D environments.

    Science.gov (United States)

    Congcong Liu; Herrup, Karl; Shi, Bertram E

    2017-07-01

    Eye tracking systems are typically divided into two categories: remote and mobile. Remote systems, where the eye tracker is located near the object being viewed by the subject, have the advantage of being less intrusive, but are typically used for tracking gaze points on fixed two dimensional (2D) computer screens. Mobile systems such as eye tracking glasses, where the eye tracker are attached to the subject, are more intrusive, but are better suited for cases where subjects are viewing objects in the three dimensional (3D) environment. In this paper, we describe how remote gaze tracking systems developed for 2D computer screens can be used to track gaze points in a 3D environment. The system is non-intrusive. It compensates for small head movements by the user, so that the head need not be stabilized by a chin rest or bite bar. The system maps the 3D gaze points of the user onto 2D images from a scene camera and is also located remotely from the subject. Measurement results from this system indicate that it is able to estimate gaze points in the scene camera to within one degree over a wide range of head positions.

  15. Mobile gaze input system for pervasive interaction

    DEFF Research Database (Denmark)

    2017-01-01

    feedback to the user in response to the received command input. The unit provides feedback to the user on how to position the mobile unit in front of his eyes. The gaze tracking unit interacts with one or more controlled devices via wireless or wired communications. Example devices include a lock......, a thermostat, a light or a TV. The connection between the gaze tracking unit may be temporary or longer-lasting. The gaze tracking unit may detect features of the eye that provide information about the identity of the user....

  16. Assessing the Usability of Gaze-Adapted Interface against Conventional Eye-based Input Emulation

    OpenAIRE

    Kumar, Chandan; Menges, Raphael; Staab, Steffen

    2017-01-01

    In recent years, eye tracking systems have greatly improved, beginning to play a promising role as an input medium. Eye trackers can be used for application control either by simply emulating the mouse and keyboard devices in the traditional graphical user interface, or by customized interfaces for eye gaze events. In this work, we evaluate these two approaches to assess their impact in usability. We present a gaze-adapted Twitter application interface with direct interaction of eye gaze inpu...

  17. Eye blinking in an avian species is associated with gaze shifts.

    Science.gov (United States)

    Yorzinski, Jessica L

    2016-08-30

    Even when animals are actively monitoring their environment, they lose access to visual information whenever they blink. They can strategically time their blinks to minimize information loss and improve visual functioning but we have little understanding of how this process operates in birds. This study therefore examined blinking in freely-moving peacocks (Pavo cristatus) to determine the relationship between their blinks, gaze shifts, and context. Peacocks wearing a telemetric eye-tracker were exposed to a taxidermy predator (Vulpes vulpes) and their blinks and gaze shifts were recorded. Peacocks blinked during the majority of their gaze shifts, especially when gaze shifts were large, thereby timing their blinks to coincide with periods when visual information is already suppressed. They inhibited their blinks the most when they exhibited high rates of gaze shifts and were thus highly alert. Alternative hypotheses explaining the link between blinks and gaze shifts are discussed.

  18. Depth Compensation Model for Gaze Estimation in Sport Analysis

    DEFF Research Database (Denmark)

    Batista Narcizo, Fabricio; Hansen, Dan Witzner

    2015-01-01

    is tested in a totally controlled environment with aim to check the influences of eye tracker parameters and ocular biometric parameters on its behavior. We also present a gaze estimation method based on epipolar geometry for binocular eye tracking setups. The depth compensation model has shown very...

  19. A new mapping function in table-mounted eye tracker

    Science.gov (United States)

    Tong, Qinqin; Hua, Xiao; Qiu, Jian; Luo, Kaiqing; Peng, Li; Han, Peng

    2018-01-01

    Eye tracker is a new apparatus of human-computer interaction, which has caught much attention in recent years. Eye tracking technology is to obtain the current subject's "visual attention (gaze)" direction by using mechanical, electronic, optical, image processing and other means of detection. While the mapping function is one of the key technology of the image processing, and is also the determination of the accuracy of the whole eye tracker system. In this paper, we present a new mapping model based on the relationship among the eyes, the camera and the screen that the eye gazed. Firstly, according to the geometrical relationship among the eyes, the camera and the screen, the framework of mapping function between the pupil center and the screen coordinate is constructed. Secondly, in order to simplify the vectors inversion of the mapping function, the coordinate of the eyes, the camera and screen was modeled by the coaxial model systems. In order to verify the mapping function, corresponding experiment was implemented. It is also compared with the traditional quadratic polynomial function. And the results show that our approach can improve the accuracy of the determination of the gazing point. Comparing with other methods, this mapping function is simple and valid.

  20. Parallax error in the monocular head-mounted eye trackers

    DEFF Research Database (Denmark)

    Mardanbeigi, Diako; Witzner Hansen, Dan

    2012-01-01

    each parameter affects the error. The optimum distribution of the error (magnitude and direction) in the field of view varies for different applications. However, the results can be used for finding the optimum parameters that are needed for designing a head-mounted gaze tracker. It has been shown...

  1. Investigating the Link Between Radiologists Gaze, Diagnostic Decision, and Image Content

    Energy Technology Data Exchange (ETDEWEB)

    Tourassi, Georgia [ORNL; Voisin, Sophie [ORNL; Paquit, Vincent C [ORNL; Krupinski, Elizabeth [University of Arizona

    2013-01-01

    Objective: To investigate machine learning for linking image content, human perception, cognition, and error in the diagnostic interpretation of mammograms. Methods: Gaze data and diagnostic decisions were collected from six radiologists who reviewed 20 screening mammograms while wearing a head-mounted eye-tracker. Texture analysis was performed in mammographic regions that attracted radiologists attention and in all abnormal regions. Machine learning algorithms were investigated to develop predictive models that link: (i) image content with gaze, (ii) image content and gaze with cognition, and (iii) image content, gaze, and cognition with diagnostic error. Both group-based and individualized models were explored. Results: By pooling the data from all radiologists machine learning produced highly accurate predictive models linking image content, gaze, cognition, and error. Merging radiologists gaze metrics and cognitive opinions with computer-extracted image features identified 59% of the radiologists diagnostic errors while confirming 96.2% of their correct diagnoses. The radiologists individual errors could be adequately predicted by modeling the behavior of their peers. However, personalized tuning appears to be beneficial in many cases to capture more accurately individual behavior. Conclusions: Machine learning algorithms combining image features with radiologists gaze data and diagnostic decisions can be effectively developed to recognize cognitive and perceptual errors associated with the diagnostic interpretation of mammograms.

  2. Simple gaze-contingent cues guide eye movements in a realistic driving simulator

    Science.gov (United States)

    Pomarjanschi, Laura; Dorr, Michael; Bex, Peter J.; Barth, Erhardt

    2013-03-01

    Looking at the right place at the right time is a critical component of driving skill. Therefore, gaze guidance has the potential to become a valuable driving assistance system. In previous work, we have already shown that complex gaze-contingent stimuli can guide attention and reduce the number of accidents in a simple driving simulator. We here set out to investigate whether cues that are simple enough to be implemented in a real car can also capture gaze during a more realistic driving task in a high-fidelity driving simulator. We used a state-of-the-art, wide-field-of-view driving simulator with an integrated eye tracker. Gaze-contingent warnings were implemented using two arrays of light-emitting diodes horizontally fitted below and above the simulated windshield. Thirteen volunteering subjects drove along predetermined routes in a simulated environment popu­ lated with autonomous traffic. Warnings were triggered during the approach to half of the intersections, cueing either towards the right or to the left. The remaining intersections were not cued, and served as controls. The analysis of the recorded gaze data revealed that the gaze-contingent cues did indeed have a gaze guiding effect, triggering a significant shift in gaze position towards the highlighted direction. This gaze shift was not accompanied by changes in driving behaviour, suggesting that the cues do not interfere with the driving task itself.

  3. Eye Tracker Accuracy: Quantitative Evaluation of the Invisible Eye Center Location

    OpenAIRE

    Wyder, Stephan; Cattin, Philippe C.

    2017-01-01

    Purpose. We present a new method to evaluate the accuracy of an eye tracker based eye localization system. Measuring the accuracy of an eye tracker's primary intention, the estimated point of gaze, is usually done with volunteers and a set of fixation points used as ground truth. However, verifying the accuracy of the location estimate of a volunteer's eye center in 3D space is not easily possible. This is because the eye center is an intangible point hidden by the iris. Methods. We evaluate ...

  4. Investigating the link between radiologists’ gaze, diagnostic decision, and image content

    Science.gov (United States)

    Tourassi, Georgia; Voisin, Sophie; Paquit, Vincent; Krupinski, Elizabeth

    2013-01-01

    Objective To investigate machine learning for linking image content, human perception, cognition, and error in the diagnostic interpretation of mammograms. Methods Gaze data and diagnostic decisions were collected from three breast imaging radiologists and three radiology residents who reviewed 20 screening mammograms while wearing a head-mounted eye-tracker. Image analysis was performed in mammographic regions that attracted radiologists’ attention and in all abnormal regions. Machine learning algorithms were investigated to develop predictive models that link: (i) image content with gaze, (ii) image content and gaze with cognition, and (iii) image content, gaze, and cognition with diagnostic error. Both group-based and individualized models were explored. Results By pooling the data from all readers, machine learning produced highly accurate predictive models linking image content, gaze, and cognition. Potential linking of those with diagnostic error was also supported to some extent. Merging readers’ gaze metrics and cognitive opinions with computer-extracted image features identified 59% of the readers’ diagnostic errors while confirming 97.3% of their correct diagnoses. The readers’ individual perceptual and cognitive behaviors could be adequately predicted by modeling the behavior of others. However, personalized tuning was in many cases beneficial for capturing more accurately individual behavior. Conclusions There is clearly an interaction between radiologists’ gaze, diagnostic decision, and image content which can be modeled with machine learning algorithms. PMID:23788627

  5. 60 GHz wireless data transfer for tracker readout systems—first studies and results

    International Nuclear Information System (INIS)

    Dittmeier, S.; Berger, N.; Schöning, A.; Soltveit, H.K.; Wiedner, D.

    2014-01-01

    To allow highly granular trackers to contribute to first level trigger decisions or event filtering, a fast readout system with very high bandwidth is required. Space, power and material constraints, however, pose severe limitations on the maximum available bandwidth of electrical or optical data transfers. A new approach for the implementation of a fast readout system is the application of a wireless data transfer at a carrier frequency of 60 GHz. The available bandwidth of several GHz allows for data rates of multiple Gbps per link. 60 GHz transceiver chips can be produced with a small form factor and a high integration level. A prototype transceiver currently under development at the University of Heidelberg is briefly described in this paper. To allow easy and fast future testing of the chip's functionality, a bit error rate test has been developed with a commercially available transceiver. Crosstalk might be a big issue for a wireless readout system with many links in a tracking detector. Direct crosstalk can be avoided by using directive antennas, linearly polarized waves and frequency channeling. Reflections from tracking modules can be reduced by applying an absorbing material like graphite foam. Properties of different materials typically used in tracking detectors and graphite foam in the 60 GHz frequency range are presented. For data transmission tests, links using commercially available 60 GHz transmitters and receivers are used. Studies regarding crosstalk and the applicability of graphite foam, Kapton horn antennas and polarized waves are shown

  6. Attention to the Mouth and Gaze Following in Infancy Predict Language Development

    Science.gov (United States)

    Tenenbaum, Elena J.; Sobel, David M.; Sheinkpof, Stephen J.; Malle, Bertram F.; Morgan, James L.

    2015-01-01

    We investigated longitudinal relations among gaze following and face scanning in infancy and later language development. At 12 months, infants watched videos of a woman describing an object while their passive viewing was measured with an eye-tracker. We examined the relation between infants' face scanning behavior and their tendency to follow the…

  7. Look at my poster! Active gaze, preference and memory during a poster session.

    Science.gov (United States)

    Foulsham, Tom; Kingstone, Alan

    2011-01-01

    In science, as in advertising, people often present information on a poster, yet little is known about attention during a poster session. A mobile eye-tracker was used to record participants' gaze during a mock poster session featuring a range of academic psychology posters. Participants spent the most time looking at introductions and conclusions. Larger posters were looked at for longer, as were posters rated more interesting (but not necessarily more aesthetically pleasing). Interestingly, gaze did not correlate with memory for poster details or liking, suggesting that attracting someone towards your poster may not be enough.

  8. Investigating gaze of children with ASD in naturalistic settings.

    Directory of Open Access Journals (Sweden)

    Basilio Noris

    Full Text Available BACKGROUND: Visual behavior is known to be atypical in Autism Spectrum Disorders (ASD. Monitor-based eye-tracking studies have measured several of these atypicalities in individuals with Autism. While atypical behaviors are known to be accentuated during natural interactions, few studies have been made on gaze behavior in natural interactions. In this study we focused on i whether the findings done in laboratory settings are also visible in a naturalistic interaction; ii whether new atypical elements appear when studying visual behavior across the whole field of view. METHODOLOGY/PRINCIPAL FINDINGS: Ten children with ASD and ten typically developing children participated in a dyadic interaction with an experimenter administering items from the Early Social Communication Scale (ESCS. The children wore a novel head-mounted eye-tracker, measuring gaze direction and presence of faces across the child's field of view. The analysis of gaze episodes to faces revealed that children with ASD looked significantly less and for shorter lapses of time at the experimenter. The analysis of gaze patterns across the child's field of view revealed that children with ASD looked downwards and made more extensive use of their lateral field of view when exploring the environment. CONCLUSIONS/SIGNIFICANCE: The data gathered in naturalistic settings confirm findings previously obtained only in monitor-based studies. Moreover, the study allowed to observe a generalized strategy of lateral gaze in children with ASD when they were looking at the objects in their environment.

  9. Relationship between abstract thinking and eye gaze pattern in patients with schizophrenia

    Science.gov (United States)

    2014-01-01

    Background Effective integration of visual information is necessary to utilize abstract thinking, but patients with schizophrenia have slow eye movement and usually explore limited visual information. This study examines the relationship between abstract thinking ability and the pattern of eye gaze in patients with schizophrenia using a novel theme identification task. Methods Twenty patients with schizophrenia and 22 healthy controls completed the theme identification task, in which subjects selected which word, out of a set of provided words, best described the theme of a picture. Eye gaze while performing the task was recorded by the eye tracker. Results Patients exhibited a significantly lower correct rate for theme identification and lesser fixation and saccade counts than controls. The correct rate was significantly correlated with the fixation count in patients, but not in controls. Conclusions Patients with schizophrenia showed impaired abstract thinking and decreased quality of gaze, which were positively associated with each other. Theme identification and eye gaze appear to be useful as tools for the objective measurement of abstract thinking in patients with schizophrenia. PMID:24739356

  10. Eye gaze tracking based on the shape of pupil image

    Science.gov (United States)

    Wang, Rui; Qiu, Jian; Luo, Kaiqing; Peng, Li; Han, Peng

    2018-01-01

    Eye tracker is an important instrument for research in psychology, widely used in attention, visual perception, reading and other fields of research. Because of its potential function in human-computer interaction, the eye gaze tracking has already been a topic of research in many fields over the last decades. Nowadays, with the development of technology, non-intrusive methods are more and more welcomed. In this paper, we will present a method based on the shape of pupil image to estimate the gaze point of human eyes without any other intrusive devices such as a hat, a pair of glasses and so on. After using the ellipse fitting algorithm to deal with the pupil image we get, we can determine the direction of the fixation by the shape of the pupil.The innovative aspect of this method is to utilize the new idea of the shape of the pupil so that we can avoid much complicated algorithm. The performance proposed is very helpful for the study of eye gaze tracking, which just needs one camera without infrared light to know the changes in the shape of the pupil to determine the direction of the eye gazing, no additional condition is required.

  11. Eye and head movements shape gaze shifts in Indian peafowl.

    Science.gov (United States)

    Yorzinski, Jessica L; Patricelli, Gail L; Platt, Michael L; Land, Michael F

    2015-12-01

    Animals selectively direct their visual attention toward relevant aspects of their environments. They can shift their attention using a combination of eye, head and body movements. While we have a growing understanding of eye and head movements in mammals, we know little about these processes in birds. We therefore measured the eye and head movements of freely behaving Indian peafowl (Pavo cristatus) using a telemetric eye-tracker. Both eye and head movements contributed to gaze changes in peafowl. When gaze shifts were smaller, eye movements played a larger role than when gaze shifts were larger. The duration and velocity of eye and head movements were positively related to the size of the eye and head movements, respectively. In addition, the coordination of eye and head movements in peafowl differed from that in mammals; peafowl exhibited a near-absence of the vestibulo-ocular reflex, which may partly result from the peafowl's ability to move their heads as quickly as their eyes. © 2015. Published by The Company of Biologists Ltd.

  12. Parent Perception of Two Eye-Gaze Control Technology Systems in Young Children with Cerebral Palsy: Pilot Study.

    Science.gov (United States)

    Karlsson, Petra; Wallen, Margaret

    2017-01-01

    Eye-gaze control technology enables people with significant physical disability to access computers for communication, play, learning and environmental control. This pilot study used a multiple case study design with repeated baseline assessment and parents' evaluations to compare two eye-gaze control technology systems to identify any differences in factors such as ease of use and impact of the systems for their young children. Five children, aged 3 to 5 years, with dyskinetic cerebral palsy, and their families participated. Overall, families were satisfied with both the Tobii PCEye Go and myGaze® eye tracker, found them easy to position and use, and children learned to operate them quickly. This technology provides young children with important opportunities for learning, play, leisure, and developing communication.

  13. Gaze shifts and fixations dominate gaze behavior of walking cats

    Science.gov (United States)

    Rivers, Trevor J.; Sirota, Mikhail G.; Guttentag, Andrew I.; Ogorodnikov, Dmitri A.; Shah, Neet A.; Beloozerova, Irina N.

    2014-01-01

    Vision is important for locomotion in complex environments. How it is used to guide stepping is not well understood. We used an eye search coil technique combined with an active marker-based head recording system to characterize the gaze patterns of cats walking over terrains of different complexity: (1) on a flat surface in the dark when no visual information was available, (2) on the flat surface in light when visual information was available but not required, (3) along the highly structured but regular and familiar surface of a horizontal ladder, a task for which visual guidance of stepping was required, and (4) along a pathway cluttered with many small stones, an irregularly structured surface that was new each day. Three cats walked in a 2.5 m corridor, and 958 passages were analyzed. Gaze activity during the time when the gaze was directed at the walking surface was subdivided into four behaviors based on speed of gaze movement along the surface: gaze shift (fast movement), gaze fixation (no movement), constant gaze (movement at the body’s speed), and slow gaze (the remainder). We found that gaze shifts and fixations dominated the cats’ gaze behavior during all locomotor tasks, jointly occupying 62–84% of the time when the gaze was directed at the surface. As visual complexity of the surface and demand on visual guidance of stepping increased, cats spent more time looking at the surface, looked closer to them, and switched between gaze behaviors more often. During both visually guided locomotor tasks, gaze behaviors predominantly followed a repeated cycle of forward gaze shift followed by fixation. We call this behavior “gaze stepping”. Each gaze shift took gaze to a site approximately 75–80 cm in front of the cat, which the cat reached in 0.7–1.2 s and 1.1–1.6 strides. Constant gaze occupied only 5–21% of the time cats spent looking at the walking surface. PMID:24973656

  14. Radial transfer of tracking data with wireless links

    CERN Document Server

    Pelikan, Daniel; Brenner, Richard; Dancila, Dragos; Gustafsson, Leif

    2014-01-01

    Wireless data transfer has revolutionized the consumer mar ket for the last decade giving products equipped with transmitters and receiver for wireless data t ransfer. Wireless technology has fea- tures attractive for data transfer in future tracking detec tors. The removal of wires and connectors for data links is certainly beneficial both for the material b udget and the reliability of the system. One other advantage is the freedom of routing signals which t oday is particularly complicated when bringing the data the first 50 cm outside the tracker. Wit h wireless links intelligence can be built into a tracker by introducing communication betwee n tracking layers within a Region Of Interest which would allow the construction of track primit ives in real time. The wireless signal is transmitted by a passive antenna structure which is a radiat ion hard and much less complex object than an optical link. Due to the requirement of high data rate s in detectors a high bandwidth is required. The frequency band aro...

  15. Quantifying the cognitive cost of laparo-endoscopic single-site surgeries: Gaze-based indices.

    Science.gov (United States)

    Di Stasi, Leandro L; Díaz-Piedra, Carolina; Ruiz-Rabelo, Juan Francisco; Rieiro, Héctor; Sanchez Carrion, Jose M; Catena, Andrés

    2017-11-01

    Despite the growing interest concerning the laparo-endoscopic single-site surgery (LESS) procedure, LESS presents multiple difficulties and challenges that are likely to increase the surgeon's cognitive cost, in terms of both cognitive load and performance. Nevertheless, there is currently no objective index capable of assessing the surgeon cognitive cost while performing LESS. We assessed if gaze-based indices might offer unique and unbiased measures to quantify LESS complexity and its cognitive cost. We expect that the assessment of surgeon's cognitive cost to improve patient safety by measuring fitness-for-duty and reducing surgeons overload. Using a wearable eye tracker device, we measured gaze entropy and velocity of surgical trainees and attending surgeons during two surgical procedures (LESS vs. multiport laparoscopy surgery [MPS]). None of the participants had previous experience with LESS. They performed two exercises with different complexity levels (Low: Pattern Cut vs. High: Peg Transfer). We also collected performance and subjective data. LESS caused higher cognitive demand than MPS, as indicated by increased gaze entropy in both surgical trainees and attending surgeons (exploration pattern became more random). Furthermore, gaze velocity was higher (exploration pattern became more rapid) for the LESS procedure independently of the surgeon's expertise. Perceived task complexity and laparoscopic accuracy confirmed gaze-based results. Gaze-based indices have great potential as objective and non-intrusive measures to assess surgeons' cognitive cost and fitness-for-duty. Furthermore, gaze-based indices might play a relevant role in defining future guidelines on surgeons' examinations to mark their achievements during the entire training (e.g. analyzing surgical learning curves). Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. A multimodal interface to resolve the Midas-Touch problem in gaze controlled wheelchair.

    Science.gov (United States)

    Meena, Yogesh Kumar; Cecotti, Hubert; Wong-Lin, KongFatt; Prasad, Girijesh

    2017-07-01

    Human-computer interaction (HCI) research has been playing an essential role in the field of rehabilitation. The usability of the gaze controlled powered wheelchair is limited due to Midas-Touch problem. In this work, we propose a multimodal graphical user interface (GUI) to control a powered wheelchair that aims to help upper-limb mobility impaired people in daily living activities. The GUI was designed to include a portable and low-cost eye-tracker and a soft-switch wherein the wheelchair can be controlled in three different ways: 1) with a touchpad 2) with an eye-tracker only, and 3) eye-tracker with soft-switch. The interface includes nine different commands (eight directions and stop) and integrated within a powered wheelchair system. We evaluated the performance of the multimodal interface in terms of lap-completion time, the number of commands, and the information transfer rate (ITR) with eight healthy participants. The analysis of the results showed that the eye-tracker with soft-switch provides superior performance with an ITR of 37.77 bits/min among the three different conditions (pusers.

  17. Wireless data transfer with mm-waves for future tracking detectors

    International Nuclear Information System (INIS)

    Pelikan, D.; Bingefors, N.; Brenner, R.; Gustafsson, L.; Dancila, D.

    2014-01-01

    Wireless data transfer has revolutionized the consumer market for the last decade generating many products equipped with transmitters and receivers for wireless data transfer. Wireless technology opens attractive possibilities for data transfer in future tracking detectors. The reduction of wires and connectors for data links is certainly beneficial both for the material budget and the reliability of the system. An advantage of wireless data transfer is the freedom of routing signals which today is particularly complicated when bringing the data the first 50 cm out of the tracker. With wireless links intelligence can be built into a tracker by introducing communication between tracking layers within a region of interest which would allow the construction of track primitives in real time. The wireless technology used in consumer products is however not suitable for tracker readouts. The low data transfer capacity of current 5 GHz transceivers and the relatively large feature sizes of the components is a disadvantage.Due to the requirement of high data rates in tracking detectors high bandwidth is required. The frequency band around 60 GHz turns out to be a very promising candidate for data transfer in a detector system. The high baseband frequency allows for data transfer in the order of several Gbit/s. Due to the small wavelength in the mm range only small structures are needed for the transmitting and receiving electronics. The 60 GHz frequency band is a strong candidate for future WLAN applications hence components are already starting to be available on the market.Patch antennas produced on flexible Printed Circuit Board substrate that can be used for wireless communication in future trackers are presented in this article. The antennas can be connected to transceivers for data transmission/reception or be connected by wave-guides to structures capable of bringing the 60 GHz signal behind boundaries. Results on simulation and fabrication of these antennas are

  18. Wireless data transfer with mm-waves for future tracking detectors

    Science.gov (United States)

    Pelikan, D.; Bingefors, N.; Brenner, R.; Dancila, D.; Gustafsson, L.

    2014-11-01

    Wireless data transfer has revolutionized the consumer market for the last decade generating many products equipped with transmitters and receivers for wireless data transfer. Wireless technology opens attractive possibilities for data transfer in future tracking detectors. The reduction of wires and connectors for data links is certainly beneficial both for the material budget and the reliability of the system. An advantage of wireless data transfer is the freedom of routing signals which today is particularly complicated when bringing the data the first 50 cm out of the tracker. With wireless links intelligence can be built into a tracker by introducing communication between tracking layers within a region of interest which would allow the construction of track primitives in real time. The wireless technology used in consumer products is however not suitable for tracker readouts. The low data transfer capacity of current 5 GHz transceivers and the relatively large feature sizes of the components is a disadvantage.Due to the requirement of high data rates in tracking detectors high bandwidth is required. The frequency band around 60 GHz turns out to be a very promising candidate for data transfer in a detector system. The high baseband frequency allows for data transfer in the order of several Gbit/s. Due to the small wavelength in the mm range only small structures are needed for the transmitting and receiving electronics. The 60 GHz frequency band is a strong candidate for future WLAN applications hence components are already starting to be available on the market.Patch antennas produced on flexible Printed Circuit Board substrate that can be used for wireless communication in future trackers are presented in this article. The antennas can be connected to transceivers for data transmission/reception or be connected by wave-guides to structures capable of bringing the 60 GHz signal behind boundaries. Results on simulation and fabrication of these antennas are

  19. TabletGaze: Unconstrained Appearance-based Gaze Estimation in Mobile Tablets

    OpenAIRE

    Huang, Qiong; Veeraraghavan, Ashok; Sabharwal, Ashutosh

    2015-01-01

    We study gaze estimation on tablets, our key design goal is uncalibrated gaze estimation using the front-facing camera during natural use of tablets, where the posture and method of holding the tablet is not constrained. We collected the first large unconstrained gaze dataset of tablet users, labeled Rice TabletGaze dataset. The dataset consists of 51 subjects, each with 4 different postures and 35 gaze locations. Subjects vary in race, gender and in their need for prescription glasses, all o...

  20. Eye-Tracking Technology and the Dynamics of Natural Gaze Behavior in Sports: A Systematic Review of 40 Years of Research.

    Science.gov (United States)

    Kredel, Ralf; Vater, Christian; Klostermann, André; Hossner, Ernst-Joachim

    2017-01-01

    Reviewing 60 studies on natural gaze behavior in sports, it becomes clear that, over the last 40 years, the use of eye-tracking devices has considerably increased. Specifically, this review reveals the large variance of methods applied, analyses performed, and measures derived within the field. The results of sub-sample analyses suggest that sports-related eye-tracking research strives, on the one hand, for ecologically valid test settings (i.e., viewing conditions and response modes), while on the other, for experimental control along with high measurement accuracy (i.e., controlled test conditions with high-frequency eye-trackers linked to algorithmic analyses). To meet both demands, some promising compromises of methodological solutions have been proposed-in particular, the integration of robust mobile eye-trackers in motion-capture systems. However, as the fundamental trade-off between laboratory and field research cannot be solved by technological means, researchers need to carefully weigh the arguments for one or the other approach by accounting for the respective consequences. Nevertheless, for future research on dynamic gaze behavior in sports, further development of the current mobile eye-tracking methodology seems highly advisable to allow for the acquisition and algorithmic analyses of larger amounts of gaze-data and further, to increase the explanatory power of the derived results.

  1. Eye-Tracking Technology and the Dynamics of Natural Gaze Behavior in Sports: A Systematic Review of 40 Years of Research

    Directory of Open Access Journals (Sweden)

    Ralf Kredel

    2017-10-01

    Full Text Available Reviewing 60 studies on natural gaze behavior in sports, it becomes clear that, over the last 40 years, the use of eye-tracking devices has considerably increased. Specifically, this review reveals the large variance of methods applied, analyses performed, and measures derived within the field. The results of sub-sample analyses suggest that sports-related eye-tracking research strives, on the one hand, for ecologically valid test settings (i.e., viewing conditions and response modes, while on the other, for experimental control along with high measurement accuracy (i.e., controlled test conditions with high-frequency eye-trackers linked to algorithmic analyses. To meet both demands, some promising compromises of methodological solutions have been proposed—in particular, the integration of robust mobile eye-trackers in motion-capture systems. However, as the fundamental trade-off between laboratory and field research cannot be solved by technological means, researchers need to carefully weigh the arguments for one or the other approach by accounting for the respective consequences. Nevertheless, for future research on dynamic gaze behavior in sports, further development of the current mobile eye-tracking methodology seems highly advisable to allow for the acquisition and algorithmic analyses of larger amounts of gaze-data and further, to increase the explanatory power of the derived results.

  2. E-gaze : create gaze communication for peoplewith visual disability

    NARCIS (Netherlands)

    Qiu, S.; Osawa, H.; Hu, J.; Rauterberg, G.W.M.

    2015-01-01

    Gaze signals are frequently used by the sighted in social interactions as visual cues. However, these signals and cues are hardly accessible for people with visual disability. A conceptual design of E-Gaze glasses is proposed, assistive to create gaze communication between blind and sighted people

  3. Cheating experience: Guiding novices to adopt the gaze strategies of experts expedites the learning of technical laparoscopic skills.

    Science.gov (United States)

    Vine, Samuel J; Masters, Rich S W; McGrath, John S; Bright, Elizabeth; Wilson, Mark R

    2012-07-01

    Previous research has demonstrated that trainees can be taught (via explicit verbal instruction) to adopt the gaze strategies of expert laparoscopic surgeons. The current study examined a software template designed to guide trainees to adopt expert gaze control strategies passively, without being provided with explicit instructions. We examined 27 novices (who had no laparoscopic training) performing 50 learning trials of a laparoscopic training task in either a discovery-learning (DL) group or a gaze-training (GT) group while wearing an eye tracker to assess gaze control. The GT group performed trials using a surgery-training template (STT); software that is designed to guide expert-like gaze strategies by highlighting the key locations on the monitor screen. The DL group had a normal, unrestricted view of the scene on the monitor screen. Both groups then took part in a nondelayed retention test (to assess learning) and a stress test (under social evaluative threat) with a normal view of the scene. The STT was successful in guiding the GT group to adopt an expert-like gaze strategy (displaying more target-locking fixations). Adopting expert gaze strategies led to an improvement in performance for the GT group, which outperformed the DL group in both retention and stress tests (faster completion time and fewer errors). The STT is a practical and cost-effective training interface that automatically promotes an optimal gaze strategy. Trainees who are trained to adopt the efficient target-locking gaze strategy of experts gain a performance advantage over trainees left to discover their own strategies for task completion. Copyright © 2012 Mosby, Inc. All rights reserved.

  4. Fusion of P300 and eye-tracker data for spelling using BCI2000

    Science.gov (United States)

    Kalika, Dmitry; Collins, Leslie; Caves, Kevin; Throckmorton, Chandra

    2017-10-01

    Objective. Various augmentative and alternative communication (AAC) devices have been developed in order to aid communication for individuals with communication disorders. Recently, there has been interest in combining EEG data and eye-gaze data with the goal of developing a hybrid (or ‘fused’) BCI (hBCI) AAC system. This work explores the effectiveness of a speller that fuses data from an eye-tracker and the P300 speller in order to create a hybrid P300 speller. Approach. This hybrid speller collects both eye-tracking and EEG data in parallel, and the user spells characters on the screen in the same way that they would if they were only using the P300 speller. Online and offline experiments were performed. The online experiments measured the performance of the speller for sixteen non-disabled participants, while the offline simulations were used to assess the robustness of the hybrid system. Main results. Online results showed that for fifteen non-disabled participants, using eye-gaze in a Bayesian framework with EEG data from the P300 speller improved accuracy (0.0163+/- 2.72 , 0.085+/- 0.111 , 0.080+/- 0.106 for estimated, medium and high variance configurations) and reduced the average number of flashes required to spell a character compared to the standard P300 speller that relies solely on EEG data (-53.27+/- 25.87 , -36.15+/- 19.3 , -18.85+/- 12.43 for estimated, medium and high variance configurations). Offline simulations indicate that the system provides more robust performance than a standalone eye gaze system. Significance. The results of this work on non-disabled participants shows the potential efficacy of hybrid P300 and eye-tracker speller. Further validation on the amyotrophic lateral sceloris population is needed to assess the benefit of this hybrid system.

  5. Single gaze gestures

    DEFF Research Database (Denmark)

    Møllenbach, Emilie; Lilholm, Martin; Gail, Alastair

    2010-01-01

    This paper examines gaze gestures and their applicability as a generic selection method for gaze-only controlled interfaces. The method explored here is the Single Gaze Gesture (SGG), i.e. gestures consisting of a single point-to-point eye movement. Horizontal and vertical, long and short SGGs were...

  6. Gaze beats mouse

    DEFF Research Database (Denmark)

    Mateo, Julio C.; San Agustin, Javier; Hansen, John Paulin

    2008-01-01

    Facial EMG for selection is fast, easy and, combined with gaze pointing, it can provide completely hands-free interaction. In this pilot study, 5 participants performed a simple point-and-select task using mouse or gaze for pointing and a mouse button or a facial-EMG switch for selection. Gaze...

  7. Exploiting Three-Dimensional Gaze Tracking for Action Recognition During Bimanual Manipulation to Enhance Human–Robot Collaboration

    Directory of Open Access Journals (Sweden)

    Alireza Haji Fathaliyan

    2018-04-01

    Full Text Available Human–robot collaboration could be advanced by facilitating the intuitive, gaze-based control of robots, and enabling robots to recognize human actions, infer human intent, and plan actions that support human goals. Traditionally, gaze tracking approaches to action recognition have relied upon computer vision-based analyses of two-dimensional egocentric camera videos. The objective of this study was to identify useful features that can be extracted from three-dimensional (3D gaze behavior and used as inputs to machine learning algorithms for human action recognition. We investigated human gaze behavior and gaze–object interactions in 3D during the performance of a bimanual, instrumental activity of daily living: the preparation of a powdered drink. A marker-based motion capture system and binocular eye tracker were used to reconstruct 3D gaze vectors and their intersection with 3D point clouds of objects being manipulated. Statistical analyses of gaze fixation duration and saccade size suggested that some actions (pouring and stirring may require more visual attention than other actions (reach, pick up, set down, and move. 3D gaze saliency maps, generated with high spatial resolution for six subtasks, appeared to encode action-relevant information. The “gaze object sequence” was used to capture information about the identity of objects in concert with the temporal sequence in which the objects were visually regarded. Dynamic time warping barycentric averaging was used to create a population-based set of characteristic gaze object sequences that accounted for intra- and inter-subject variability. The gaze object sequence was used to demonstrate the feasibility of a simple action recognition algorithm that utilized a dynamic time warping Euclidean distance metric. Averaged over the six subtasks, the action recognition algorithm yielded an accuracy of 96.4%, precision of 89.5%, and recall of 89.2%. This level of performance suggests that

  8. Adaptive Gaze Strategies for Locomotion with Constricted Visual Field

    Directory of Open Access Journals (Sweden)

    Colas N. Authié

    2017-07-01

    Full Text Available In retinitis pigmentosa (RP, loss of peripheral visual field accounts for most difficulties encountered in visuo-motor coordination during locomotion. The purpose of this study was to accurately assess the impact of peripheral visual field loss on gaze strategies during locomotion, and identify compensatory mechanisms. Nine RP subjects presenting a central visual field limited to 10–25° in diameter, and nine healthy subjects were asked to walk in one of three directions—straight ahead to a visual target, leftward and rightward through a door frame, with or without obstacle on the way. Whole body kinematics were recorded by motion capture, and gaze direction in space was reconstructed using an eye-tracker. Changes in gaze strategies were identified in RP subjects, including extensive exploration prior to walking, frequent fixations of the ground (even knowing no obstacle was present, of door edges, essentially of the proximal one, of obstacle edge/corner, and alternating door edges fixations when approaching the door. This was associated with more frequent, sometimes larger rapid-eye-movements, larger movements, and forward tilting of the head. Despite the visual handicap, the trajectory geometry was identical between groups, with a small decrease in walking speed in RPs. These findings identify the adaptive changes in sensory-motor coordination, in order to ensure visual awareness of the surrounding, detect changes in spatial configuration, collect information for self-motion, update the postural reference frame, and update egocentric distances to environmental objects. They are of crucial importance for the design of optimized rehabilitation procedures.

  9. Adaptive Gaze Strategies for Locomotion with Constricted Visual Field

    Science.gov (United States)

    Authié, Colas N.; Berthoz, Alain; Sahel, José-Alain; Safran, Avinoam B.

    2017-01-01

    In retinitis pigmentosa (RP), loss of peripheral visual field accounts for most difficulties encountered in visuo-motor coordination during locomotion. The purpose of this study was to accurately assess the impact of peripheral visual field loss on gaze strategies during locomotion, and identify compensatory mechanisms. Nine RP subjects presenting a central visual field limited to 10–25° in diameter, and nine healthy subjects were asked to walk in one of three directions—straight ahead to a visual target, leftward and rightward through a door frame, with or without obstacle on the way. Whole body kinematics were recorded by motion capture, and gaze direction in space was reconstructed using an eye-tracker. Changes in gaze strategies were identified in RP subjects, including extensive exploration prior to walking, frequent fixations of the ground (even knowing no obstacle was present), of door edges, essentially of the proximal one, of obstacle edge/corner, and alternating door edges fixations when approaching the door. This was associated with more frequent, sometimes larger rapid-eye-movements, larger movements, and forward tilting of the head. Despite the visual handicap, the trajectory geometry was identical between groups, with a small decrease in walking speed in RPs. These findings identify the adaptive changes in sensory-motor coordination, in order to ensure visual awareness of the surrounding, detect changes in spatial configuration, collect information for self-motion, update the postural reference frame, and update egocentric distances to environmental objects. They are of crucial importance for the design of optimized rehabilitation procedures. PMID:28798674

  10. New perspectives in gaze sensitivity research.

    Science.gov (United States)

    Davidson, Gabrielle L; Clayton, Nicola S

    2016-03-01

    Attending to where others are looking is thought to be of great adaptive benefit for animals when avoiding predators and interacting with group members. Many animals have been reported to respond to the gaze of others, by co-orienting their gaze with group members (gaze following) and/or responding fearfully to the gaze of predators or competitors (i.e., gaze aversion). Much of the literature has focused on the cognitive underpinnings of gaze sensitivity, namely whether animals have an understanding of the attention and visual perspectives in others. Yet there remain several unanswered questions regarding how animals learn to follow or avoid gaze and how experience may influence their behavioral responses. Many studies on the ontogeny of gaze sensitivity have shed light on how and when gaze abilities emerge and change across development, indicating the necessity to explore gaze sensitivity when animals are exposed to additional information from their environment as adults. Gaze aversion may be dependent upon experience and proximity to different predator types, other cues of predation risk, and the salience of gaze cues. Gaze following in the context of information transfer within social groups may also be dependent upon experience with group-members; therefore we propose novel means to explore the degree to which animals respond to gaze in a flexible manner, namely by inhibiting or enhancing gaze following responses. We hope this review will stimulate gaze sensitivity research to expand beyond the narrow scope of investigating underlying cognitive mechanisms, and to explore how gaze cues may function to communicate information other than attention.

  11. A GazeWatch Prototype

    DEFF Research Database (Denmark)

    Paulin Hansen, John; Biermann, Florian; Møllenbach, Emile

    2015-01-01

    We demonstrate potentials of adding a gaze tracking unit to a smartwatch, allowing hands-free interaction with the watch itself and control of the environment. Users give commands via gaze gestures, i.e. looking away and back to the GazeWatch. Rapid presentation of single words on the watch displ...... provides a rich and effective textual interface. Finally, we exemplify how the GazeWatch can be used as a ubiquitous pointer on large displays....

  12. Gaze-Aware Streaming Solutions for the Next Generation of Mobile VR Experiences.

    Science.gov (United States)

    Lungaro, Pietro; Sjoberg, Rickard; Valero, Alfredo Jose Fanghella; Mittal, Ashutosh; Tollmar, Konrad

    2018-04-01

    This paper presents a novel approach to content delivery for video streaming services. It exploits information from connected eye-trackers embedded in the next generation of VR Head Mounted Displays (HMDs). The proposed solution aims to deliver high visual quality, in real time, around the users' fixations points while lowering the quality everywhere else. The goal of the proposed approach is to substantially reduce the overall bandwidth requirements for supporting VR video experiences while delivering high levels of user perceived quality. The prerequisites to achieve these results are: (1) mechanisms that can cope with different degrees of latency in the system and (2) solutions that support fast adaptation of video quality in different parts of a frame, without requiring a large increase in bitrate. A novel codec configuration, capable of supporting near-instantaneous video quality adaptation in specific portions of a video frame, is presented. The proposed method exploits in-built properties of HEVC encoders and while it introduces a moderate amount of error, these errors are indetectable by users. Fast adaptation is the key to enable gaze-aware streaming and its reduction in bandwidth. A testbed implementing gaze-aware streaming, together with a prototype HMD with in-built eye tracker, is presented and was used for testing with real users. The studies quantified the bandwidth savings achievable by the proposed approach and characterize the relationships between Quality of Experience (QoE) and network latency. The results showed that up to 83% less bandwidth is required to deliver high QoE levels to the users, as compared to conventional solutions.

  13. Follow My Eyes: The Gaze of Politicians Reflexively Captures the Gaze of Ingroup Voters

    Science.gov (United States)

    Liuzza, Marco Tullio; Cazzato, Valentina; Vecchione, Michele; Crostella, Filippo; Caprara, Gian Vittorio; Aglioti, Salvatore Maria

    2011-01-01

    Studies in human and non-human primates indicate that basic socio-cognitive operations are inherently linked to the power of gaze in capturing reflexively the attention of an observer. Although monkey studies indicate that the automatic tendency to follow the gaze of a conspecific is modulated by the leader-follower social status, evidence for such effects in humans is meager. Here, we used a gaze following paradigm where the directional gaze of right- or left-wing Italian political characters could influence the oculomotor behavior of ingroup or outgroup voters. We show that the gaze of Berlusconi, the right-wing leader currently dominating the Italian political landscape, potentiates and inhibits gaze following behavior in ingroup and outgroup voters, respectively. Importantly, the higher the perceived similarity in personality traits between voters and Berlusconi, the stronger the gaze interference effect. Thus, higher-order social variables such as political leadership and affiliation prepotently affect reflexive shifts of attention. PMID:21957479

  14. Is gaze following purely reflexive or goal-directed instead? Revisiting the automaticity of orienting attention by gaze cues.

    Science.gov (United States)

    Ricciardelli, Paola; Carcagno, Samuele; Vallar, Giuseppe; Bricolo, Emanuela

    2013-01-01

    Distracting gaze has been shown to elicit automatic gaze following. However, it is still debated whether the effects of perceived gaze are a simple automatic spatial orienting response or are instead sensitive to the context (i.e. goals and task demands). In three experiments, we investigated the conditions under which gaze following occurs. Participants were instructed to saccade towards one of two lateral targets. A face distracter, always present in the background, could gaze towards: (a) a task-relevant target--("matching" goal-directed gaze shift)--congruent or incongruent with the instructed direction, (b) a task-irrelevant target, orthogonal to the one instructed ("non-matching" goal-directed gaze shift), or (c) an empty spatial location (no-goal-directed gaze shift). Eye movement recordings showed faster saccadic latencies in correct trials in congruent conditions especially when the distracting gaze shift occurred before the instruction to make a saccade. Interestingly, while participants made a higher proportion of gaze-following errors (i.e. errors in the direction of the distracting gaze) in the incongruent conditions when the distracter's gaze shift preceded the instruction onset indicating an automatic gaze following, they never followed the distracting gaze when it was directed towards an empty location or a stimulus that was never the target. Taken together, these findings suggest that gaze following is likely to be a product of both automatic and goal-driven orienting mechanisms.

  15. Multigigabit wireless transfer of trigger data through millimetre wave technology

    International Nuclear Information System (INIS)

    Brenner, R; Cheng, S

    2010-01-01

    The amount of data that can be transferred from highly granular tracking detectors with several million channels is today limited by the available bandwidth in the readout links which again is limited by power budget, mass and the available space for services. The low bandwidth prevents the tracker from being fully read out in real time which is a requirement for becomming a part of the first level trigger. To get the tracker to contribute to the fast trigger decision the data transfer bandwidth from the tracker has either to be increased for all data to be read out in real time or the quantity of the data to be reduced by improving the quality of the data or a combination of the two. A higher data transfer rate can be achieved by increasing the the number of data links, the data transfer speed or a combination of both. The quantity of data read out from the detector can be reduced by introducing on-detector intelligence. Next generation multigigabit wireless technology has several features that makes the technology attractive for use in future trackers. The technology can provide both higher bandwidth for data readout and means to build on-detector intelligence to improve the quality of data. The emerging millimetre wave technology offers components that are small size,low power and mass thus well suited for integration in trackers. In this paper the feasibility of wireless transfer of trigger data using 60 GHz radio in the future upgraded tracker at the Super Large Hadron Collider (SLHC) is investigated.

  16. Conjugate Gaze Palsies

    Science.gov (United States)

    ... version Home Brain, Spinal Cord, and Nerve Disorders Cranial Nerve Disorders Conjugate Gaze Palsies Horizontal gaze palsy Vertical ... Version. DOCTORS: Click here for the Professional Version Cranial Nerve Disorders Overview of the Cranial Nerves Internuclear Ophthalmoplegia ...

  17. Design of a Binocular Pupil and Gaze Point Detection System Utilizing High Definition Images

    Directory of Open Access Journals (Sweden)

    Yilmaz Durna

    2017-05-01

    Full Text Available This study proposes a novel binocular pupil and gaze detection system utilizing a remote full high definition (full HD camera and employing LabVIEW. LabVIEW is inherently parallel and has fewer time-consuming algorithms. Many eye tracker applications are monocular and use low resolution cameras due to real-time image processing difficulties. We utilized the computer’s direct access memory channel for rapid data transmission and processed full HD images with LabVIEW. Full HD images make easier determinations of center coordinates/sizes of pupil and corneal reflection. We modified the camera so that the camera sensor passed only infrared (IR images. Glints were taken as reference points for region of interest (ROI area selection of the eye region in the face image. A morphologic filter was applied for erosion of noise, and a weighted average technique was used for center detection. To test system accuracy with 11 participants, we produced a visual stimulus set up to analyze each eye’s movement. Nonlinear mapping function was utilized for gaze estimation. Pupil size, pupil position, glint position and gaze point coordinates were obtained with free natural head movements in our system. This system also works at 2046 × 1086 resolution at 40 frames per second. It is assumed that 280 frames per second for 640 × 480 pixel images is the case. Experimental results show that the average gaze detection error for 11 participants was 0.76° for the left eye, 0.89° for right eye and 0.83° for the mean of two eyes.

  18. Gazes

    DEFF Research Database (Denmark)

    Khawaja, Iram

    , and the different strategies of positioning they utilize are studied and identified. The first strategy is to confront stereotyping prejudices and gazes, thereby attempting to position oneself in a counteracting way. The second is to transform and try to normalise external characteristics, such as clothing...... and other symbols that indicate Muslimness. A third strategy is to play along and allow the prejudice in question to remain unchallenged. A fourth is to join and participate in religious communities and develop an alternate sense of belonging to a wider community of Muslims. The concept of panoptical gazes...

  19. Evidence for a link between changes to gaze behaviour and risk of falling in older adults during adaptive locomotion.

    Science.gov (United States)

    Chapman, G J; Hollands, M A

    2006-11-01

    There is increasing evidence that gaze stabilization with respect to footfall targets plays a crucial role in the control of visually guided stepping and that there are significant changes to gaze behaviour as we age. However, past research has not measured if age-related changes in gaze behaviour are associated with changes to stepping performance. This paper aims to identify differences in gaze behaviour between young (n=8) adults, older adults determined to be at a low-risk of falling (low-risk, n=4) and older adults prone to falling (high-risk, n=4) performing an adaptive locomotor task and attempts to relate observed differences in gaze behaviour to decline in stepping performance. Participants walked at a self-selected pace along a 9m pathway stepping into two footfall target locations en route. Gaze behaviour and lower limb kinematics were recorded using an ASL 500 gaze tracker interfaced with a Vicon motion analysis system. Results showed that older adults looked significantly sooner to targets, and fixated the targets for longer, than younger adults. There were also significant differences in these measures between high and low-risk older adults. On average, high-risk older adults looked away from targets significantly sooner and demonstrated less accurate and more variable foot placements than younger adults and low-risk older adults. These findings suggest that, as we age, we need more time to plan precise stepping movements and clearly demonstrate that there are differences between low-risk and high-risk older adults in both where and when they look at future stepping targets and the precision with which they subsequently step. We propose that high-risk older adults may prioritize the planning of future actions over the accurate execution of ongoing movements and that adoption of this strategy may contribute to an increased likelihood of falls. Copyright 2005 Elsevier B.V.

  20. Eye Movements in Gaze Interaction

    DEFF Research Database (Denmark)

    Møllenbach, Emilie; Hansen, John Paulin; Lillholm, Martin

    2013-01-01

    Gaze as a sole input modality must support complex navigation and selection tasks. Gaze interaction combines specific eye movements and graphic display objects (GDOs). This paper suggests a unifying taxonomy of gaze interaction principles. The taxonomy deals with three types of eye movements...

  1. Fearful gaze cueing: gaze direction and facial expression independently influence overt orienting responses in 12-month-olds.

    Directory of Open Access Journals (Sweden)

    Reiko Matsunaka

    Full Text Available Gaze direction cues and facial expressions have been shown to influence object processing in infants. For example, infants around 12 months of age utilize others' gaze directions and facial expressions to regulate their own behaviour toward an ambiguous target (i.e., social referencing. However, the mechanism by which social signals influence overt orienting in infants is unclear. The present study examined the effects of static gaze direction cues and facial expressions (neutral vs. fearful on overt orienting using a gaze-cueing paradigm in 6- and 12-month-old infants. Two experiments were conducted: in Experiment 1, a face with a leftward or rightward gaze direction was used as a cue, and a face with a forward gaze direction was added in Experiment 2. In both experiments, an effect of facial expression was found in 12-month-olds; no effect was found in 6-month-olds. Twelve-month-old infants exhibited more rapid overt orienting in response to fearful expressions than neutral expressions, irrespective of gaze direction. These findings suggest that gaze direction information and facial expressions independently influence overt orienting in infants, and the effect of facial expression emerges earlier than that of static gaze direction. Implications for the development of gaze direction and facial expression processing systems are discussed.

  2. Eye gaze in intelligent user interfaces gaze-based analyses, models and applications

    CERN Document Server

    Nakano, Yukiko I; Bader, Thomas

    2013-01-01

    Remarkable progress in eye-tracking technologies opened the way to design novel attention-based intelligent user interfaces, and highlighted the importance of better understanding of eye-gaze in human-computer interaction and human-human communication. For instance, a user's focus of attention is useful in interpreting the user's intentions, their understanding of the conversation, and their attitude towards the conversation. In human face-to-face communication, eye gaze plays an important role in floor management, grounding, and engagement in conversation.Eye Gaze in Intelligent User Interfac

  3. Gazing and Performing

    DEFF Research Database (Denmark)

    Larsen, Jonas; Urry, John

    2011-01-01

    The Tourist Gaze [Urry J, 1990 (Sage, London)] is one of the most discussed and cited tourism books (with about 4000 citations on Google scholar). Whilst wide ranging in scope, the book is known for the Foucault-inspired concept of the tourist gaze that brings out the fundamentally visual and image...

  4. Gaze as a biometric

    Science.gov (United States)

    Yoon, Hong-Jun; Carmichael, Tandy R.; Tourassi, Georgia

    2014-03-01

    Two people may analyze a visual scene in two completely different ways. Our study sought to determine whether human gaze may be used to establish the identity of an individual. To accomplish this objective we investigated the gaze pattern of twelve individuals viewing still images with different spatial relationships. Specifically, we created 5 visual "dotpattern" tests to be shown on a standard computer monitor. These tests challenged the viewer's capacity to distinguish proximity, alignment, and perceptual organization. Each test included 50 images of varying difficulty (total of 250 images). Eye-tracking data were collected from each individual while taking the tests. The eye-tracking data were converted into gaze velocities and analyzed with Hidden Markov Models to develop personalized gaze profiles. Using leave-one-out cross-validation, we observed that these personalized profiles could differentiate among the 12 users with classification accuracy ranging between 53% and 76%, depending on the test. This was statistically significantly better than random guessing (i.e., 8.3% or 1 out of 12). Classification accuracy was higher for the tests where the users' average gaze velocity per case was lower. The study findings support the feasibility of using gaze as a biometric or personalized biomarker. These findings could have implications in Radiology training and the development of personalized e-learning environments.

  5. Gaze as a biometric

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Hong-Jun [ORNL; Carmichael, Tandy [Tennessee Technological University; Tourassi, Georgia [ORNL

    2014-01-01

    Two people may analyze a visual scene in two completely different ways. Our study sought to determine whether human gaze may be used to establish the identity of an individual. To accomplish this objective we investigated the gaze pattern of twelve individuals viewing different still images with different spatial relationships. Specifically, we created 5 visual dot-pattern tests to be shown on a standard computer monitor. These tests challenged the viewer s capacity to distinguish proximity, alignment, and perceptual organization. Each test included 50 images of varying difficulty (total of 250 images). Eye-tracking data were collected from each individual while taking the tests. The eye-tracking data were converted into gaze velocities and analyzed with Hidden Markov Models to develop personalized gaze profiles. Using leave-one-out cross-validation, we observed that these personalized profiles could differentiate among the 12 users with classification accuracy ranging between 53% and 76%, depending on the test. This was statistically significantly better than random guessing (i.e., 8.3% or 1 out of 12). Classification accuracy was higher for the tests where the users average gaze velocity per case was lower. The study findings support the feasibility of using gaze as a biometric or personalized biomarker. These findings could have implications in Radiology training and the development of personalized e-learning environments.

  6. The Epistemology of the Gaze

    DEFF Research Database (Denmark)

    Kramer, Mette

    2007-01-01

    In psycho-semiotic film theory the gaze is often considered to be a straitjacket for the female spectator. If we approach the gaze from an empiric so-called ‘naturalised’ lens, it is possible to regard the gaze as a functional devise through which the spectator can obtain knowledge essential for ...... for her self-preservation....

  7. AmbiGaze : direct control of ambient devices by gaze

    OpenAIRE

    Velloso, Eduardo; Wirth, Markus; Weichel, Christian; Abreu Esteves, Augusto Emanuel; Gellersen, Hans-Werner Georg

    2016-01-01

    Eye tracking offers many opportunities for direct device control in smart environments, but issues such as the need for calibration and the Midas touch problem make it impractical. In this paper, we propose AmbiGaze, a smart environment that employs the animation of targets to provide users with direct control of devices by gaze only through smooth pursuit tracking. We propose a design space of means of exposing functionality through movement and illustrate the concept through four prototypes...

  8. A smart car for the surface shape measurement of large antenna based on laser tracker

    Science.gov (United States)

    Gu, Yonggang; Hu, Jing; Jin, Yi; Zhai, Chao

    2012-09-01

    The geometric accuracy of the surface shape of large antenna is an important indicator of antenna’s quality. Currently, high-precision measurement of large antenna surface shape can be performed in two ways: photogrammetry and laser tracker. Photogrammetry is a rapid method, but its accuracy is not enough good. Laser tracker can achieve high precision, but it is very inconvenient to move the reflector (target mirror) on the surface of the antenna by hand during the measurement. So, a smart car is designed to carry the reflector in this paper. The car, controlled by wireless, has a small weight and a strong ability for climbing, and there is a holding bracket gripping the reflector and controlling reflector rise up and drop down on the car. During the measurement of laser tracker, the laser beam between laser tracker and the reflector must not be interrupted, so two high-precision three-dimensional miniature electronic compasses, which can real-time monitor the relative angle between the holding bracket and the laser tracker’s head, are both equipped on the car and the head of laser tracker to achieve automatic alignment between reflector and laser beam. With the aid of the smart car, the measurement of laser tracker has the advantages of high precision and rapidity.

  9. Demo of Gaze Controlled Flying

    DEFF Research Database (Denmark)

    Alapetite, Alexandre; Hansen, John Paulin; Scott MacKenzie, I.

    2012-01-01

    Development of a control paradigm for unmanned aerial vehicles (UAV) is a new challenge to HCI. The demo explores how to use gaze as input for locomotion in 3D. A low-cost drone will be controlled by tracking user’s point of regard (gaze) on a live video stream from the UAV.......Development of a control paradigm for unmanned aerial vehicles (UAV) is a new challenge to HCI. The demo explores how to use gaze as input for locomotion in 3D. A low-cost drone will be controlled by tracking user’s point of regard (gaze) on a live video stream from the UAV....

  10. Gazes and Performances

    DEFF Research Database (Denmark)

    Larsen, Jonas

    ethnographic studies I spell out the embodied, hybridised, mobile and performative nature of tourist gazing especially with regard to tourist photography. The talk draws on my recent book Tourism, Performance and the Everyday: Consuming the Orient (Routledge, 2009, With M. Haldrup) and the substantially......Abstract: Recent literature has critiqued this notion of the 'tourist gaze' for reducing tourism to visual experiences 'sightseeing' and neglecting other senses and bodily experiences of doing tourism. A so-called 'performance turn' within tourist studies highlights how tourists experience places...... to onceptualise the corporeality of tourist bodies and the embodied actions of and interactions between tourist workers, tourists and 'locals' on various stages. It has been suggested that it is necessary to choose between gazing and performing as the tourism paradigm (Perkin and Thorns 2001). Rather than...

  11. Eye gaze performance for children with severe physical impairments using gaze-based assistive technology-A longitudinal study.

    Science.gov (United States)

    Borgestig, Maria; Sandqvist, Jan; Parsons, Richard; Falkmer, Torbjörn; Hemmingsson, Helena

    2016-01-01

    Gaze-based assistive technology (gaze-based AT) has the potential to provide children affected by severe physical impairments with opportunities for communication and activities. This study aimed to examine changes in eye gaze performance over time (time on task and accuracy) in children with severe physical impairments, without speaking ability, using gaze-based AT. A longitudinal study with a before and after design was conducted on 10 children (aged 1-15 years) with severe physical impairments, who were beginners to gaze-based AT at baseline. Thereafter, all children used the gaze-based AT in daily activities over the course of the study. Compass computer software was used to measure time on task and accuracy with eye selection of targets on screen, and tests were performed with the children at baseline, after 5 months, 9-11 months, and after 15-20 months. Findings showed that the children improved in time on task after 5 months and became more accurate in selecting targets after 15-20 months. This study indicates that these children with severe physical impairments, who were unable to speak, could improve in eye gaze performance. However, the children needed time to practice on a long-term basis to acquire skills needed to develop fast and accurate eye gaze performance.

  12. A Gaze-Driven Evolutionary Algorithm to Study Aesthetic Evaluation of Visual Symmetry

    Directory of Open Access Journals (Sweden)

    Alexis D. J. Makin

    2016-03-01

    Full Text Available Empirical work has shown that people like visual symmetry. We used a gaze-driven evolutionary algorithm technique to answer three questions about symmetry preference. First, do people automatically evaluate symmetry without explicit instruction? Second, is perfect symmetry the best stimulus, or do people prefer a degree of imperfection? Third, does initial preference for symmetry diminish after familiarity sets in? Stimuli were generated as phenotypes from an algorithmic genotype, with genes for symmetry (coded as deviation from a symmetrical template, deviation–symmetry, DS gene and orientation (0° to 90°, orientation, ORI gene. An eye tracker identified phenotypes that were good at attracting and retaining the gaze of the observer. Resulting fitness scores determined the genotypes that passed to the next generation. We recorded changes to the distribution of DS and ORI genes over 20 generations. When participants looked for symmetry, there was an increase in high-symmetry genes. When participants looked for the patterns they preferred, there was a smaller increase in symmetry, indicating that people tolerated some imperfection. Conversely, there was no increase in symmetry during free viewing, and no effect of familiarity or orientation. This work demonstrates the viability of the evolutionary algorithm approach as a quantitative measure of aesthetic preference.

  13. Gaze interaction from bed

    DEFF Research Database (Denmark)

    Hansen, John Paulin; San Agustin, Javier; Jensen, Henrik Tomra Skovsgaard Hegner

    2011-01-01

    This paper presents a low-cost gaze tracking solution for bedbound people composed of free-ware tracking software and commodity hardware. Gaze interaction is done on a large wall-projected image, visible to all people present in the room. The hardware equipment leaves physical space free to assis...

  14. TRACKER

    CERN Multimedia

    C. Barth

    2012-01-01

      Strip Tracker In the end of 2011, the Silicon Strip Tracker participated in the very successful heavy-ion collision data-taking. With zero downtime attributed to the Strip Tracker, CMS could achieve the excellent efficiency of 96%. Thus we were able to improve on the already good uptime during pp collisions, and completed an excellent year for the Strip Tracker. The shift of responsibility to raise the high voltages at the declaration of Stable Beams from the Tracker DOC to the central crew went smoothly. The new scheme is working reliably and we improved our automatic DQM and DCS SMS services. With this further improvement we plan to discontinue calling the TK DOC at each Stable Beam; so far the TK DOC personally checked all systems. The biggest effort of this Year-End Technical Stop was a comprehensive evaluation of the C6F14 cooling system performance with respect to future cold operation. The analysis allows a dedicated planning of system refurbishments to be executed during 2012 and LS1....

  15. Eye Gaze in Creative Sign Language

    Science.gov (United States)

    Kaneko, Michiko; Mesch, Johanna

    2013-01-01

    This article discusses the role of eye gaze in creative sign language. Because eye gaze conveys various types of linguistic and poetic information, it is an intrinsic part of sign language linguistics in general and of creative signing in particular. We discuss various functions of eye gaze in poetic signing and propose a classification of gaze…

  16. Perceptual Training in Beach Volleyball Defence: Different Effects of Gaze-Path Cueing on Gaze and Decision-Making

    Directory of Open Access Journals (Sweden)

    André eKlostermann

    2015-12-01

    Full Text Available For perceptual-cognitive skill training, a variety of intervention methods has been proposed, including the so-called colour-cueing method which aims on superior gaze-path learning by applying visual markers. However, recent findings challenge this method, especially, with regards to its actual effects on gaze behaviour. Consequently, after a preparatory study on the identification of appropriate visual cues for life-size displays, a perceptual-training experiment on decision-making in beach volleyball was conducted, contrasting two cueing interventions (functional vs. dysfunctional gaze path with a conservative control condition (anticipation-related instructions. Gaze analyses revealed learning effects for the dysfunctional group only. Regarding decision-making, all groups showed enhanced performance with largest improvements for the control group followed by the functional and the dysfunctional group. Hence, the results confirm cueing effects on gaze behaviour, but they also question its benefit for enhancing decision-making. However, before completely denying the method’s value, optimisations should be checked regarding, for instance, cueing-pattern characteristics and gaze-related feedback.

  17. Wrist-worn pervasive gaze interaction

    DEFF Research Database (Denmark)

    Hansen, John Paulin; Lund, Haakon; Biermann, Florian

    2016-01-01

    This paper addresses gaze interaction for smart home control, conducted from a wrist-worn unit. First we asked ten people to enact the gaze movements they would propose for e.g. opening a door or adjusting the room temperature. On basis of their suggestions we built and tested different versions...... selection. Their subjective evaluations were positive with regard to the speed of the interaction. We conclude that gaze gesture input seems feasible for fast and brief remote control of smart home technology provided that robustness of tracking is improved....

  18. Creating Gaze Annotations in Head Mounted Displays

    DEFF Research Database (Denmark)

    Mardanbeigi, Diako; Qvarfordt, Pernilla

    2015-01-01

    To facilitate distributed communication in mobile settings, we developed GazeNote for creating and sharing gaze annotations in head mounted displays (HMDs). With gaze annotations it possible to point out objects of interest within an image and add a verbal description. To create an annota- tion...

  19. The “Social Gaze Space”: A Taxonomy for Gaze-Based Communication in Triadic Interactions

    Directory of Open Access Journals (Sweden)

    Mathis Jording

    2018-02-01

    Full Text Available Humans substantially rely on non-verbal cues in their communication and interaction with others. The eyes represent a “simultaneous input-output device”: While we observe others and obtain information about their mental states (including feelings, thoughts, and intentions-to-act, our gaze simultaneously provides information about our own attention and inner experiences. This substantiates its pivotal role for the coordination of communication. The communicative and coordinative capacities – and their phylogenetic and ontogenetic impacts – become fully apparent in triadic interactions constituted in its simplest form by two persons and an object. Technological advances have sparked renewed interest in social gaze and provide new methodological approaches. Here we introduce the ‘Social Gaze Space’ as a new conceptual framework for the systematic study of gaze behavior during social information processing. It covers all possible categorical states, namely ‘partner-oriented,’ ‘object-oriented,’ ‘introspective,’ ‘initiating joint attention,’ and ‘responding joint attention.’ Different combinations of these states explain several interpersonal phenomena. We argue that this taxonomy distinguishes the most relevant interactional states along their distinctive features, and will showcase the implications for prominent social gaze phenomena. The taxonomy allows to identify research desiderates that have been neglected so far. We argue for a systematic investigation of these phenomena and discuss some related methodological issues.

  20. Evaluation of Binocular Eye Trackers and Algorithms for 3D Gaze Interaction in Virtual Reality Environments

    OpenAIRE

    Thies Pfeiffer; Ipke Wachsmuth; Marc E. Latoschik

    2009-01-01

    Tracking user's visual attention is a fundamental aspect in novel human-computer interaction paradigms found in Virtual Reality. For example, multimodal interfaces or dialogue-based communications with virtual and real agents greatly benefit from the analysis of the user's visual attention as a vital source for deictic references or turn-taking signals. Current approaches to determine visual attention rely primarily on monocular eye trackers. Hence they are restricted to the interpretation of...

  1. Owners' direct gazes increase dogs' attention-getting behaviors.

    Science.gov (United States)

    Ohkita, Midori; Nagasawa, Miho; Kazutaka, Mogi; Kikusui, Takefumi

    2016-04-01

    This study examined whether dogs gain information about human's attention via their gazes and whether they change their attention-getting behaviors (i.e., whining and whimpering, looking at their owners' faces, pawing, and approaching their owners) in response to their owners' direct gazes. The results showed that when the owners gazed at their dogs, the durations of whining and whimpering and looking at the owners' faces were longer than when the owners averted their gazes. In contrast, there were no differences in duration of pawing and likelihood of approaching the owners between the direct and averted gaze conditions. Therefore, owners' direct gazes increased the behaviors that acted as distant signals and did not necessarily involve touching the owners. We suggest that dogs are sensitive to human gazes, and this sensitivity may act as attachment signals to humans, and may contribute to close relationships between humans and dogs. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Look Together: Analyzing Gaze Coordination with Epistemic Network Analysis

    Directory of Open Access Journals (Sweden)

    Sean eAndrist

    2015-07-01

    Full Text Available When conversing and collaborating in everyday situations, people naturally and interactively align their behaviors with each other across various communication channels, including speech, gesture, posture, and gaze. Having access to a partner's referential gaze behavior has been shown to be particularly important in achieving collaborative outcomes, but the process in which people's gaze behaviors unfold over the course of an interaction and become tightly coordinated is not well understood. In this paper, we present work to develop a deeper and more nuanced understanding of coordinated referential gaze in collaborating dyads. We recruited 13 dyads to participate in a collaborative sandwich-making task and used dual mobile eye tracking to synchronously record each participant's gaze behavior. We used a relatively new analysis technique—epistemic network analysis—to jointly model the gaze behaviors of both conversational participants. In this analysis, network nodes represent gaze targets for each participant, and edge strengths convey the likelihood of simultaneous gaze to the connected target nodes during a given time-slice. We divided collaborative task sequences into discrete phases to examine how the networks of shared gaze evolved over longer time windows. We conducted three separate analyses of the data to reveal (1 properties and patterns of how gaze coordination unfolds throughout an interaction sequence, (2 optimal time lags of gaze alignment within a dyad at different phases of the interaction, and (3 differences in gaze coordination patterns for interaction sequences that lead to breakdowns and repairs. In addition to contributing to the growing body of knowledge on the coordination of gaze behaviors in joint activities, this work has implications for the design of future technologies that engage in situated interactions with human users.

  3. INNER TRACKER

    CERN Multimedia

    Karl Gill

    A series of important milestones have been passed during the last 3 months. With the delivery of refurbished cooling systems, pixels and strip systems have been brought back into operation after long shutdowns. Pixels has been operating since reinsertion of FPIX in April, and has been running at 4°C since May 16 when the bulkhead thermal screen was commissioned. More recently, on June 10 the Strip Tracker was powered up in its entirety, with cooling fluid circulating at 4°C, allowing commissioning of the Strip Tracker to proceed at full speed. The full Tracker is well on course to be ready for CRAFT, with Strip Tracker readout operation in ‘peak’ mode remaining also on track to be ready for beam operations in the Autumn in ‘deconvolution’ readout mode. The main Tracker activity during the shutdown was the cooling plant refurbishment for Strips and Pixels systems. The objectives were to reduce the serious leaks observed in 2008 and improve the longevity...

  4. Facilitated orienting underlies fearful face-enhanced gaze cueing of spatial location

    Directory of Open Access Journals (Sweden)

    Joshua M. Carlson

    2016-12-01

    Full Text Available Faces provide a platform for non-verbal communication through emotional expression and eye gaze. Fearful facial expressions are salient indicators of potential threat within the environment, which automatically capture observers’ attention. However, the degree to which fearful facial expressions facilitate attention to others’ gaze is unresolved. Given that fearful gaze indicates the location of potential threat, it was hypothesized that fearful gaze facilitates location processing. To test this hypothesis, a gaze cueing study with fearful and neutral faces assessing target localization was conducted. The task consisted of leftward, rightward, and forward/straight gaze trials. The inclusion of forward gaze trials allowed for the isolation of orienting and disengagement components of gaze-directed attention. The results suggest that both neutral and fearful gaze modulates attention through orienting and disengagement components. Fearful gaze, however, resulted in quicker orienting than neutral gaze. Thus, fearful faces enhance gaze cueing of spatial location through facilitated orienting.

  5. Judgments at Gaze Value: Gaze Cuing in Banner Advertisements, Its Effect on Attention Allocation and Product Judgments

    Directory of Open Access Journals (Sweden)

    Johanna Palcu

    2017-06-01

    Full Text Available Banner advertising is a popular means of promoting products and brands online. Although banner advertisements are often designed to be particularly attention grabbing, they frequently go unnoticed. Applying an eye-tracking procedure, the present research aimed to (a determine whether presenting human faces (static or animated in banner advertisements is an adequate tool for capturing consumers’ attention and thus overcoming the frequently observed phenomenon of banner blindness, (b to examine whether the gaze of a featured face possesses the ability to direct consumers’ attention toward specific elements (i.e., the product in an advertisement, and (c to establish whether the gaze direction of an advertised face influences consumers subsequent evaluation of the advertised product. We recorded participants’ eye gaze while they viewed a fictional online shopping page displaying banner advertisements that featured either no human face or a human face that was either static or animated and involved different gaze directions (toward or away from the advertised product. Moreover, we asked participants to subsequently evaluate a set of products, one of which was the product previously featured in the banner advertisement. Results showed that, when advertisements included a human face, participants’ attention was more attracted by and they looked longer at animated compared with static banner advertisements. Moreover, when a face gazed toward the product region, participants’ likelihood of looking at the advertised product increased regardless of whether the face was animated or not. Most important, gaze direction influenced subsequent product evaluations; that is, consumers indicated a higher intention to buy a product when it was previously presented in a banner advertisement that featured a face that gazed toward the product. The results suggest that while animation in banner advertising constitutes a salient feature that captures consumers

  6. Judgments at Gaze Value: Gaze Cuing in Banner Advertisements, Its Effect on Attention Allocation and Product Judgments.

    Science.gov (United States)

    Palcu, Johanna; Sudkamp, Jennifer; Florack, Arnd

    2017-01-01

    Banner advertising is a popular means of promoting products and brands online. Although banner advertisements are often designed to be particularly attention grabbing, they frequently go unnoticed. Applying an eye-tracking procedure, the present research aimed to (a) determine whether presenting human faces (static or animated) in banner advertisements is an adequate tool for capturing consumers' attention and thus overcoming the frequently observed phenomenon of banner blindness, (b) to examine whether the gaze of a featured face possesses the ability to direct consumers' attention toward specific elements (i.e., the product) in an advertisement, and (c) to establish whether the gaze direction of an advertised face influences consumers subsequent evaluation of the advertised product. We recorded participants' eye gaze while they viewed a fictional online shopping page displaying banner advertisements that featured either no human face or a human face that was either static or animated and involved different gaze directions (toward or away from the advertised product). Moreover, we asked participants to subsequently evaluate a set of products, one of which was the product previously featured in the banner advertisement. Results showed that, when advertisements included a human face, participants' attention was more attracted by and they looked longer at animated compared with static banner advertisements. Moreover, when a face gazed toward the product region, participants' likelihood of looking at the advertised product increased regardless of whether the face was animated or not. Most important, gaze direction influenced subsequent product evaluations; that is, consumers indicated a higher intention to buy a product when it was previously presented in a banner advertisement that featured a face that gazed toward the product. The results suggest that while animation in banner advertising constitutes a salient feature that captures consumers' visual attention, gaze

  7. INNER TRACKER

    CERN Multimedia

    P. Sharp

    The CMS Inner Tracking Detector continues to make good progress. The Objective for 2007 is to deliver to CMS a completed, installed, commissioned and calibrated Tracking System (Silicon Strip and Pixels) aligned to < 100µ in April 2008 ready for the first physics collisions at LHC. On 21 March 2007, the integration of the CMS Silicon Strip Tracker was completed with the successful integration of TEC- into the Tracker Support Tube (TST). Since then ~25% of the complete Tracker Systems has been commission at the TIF at both room temperature and operating temperature (-100 C), and the Tracker Community has gained very valuable experience in operating, calibrating and aligning the Tracker at the TIF before it is prepared for transportation to P5 in July 2007. The CMS Pixel System continues to make good progress. Module and Plaquette production is very well advanced. The first 25% of the Forward Pixel detector (Fpix) was delivered to CERN in April and the second 25% will shipped to CERN on 19 ...

  8. INNER TRACKER

    CERN Multimedia

    K. Gill.

    The clear highlight of recent months was switching on the Tracker to capture the first LHC collisions with 450GeV beams. This was during the first trial run of the LHC on 23rd November. On that day, the Tracker Outer Barrel (TOB) was powered and the detector performance was excellent, in accord with our expectations. Since then, the full Tracker, strips and pixels, has been powered up during “quiet” beam periods when there was judged to be little risk of damage due to sudden beam losses. All Tracker systems performed very well, considering the beam and trigger conditions in place, and we now eagerly anticipate the first collisions with stable beams. Besides this very intense and exciting recent period there has been a lot of other activity in the last 6 months. The full Tracker participated in CRAFT09 and operations of all systems went very smoothly for both pixels and strips, validating all the meticulous work that had taking place during the long shutdown, the subsequent re-commissionin...

  9. Gaze perception in social anxiety and social anxiety disorder

    Directory of Open Access Journals (Sweden)

    Lars eSchulze

    2013-12-01

    Full Text Available Clinical observations suggest abnormal gaze perception to be an important indicator of social anxiety disorder (SAD. Experimental research has yet paid relatively little attention to the study of gaze perception in SAD. In this article we first discuss gaze perception in healthy human beings before reviewing self-referential and threat-related biases of gaze perception in clinical and non-clinical socially anxious samples. Relative to controls, socially anxious individuals exhibit an enhanced self-directed perception of gaze directions and demonstrate a pronounced fear of direct eye contact, though findings are less consistent regarding the avoidance of mutual gaze in SAD. Prospects for future research and clinical implications are discussed.

  10. Wolves (Canis lupus) and Dogs (Canis familiaris) Differ in Following Human Gaze Into Distant Space But Respond Similar to Their Packmates’ Gaze

    Science.gov (United States)

    Werhahn, Geraldine; Virányi, Zsófia; Barrera, Gabriela; Sommese, Andrea; Range, Friederike

    2017-01-01

    Gaze following into distant space is defined as visual co-orientation with another individual’s head direction allowing the gaze follower to gain information on its environment. Human and nonhuman animals share this basic gaze following behavior, suggested to rely on a simple reflexive mechanism and believed to be an important prerequisite for complex forms of social cognition. Pet dogs differ from other species in that they follow only communicative human gaze clearly addressed to them. However, in an earlier experiment we showed that wolves follow human gaze into distant space. Here we set out to investigate whether domestication has affected gaze following in dogs by comparing pack-living dogs and wolves raised and kept under the same conditions. In Study 1 we found that in contrast to the wolves, these dogs did not follow minimally communicative human gaze into distant space in the same test paradigm. In the observational Study 2 we found that pack-living dogs and wolves, similarly vigilant to environmental stimuli, follow the spontaneous gaze of their conspecifics similarly often. Our findings suggest that domestication did not affect the gaze following ability of dogs itself. The results raise hypotheses about which other dog skills might have been altered through domestication that may have influenced their performance in Study 1. Because following human gaze in dogs might be influenced by special evolutionary as well as developmental adaptations to interactions with humans, we suggest that comparing dogs to other animal species might be more informative when done in intraspecific social contexts. PMID:27244538

  11. Wolves (Canis lupus) and dogs (Canis familiaris) differ in following human gaze into distant space but respond similar to their packmates' gaze.

    Science.gov (United States)

    Werhahn, Geraldine; Virányi, Zsófia; Barrera, Gabriela; Sommese, Andrea; Range, Friederike

    2016-08-01

    Gaze following into distant space is defined as visual co-orientation with another individual's head direction allowing the gaze follower to gain information on its environment. Human and nonhuman animals share this basic gaze following behavior, suggested to rely on a simple reflexive mechanism and believed to be an important prerequisite for complex forms of social cognition. Pet dogs differ from other species in that they follow only communicative human gaze clearly addressed to them. However, in an earlier experiment we showed that wolves follow human gaze into distant space. Here we set out to investigate whether domestication has affected gaze following in dogs by comparing pack-living dogs and wolves raised and kept under the same conditions. In Study 1 we found that in contrast to the wolves, these dogs did not follow minimally communicative human gaze into distant space in the same test paradigm. In the observational Study 2 we found that pack-living dogs and wolves, similarly vigilant to environmental stimuli, follow the spontaneous gaze of their conspecifics similarly often. Our findings suggest that domestication did not affect the gaze following ability of dogs itself. The results raise hypotheses about which other dog skills might have been altered through domestication that may have influenced their performance in Study 1. Because following human gaze in dogs might be influenced by special evolutionary as well as developmental adaptations to interactions with humans, we suggest that comparing dogs to other animal species might be more informative when done in intraspecific social contexts. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Design gaze simulation for people with visual disability

    NARCIS (Netherlands)

    Qiu, S.

    2017-01-01

    In face-to-face communication, eye gaze is integral to a conversation to supplement verbal language. The sighted often uses eye gaze to convey nonverbal information in social interactions, which a blind conversation partner cannot access and react. My doctoral research is to design gaze simulation

  13. Mental state attribution and the gaze cueing effect.

    Science.gov (United States)

    Cole, Geoff G; Smith, Daniel T; Atkinson, Mark A

    2015-05-01

    Theory of mind is said to be possessed by an individual if he or she is able to impute mental states to others. Recently, some authors have demonstrated that such mental state attributions can mediate the "gaze cueing" effect, in which observation of another individual shifts an observer's attention. One question that follows from this work is whether such mental state attributions produce mandatory modulations of gaze cueing. Employing the basic gaze cueing paradigm, together with a technique commonly used to assess mental-state attribution in nonhuman animals, we manipulated whether the gazing agent could see the same thing as the participant (i.e., the target) or had this view obstructed by a physical barrier. We found robust gaze cueing effects, even when the observed agent in the display could not see the same thing as the participant. These results suggest that the attribution of "seeing" does not necessarily modulate the gaze cueing effect.

  14. Discussion and Future Directions for Eye Tracker Development

    DEFF Research Database (Denmark)

    Hansen, Dan Witzner; Mulvey, Fiona; Mardanbegi, Diako

    2011-01-01

    Eye and gaze tracking have a long history but there is still plenty of room for further development. In this concluding chapter for Section 6, we consider future perspectives for the development of eye and gaze tracking.......Eye and gaze tracking have a long history but there is still plenty of room for further development. In this concluding chapter for Section 6, we consider future perspectives for the development of eye and gaze tracking....

  15. Training for eye contact modulates gaze following in dogs.

    Science.gov (United States)

    Wallis, Lisa J; Range, Friederike; Müller, Corsin A; Serisier, Samuel; Huber, Ludwig; Virányi, Zsófia

    2015-08-01

    Following human gaze in dogs and human infants can be considered a socially facilitated orientation response, which in object choice tasks is modulated by human-given ostensive cues. Despite their similarities to human infants, and extensive skills in reading human cues in foraging contexts, no evidence that dogs follow gaze into distant space has been found. We re-examined this question, and additionally whether dogs' propensity to follow gaze was affected by age and/or training to pay attention to humans. We tested a cross-sectional sample of 145 border collies aged 6 months to 14 years with different amounts of training over their lives. The dogs' gaze-following response in test and control conditions before and after training for initiating eye contact with the experimenter was compared with that of a second group of 13 border collies trained to touch a ball with their paw. Our results provide the first evidence that dogs can follow human gaze into distant space. Although we found no age effect on gaze following, the youngest and oldest age groups were more distractible, which resulted in a higher number of looks in the test and control conditions. Extensive lifelong formal training as well as short-term training for eye contact decreased dogs' tendency to follow gaze and increased their duration of gaze to the face. The reduction in gaze following after training for eye contact cannot be explained by fatigue or short-term habituation, as in the second group gaze following increased after a different training of the same length. Training for eye contact created a competing tendency to fixate the face, which prevented the dogs from following the directional cues. We conclude that following human gaze into distant space in dogs is modulated by training, which may explain why dogs perform poorly in comparison to other species in this task.

  16. Speaker gaze increases information coupling between infant and adult brains.

    Science.gov (United States)

    Leong, Victoria; Byrne, Elizabeth; Clackson, Kaili; Georgieva, Stanimira; Lam, Sarah; Wass, Sam

    2017-12-12

    When infants and adults communicate, they exchange social signals of availability and communicative intention such as eye gaze. Previous research indicates that when communication is successful, close temporal dependencies arise between adult speakers' and listeners' neural activity. However, it is not known whether similar neural contingencies exist within adult-infant dyads. Here, we used dual-electroencephalography to assess whether direct gaze increases neural coupling between adults and infants during screen-based and live interactions. In experiment 1 ( n = 17), infants viewed videos of an adult who was singing nursery rhymes with ( i ) direct gaze (looking forward), ( ii ) indirect gaze (head and eyes averted by 20°), or ( iii ) direct-oblique gaze (head averted but eyes orientated forward). In experiment 2 ( n = 19), infants viewed the same adult in a live context, singing with direct or indirect gaze. Gaze-related changes in adult-infant neural network connectivity were measured using partial directed coherence. Across both experiments, the adult had a significant (Granger) causal influence on infants' neural activity, which was stronger during direct and direct-oblique gaze relative to indirect gaze. During live interactions, infants also influenced the adult more during direct than indirect gaze. Further, infants vocalized more frequently during live direct gaze, and individual infants who vocalized longer also elicited stronger synchronization from the adult. These results demonstrate that direct gaze strengthens bidirectional adult-infant neural connectivity during communication. Thus, ostensive social signals could act to bring brains into mutual temporal alignment, creating a joint-networked state that is structured to facilitate information transfer during early communication and learning. Copyright © 2017 the Author(s). Published by PNAS.

  17. Embodied social robots trigger gaze following in real-time

    OpenAIRE

    Wiese, Eva; Weis, Patrick; Lofaro, Daniel

    2018-01-01

    In human-human interaction, we use information from gestures, facial expressions and gaze direction to make inferences about what interaction partners think, feel or intend to do next. Observing changes in gaze direction triggers shifts of attention to gazed-at locations and helps establish shared attention between gazer and observer - a prerequisite for more complex social skills like mentalizing, action understanding and joint action. The ability to follow others’ gaze develops early in lif...

  18. Adaptive gaze control for object detection

    NARCIS (Netherlands)

    De Croon, G.C.H.E.; Postma, E.O.; Van den Herik, H.J.

    2011-01-01

    We propose a novel gaze-control model for detecting objects in images. The model, named act-detect, uses the information from local image samples in order to shift its gaze towards object locations. The model constitutes two main contributions. The first contribution is that the model’s setup makes

  19. Attention to gaze and emotion in schizophrenia.

    Science.gov (United States)

    Schwartz, Barbara L; Vaidya, Chandan J; Howard, James H; Deutsch, Stephen I

    2010-11-01

    Individuals with schizophrenia have difficulty interpreting social and emotional cues such as facial expression, gaze direction, body position, and voice intonation. Nonverbal cues are powerful social signals but are often processed implicitly, outside the focus of attention. The aim of this research was to assess implicit processing of social cues in individuals with schizophrenia. Patients with schizophrenia or schizoaffective disorder and matched controls performed a primary task of word classification with social cues in the background. Participants were asked to classify target words (LEFT/RIGHT) by pressing a key that corresponded to the word, in the context of facial expressions with eye gaze averted to the left or right. Although facial expression and gaze direction were irrelevant to the task, these facial cues influenced word classification performance. Participants were slower to classify target words (e.g., LEFT) that were incongruent to gaze direction (e.g., eyes averted to the right) compared to target words (e.g., LEFT) that were congruent to gaze direction (e.g., eyes averted to the left), but this only occurred for expressions of fear. This pattern did not differ for patients and controls. The results showed that threat-related signals capture the attention of individuals with schizophrenia. These data suggest that implicit processing of eye gaze and fearful expressions is intact in schizophrenia. (c) 2010 APA, all rights reserved

  20. A comparison of facial color pattern and gazing behavior in canid species suggests gaze communication in gray wolves (Canis lupus.

    Directory of Open Access Journals (Sweden)

    Sayoko Ueda

    Full Text Available As facial color pattern around the eyes has been suggested to serve various adaptive functions related to the gaze signal, we compared the patterns among 25 canid species, focusing on the gaze signal, to estimate the function of facial color pattern in these species. The facial color patterns of the studied species could be categorized into the following three types based on contrast indices relating to the gaze signal: A-type (both pupil position in the eye outline and eye position in the face are clear, B-type (only the eye position is clear, and C-type (both the pupil and eye position are unclear. A-type faces with light-colored irises were observed in most studied species of the wolf-like clade and some of the red fox-like clade. A-type faces tended to be observed in species living in family groups all year-round, whereas B-type faces tended to be seen in solo/pair-living species. The duration of gazing behavior during which the facial gaze-signal is displayed to the other individual was longest in gray wolves with typical A-type faces, of intermediate length in fennec foxes with typical B-type faces, and shortest in bush dogs with typical C-type faces. These results suggest that the facial color pattern of canid species is related to their gaze communication and that canids with A-type faces, especially gray wolves, use the gaze signal in conspecific communication.

  1. Reading the mind from eye gaze.

    NARCIS (Netherlands)

    Christoffels, I.; Young, A.W.; Owen, A.M.; Scott, S.K.; Keane, J.; Lawrence, A.D.

    2002-01-01

    S. Baron-Cohen (1997) has suggested that the interpretation of gaze plays an important role in a normal functioning theory of mind (ToM) system. Consistent with this suggestion, functional imaging research has shown that both ToM tasks and eye gaze processing engage a similar region of the posterior

  2. Autonomous Star Tracker Algorithms

    DEFF Research Database (Denmark)

    Betto, Maurizio; Jørgensen, John Leif; Kilsgaard, Søren

    1998-01-01

    Proposal, in response to an ESA R.f.P., to design algorithms for autonomous star tracker operations.The proposal also included the development of a star tracker breadboard to test the algorithms performances.......Proposal, in response to an ESA R.f.P., to design algorithms for autonomous star tracker operations.The proposal also included the development of a star tracker breadboard to test the algorithms performances....

  3. Latvijas Gaze buyback likely to flop

    Index Scriptorium Estoniae

    2003-01-01

    Veerandi Läti gaasifirma Latvijas Gaze omanik Itera kavatseb lähiajal lõpule viia üheksa protsendi Läti firma aktsiate müügi ettevõttele Gazprom. Gazprom'i kontrolli all on praegu 25 protsenti, Ruhrgas'il 28,66 ning E.ON Energie AG-l 18,06 protsenti Latvijas Gaze aktsiatest

  4. Fitness Tracker for Weight Lifting Style Workouts

    Energy Technology Data Exchange (ETDEWEB)

    Wihl, B. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This document proposes an early, high level design for a fitness tracking system which can automatically log weight lifting style workouts. The system will provide an easy to use interface both physically through the use of several wireless wristband style motion trackers worn on the limbs, and graphically through a smartphone application. Exercise classification will be accomplished by calibration of the user’s specific motions. The system will accurately track a user’s workout, miscounting no more than one repetition in every 20, have sufficient battery life to last several hours, work with existing smartphones and have a cost similar to those of current fitness tracking devices. This document presents the mission background, current state-of-theart, stakeholders and their expectations, the proposed system’s context and concepts, implementation concepts, system requirements, first sublevel function decomposition, possible risks for the system, and a reflection on the design process.

  5. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load.

    Science.gov (United States)

    Pecchinenda, Anna; Petrucci, Manuel

    2016-01-01

    Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information.

  6. Gliding and Saccadic Gaze Gesture Recognition in Real Time

    DEFF Research Database (Denmark)

    Rozado, David; San Agustin, Javier; Rodriguez, Francisco

    2012-01-01

    , and their corresponding real-time recognition algorithms, Hierarchical Temporal Memory networks and the Needleman-Wunsch algorithm for sequence alignment. Our results show how a specific combination of gaze gesture modality, namely saccadic gaze gestures, and recognition algorithm, Needleman-Wunsch, allows for reliable...... usage of intentional gaze gestures to interact with a computer with accuracy rates of up to 98% and acceptable completion speed. Furthermore, the gesture recognition engine does not interfere with otherwise standard human-machine gaze interaction generating therefore, very low false positive rates...

  7. Face age modulates gaze following in young adults.

    Science.gov (United States)

    Ciardo, Francesca; Marino, Barbara F M; Actis-Grosso, Rossana; Rossetti, Angela; Ricciardelli, Paola

    2014-04-22

    Gaze-following behaviour is considered crucial for social interactions which are influenced by social similarity. We investigated whether the degree of similarity, as indicated by the perceived age of another person, can modulate gaze following. Participants of three different age-groups (18-25; 35-45; over 65) performed an eye movement (a saccade) towards an instructed target while ignoring the gaze-shift of distracters of different age-ranges (6-10; 18-25; 35-45; over 70). The results show that gaze following was modulated by the distracter face age only for young adults. Particularly, the over 70 year-old distracters exerted the least interference effect. The distracters of a similar age-range as the young adults (18-25; 35-45) had the most effect, indicating a blurred own-age bias (OAB) only for the young age group. These findings suggest that face age can modulate gaze following, but this modulation could be due to factors other than just OAB (e.g., familiarity).

  8. Gaze Bias in Preference Judgments by Younger and Older Adults

    Directory of Open Access Journals (Sweden)

    Toshiki Saito

    2017-08-01

    Full Text Available Individuals’ gaze behavior reflects the choice they will ultimately make. For example, people confronting a choice among multiple stimuli tend to look longer at stimuli that are subsequently chosen than at other stimuli. This tendency, called the gaze bias effect, is a key aspect of visual decision-making. Nevertheless, no study has examined the generality of the gaze bias effect in older adults. Here, we used a two-alternative forced-choice task (2AFC to compare the gaze behavior reflective of different stages of decision processes demonstrated by younger and older adults. Participants who had viewed two faces were instructed to choose the one that they liked/disliked or the one that they judged to be more/less similar to their own face. Their eye movements were tracked while they chose. The results show that the gaze bias effect occurred during the remaining time in both age groups irrespective of the decision type. However, no gaze bias effect was observed for the preference judgment during the first dwell time. Our study demonstrated that the gaze bias during the remaining time occurred regardless of decision-making task and age. Further study using diverse participants, such as clinic patients or infants, may help to generalize the gaze bias effect and to elucidate the mechanisms underlying the gaze bias.

  9. Direct Speaker Gaze Promotes Trust in Truth-Ambiguous Statements.

    Science.gov (United States)

    Kreysa, Helene; Kessler, Luise; Schweinberger, Stefan R

    2016-01-01

    A speaker's gaze behaviour can provide perceivers with a multitude of cues which are relevant for communication, thus constituting an important non-verbal interaction channel. The present study investigated whether direct eye gaze of a speaker affects the likelihood of listeners believing truth-ambiguous statements. Participants were presented with videos in which a speaker produced such statements with either direct or averted gaze. The statements were selected through a rating study to ensure that participants were unlikely to know a-priori whether they were true or not (e.g., "sniffer dogs cannot smell the difference between identical twins"). Participants indicated in a forced-choice task whether or not they believed each statement. We found that participants were more likely to believe statements by a speaker looking at them directly, compared to a speaker with averted gaze. Moreover, when participants disagreed with a statement, they were slower to do so when the statement was uttered with direct (compared to averted) gaze, suggesting that the process of rejecting a statement as untrue may be inhibited when that statement is accompanied by direct gaze.

  10. Animating Flames: Recovering Fire-Gazing as a Moving-Image Technology

    Directory of Open Access Journals (Sweden)

    Anne Sullivan

    2017-12-01

    Full Text Available In nineteenth-century England, the industrialization of heat and light rendered fire-gazing increasingly obsolete. Fire-gazing is a form of flame-based reverie that typically involves a solitary viewer who perceives animated, moving images dissolving into and out of view in a wood or coal fire. When fire-gazing, the viewer may perceive arbitrary pictures, fantastic landscapes, or more familiar forms, such as the faces of friends and family. This article recovers fire-gazing as an early and more intimate animation technology by examining remediations of fire-gazing in print. After reviewing why an analysis of fire-gazing requires a joint literary and media history approach, I build from Michael Faraday’s mid-nineteenth-century theorization of flame as a moving image to argue that fire-gazing must be included in the history of animation technologies. I then demonstrate the uneasy connections that form between automatism, mechanical reproduction, and creativity in Leigh Hunt’s description of fire-gazing in his 1811 essay ‘A Day by the Fire’. The tension between conscious and unconscious modes of production culminates in a discussion of fireside scenes of (reanimation in Charles Dickens’s 'Our Mutual Friend' (1864–65, including those featuring one of his more famous fire-gazers, Lizzie Hexam. The article concludes with a brief discussion of the 1908 silent film 'Fireside Reminiscences' as an example of the continued remediations of fire-gazing beyond the nineteenth century.

  11. Just one look: Direct gaze briefly disrupts visual working memory.

    Science.gov (United States)

    Wang, J Jessica; Apperly, Ian A

    2017-04-01

    Direct gaze is a salient social cue that affords rapid detection. A body of research suggests that direct gaze enhances performance on memory tasks (e.g., Hood, Macrae, Cole-Davies, & Dias, Developmental Science, 1, 67-71, 2003). Nonetheless, other studies highlight the disruptive effect direct gaze has on concurrent cognitive processes (e.g., Conty, Gimmig, Belletier, George, & Huguet, Cognition, 115(1), 133-139, 2010). This discrepancy raises questions about the effects direct gaze may have on concurrent memory tasks. We addressed this topic by employing a change detection paradigm, where participants retained information about the color of small sets of agents. Experiment 1 revealed that, despite the irrelevance of the agents' eye gaze to the memory task at hand, participants were worse at detecting changes when the agents looked directly at them compared to when the agents looked away. Experiment 2 showed that the disruptive effect was relatively short-lived. Prolonged presentation of direct gaze led to recovery from the initial disruption, rather than a sustained disruption on change detection performance. The present study provides the first evidence that direct gaze impairs visual working memory with a rapidly-developing yet short-lived effect even when there is no need to attend to agents' gaze.

  12. Face Age and Eye Gaze Influence Older Adults' Emotion Recognition.

    Science.gov (United States)

    Campbell, Anna; Murray, Janice E; Atkinson, Lianne; Ruffman, Ted

    2017-07-01

    Eye gaze has been shown to influence emotion recognition. In addition, older adults (over 65 years) are not as influenced by gaze direction cues as young adults (18-30 years). Nevertheless, these differences might stem from the use of young to middle-aged faces in emotion recognition research because older adults have an attention bias toward old-age faces. Therefore, using older face stimuli might allow older adults to process gaze direction cues to influence emotion recognition. To investigate this idea, young and older adults completed an emotion recognition task with young and older face stimuli displaying direct and averted gaze, assessing labeling accuracy for angry, disgusted, fearful, happy, and sad faces. Direct gaze rather than averted gaze improved young adults' recognition of emotions in young and older faces, but for older adults this was true only for older faces. The current study highlights the impact of stimulus face age and gaze direction on emotion recognition in young and older adults. The use of young face stimuli with direct gaze in most research might contribute to age-related emotion recognition differences. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. The influence of crystalline lens accommodation on post-saccadic oscillations in pupil-based eye trackers.

    Science.gov (United States)

    Nyström, Marcus; Andersson, Richard; Magnusson, Måns; Pansell, Tony; Hooge, Ignace

    2015-02-01

    It is well known that the crystalline lens (henceforth lens) can oscillate (or 'wobble') relative to the eyeball at the end of saccades. Recent research has proposed that such wobbling of the lens is a source of post-saccadic oscillations (PSOs) seen in data recorded by eye trackers that estimate gaze direction from the location of the pupil. Since the size of the lens wobbles increases with accommodative effort, one would predict a similar increase of PSO-amplitude in data recorded with a pupil based eye tracker. In four experiments, we investigated the role of lens accommodation on PSOs in a video-based eye tracker. In Experiment 1, we replicated previous results showing that PSO-amplitudes increase at near viewing distances (large vergence angles), when the lens is highly accommodated. In Experiment 2a, we manipulated the accommodative state of the lens pharmacologically using eye drops at a fixed viewing distance and found, in contrast to Experiment 1, no significant difference in PSO-amplitude related to the accommodative state of the lens. Finally, in Experiment 2b, the effect of vergence angle was investigated by comparing PSO-amplitudes at near and far while maintaining a fixed lens accommodation. Despite the pharmacologically fixed degree of accommodation, PSO-amplitudes were systematically larger in the near condition. In summary, PSOs cannot exhaustively be explained by lens wobbles. Possible confounds related to pupil size and eye-camera angle are investigated in Experiments 3 and 4, and alternative mechanisms behind PSOs are probed in the discussion. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Two ATLAS trackers become one

    CERN Multimedia

    2006-01-01

    The ATLAS inner detector barrel comes one step closer to completion as the semiconductor tracker is merged with the transition radiation tracker. ATLAS collaborators prepare for the insertion of the semiconductor tracker (SCT, behind) into the transition radiation tracker (TRT, in front). Some had hoped it would fall on Valentine's Day. But despite the slight delay, Friday 17 February was lovingly embraced as 'Conception Day,' when dozens of physicists and engineers from the international collaboration gathered to witness the insertion of the ATLAS semiconductor tracker into the transition radiation tracker, a major milestone in the assembly of the experiment's inner detector. With just millimeters of room for error, the cylindrical trackers were slid into each other as inner detector integration coordinator Heinz Pernegger issued commands and scientists held out flashlights, lay on their backs and stood on ladders to take careful measurements. Each tracker is the result of about 10 years of international ...

  15. Visual Foraging With Fingers and Eye Gaze

    Directory of Open Access Journals (Sweden)

    Ómar I. Jóhannesson

    2016-03-01

    Full Text Available A popular model of the function of selective visual attention involves search where a single target is to be found among distractors. For many scenarios, a more realistic model involves search for multiple targets of various types, since natural tasks typically do not involve a single target. Here we present results from a novel multiple-target foraging paradigm. We compare finger foraging where observers cancel a set of predesignated targets by tapping them, to gaze foraging where observers cancel items by fixating them for 100 ms. During finger foraging, for most observers, there was a large difference between foraging based on a single feature, where observers switch easily between target types, and foraging based on a conjunction of features where observers tended to stick to one target type. The pattern was notably different during gaze foraging where these condition differences were smaller. Two conclusions follow: (a The fact that a sizeable number of observers (in particular during gaze foraging had little trouble switching between different target types raises challenges for many prominent theoretical accounts of visual attention and working memory. (b While caveats must be noted for the comparison of gaze and finger foraging, the results suggest that selection mechanisms for gaze and pointing have different operational constraints.

  16. TRACKER

    CERN Multimedia

    L. Demaria

    2011-01-01

    Strip Tracker The Silicon Strip Tracker has maintained excellent operational performance during the 2011 data-taking period. The increase of instantaneous luminosity up to 1033 cm-2s-1 did not introduce any new issues in the detector. The detector has collected high-quality physics data with an uptime greater than 98%. Sources of downtime have been identified and problems were properly addressed. Improved firmware in the Front-End Driver (FED) firmware was deployed to increase the robustness of the readout against spurious extra frames coming from the detector. When a FED detects bad data, it goes into Out-Of-Sync (OOS) status, waits for a L1 resynchronisation command (resync) to clean up the culprit data and restarts. Resync commands are now sent automatically to the Strip Tracker when it signals OOS and, as a result, this source of downtime has been reduced significantly. The dead-time, caused by recoveries from OOS, accounts for less than 0.1%. Downtime was also found to be caused by a FED occasionally ge...

  17. TRACKER

    CERN Multimedia

    G. Dirkes

    2010-01-01

    The strip system has generally exhibited stable and high performance operation during the last six months of pp and heavy ion collisions. The up-time during pp collision from June onwards was 99.0% and during the first weeks of heavy-ion running we reached 99.7%. Most of the down-time during the proton runs came from Tracker DAQ problems. Spurious extra events from individual front-end channels caused ‘sync loss draining’ errors at the central DAQ system downstream of the Tracker FEDs. Once the problem was understood, new firmware that detects this error condition was installed on the FEDs. This has reduced the recovery procedure from this particular condition from a full reconfiguration requiring 170 s, to a simple re-synchronisation taking only ~1 s. We have also streamlined the instructions for the central DAQ shifters in order to minimise the time needed to decide the proper reaction to a given problem. The average down-time for problems triggered by the strip tracker DAQ is 395 s. Th...

  18. TRACKER

    CERN Multimedia

    D. Strom

    2011-01-01

    Strip Tracker Since the June CMS Week, the Silicon Strip Tracker has had another period of excellent detector operation with more than 97% system uptime. The focus on stable proton physics collection was fruitful, as CMS recorded greater than 5 fb–1 by the completion of the 2011 pp run. Following the November machine development and technical stop, the Strip Tracker now aims to provide the highest quality data during the heavy-ion run. The detector health, measured by the fraction of alive channels, is largely stable at around 97.8%. Recent failures include a TOB control ring, which now requires redundancy, and a TEC control ring with intermittent failures. These will be investigated during the Year-End Technical Stop. Critical services are very stable. The cooling system has a low total leak rate of less than 1 kg per day, and the power supply exchange rate is less than 1 unit per month. Two operational changes recently went into effect to optimise data-taking efficiency: (1) a tripped power su...

  19. TRACKER

    CERN Multimedia

    Frank Hartmann

    2012-01-01

      Strip Tracker In general, the Strip Tracker is operating smoothly with the current peak instantaneous luminosity beyond 6.5E33, high L1 rate and large pile-up. With several improvements in automatic DQM checks and an enhanced SMS and e-mail service system plus additional audio alarms, we have reduced the work-load of our TK DOC and stopped the calls made at the beginning of each fill. We successfully collected more than two million cosmic tracks in peak mode during inter-fill periods before June, fulfilling the request from the Tracker alignment group. Around 500k cosmic tracks were also collected at zero Tesla. All planned special measurements, namely DCU calibration and I-V scans, have been taken during the YETS and other technical stops. A peak-mode run, a delay run and two HV scans have also been taken during early collisions at the initial low-lumi runs as well as during the fill where CMS had a problem with the magnet. The largest source of downtime comes from TIB-2.8.1 a.k.a. FED 101, ...

  20. Predictive Gaze Cues and Personality Judgments: Should Eye Trust You?

    OpenAIRE

    Bayliss, Andrew P.; Tipper, Steven P.

    2006-01-01

    Although following another person's gaze is essential in fluent social interactions, the reflexive nature of this gaze-cuing effect means that gaze can be used to deceive. In a gaze-cuing procedure, participants were presented with several faces that looked to the left or right. Some faces always looked to the target (predictive-valid), some never looked to the target (predictive-invalid), and others looked toward and away from the target in equal proportions (nonpredictive). The standard gaz...

  1. Culture and Listeners' Gaze Responses to Stuttering

    Science.gov (United States)

    Zhang, Jianliang; Kalinowski, Joseph

    2012-01-01

    Background: It is frequently observed that listeners demonstrate gaze aversion to stuttering. This response may have profound social/communicative implications for both fluent and stuttering individuals. However, there is a lack of empirical examination of listeners' eye gaze responses to stuttering, and it is unclear whether cultural background…

  2. Proximity and Gaze Influences Facial Temperature: A Thermal Infrared Imaging Study.

    Directory of Open Access Journals (Sweden)

    Stephanos eIoannou

    2014-08-01

    Full Text Available Direct gaze and interpersonal proximity are known to lead to changes in psycho-physiology, behaviour and brain function. We know little, however, about subtler facial reactions such as rise and fall in temperature, which may be sensitive to contextual effects and functional in social interactions. Using thermal infrared imaging cameras 18 female adult participants were filmed at two interpersonal distances (intimate and social and two gaze conditions (averted and direct. The order of variation in distance was counterbalanced: half the participants experienced a female experimenter’s gaze at the social distance first before the intimate distance (a socially ‘normal’ order and half experienced the intimate distance first and then the social distance (an odd social order. At both distances averted gaze always preceded direct gaze. We found strong correlations in thermal changes between six areas of the face (forehead, chin, cheeks, nose, maxilliary and periorbital regions for all experimental conditions and developed a composite measure of thermal shifts for all analyses. Interpersonal proximity led to a thermal rise, but only in the ‘normal’ social order. Direct gaze, compared to averted gaze, led to a thermal increase at both distances with a stronger effect at intimate distance, in both orders of distance variation. Participants reported direct gaze as more intrusive than averted gaze, especially at the intimate distance. These results demonstrate the powerful effects of another person’s gaze on psycho-physiological responses, even at a distance and independent of context.

  3. Direct Speaker Gaze Promotes Trust in Truth-Ambiguous Statements.

    Directory of Open Access Journals (Sweden)

    Helene Kreysa

    Full Text Available A speaker's gaze behaviour can provide perceivers with a multitude of cues which are relevant for communication, thus constituting an important non-verbal interaction channel. The present study investigated whether direct eye gaze of a speaker affects the likelihood of listeners believing truth-ambiguous statements. Participants were presented with videos in which a speaker produced such statements with either direct or averted gaze. The statements were selected through a rating study to ensure that participants were unlikely to know a-priori whether they were true or not (e.g., "sniffer dogs cannot smell the difference between identical twins". Participants indicated in a forced-choice task whether or not they believed each statement. We found that participants were more likely to believe statements by a speaker looking at them directly, compared to a speaker with averted gaze. Moreover, when participants disagreed with a statement, they were slower to do so when the statement was uttered with direct (compared to averted gaze, suggesting that the process of rejecting a statement as untrue may be inhibited when that statement is accompanied by direct gaze.

  4. Look together : Using gaze for assisting co-located collaborative search

    NARCIS (Netherlands)

    Zhang, Y.; Pfeuffer, Ken; Chong, Ming Ki; Alexander, Jason; Bulling, Andreas; Gellersen, Hans

    2017-01-01

    Gaze information provides indication of users focus which complements remote collaboration tasks, as distant users can see their partner’s focus. In this paper, we apply gaze for co-located collaboration, where users’ gaze locations are presented on the same display, to help collaboration between

  5. Orienting of attention via observed eye gaze is head-centred.

    Science.gov (United States)

    Bayliss, Andrew P; di Pellegrino, Giuseppe; Tipper, Steven P

    2004-11-01

    Observing averted eye gaze results in the automatic allocation of attention to the gazed-at location. The role of the orientation of the face that produces the gaze cue was investigated. The eyes in the face could look left or right in a head-centred frame, but the face itself could be oriented 90 degrees clockwise or anticlockwise such that the eyes were gazing up or down. Significant cueing effects to targets presented to the left or right of the screen were found in these head orientation conditions. This suggests that attention was directed to the side to which the eyes would have been looking towards, had the face been presented upright. This finding provides evidence that head orientation can affect gaze following, even when the head orientation alone is not a social cue. It also shows that the mechanism responsible for the allocation of attention following a gaze cue can be influenced by intrinsic object-based (i.e. head-centred) properties of the task-irrelevant cue.

  6. Face age modulates gaze following in young adults

    OpenAIRE

    Francesca Ciardo; Barbara F. M. Marino; Rossana Actis-Grosso; Angela Rossetti; Paola Ricciardelli

    2014-01-01

    Gaze-following behaviour is considered crucial for social interactions which are influenced by social similarity. We investigated whether the degree of similarity, as indicated by the perceived age of another person, can modulate gaze following. Participants of three different age-groups (18–25; 35–45; over 65) performed an eye movement (a saccade) towards an instructed target while ignoring the gaze-shift of distracters of different age-ranges (6–10; 18–25; 35–45; over 70). The results show ...

  7. The Effectiveness of Gaze-Contingent Control in Computer Games.

    Science.gov (United States)

    Orlov, Paul A; Apraksin, Nikolay

    2015-01-01

    Eye-tracking technology and gaze-contingent control in human-computer interaction have become an objective reality. This article reports on a series of eye-tracking experiments, in which we concentrated on one aspect of gaze-contingent interaction: Its effectiveness compared with mouse-based control in a computer strategy game. We propose a measure for evaluating the effectiveness of interaction based on "the time of recognition" the game unit. In this article, we use this measure to compare gaze- and mouse-contingent systems, and we present the analysis of the differences as a function of the number of game units. Our results indicate that performance of gaze-contingent interaction is typically higher than mouse manipulation in a visual searching task. When tested on 60 subjects, the results showed that the effectiveness of gaze-contingent systems over 1.5 times higher. In addition, we obtained that eye behavior stays quite stabile with or without mouse interaction. © The Author(s) 2015.

  8. Estimating the gaze of a virtuality human.

    Science.gov (United States)

    Roberts, David J; Rae, John; Duckworth, Tobias W; Moore, Carl M; Aspin, Rob

    2013-04-01

    The aim of our experiment is to determine if eye-gaze can be estimated from a virtuality human: to within the accuracies that underpin social interaction; and reliably across gaze poses and camera arrangements likely in every day settings. The scene is set by explaining why Immersive Virtuality Telepresence has the potential to meet the grand challenge of faithfully communicating both the appearance and the focus of attention of a remote human participant within a shared 3D computer-supported context. Within the experiment n=22 participants rotated static 3D virtuality humans, reconstructed from surround images, until they felt most looked at. The dependent variable was absolute angular error, which was compared to that underpinning social gaze behaviour in the natural world. Independent variables were 1) relative orientations of eye, head and body of captured subject; and 2) subset of cameras used to texture the form. Analysis looked for statistical and practical significance and qualitative corroborating evidence. The analysed results tell us much about the importance and detail of the relationship between gaze pose, method of video based reconstruction, and camera arrangement. They tell us that virtuality can reproduce gaze to an accuracy useful in social interaction, but with the adopted method of Video Based Reconstruction, this is highly dependent on combination of gaze pose and camera arrangement. This suggests changes in the VBR approach in order to allow more flexible camera arrangements. The work is of interest to those wanting to support expressive meetings that are both socially and spatially situated, and particular those using or building Immersive Virtuality Telepresence to accomplish this. It is also of relevance to the use of virtuality humans in applications ranging from the study of human interactions to gaming and the crossing of the stage line in films and TV.

  9. The Gaze as constituent and annihilator

    Directory of Open Access Journals (Sweden)

    Mats Carlsson

    2012-11-01

    Full Text Available This article aims to join the contemporary effort to promote a psychoanalytic renaissance within cinema studies, post Post-Theory. In trying to shake off the burden of the 1970s film theory's distortion of the Lacanian Gaze, rejuvenating it with the strength of the Real and fusing it with Freudian thoughts on the uncanny, hopefully this new dawn can be reached. I aspire to conceptualize the Gaze in a straightforward manner. This in order to obtain an instrument for the identification of certain strategies within the filmic realm aimed at depicting the subjective destabilizing of diegetic characters as well as thwarting techniques directed at the spectorial subject. In setting this capricious Gaze against the uncanny phenomena described by Freud, we find that these two ideas easily intertwine into a draft description of a powerful, potentially reconstitutive force worth being highlighted.

  10. In the presence of conflicting gaze cues, fearful expression and eye-size guide attention.

    Science.gov (United States)

    Carlson, Joshua M; Aday, Jacob

    2017-10-19

    Humans are social beings that often interact in multi-individual environments. As such, we are frequently confronted with nonverbal social signals, including eye-gaze direction, from multiple individuals. Yet, the factors that allow for the prioritisation of certain gaze cues over others are poorly understood. Using a modified conflicting gaze paradigm, we tested the hypothesis that fearful gaze would be favoured amongst competing gaze cues. We further hypothesised that this effect is related to the increased sclera exposure, which is characteristic of fearful expressions. Across three experiments, we found that fearful, but not happy, gaze guides observers' attention over competing non-emotional gaze. The guidance of attention by fearful gaze appears to be linked to increased sclera exposure. However, differences in sclera exposure do not prioritise competing gazes of other types. Thus, fearful gaze guides attention among competing cues and this effect is facilitated by increased sclera exposure - but increased sclera exposure per se does not guide attention. The prioritisation of fearful gaze over non-emotional gaze likely represents an adaptive means of selectively attending to survival-relevant spatial locations.

  11. EYE GAZE TRACKING

    DEFF Research Database (Denmark)

    2017-01-01

    This invention relates to a method of performing eye gaze tracking of at least one eye of a user, by determining the position of the center of the eye, said method comprising the steps of: detecting the position of at least three reflections on said eye, transforming said positions to spanning...... a normalized coordinate system spanning a frame of reference, wherein said transformation is performed based on a bilinear transformation or a non linear transformation e.g. a möbius transformation or a homographic transformation, detecting the position of said center of the eye relative to the position...... of said reflections and transforming this position to said normalized coordinate system, tracking the eye gaze by tracking the movement of said eye in said normalized coordinate system. Thereby calibration of a camera, such as knowledge of the exact position and zoom level of the camera, is avoided...

  12. Eye gazing direction inspection based on image processing technique

    Science.gov (United States)

    Hao, Qun; Song, Yong

    2005-02-01

    According to the research result in neural biology, human eyes can obtain high resolution only at the center of view of field. In the research of Virtual Reality helmet, we design to detect the gazing direction of human eyes in real time and feed it back to the control system to improve the resolution of the graph at the center of field of view. In the case of current display instruments, this method can both give attention to the view field of virtual scene and resolution, and improve the immersion of virtual system greatly. Therefore, detecting the gazing direction of human eyes rapidly and exactly is the basis of realizing the design scheme of this novel VR helmet. In this paper, the conventional method of gazing direction detection that based on Purklinje spot is introduced firstly. In order to overcome the disadvantage of the method based on Purklinje spot, this paper proposed a method based on image processing to realize the detection and determination of the gazing direction. The locations of pupils and shapes of eye sockets change with the gazing directions. With the aid of these changes, analyzing the images of eyes captured by the cameras, gazing direction of human eyes can be determined finally. In this paper, experiments have been done to validate the efficiency of this method by analyzing the images. The algorithm can carry out the detection of gazing direction base on normal eye image directly, and it eliminates the need of special hardware. Experiment results show that the method is easy to implement and have high precision.

  13. Latvijas gaze

    International Nuclear Information System (INIS)

    1994-04-01

    A collection of photocopies of materials (such as overheads etc.) used at a seminar (organized by the Board of Directors of the company designated ''Latvijas Gaze'' in connection with The National Oil and Gas Company of Denmark, DONG) comprising an analysis of training needs with regard to marketing of gas technology and consultancy to countries in Europe, especially with regard to Latvia. (AB)

  14. Robot Faces that Follow Gaze Facilitate Attentional Engagement and Increase Their Likeability

    Science.gov (United States)

    Willemse, Cesco; Marchesi, Serena; Wykowska, Agnieszka

    2018-01-01

    Gaze behavior of humanoid robots is an efficient mechanism for cueing our spatial orienting, but less is known about the cognitive–affective consequences of robots responding to human directional cues. Here, we examined how the extent to which a humanoid robot (iCub) avatar directed its gaze to the same objects as our participants affected engagement with the robot, subsequent gaze-cueing, and subjective ratings of the robot’s characteristic traits. In a gaze-contingent eyetracking task, participants were asked to indicate a preference for one of two objects with their gaze while an iCub avatar was presented between the object photographs. In one condition, the iCub then shifted its gaze toward the object chosen by a participant in 80% of the trials (joint condition) and in the other condition it looked at the opposite object 80% of the time (disjoint condition). Based on the literature in human–human social cognition, we took the speed with which the participants looked back at the robot as a measure of facilitated reorienting and robot-preference, and found these return saccade onset times to be quicker in the joint condition than in the disjoint condition. As indicated by results from a subsequent gaze-cueing tasks, the gaze-following behavior of the robot had little effect on how our participants responded to gaze cues. Nevertheless, subjective reports suggested that our participants preferred the iCub following participants’ gaze to the one with a disjoint attention behavior, rated it as more human-like and as more likeable. Taken together, our findings show a preference for robots who follow our gaze. Importantly, such subtle differences in gaze behavior are sufficient to influence our perception of humanoid agents, which clearly provides hints about the design of behavioral characteristics of humanoid robots in more naturalistic settings. PMID:29459842

  15. Robot Faces that Follow Gaze Facilitate Attentional Engagement and Increase Their Likeability

    Directory of Open Access Journals (Sweden)

    Cesco Willemse

    2018-02-01

    Full Text Available Gaze behavior of humanoid robots is an efficient mechanism for cueing our spatial orienting, but less is known about the cognitive–affective consequences of robots responding to human directional cues. Here, we examined how the extent to which a humanoid robot (iCub avatar directed its gaze to the same objects as our participants affected engagement with the robot, subsequent gaze-cueing, and subjective ratings of the robot’s characteristic traits. In a gaze-contingent eyetracking task, participants were asked to indicate a preference for one of two objects with their gaze while an iCub avatar was presented between the object photographs. In one condition, the iCub then shifted its gaze toward the object chosen by a participant in 80% of the trials (joint condition and in the other condition it looked at the opposite object 80% of the time (disjoint condition. Based on the literature in human–human social cognition, we took the speed with which the participants looked back at the robot as a measure of facilitated reorienting and robot-preference, and found these return saccade onset times to be quicker in the joint condition than in the disjoint condition. As indicated by results from a subsequent gaze-cueing tasks, the gaze-following behavior of the robot had little effect on how our participants responded to gaze cues. Nevertheless, subjective reports suggested that our participants preferred the iCub following participants’ gaze to the one with a disjoint attention behavior, rated it as more human-like and as more likeable. Taken together, our findings show a preference for robots who follow our gaze. Importantly, such subtle differences in gaze behavior are sufficient to influence our perception of humanoid agents, which clearly provides hints about the design of behavioral characteristics of humanoid robots in more naturalistic settings.

  16. Robot Faces that Follow Gaze Facilitate Attentional Engagement and Increase Their Likeability.

    Science.gov (United States)

    Willemse, Cesco; Marchesi, Serena; Wykowska, Agnieszka

    2018-01-01

    Gaze behavior of humanoid robots is an efficient mechanism for cueing our spatial orienting, but less is known about the cognitive-affective consequences of robots responding to human directional cues. Here, we examined how the extent to which a humanoid robot (iCub) avatar directed its gaze to the same objects as our participants affected engagement with the robot, subsequent gaze-cueing, and subjective ratings of the robot's characteristic traits. In a gaze-contingent eyetracking task, participants were asked to indicate a preference for one of two objects with their gaze while an iCub avatar was presented between the object photographs. In one condition, the iCub then shifted its gaze toward the object chosen by a participant in 80% of the trials (joint condition) and in the other condition it looked at the opposite object 80% of the time (disjoint condition). Based on the literature in human-human social cognition, we took the speed with which the participants looked back at the robot as a measure of facilitated reorienting and robot-preference, and found these return saccade onset times to be quicker in the joint condition than in the disjoint condition. As indicated by results from a subsequent gaze-cueing tasks, the gaze-following behavior of the robot had little effect on how our participants responded to gaze cues. Nevertheless, subjective reports suggested that our participants preferred the iCub following participants' gaze to the one with a disjoint attention behavior, rated it as more human-like and as more likeable. Taken together, our findings show a preference for robots who follow our gaze. Importantly, such subtle differences in gaze behavior are sufficient to influence our perception of humanoid agents, which clearly provides hints about the design of behavioral characteristics of humanoid robots in more naturalistic settings.

  17. "The Gaze Heuristic:" Biography of an Adaptively Rational Decision Process.

    Science.gov (United States)

    Hamlin, Robert P

    2017-04-01

    This article is a case study that describes the natural and human history of the gaze heuristic. The gaze heuristic is an interception heuristic that utilizes a single input (deviation from a constant angle of approach) repeatedly as a task is performed. Its architecture, advantages, and limitations are described in detail. A history of the gaze heuristic is then presented. In natural history, the gaze heuristic is the only known technique used by predators to intercept prey. In human history the gaze heuristic was discovered accidentally by Royal Air Force (RAF) fighter command just prior to World War II. As it was never discovered by the Luftwaffe, the technique conferred a decisive advantage upon the RAF throughout the war. After the end of the war in America, German technology was combined with the British heuristic to create the Sidewinder AIM9 missile, the most successful autonomous weapon ever built. There are no plans to withdraw it or replace its guiding gaze heuristic. The case study demonstrates that the gaze heuristic is a specific heuristic type that takes a single best input at the best time (take the best 2 ). Its use is an adaptively rational response to specific, rapidly evolving decision environments that has allowed those animals/humans/machines who use it to survive, prosper, and multiply relative to those who do not. Copyright © 2017 Cognitive Science Society, Inc.

  18. Experimental test of spatial updating models for monkey eye-head gaze shifts.

    Directory of Open Access Journals (Sweden)

    Tom J Van Grootel

    Full Text Available How the brain maintains an accurate and stable representation of visual target locations despite the occurrence of saccadic gaze shifts is a classical problem in oculomotor research. Here we test and dissociate the predictions of different conceptual models for head-unrestrained gaze-localization behavior of macaque monkeys. We adopted the double-step paradigm with rapid eye-head gaze shifts to measure localization accuracy in response to flashed visual stimuli in darkness. We presented the second target flash either before (static, or during (dynamic the first gaze displacement. In the dynamic case the brief visual flash induced a small retinal streak of up to about 20 deg at an unpredictable moment and retinal location during the eye-head gaze shift, which provides serious challenges for the gaze-control system. However, for both stimulus conditions, monkeys localized the flashed targets with accurate gaze shifts, which rules out several models of visuomotor control. First, these findings exclude the possibility that gaze-shift programming relies on retinal inputs only. Instead, they support the notion that accurate eye-head motor feedback updates the gaze-saccade coordinates. Second, in dynamic trials the visuomotor system cannot rely on the coordinates of the planned first eye-head saccade either, which rules out remapping on the basis of a predictive corollary gaze-displacement signal. Finally, because gaze-related head movements were also goal-directed, requiring continuous access to eye-in-head position, we propose that our results best support a dynamic feedback scheme for spatial updating in which visuomotor control incorporates accurate signals about instantaneous eye- and head positions rather than relative eye- and head displacements.

  19. Gaze Behavior of Children with ASD toward Pictures of Facial Expressions.

    Science.gov (United States)

    Matsuda, Soichiro; Minagawa, Yasuyo; Yamamoto, Junichi

    2015-01-01

    Atypical gaze behavior in response to a face has been well documented in individuals with autism spectrum disorders (ASDs). Children with ASD appear to differ from typically developing (TD) children in gaze behavior for spoken and dynamic face stimuli but not for nonspeaking, static face stimuli. Furthermore, children with ASD and TD children show a difference in their gaze behavior for certain expressions. However, few studies have examined the relationship between autism severity and gaze behavior toward certain facial expressions. The present study replicated and extended previous studies by examining gaze behavior towards pictures of facial expressions. We presented ASD and TD children with pictures of surprised, happy, neutral, angry, and sad facial expressions. Autism severity was assessed using the Childhood Autism Rating Scale (CARS). The results showed that there was no group difference in gaze behavior when looking at pictures of facial expressions. Conversely, the children with ASD who had more severe autistic symptomatology had a tendency to gaze at angry facial expressions for a shorter duration in comparison to other facial expressions. These findings suggest that autism severity should be considered when examining atypical responses to certain facial expressions.

  20. Anxiety symptoms and children's eye gaze during fear learning.

    Science.gov (United States)

    Michalska, Kalina J; Machlin, Laura; Moroney, Elizabeth; Lowet, Daniel S; Hettema, John M; Roberson-Nay, Roxann; Averbeck, Bruno B; Brotman, Melissa A; Nelson, Eric E; Leibenluft, Ellen; Pine, Daniel S

    2017-11-01

    The eye region of the face is particularly relevant for decoding threat-related signals, such as fear. However, it is unclear if gaze patterns to the eyes can be influenced by fear learning. Previous studies examining gaze patterns in adults find an association between anxiety and eye gaze avoidance, although no studies to date examine how associations between anxiety symptoms and eye-viewing patterns manifest in children. The current study examined the effects of learning and trait anxiety on eye gaze using a face-based fear conditioning task developed for use in children. Participants were 82 youth from a general population sample of twins (aged 9-13 years), exhibiting a range of anxiety symptoms. Participants underwent a fear conditioning paradigm where the conditioned stimuli (CS+) were two neutral faces, one of which was randomly selected to be paired with an aversive scream. Eye tracking, physiological, and subjective data were acquired. Children and parents reported their child's anxiety using the Screen for Child Anxiety Related Emotional Disorders. Conditioning influenced eye gaze patterns in that children looked longer and more frequently to the eye region of the CS+ than CS- face; this effect was present only during fear acquisition, not at baseline or extinction. Furthermore, consistent with past work in adults, anxiety symptoms were associated with eye gaze avoidance. Finally, gaze duration to the eye region mediated the effect of anxious traits on self-reported fear during acquisition. Anxiety symptoms in children relate to face-viewing strategies deployed in the context of a fear learning experiment. This relationship may inform attempts to understand the relationship between pediatric anxiety symptoms and learning. © 2017 Association for Child and Adolescent Mental Health.

  1. The impact of visual gaze direction on auditory object tracking.

    Science.gov (United States)

    Pomper, Ulrich; Chait, Maria

    2017-07-05

    Subjective experience suggests that we are able to direct our auditory attention independent of our visual gaze, e.g when shadowing a nearby conversation at a cocktail party. But what are the consequences at the behavioural and neural level? While numerous studies have investigated both auditory attention and visual gaze independently, little is known about their interaction during selective listening. In the present EEG study, we manipulated visual gaze independently of auditory attention while participants detected targets presented from one of three loudspeakers. We observed increased response times when gaze was directed away from the locus of auditory attention. Further, we found an increase in occipital alpha-band power contralateral to the direction of gaze, indicative of a suppression of distracting input. Finally, this condition also led to stronger central theta-band power, which correlated with the observed effect in response times, indicative of differences in top-down processing. Our data suggest that a misalignment between gaze and auditory attention both reduce behavioural performance and modulate underlying neural processes. The involvement of central theta-band and occipital alpha-band effects are in line with compensatory neural mechanisms such as increased cognitive control and the suppression of task irrelevant inputs.

  2. INNER TRACKER

    CERN Multimedia

    K. Gill

    During the winter shutdown several parts of the Tracker system are undergoing maintenance, revision or upgrade. The main items are the revision of the strips and pixels cooling plants, removal and maintenance of FPIX, sealing of Tracker patch-panels and the bulkhead, integration of strips and pixels DCS, and further development of the DAQ, Online and commissioning software and firmware. The revision of the cooling system involves the complete replacement of the tanks, distribution lines, valves and manifolds on the SS1 and SS2 strip tracker (182 circuits) and pixels (36 circuits) cooling plants. The objectives are to eliminate the large leaks experienced during 2008 operations and to assure the long-term reliability of the cooling systems. Additional instrumentation is being added to provide more detailed monitoring of the performance of the cooling system. This work is proceeding smoothly under close supervision. Procurements are almost completed and the quality of delivered parts and the subsequent assembl...

  3. Intermediate view synthesis for eye-gazing

    Science.gov (United States)

    Baek, Eu-Ttuem; Ho, Yo-Sung

    2015-01-01

    Nonverbal communication, also known as body language, is an important form of communication. Nonverbal behaviors such as posture, eye contact, and gestures send strong messages. In regard to nonverbal communication, eye contact is one of the most important forms that an individual can use. However, lack of eye contact occurs when we use video conferencing system. The disparity between locations of the eyes and a camera gets in the way of eye contact. The lock of eye gazing can give unapproachable and unpleasant feeling. In this paper, we proposed an eye gazing correction for video conferencing. We use two cameras installed at the top and the bottom of the television. The captured two images are rendered with 2D warping at virtual position. We implement view morphing to the detected face, and synthesize the face and the warped image. Experimental results verify that the proposed system is effective in generating natural gaze-corrected images.

  4. Speaking and Listening with the Eyes: Gaze Signaling during Dyadic Interactions.

    Science.gov (United States)

    Ho, Simon; Foulsham, Tom; Kingstone, Alan

    2015-01-01

    Cognitive scientists have long been interested in the role that eye gaze plays in social interactions. Previous research suggests that gaze acts as a signaling mechanism and can be used to control turn-taking behaviour. However, early research on this topic employed methods of analysis that aggregated gaze information across an entire trial (or trials), which masks any temporal dynamics that may exist in social interactions. More recently, attempts have been made to understand the temporal characteristics of social gaze but little research has been conducted in a natural setting with two interacting participants. The present study combines a temporally sensitive analysis technique with modern eye tracking technology to 1) validate the overall results from earlier aggregated analyses and 2) provide insight into the specific moment-to-moment temporal characteristics of turn-taking behaviour in a natural setting. Dyads played two social guessing games (20 Questions and Heads Up) while their eyes were tracked. Our general results are in line with past aggregated data, and using cross-correlational analysis on the specific gaze and speech signals of both participants we found that 1) speakers end their turn with direct gaze at the listener and 2) the listener in turn begins to speak with averted gaze. Convergent with theoretical models of social interaction, our data suggest that eye gaze can be used to signal both the end and the beginning of a speaking turn during a social interaction. The present study offers insight into the temporal dynamics of live dyadic interactions and also provides a new method of analysis for eye gaze data when temporal relationships are of interest.

  5. Speaking and Listening with the Eyes: Gaze Signaling during Dyadic Interactions.

    Directory of Open Access Journals (Sweden)

    Simon Ho

    Full Text Available Cognitive scientists have long been interested in the role that eye gaze plays in social interactions. Previous research suggests that gaze acts as a signaling mechanism and can be used to control turn-taking behaviour. However, early research on this topic employed methods of analysis that aggregated gaze information across an entire trial (or trials, which masks any temporal dynamics that may exist in social interactions. More recently, attempts have been made to understand the temporal characteristics of social gaze but little research has been conducted in a natural setting with two interacting participants. The present study combines a temporally sensitive analysis technique with modern eye tracking technology to 1 validate the overall results from earlier aggregated analyses and 2 provide insight into the specific moment-to-moment temporal characteristics of turn-taking behaviour in a natural setting. Dyads played two social guessing games (20 Questions and Heads Up while their eyes were tracked. Our general results are in line with past aggregated data, and using cross-correlational analysis on the specific gaze and speech signals of both participants we found that 1 speakers end their turn with direct gaze at the listener and 2 the listener in turn begins to speak with averted gaze. Convergent with theoretical models of social interaction, our data suggest that eye gaze can be used to signal both the end and the beginning of a speaking turn during a social interaction. The present study offers insight into the temporal dynamics of live dyadic interactions and also provides a new method of analysis for eye gaze data when temporal relationships are of interest.

  6. Gaze Behavior, Believability, Likability and the iCat

    NARCIS (Netherlands)

    Poel, Mannes; Heylen, Dirk K.J.; Meulemans, M.; Nijholt, Antinus; Stock, O.; Nishida, T.

    2007-01-01

    The iCat is a user-interface robot with the ability to express a range of emotions through its facial features. This paper summarizes our research whether we can increase the believability and likability of the iCat for its human partners through the application of gaze behaviour. Gaze behaviour

  7. Gaze Behavior, Believability, Likability and the iCat

    NARCIS (Netherlands)

    Nijholt, Antinus; Poel, Mannes; Heylen, Dirk K.J.; Stock, O.; Nishida, T.; Meulemans, M.; van Bremen, A.

    2009-01-01

    The iCat is a user-interface robot with the ability to express a range of emotions through its facial features. This paper summarizes our research whether we can increase the believability and likability of the iCat for its human partners through the application of gaze behaviour. Gaze behaviour

  8. "Wolves (Canis lupus) and dogs (Canis familiaris) differ in following human gaze into distant space but respond similar to their packmates' gaze": Correction to Werhahn et al. (2016).

    Science.gov (United States)

    2017-02-01

    Reports an error in "Wolves ( Canis lupus ) and dogs ( Canis familiaris ) differ in following human gaze into distant space but respond similar to their packmates' gaze" by Geraldine Werhahn, Zsófia Virányi, Gabriela Barrera, Andrea Sommese and Friederike Range ( Journal of Comparative Psychology , 2016[Aug], Vol 130[3], 288-298). In the article, the affiliations for the second and fifth authors should be Wolf Science Center, Ernstbrunn, Austria, and Comparative Cognition, Messerli Research Institute, University of Veterinary Medicine Vienna/ Medical University of Vienna/University of Vienna. The online version of this article has been corrected. (The following abstract of the original article appeared in record 2016-26311-001.) Gaze following into distant space is defined as visual co-orientation with another individual's head direction allowing the gaze follower to gain information on its environment. Human and nonhuman animals share this basic gaze following behavior, suggested to rely on a simple reflexive mechanism and believed to be an important prerequisite for complex forms of social cognition. Pet dogs differ from other species in that they follow only communicative human gaze clearly addressed to them. However, in an earlier experiment we showed that wolves follow human gaze into distant space. Here we set out to investigate whether domestication has affected gaze following in dogs by comparing pack-living dogs and wolves raised and kept under the same conditions. In Study 1 we found that in contrast to the wolves, these dogs did not follow minimally communicative human gaze into distant space in the same test paradigm. In the observational Study 2 we found that pack-living dogs and wolves, similarly vigilant to environmental stimuli, follow the spontaneous gaze of their conspecifics similarly often. Our findings suggest that domestication did not affect the gaze following ability of dogs itself. The results raise hypotheses about which other dog skills

  9. Early Left Parietal Activity Elicited by Direct Gaze: A High-Density EEG Study

    Science.gov (United States)

    Burra, Nicolas; Kerzel, Dirk; George, Nathalie

    2016-01-01

    Gaze is one of the most important cues for human communication and social interaction. In particular, gaze contact is the most primary form of social contact and it is thought to capture attention. A very early-differentiated brain response to direct versus averted gaze has been hypothesized. Here, we used high-density electroencephalography to test this hypothesis. Topographical analysis allowed us to uncover a very early topographic modulation (40–80 ms) of event-related responses to faces with direct as compared to averted gaze. This modulation was obtained only in the condition where intact broadband faces–as opposed to high-pass or low-pas filtered faces–were presented. Source estimation indicated that this early modulation involved the posterior parietal region, encompassing the left precuneus and inferior parietal lobule. This supports the idea that it reflected an early orienting response to direct versus averted gaze. Accordingly, in a follow-up behavioural experiment, we found faster response times to the direct gaze than to the averted gaze broadband faces. In addition, classical evoked potential analysis showed that the N170 peak amplitude was larger for averted gaze than for direct gaze. Taken together, these results suggest that direct gaze may be detected at a very early processing stage, involving a parallel route to the ventral occipito-temporal route of face perceptual analysis. PMID:27880776

  10. Gaze behaviour during space perception and spatial decision making.

    Science.gov (United States)

    Wiener, Jan M; Hölscher, Christoph; Büchner, Simon; Konieczny, Lars

    2012-11-01

    A series of four experiments investigating gaze behavior and decision making in the context of wayfinding is reported. Participants were presented with screenshots of choice points taken in large virtual environments. Each screenshot depicted alternative path options. In Experiment 1, participants had to decide between them to find an object hidden in the environment. In Experiment 2, participants were first informed about which path option to take as if following a guided route. Subsequently, they were presented with the same images in random order and had to indicate which path option they chose during initial exposure. In Experiment 1, we demonstrate (1) that participants have a tendency to choose the path option that featured the longer line of sight, and (2) a robust gaze bias towards the eventually chosen path option. In Experiment 2, systematic differences in gaze behavior towards the alternative path options between encoding and decoding were observed. Based on data from Experiments 1 and 2 and two control experiments ensuring that fixation patterns were specific to the spatial tasks, we develop a tentative model of gaze behavior during wayfinding decision making suggesting that particular attention was paid to image areas depicting changes in the local geometry of the environments such as corners, openings, and occlusions. Together, the results suggest that gaze during a wayfinding tasks is directed toward, and can be predicted by, a subset of environmental features and that gaze bias effects are a general phenomenon of visual decision making.

  11. The Politics of the Gaze Foucault, Lacan and Zizek

    Directory of Open Access Journals (Sweden)

    Henry Krips

    2010-03-01

    Full Text Available Joan Copjec accuses orthodox film theory of misrepresenting the Lacanian gaze by assimilating it to Foucauldian panopticon (Copjec 1994: 18-19. Although Copjec is correct that orthodox film theory misrepresents the Lacanian gaze, she, in turn, misrepresents Foucault by choosing to focus exclusively upon those as-pects of his work on the panopticon that have been taken up by orthodox film the-ory (Copjec 1994: 4. In so doing, I argue, Copjec misses key parallels between the Lacanian and Foucauldian concepts of the gaze. More than a narrow academic dispute about how to read Foucault and Lacan, this debate has wider political sig-nificance. In particular, using Slavoj Zizek's work, I show that a correct account of the panoptic gaze leads us to rethink the question of how to oppose modern techniques of surveillance.

  12. A testimony to Muzil: Hervé Guibert, Foucault, and the medical gaze.

    Science.gov (United States)

    Rendell, Joanne

    2004-01-01

    Testimony to Muzil: Hervé Guibert, Michel Foucault, and the "Medical Gaze" examines the fictional/autobiographical AIDS writings of the French writer Hervé Guibert. Locating Guibert's writings alongside the work of his friend Michel Foucault, the article explores how they echo Foucault's evolving notions of the "medical gaze." The article also explores how Guilbert's narrators and Guibert himself (as writer) resist and challenge the medical gaze; a gaze which particularly in the era of AIDS has subjected, objectified, and even sometimes punished the body of the gay man. It is argued that these resistances to the gaze offer a literary extension to Foucault's later work on power and resistance strategies.

  13. Human-like object tracking and gaze estimation with PKD android.

    Science.gov (United States)

    Wijayasinghe, Indika B; Miller, Haylie L; Das, Sumit K; Bugnariu, Nicoleta L; Popa, Dan O

    2016-05-01

    As the use of robots increases for tasks that require human-robot interactions, it is vital that robots exhibit and understand human-like cues for effective communication. In this paper, we describe the implementation of object tracking capability on Philip K. Dick (PKD) android and a gaze tracking algorithm, both of which further robot capabilities with regard to human communication. PKD's ability to track objects with human-like head postures is achieved with visual feedback from a Kinect system and an eye camera. The goal of object tracking with human-like gestures is twofold : to facilitate better human-robot interactions and to enable PKD as a human gaze emulator for future studies. The gaze tracking system employs a mobile eye tracking system (ETG; SensoMotoric Instruments) and a motion capture system (Cortex; Motion Analysis Corp.) for tracking the head orientations. Objects to be tracked are displayed by a virtual reality system, the Computer Assisted Rehabilitation Environment (CAREN; MotekForce Link). The gaze tracking algorithm converts eye tracking data and head orientations to gaze information facilitating two objectives: to evaluate the performance of the object tracking system for PKD and to use the gaze information to predict the intentions of the user, enabling the robot to understand physical cues by humans.

  14. Human-like object tracking and gaze estimation with PKD android

    Science.gov (United States)

    Wijayasinghe, Indika B.; Miller, Haylie L.; Das, Sumit K.; Bugnariu, Nicoleta L.; Popa, Dan O.

    2016-05-01

    As the use of robots increases for tasks that require human-robot interactions, it is vital that robots exhibit and understand human-like cues for effective communication. In this paper, we describe the implementation of object tracking capability on Philip K. Dick (PKD) android and a gaze tracking algorithm, both of which further robot capabilities with regard to human communication. PKD's ability to track objects with human-like head postures is achieved with visual feedback from a Kinect system and an eye camera. The goal of object tracking with human-like gestures is twofold: to facilitate better human-robot interactions and to enable PKD as a human gaze emulator for future studies. The gaze tracking system employs a mobile eye tracking system (ETG; SensoMotoric Instruments) and a motion capture system (Cortex; Motion Analysis Corp.) for tracking the head orientations. Objects to be tracked are displayed by a virtual reality system, the Computer Assisted Rehabilitation Environment (CAREN; MotekForce Link). The gaze tracking algorithm converts eye tracking data and head orientations to gaze information facilitating two objectives: to evaluate the performance of the object tracking system for PKD and to use the gaze information to predict the intentions of the user, enabling the robot to understand physical cues by humans.

  15. Utilizing Gaze Behavior for Inferring Task Transitions Using Abstract Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Daniel Fernando Tello Gamarra

    2016-12-01

    Full Text Available We demonstrate an improved method for utilizing observed gaze behavior and show that it is useful in inferring hand movement intent during goal directed tasks. The task dynamics and the relationship between hand and gaze behavior are learned using an Abstract Hidden Markov Model (AHMM. We show that the predicted hand movement transitions occur consistently earlier in AHMM models with gaze than those models that do not include gaze observations.

  16. Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain-machine interfaces

    Science.gov (United States)

    Abbott, W. W.; Faisal, A. A.

    2012-08-01

    Eye movements are highly correlated with motor intentions and are often retained by patients with serious motor deficiencies. Despite this, eye tracking is not widely used as control interface for movement in impaired patients due to poor signal interpretation and lack of control flexibility. We propose that tracking the gaze position in 3D rather than 2D provides a considerably richer signal for human machine interfaces by allowing direct interaction with the environment rather than via computer displays. We demonstrate here that by using mass-produced video-game hardware, it is possible to produce an ultra-low-cost binocular eye-tracker with comparable performance to commercial systems, yet 800 times cheaper. Our head-mounted system has 30 USD material costs and operates at over 120 Hz sampling rate with a 0.5-1 degree of visual angle resolution. We perform 2D and 3D gaze estimation, controlling a real-time volumetric cursor essential for driving complex user interfaces. Our approach yields an information throughput of 43 bits s-1, more than ten times that of invasive and semi-invasive brain-machine interfaces (BMIs) that are vastly more expensive. Unlike many BMIs our system yields effective real-time closed loop control of devices (10 ms latency), after just ten minutes of training, which we demonstrate through a novel BMI benchmark—the control of the video arcade game ‘Pong’.

  17. Towards Wearable Gaze Supported Augmented Cognition

    DEFF Research Database (Denmark)

    Toshiaki Kurauchi, Andrew; Hitoshi Morimoto, Carlos; Mardanbeigi, Diako

    Augmented cognition applications must deal with the problem of how to exhibit information in an orderly, understandable, and timely fashion. Though context have been suggested to control the kind, amount, and timing of the information delivered, we argue that gaze can be a fundamental tool...... by the wearable computing community to develop a gaze supported augmented cognition application with three interaction modes. The application provides information of the person being looked at. The continuous mode updates information every time the user looks at a different face. The key activated discrete mode...

  18. Trait Anxiety Impacts the Perceived Gaze Direction of Fearful But Not Angry Faces

    Directory of Open Access Journals (Sweden)

    Zhonghua Hu

    2017-07-01

    Full Text Available Facial expression and gaze direction play an important role in social communication. Previous research has demonstrated the perception of anger is enhanced by direct gaze, whereas, it is unclear whether perception of fear is enhanced by averted gaze. In addition, previous research has shown the anxiety affects the processing of facial expression and gaze direction, but hasn’t measured or controlled for depression. As a result, firm conclusions cannot be made regarding the impact of individual differences in anxiety and depression on perceptions of face expressions and gaze direction. The current study attempted to reexamine the effect of the anxiety level on the processing of facial expressions and gaze direction by matching participants on depression scores. A reliable psychophysical index of the range of eye gaze angles judged as being directed at oneself [the cone of direct gaze (CoDG] was used as the dependent variable in this study. Participants were stratified into high/low trait anxiety groups and asked to judge the gaze of angry, fearful, and neutral faces across a range of gaze directions. The result showed: (1 the perception of gaze direction was influenced by facial expression and this was modulated by trait anxiety. For the high trait anxiety group, the CoDG for angry expressions was wider than for fearful and neutral expressions, and no significant difference emerged between fearful and neutral expressions; For the low trait anxiety group, the CoDG for both angry and fearful expressions was wider than for neutral, and no significant difference emerged between angry and fearful expressions. (2 Trait anxiety modulated the perception of gaze direction only in the fearful condition, such that the fearful CoDG for the high trait anxiety group was narrower than the low trait anxiety group. This demonstrated that anxiety distinctly affected gaze perception in expressions that convey threat (angry, fearful, such that a high trait anxiety

  19. The impact of visual gaze direction on auditory object tracking

    OpenAIRE

    Pomper, U.; Chait, M.

    2017-01-01

    Subjective experience suggests that we are able to direct our auditory attention independent of our visual gaze, e.g when shadowing a nearby conversation at a cocktail party. But what are the consequences at the behavioural and neural level? While numerous studies have investigated both auditory attention and visual gaze independently, little is known about their interaction during selective listening. In the present EEG study, we manipulated visual gaze independently of auditory attention wh...

  20. Watch out! Magnetoencephalographic evidence for early modulation of attention orienting by fearful gaze cueing.

    Directory of Open Access Journals (Sweden)

    Fanny Lachat

    Full Text Available Others' gaze and emotional facial expression are important cues for the process of attention orienting. Here, we investigated with magnetoencephalography (MEG whether the combination of averted gaze and fearful expression may elicit a selectively early effect of attention orienting on the brain responses to targets. We used the direction of gaze of centrally presented fearful and happy faces as the spatial attention orienting cue in a Posner-like paradigm where the subjects had to detect a target checkerboard presented at gazed-at (valid trials or non gazed-at (invalid trials locations of the screen. We showed that the combination of averted gaze and fearful expression resulted in a very early attention orienting effect in the form of additional parietal activity between 55 and 70 ms for the valid versus invalid targets following fearful gaze cues. No such effect was obtained for the targets following happy gaze cues. This early cue-target validity effect selective of fearful gaze cues involved the left superior parietal region and the left lateral middle occipital region. These findings provide the first evidence for an effect of attention orienting induced by fearful gaze in the time range of C1. In doing so, they demonstrate the selective impact of combined gaze and fearful expression cues in the process of attention orienting.

  1. Experimental predictions drawn from a computational model of sign-trackers and goal-trackers.

    Science.gov (United States)

    Lesaint, Florian; Sigaud, Olivier; Clark, Jeremy J; Flagel, Shelly B; Khamassi, Mehdi

    2015-01-01

    Gaining a better understanding of the biological mechanisms underlying the individual variation observed in response to rewards and reward cues could help to identify and treat individuals more prone to disorders of impulsive control, such as addiction. Variation in response to reward cues is captured in rats undergoing autoshaping experiments where the appearance of a lever precedes food delivery. Although no response is required for food to be delivered, some rats (goal-trackers) learn to approach and avidly engage the magazine until food delivery, whereas other rats (sign-trackers) come to approach and engage avidly the lever. The impulsive and often maladaptive characteristics of the latter response are reminiscent of addictive behaviour in humans. In a previous article, we developed a computational model accounting for a set of experimental data regarding sign-trackers and goal-trackers. Here we show new simulations of the model to draw experimental predictions that could help further validate or refute the model. In particular, we apply the model to new experimental protocols such as injecting flupentixol locally into the core of the nucleus accumbens rather than systemically, and lesioning of the core of the nucleus accumbens before or after conditioning. In addition, we discuss the possibility of removing the food magazine during the inter-trial interval. The predictions from this revised model will help us better understand the role of different brain regions in the behaviours expressed by sign-trackers and goal-trackers. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Intelligent robotic tracker

    Science.gov (United States)

    Otaguro, W. S.; Kesler, L. O.; Land, K. C.; Rhoades, D. E.

    1987-01-01

    An intelligent tracker capable of robotic applications requiring guidance and control of platforms, robotic arms, and end effectors has been developed. This packaged system capable of supervised autonomous robotic functions is partitioned into a multiple processor/parallel processing configuration. The system currently interfaces to cameras but has the capability to also use three-dimensional inputs from scanning laser rangers. The inputs are fed into an image processing and tracking section where the camera inputs are conditioned for the multiple tracker algorithms. An executive section monitors the image processing and tracker outputs and performs all the control and decision processes. The present architecture of the system is presented with discussion of its evolutionary growth for space applications. An autonomous rendezvous demonstration of this system was performed last year. More realistic demonstrations in planning are discussed.

  3. Face and gaze perception in borderline personality disorder: An electrical neuroimaging study.

    Science.gov (United States)

    Berchio, Cristina; Piguet, Camille; Gentsch, Kornelia; Küng, Anne-Lise; Rihs, Tonia A; Hasler, Roland; Aubry, Jean-Michel; Dayer, Alexandre; Michel, Christoph M; Perroud, Nader

    2017-11-30

    Humans are sensitive to gaze direction from early life, and gaze has social and affective values. Borderline personality disorder (BPD) is a clinical condition characterized by emotional dysregulation and enhanced sensitivity to affective and social cues. In this study we wanted to investigate the temporal-spatial dynamics of spontaneous gaze processing in BPD. We used a 2-back-working-memory task, in which neutral faces with direct and averted gaze were presented. Gaze was used as an emotional modulator of event-related-potentials to faces. High density EEG data were acquired in 19 females with BPD and 19 healthy women, and analyzed with a spatio-temporal microstates analysis approach. Independently of gaze direction, BPD patients showed altered N170 and P200 topographies for neutral faces. Source localization revealed that the anterior cingulate and other prefrontal regions were abnormally activated during the N170 component related to face encoding, while middle temporal deactivations were observed during the P200 component. Post-task affective ratings showed that BPD patients had difficulty to disambiguate neutral gaze. This study provides first evidence for an early neural bias toward neutral faces in BPD independent of gaze direction and also suggests the importance of considering basic aspects of social cognition in identifying biological risk factors of BPD. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Conflict Tasks of Different Types Divergently Affect the Attentional Processing of Gaze and Arrow.

    Science.gov (United States)

    Fan, Lingxia; Yu, Huan; Zhang, Xuemin; Feng, Qing; Sun, Mengdan; Xu, Mengsi

    2018-01-01

    The present study explored the attentional processing mechanisms of gaze and arrow cues in two different types of conflict tasks. In Experiment 1, participants performed a flanker task in which gaze and arrow cues were presented as central targets or bilateral distractors. The congruency between the direction of the target and the distractors was manipulated. Results showed that arrow distractors greatly interfered with the attentional processing of gaze, while the processing of arrow direction was immune to conflict from gaze distractors. Using a spatial compatibility task, Experiment 2 explored the conflict effects exerted on gaze and arrow processing by their relative spatial locations. When the direction of the arrow was in conflict with its spatial layout on screen, response times were slowed; however, the encoding of gaze was unaffected by spatial location. In general, processing to an arrow cue is less influenced by bilateral gaze cues but is affected by irrelevant spatial information, while processing to a gaze cue is greatly disturbed by bilateral arrows but is unaffected by irrelevant spatial information. Different effects on gaze and arrow cues by different types of conflicts may reflect two relatively distinct specific modes of the attentional process.

  5. Interaction between gaze and visual and proprioceptive position judgements.

    Science.gov (United States)

    Fiehler, Katja; Rösler, Frank; Henriques, Denise Y P

    2010-06-01

    There is considerable evidence that targets for action are represented in a dynamic gaze-centered frame of reference, such that each gaze shift requires an internal updating of the target. Here, we investigated the effect of eye movements on the spatial representation of targets used for position judgements. Participants had their hand passively placed to a location, and then judged whether this location was left or right of a remembered visual or remembered proprioceptive target, while gaze direction was varied. Estimates of position of the remembered targets relative to the unseen position of the hand were assessed with an adaptive psychophysical procedure. These positional judgements significantly varied relative to gaze for both remembered visual and remembered proprioceptive targets. Our results suggest that relative target positions may also be represented in eye-centered coordinates. This implies similar spatial reference frames for action control and space perception when positions are coded relative to the hand.

  6. A silicon tracker for Christmas

    CERN Multimedia

    2008-01-01

    The CMS experiment installed the world’s largest silicon tracker just before Christmas. Marcello Mannelli: physicist and deputy CMS project leader, and Alan Honma, physicist, compare two generations of tracker: OPAL for the LEP (at the front) and CMS for the LHC (behind). There is quite a difference between 1m2 and 205m2.. CMS received an early Christmas present on 18 December when the silicon tracker was installed in the heart of the CMS magnet. The CMS tracker team couldn’t have hoped for a better present. Carefully wrapped in shiny plastic, the world’s largest silicon tracker arrived at Cessy ready for installation inside the CMS magnet on 18 December. This rounded off the year for CMS with a major event, the crowning touch to ten years of work on the project by over five hundred scientists and engineers. "Building a scientific instrument of this size and complexity is a huge technical a...

  7. Strange-face illusions during inter-subjective gazing.

    Science.gov (United States)

    Caputo, Giovanni B

    2013-03-01

    In normal observers, gazing at one's own face in the mirror for a few minutes, at a low illumination level, triggers the perception of strange faces, a new visual illusion that has been named 'strange-face in the mirror'. Individuals see huge distortions of their own faces, but they often see monstrous beings, archetypal faces, faces of relatives and deceased, and animals. In the experiment described here, strange-face illusions were perceived when two individuals, in a dimly lit room, gazed at each other in the face. Inter-subjective gazing compared to mirror-gazing produced a higher number of different strange-faces. Inter-subjective strange-face illusions were always dissociative of the subject's self and supported moderate feeling of their reality, indicating a temporary lost of self-agency. Unconscious synchronization of event-related responses to illusions was found between members in some pairs. Synchrony of illusions may indicate that unconscious response-coordination is caused by the illusion-conjunction of crossed dissociative strange-faces, which are perceived as projections into each other's visual face of reciprocal embodied representations within the pair. Inter-subjective strange-face illusions may be explained by the subject's embodied representations (somaesthetic, kinaesthetic and motor facial pattern) and the other's visual face binding. Unconscious facial mimicry may promote inter-subjective illusion-conjunction, then unconscious joint-action and response-coordination. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. The CMS Silicon Tracker Alignment

    CERN Document Server

    Castello, R

    2008-01-01

    The alignment of the Strip and Pixel Tracker of the Compact Muon Solenoid experiment, with its large number of independent silicon sensors and its excellent spatial resolution, is a complex and challenging task. Besides high precision mounting, survey measurements and the Laser Alignment System, track-based alignment is needed to reach the envisaged precision.\\\\ Three different algorithms for track-based alignment were successfully tested on a sample of cosmic-ray data collected at the Tracker Integration Facility, where 15\\% of the Tracker was tested. These results, together with those coming from the CMS global run, will provide the basis for the full-scale alignment of the Tracker, which will be carried out with the first \\emph{p-p} collisions.

  9. Gaze3DFix: Detecting 3D fixations with an ellipsoidal bounding volume.

    Science.gov (United States)

    Weber, Sascha; Schubert, Rebekka S; Vogt, Stefan; Velichkovsky, Boris M; Pannasch, Sebastian

    2017-10-26

    Nowadays, the use of eyetracking to determine 2-D gaze positions is common practice, and several approaches to the detection of 2-D fixations exist, but ready-to-use algorithms to determine eye movements in three dimensions are still missing. Here we present a dispersion-based algorithm with an ellipsoidal bounding volume that estimates 3D fixations. Therefore, 3D gaze points are obtained using a vector-based approach and are further processed with our algorithm. To evaluate the accuracy of our method, we performed experimental studies with real and virtual stimuli. We obtained good congruence between stimulus position and both the 3D gaze points and the 3D fixation locations within the tested range of 200-600 mm. The mean deviation of the 3D fixations from the stimulus positions was 17 mm for the real as well as for the virtual stimuli, with larger variances at increasing stimulus distances. The described algorithms are implemented in two dynamic linked libraries (Gaze3D.dll and Fixation3D.dll), and we provide a graphical user interface (Gaze3DFixGUI.exe) that is designed for importing 2-D binocular eyetracking data and calculating both 3D gaze points and 3D fixations using the libraries. The Gaze3DFix toolkit, including both libraries and the graphical user interface, is available as open-source software at https://github.com/applied-cognition-research/Gaze3DFix .

  10. Investigating social gaze as an action-perception online performance

    Directory of Open Access Journals (Sweden)

    Ouriel eGrynszpan

    2012-04-01

    Full Text Available In interpersonal interactions, linguistic information is complemented by non-linguistic information originating largely from facial expressions. The study of online face-to-face social interaction thus entails investigating the multimodal simultaneous processing of oral and visual percepts. Moreover, gaze in and of itself functions as a powerful communicative channel. In this respect, gaze should not be examined as a purely perceptive process but also as an active social performance. We designed a task involving multimodal deciphering of social information based on virtual characters, embedded in naturalistic backgrounds, who directly address the participant with non-literal speech and meaningful facial expressions. Eighteen adult participants were to interpret an equivocal sentence which could be disambiguated by examining the emotional expressions of the character speaking to them face-to-face. To examine self-control and self-awareness of gaze in this context, visual feedback is provided to the participant by a real-time gaze-contingent viewing window centered on the focal point, while the rest of the display is blurred. Eye-tracking data showed that the viewing window induced changes in gaze behaviour, notably longer visual fixations. Notwithstanding, only half the participants ascribed the window displacements to their eye movements. These results highlight the dissociation between non volitional gaze adaptation and self-ascription of agency. Such dissociation provides support for a two-step account of the sense of agency composed of pre-noetic monitoring mechanisms and reflexive processes. We comment upon these results, which illustrate the relevance of our method for studying online social cognition, especially concerning Autism Spectrum Disorders (ASD where poor pragmatic understanding of oral speech are considered linked to visual peculiarities that impede face exploration.

  11. Investigating social gaze as an action-perception online performance.

    Science.gov (United States)

    Grynszpan, Ouriel; Simonin, Jérôme; Martin, Jean-Claude; Nadel, Jacqueline

    2012-01-01

    Gaze represents a major non-verbal communication channel in social interactions. In this respect, when facing another person, one's gaze should not be examined as a purely perceptive process but also as an action-perception online performance. However, little is known about processes involved in the real-time self-regulation of social gaze. The present study investigates the impact of a gaze-contingent viewing window on fixation patterns and the awareness of being the agent moving the window. In face-to-face scenarios played by a virtual human character, the task for the 18 adult participants was to interpret an equivocal sentence which could be disambiguated by examining the emotional expressions of the character speaking. The virtual character was embedded in naturalistic backgrounds to enhance realism. Eye-tracking data showed that the viewing window induced changes in gaze behavior, notably longer visual fixations. Notwithstanding, only half of the participants ascribed the window displacements to their eye movements. These participants also spent more time looking at the eyes and mouth regions of the virtual human character. The outcomes of the study highlight the dissociation between non-volitional gaze adaptation and the self-ascription of agency. Such dissociation provides support for a two-step account of the sense of agency composed of pre-noetic monitoring mechanisms and reflexive processes, linked by bottom-up and top-down processes. We comment upon these results, which illustrate the relevance of our method for studying online social cognition, in particular concerning autism spectrum disorders (ASD) where the poor pragmatic understanding of oral speech is considered linked to visual peculiarities that impede facial exploration.

  12. Gaze and power. A post-structuralist interpretation on Perseus’ myth

    Directory of Open Access Journals (Sweden)

    Olaya Fernández Guerrero

    2015-12-01

    Full Text Available Gaze hierarchizes, manages and labels reality. Then, according to Foucault, gaze can be understood as a practice of power. This paper is inspired by his theories, and it applies them to one of the most powerful symbolic spheres of Western culture: Greek Myths. Notions such as visibility, invisibility and panopticism bring new light into the story of Perseus and Medusa, and they enable a re-reading of this Myth focused on the different ways of power that emerge from the gaze.

  13. MediaTracker system

    Energy Technology Data Exchange (ETDEWEB)

    Sandoval, D. M. (Dana M.); Strittmatter, R. B. (Richard B.); Abeyta, J. D. (Joline D.); Brown, J. (John); Marks, T. (Thomas), Jr.; Martinez, B. J. (Benny J.); Jones, D. B. (Dana Benelli); Hsue, W.

    2004-01-01

    The initial objectives of this effort were to provide a hardware and software platform that can address the requirements for the accountability of classified removable electronic media and vault access logging. The Media Tracker system software assists classified media custodian in managing vault access logging and Media Tracking to prevent the inadvertent violation of rules or policies for the access to a restricted area and the movement and use of tracked items. The MediaTracker system includes the software tools to track and account for high consequence security assets and high value items. The overall benefits include: (1) real-time access to the disposition of all Classified Removable Electronic Media (CREM), (2) streamlined security procedures and requirements, (3) removal of ambiguity and managerial inconsistencies, (4) prevention of incidents that can and should be prevented, (5) alignment with the DOE's initiative to achieve improvements in security and facility operations through technology deployment, and (6) enhanced individual responsibility by providing a consistent method of dealing with daily responsibilities. In response to initiatives to enhance the control of classified removable electronic media (CREM), the Media Tracker software suite was developed, piloted and implemented at the Los Alamos National Laboratory beginning in July 2000. The Media Tracker software suite assists in the accountability and tracking of CREM and other high-value assets. One component of the MediaTracker software suite provides a Laboratory-approved media tracking system. Using commercial touch screen and bar code technology, the MediaTracker (MT) component of the MediaTracker software suite provides an efficient and effective means to meet current Laboratory requirements and provides new-engineered controls to help assure compliance with those requirements. It also establishes a computer infrastructure at vault entrances for vault access logging, and can

  14. MediaTracker system

    International Nuclear Information System (INIS)

    Sandoval, D.M.; Strittmatter, R.B.; Abeyta, J.D.; Brown, J.; Marks, T. Jr.; Martinez, B.J.; Jones, D.B.; Hsue, W.

    2004-01-01

    The initial objectives of this effort were to provide a hardware and software platform that can address the requirements for the accountability of classified removable electronic media and vault access logging. The Media Tracker system software assists classified media custodian in managing vault access logging and Media Tracking to prevent the inadvertent violation of rules or policies for the access to a restricted area and the movement and use of tracked items. The MediaTracker system includes the software tools to track and account for high consequence security assets and high value items. The overall benefits include: (1) real-time access to the disposition of all Classified Removable Electronic Media (CREM), (2) streamlined security procedures and requirements, (3) removal of ambiguity and managerial inconsistencies, (4) prevention of incidents that can and should be prevented, (5) alignment with the DOE's initiative to achieve improvements in security and facility operations through technology deployment, and (6) enhanced individual responsibility by providing a consistent method of dealing with daily responsibilities. In response to initiatives to enhance the control of classified removable electronic media (CREM), the Media Tracker software suite was developed, piloted and implemented at the Los Alamos National Laboratory beginning in July 2000. The Media Tracker software suite assists in the accountability and tracking of CREM and other high-value assets. One component of the MediaTracker software suite provides a Laboratory-approved media tracking system. Using commercial touch screen and bar code technology, the MediaTracker (MT) component of the MediaTracker software suite provides an efficient and effective means to meet current Laboratory requirements and provides new-engineered controls to help assure compliance with those requirements. It also establishes a computer infrastructure at vault entrances for vault access logging, and can accommodate

  15. Predicting gaze direction from head pose yaw and pitch

    NARCIS (Netherlands)

    Johnson, D.O.; Cuijpers, R.H.; Arabnia, H.R.; Deligiannidis, L.; Lu, J.; Tinetti, F.G.; You, J.

    2013-01-01

    Abstract - Socially assistive robots (SARs) must be able to interpret non-verbal communication from a human. A person’s gaze direction informs the observer where the visual attention is directed to. Therefore it is useful if a robot can interpret the gaze direction, so that it can assess whether a

  16. Cerebellar inactivation impairs memory of learned prism gaze-reach calibrations.

    Science.gov (United States)

    Norris, Scott A; Hathaway, Emily N; Taylor, Jordan A; Thach, W Thomas

    2011-05-01

    Three monkeys performed a visually guided reach-touch task with and without laterally displacing prisms. The prisms offset the normally aligned gaze/reach and subsequent touch. Naive monkeys showed adaptation, such that on repeated prism trials the gaze-reach angle widened and touches hit nearer the target. On the first subsequent no-prism trial the monkeys exhibited an aftereffect, such that the widened gaze-reach angle persisted and touches missed the target in the direction opposite that of initial prism-induced error. After 20-30 days of training, monkeys showed long-term learning and storage of the prism gaze-reach calibration: they switched between prism and no-prism and touched the target on the first trials without adaptation or aftereffect. Injections of lidocaine into posterolateral cerebellar cortex or muscimol or lidocaine into dentate nucleus temporarily inactivated these structures. Immediately after injections into cortex or dentate, reaches were displaced in the direction of prism-displaced gaze, but no-prism reaches were relatively unimpaired. There was little or no adaptation on the day of injection. On days after injection, there was no adaptation and both prism and no-prism reaches were horizontally, and often vertically, displaced. A single permanent lesion (kainic acid) in the lateral dentate nucleus of one monkey immediately impaired only the learned prism gaze-reach calibration and in subsequent days disrupted both learning and performance. This effect persisted for the 18 days of observation, with little or no adaptation.

  17. Right Hemispheric Dominance in Gaze-Triggered Reflexive Shift of Attention in Humans

    Science.gov (United States)

    Okada, Takashi; Sato, Wataru; Toichi, Motomi

    2006-01-01

    Recent findings suggest a right hemispheric dominance in gaze-triggered shifts of attention. The aim of this study was to clarify the dominant hemisphere in the gaze processing that mediates attentional shift. A target localization task, with preceding non-predicative gaze cues presented to each visual field, was undertaken by 44 healthy subjects,…

  18. ATLAS semiconductor tracker installed into its barrel

    CERN Multimedia

    Maximilien Brice

    2005-01-01

    The ATLAS silicon tracker is installed in the silicon tracker barrel. Absolute precision was required in this operation to ensure that the tracker was inserted without damage through minimal clearance. The installation was performed in a clean room on the CERN site so that no impurities in the air would contaminate the tracker's systems.

  19. Low Cost Eye Tracking: The Current Panorama

    Directory of Open Access Journals (Sweden)

    Onur Ferhat

    2016-01-01

    Full Text Available Despite the availability of accurate, commercial gaze tracker devices working with infrared (IR technology, visible light gaze tracking constitutes an interesting alternative by allowing scalability and removing hardware requirements. Over the last years, this field has seen examples of research showing performance comparable to the IR alternatives. In this work, we survey the previous work on remote, visible light gaze trackers and analyze the explored techniques from various perspectives such as calibration strategies, head pose invariance, and gaze estimation techniques. We also provide information on related aspects of research such as public datasets to test against, open source projects to build upon, and gaze tracking services to directly use in applications. With all this information, we aim to provide the contemporary and future researchers with a map detailing previously explored ideas and the required tools.

  20. "Gaze Leading": Initiating Simulated Joint Attention Influences Eye Movements and Choice Behavior

    Science.gov (United States)

    Bayliss, Andrew P.; Murphy, Emily; Naughtin, Claire K.; Kritikos, Ada; Schilbach, Leonhard; Becker, Stefanie I.

    2013-01-01

    Recent research in adults has made great use of the gaze cuing paradigm to understand the behavior of the follower in joint attention episodes. We implemented a gaze leading task to investigate the initiator--the other person in these triadic interactions. In a series of gaze-contingent eye-tracking studies, we show that fixation dwell time upon…

  1. Eye-based head gestures

    DEFF Research Database (Denmark)

    Mardanbegi, Diako; Witzner Hansen, Dan; Pederson, Thomas

    2012-01-01

    A novel method for video-based head gesture recognition using eye information by an eye tracker has been proposed. The method uses a combination of gaze and eye movement to infer head gestures. Compared to other gesture-based methods a major advantage of the method is that the user keeps the gaze...... mobile phone screens. The user study shows that the method detects a set of defined gestures reliably.......A novel method for video-based head gesture recognition using eye information by an eye tracker has been proposed. The method uses a combination of gaze and eye movement to infer head gestures. Compared to other gesture-based methods a major advantage of the method is that the user keeps the gaze...

  2. Gaze Interactive Building Instructions

    DEFF Research Database (Denmark)

    Hansen, John Paulin; Ahmed, Zaheer; Mardanbeigi, Diako

    We combine eye tracking technology and mobile tablets to support hands-free interaction with digital building instructions. As a proof-of-concept we have developed a small interactive 3D environment where one can interact with digital blocks by gaze, keystroke and head gestures. Blocks may be moved...

  3. Assessing the precision of gaze following using a stereoscopic 3D virtual reality setting.

    Science.gov (United States)

    Atabaki, Artin; Marciniak, Karolina; Dicke, Peter W; Thier, Peter

    2015-07-01

    Despite the ecological importance of gaze following, little is known about the underlying neuronal processes, which allow us to extract gaze direction from the geometric features of the eye and head of a conspecific. In order to understand the neuronal mechanisms underlying this ability, a careful description of the capacity and the limitations of gaze following at the behavioral level is needed. Previous studies of gaze following, which relied on naturalistic settings have the disadvantage of allowing only very limited control of potentially relevant visual features guiding gaze following, such as the contrast of iris and sclera, the shape of the eyelids and--in the case of photographs--they lack depth. Hence, in order to get full control of potentially relevant features we decided to study gaze following of human observers guided by the gaze of a human avatar seen stereoscopically. To this end we established a stereoscopic 3D virtual reality setup, in which we tested human subjects' abilities to detect at which target a human avatar was looking at. Following the gaze of the avatar showed all the features of the gaze following of a natural person, namely a substantial degree of precision associated with a consistent pattern of systematic deviations from the target. Poor stereo vision affected performance surprisingly little (only in certain experimental conditions). Only gaze following guided by targets at larger downward eccentricities exhibited a differential effect of the presence or absence of accompanying movements of the avatar's eyelids and eyebrows. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Direct gaze elicits atypical activation of the theory-of-mind network in autism spectrum conditions.

    Science.gov (United States)

    von dem Hagen, Elisabeth A H; Stoyanova, Raliza S; Rowe, James B; Baron-Cohen, Simon; Calder, Andrew J

    2014-06-01

    Eye contact plays a key role in social interaction and is frequently reported to be atypical in individuals with autism spectrum conditions (ASCs). Despite the importance of direct gaze, previous functional magnetic resonance imaging in ASC has generally focused on paradigms using averted gaze. The current study sought to determine the neural processing of faces displaying direct and averted gaze in 18 males with ASC and 23 matched controls. Controls showed an increased response to direct gaze in brain areas implicated in theory-of-mind and gaze perception, including medial prefrontal cortex, temporoparietal junction, posterior superior temporal sulcus region, and amygdala. In contrast, the same regions showed an increased response to averted gaze in individuals with an ASC. This difference was confirmed by a significant gaze direction × group interaction. Relative to controls, participants with ASC also showed reduced functional connectivity between these regions. We suggest that, in the typical brain, perceiving another person gazing directly at you triggers spontaneous attributions of mental states (e.g. he is "interested" in me), and that such mental state attributions to direct gaze may be reduced or absent in the autistic brain.

  5. TRACKER

    CERN Multimedia

    K. Gill and G. Bolla

    2010-01-01

    Silicon strips During the first collisions the strip-Tracker operated with excellent performance and stability. The results obtained were very impressive and this exciting experience marked a fine end to another intense year. Several issues were identified during 2009 operations that could benefit from improvement: to suppress the increased output data volume when in STANDBY state (LV ON, HV OFF), which is due to the larger noise amplitudes when the sensors are unbiased; to reduce the strips configuration time; to increase the stability of the power system, particularly during state transitions, and to decrease the powering up time. The strip-Tracker FEDs now react to changes in the HV conditions of the strips. Upon a transition to STAND-BY, central DAQ starts a PAUSE-RESUME cycle and a flag is issued to the FEDSupervisor. This results in forcing the common mode noise artificially to the maximum value, which effectively suppresses the analogue data output. This forced offset is removed as soon as the strips ...

  6. EDITORIAL: Special section on gaze-independent brain-computer interfaces Special section on gaze-independent brain-computer interfaces

    Science.gov (United States)

    Treder, Matthias S.

    2012-08-01

    Restoring the ability to communicate and interact with the environment in patients with severe motor disabilities is a vision that has been the main catalyst of early brain-computer interface (BCI) research. The past decade has brought a diversification of the field. BCIs have been examined as a tool for motor rehabilitation and their benefit in non-medical applications such as mental-state monitoring for improved human-computer interaction and gaming has been confirmed. At the same time, the weaknesses of some approaches have been pointed out. One of these weaknesses is gaze-dependence, that is, the requirement that the user of a BCI system voluntarily directs his or her eye gaze towards a visual target in order to efficiently operate a BCI. This not only contradicts the main doctrine of BCI research, namely that BCIs should be independent of muscle activity, but it can also limit its real-world applicability both in clinical and non-medical settings. It is only in a scenario devoid of any motor activity that a BCI solution is without alternative. Gaze-dependencies have surfaced at two different points in the BCI loop. Firstly, a BCI that relies on visual stimulation may require users to fixate on the target location. Secondly, feedback is often presented visually, which implies that the user may have to move his or her eyes in order to perceive the feedback. This special section was borne out of a BCI workshop on gaze-independent BCIs held at the 2011 Society for Applied Neurosciences (SAN) Conference and has then been extended with additional contributions from other research groups. It compiles experimental and methodological work that aims toward gaze-independent communication and mental-state monitoring. Riccio et al review the current state-of-the-art in research on gaze-independent BCIs [1]. Van der Waal et al present a tactile speller that builds on the stimulation of the fingers of the right and left hand [2]. H¨ohne et al analyze the ergonomic aspects

  7. Segmentation of object-based video of gaze communication

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Stegmann, Mikkel Bille; Forchhammer, Søren

    2005-01-01

    Aspects of video communication based on gaze interaction are considered. The overall idea is to use gaze interaction to control video, e.g. for video conferencing. Towards this goal, animation of a facial mask is demonstrated. The animation is based on images using Active Appearance Models (AAM......). Good quality reproduction of (low-resolution) coded video of an animated facial mask as low as 10-20 kbit/s using MPEG-4 object based video is demonstated....

  8. CULTURAL DISPLAY RULES DRIVE EYE GAZE DURING THINKING.

    Science.gov (United States)

    McCarthy, Anjanie; Lee, Kang; Itakura, Shoji; Muir, Darwin W

    2006-11-01

    The authors measured the eye gaze displays of Canadian, Trinidadian, and Japanese participants as they answered questions for which they either knew, or had to derive, the answers. When they knew the answers, Trinidadians maintained the most eye contact, whereas Japanese maintained the least. When thinking about the answers to questions, Canadians and Trinidadians looked up, whereas Japanese looked down. Thus, for humans, gaze displays while thinking are at least in part culturally determined.

  9. Model-driven gaze simulation for the blind person in face-to-face communication

    NARCIS (Netherlands)

    Qiu, S.; Anas, S.A.B.; Osawa, H.; Rauterberg, G.W.M.; Hu, J.

    2016-01-01

    In face-to-face communication, eye gaze is integral to a conversation to supplement verbal language. The sighted often uses eye gaze to convey nonverbal information in social interactions, which a blind conversation partner cannot access and react to them. In this paper, we present E-Gaze glasses

  10. P2-23: Deficits on Preference but Not Attention in Patients with Depression: Evidence from Gaze Cue

    Directory of Open Access Journals (Sweden)

    Jingling Li

    2012-10-01

    Full Text Available Gaze is an important social cue and can easily capture attention. Our preference judgment is biased by others' gaze; that is, we prefer objects gazed by happy or neutral faces and dislike objects gazed by disgust faces. Since patients with depression have a negative bias in emotional perception, we hypothesized that they may have different preference judgment on the gazed objects than healthy controls. Twenty-one patients with major depressive disorder and 21 healthy age-matched controls completed an object categorization task and then rated their preference on those objects. In the categorization task, a schematic face either gazed toward or away from the to-be-categorized object. The results showed that both groups categorized faster for gazed objects than non-gazed objects, suggesting that patients did not have deficits on their attention to gaze cues. Nevertheless, healthy controls preferred gazed objects more than non-gazed objects, while patients did not have significant preference. Our result indicated that patients with depression have deficits on their social cognition rather than basic attentional mechanism.

  11. Mechanical stability of the CMS Tracker

    CERN Document Server

    CMS Collaboration

    2015-01-01

    reconstructs the absolute position of individual detector modules with a similar accuracy but after days of data taking. During the long term operation at fixed temperature of +4$^o$C in years 2011--2013 the alignment of tracker components was stable within 10 microns. Temperature variations in the Tracker volume are found to cause the displacements of tracker structures of abou...

  12. The Role of Global and Local Visual Information during Gaze-Cued Orienting of Attention.

    Science.gov (United States)

    Munsters, Nicolette M; van den Boomen, Carlijn; Hooge, Ignace T C; Kemner, Chantal

    2016-01-01

    Gaze direction is an important social communication tool. Global and local visual information are known to play specific roles in processing socially relevant information from a face. The current study investigated whether global visual information has a primary role during gaze-cued orienting of attention and, as such, may influence quality of interaction. Adults performed a gaze-cueing task in which a centrally presented face cued (valid or invalid) the location of a peripheral target through a gaze shift. We measured brain activity (electroencephalography) towards the cue and target and behavioral responses (manual and saccadic reaction times) towards the target. The faces contained global (i.e. lower spatial frequencies), local (i.e. higher spatial frequencies), or a selection of both global and local (i.e. mid-band spatial frequencies) visual information. We found a gaze cue-validity effect (i.e. valid versus invalid), but no interaction effects with spatial frequency content. Furthermore, behavioral responses towards the target were in all cue conditions slower when lower spatial frequencies were not present in the gaze cue. These results suggest that whereas gaze-cued orienting of attention can be driven by both global and local visual information, global visual information determines the speed of behavioral responses towards other entities appearing in the surrounding of gaze cue stimuli.

  13. Fusing Eye-gaze and Speech Recognition for Tracking in an Automatic Reading Tutor

    DEFF Research Database (Denmark)

    Rasmussen, Morten Højfeldt; Tan, Zheng-Hua

    2013-01-01

    In this paper we present a novel approach for automatically tracking the reading progress using a combination of eye-gaze tracking and speech recognition. The two are fused by first generating word probabilities based on eye-gaze information and then using these probabilities to augment the langu......In this paper we present a novel approach for automatically tracking the reading progress using a combination of eye-gaze tracking and speech recognition. The two are fused by first generating word probabilities based on eye-gaze information and then using these probabilities to augment...

  14. A closer look at the size of the gaze-liking effect: a preregistered replication.

    Science.gov (United States)

    Tipples, Jason; Pecchinenda, Anna

    2018-04-30

    This study is a direct replication of gaze-liking effect using the same design, stimuli and procedure. The gaze-liking effect describes the tendency for people to rate objects as more likeable when they have recently seen a person repeatedly gaze toward rather than away from the object. However, as subsequent studies show considerable variability in the size of this effect, we sampled a larger number of participants (N = 98) than the original study (N = 24) to gain a more precise estimate of the gaze-liking effect size. Our results indicate a much smaller standardised effect size (d z  = 0.02) than that of the original study (d z  = 0.94). Our smaller effect size was not due to general insensitivity to eye-gaze effects because the same sample showed a clear (d z  = 1.09) gaze-cuing effect - faster reaction times when eyes looked toward vs away from target objects. We discuss the implications of our findings for future studies wishing to study the gaze-liking effect.

  15. Role of Gaze Cues in Interpersonal Motor Coordination: Towards Higher Affiliation in Human-Robot Interaction.

    Directory of Open Access Journals (Sweden)

    Mahdi Khoramshahi

    Full Text Available The ability to follow one another's gaze plays an important role in our social cognition; especially when we synchronously perform tasks together. We investigate how gaze cues can improve performance in a simple coordination task (i.e., the mirror game, whereby two players mirror each other's hand motions. In this game, each player is either a leader or follower. To study the effect of gaze in a systematic manner, the leader's role is played by a robotic avatar. We contrast two conditions, in which the avatar provides or not explicit gaze cues that indicate the next location of its hand. Specifically, we investigated (a whether participants are able to exploit these gaze cues to improve their coordination, (b how gaze cues affect action prediction and temporal coordination, and (c whether introducing active gaze behavior for avatars makes them more realistic and human-like (from the user point of view.43 subjects participated in 8 trials of the mirror game. Each subject performed the game in the two conditions (with and without gaze cues. In this within-subject study, the order of the conditions was randomized across participants, and subjective assessment of the avatar's realism was assessed by administering a post-hoc questionnaire. When gaze cues were provided, a quantitative assessment of synchrony between participants and the avatar revealed a significant improvement in subject reaction-time (RT. This confirms our hypothesis that gaze cues improve the follower's ability to predict the avatar's action. An analysis of the pattern of frequency across the two players' hand movements reveals that the gaze cues improve the overall temporal coordination across the two players. Finally, analysis of the subjective evaluations from the questionnaires reveals that, in the presence of gaze cues, participants found it not only more human-like/realistic, but also easier to interact with the avatar.This work confirms that people can exploit gaze cues to

  16. Dysfunctional gaze processing in bipolar disorder

    Directory of Open Access Journals (Sweden)

    Cristina Berchio

    2017-01-01

    The present study provides neurophysiological evidence for abnormal gaze processing in BP and suggests dysfunctional processing of direct eye contact as a prominent characteristic of bipolar disorder.

  17. Small star trackers for modern space vehicles

    Science.gov (United States)

    Kouzmin, Vladimir; Jushkov, Vladimir; Zaikin, Vladimir

    2017-11-01

    Based on experience of many years creation of spacecrafts' star trackers with diversified detectors (from the first star trackers of 60's to tens versions of star trackers in the following years), using technological achievements in the field of optics and electronics the NPP "Geofizika-Cosmos" has provided celestial orientation for all the space vehicles created in Russia and now has developed a series of new star trackers with CCD matrix and special processors, which are able to meet needs in celestial orientation of the modern spacecrafts for the nearest 10-15 years. In the given article the main characteristics and description of some star trackers' versions are presented. The star trackers have various levels of technical characteristics and use both combined (Russian and foreign) procurement parts, and only national (Russian) procurement parts for the main units.

  18. Optical model and calibration of a sun tracker

    International Nuclear Information System (INIS)

    Volkov, Sergei N.; Samokhvalov, Ignatii V.; Cheong, Hai Du; Kim, Dukhyeon

    2016-01-01

    Sun trackers are widely used to investigate scattering and absorption of solar radiation in the Earth's atmosphere. We present a method for optimization of the optical altazimuth sun tracker model with output radiation direction aligned with the axis of a stationary spectrometer. The method solves the problem of stability loss in tracker pointing at the Sun near the zenith. An optimal method for tracker calibration at the measurement site is proposed in the present work. A method of moving calibration is suggested for mobile applications in the presence of large temperature differences and errors in the alignment of the optical system of the tracker. - Highlights: • We present an optimal optical sun tracker model for atmospheric spectroscopy. • The problem of loss of stability of tracker pointing at the Sun has been solved. • We propose an optimal method for tracker calibration at a measurement site. • Test results demonstrate the efficiency of the proposed optimization methods.

  19. Gaze strategies during visually-guided versus memory-guided grasping.

    Science.gov (United States)

    Prime, Steven L; Marotta, Jonathan J

    2013-03-01

    Vision plays a crucial role in guiding motor actions. But sometimes we cannot use vision and must rely on our memory to guide action-e.g. remembering where we placed our eyeglasses on the bedside table when reaching for them with the lights off. Recent studies show subjects look towards the index finger grasp position during visually-guided precision grasping. But, where do people look during memory-guided grasping? Here, we explored the gaze behaviour of subjects as they grasped a centrally placed symmetrical block under open- and closed-loop conditions. In Experiment 1, subjects performed grasps in either a visually-guided task or memory-guided task. The results show that during visually-guided grasping, gaze was first directed towards the index finger's grasp point on the block, suggesting gaze targets future grasp points during the planning of the grasp. Gaze during memory-guided grasping was aimed closer to the blocks' centre of mass from block presentation to the completion of the grasp. In Experiment 2, subjects performed an 'immediate grasping' task in which vision of the block was removed immediately at the onset of the reach. Similar to the visually-guided results from Experiment 1, gaze was primarily directed towards the index finger location. These results support the 2-stream theory of vision in that motor planning with visual feedback at the onset of the movement is driven primarily by real-time visuomotor computations of the dorsal stream, whereas grasping remembered objects without visual feedback is driven primarily by the perceptual memory representations mediated by the ventral stream.

  20. A CONCEPT OF SOLAR TRACKER SYSTEM DESIGN

    OpenAIRE

    Meita Rumbayan *, Muhamad Dwisnanto Putro

    2017-01-01

    Improvement of solar panel efficiency is an ongoing research work recently. Maximizing the output power by integrating with the solar tracker system becomes a interest point of the research. This paper presents the concept in designing a solar tracker system applied to solar panel. The development of solar panel tracker system design that consist of system display prototype design, hardware design, and algorithm design. This concept is useful as the control system for solar tracker to improve...

  1. Flexible coordination of stationary and mobile conversations with gaze: Resource allocation among multiple joint activities

    Directory of Open Access Journals (Sweden)

    Eric Mayor

    2016-10-01

    Full Text Available Gaze is instrumental in coordinating face-to-face social interactions. But little is known about gaze use when social interactions co-occur with other joint activities. We investigated the case of walking while talking. We assessed how gaze gets allocated among various targets in mobile conversations, whether allocation of gaze to other targets affects conversational coordination, and whether reduced availability of gaze for conversational coordination affects conversational performance and content. In an experimental study, pairs were videotaped in four conditions of mobility (standing still, talking while walking along a straight-line itinerary, talking while walking along a complex itinerary, or walking along a complex itinerary with no conversational task. Gaze to partners was substantially reduced in mobile conversations, but gaze was still used to coordinate conversation via displays of mutual orientation, and conversational performance and content was not different between stationary and mobile conditions. Results expand the phenomena of multitasking to joint activities.

  2. Gaze direction effects on perceptions of upper limb kinesthetic coordinate system axes.

    Science.gov (United States)

    Darling, W G; Hondzinski, J M; Harper, J G

    2000-12-01

    The effects of varying gaze direction on perceptions of the upper limb kinesthetic coordinate system axes and of the median plane location were studied in nine subjects with no history of neuromuscular disorders. In two experiments, six subjects aligned the unseen forearm to the trunk-fixed anterior-posterior (a/p) axis and earth-fixed vertical while gazing at different visual targets using either head or eye motion to vary gaze direction in different conditions. Effects of support of the upper limb on perceptual errors were also tested in different conditions. Absolute constant errors and variable errors associated with forearm alignment to the trunk-fixed a/p axis and earth-fixed vertical were similar for different gaze directions whether the head or eyes were moved to control gaze direction. Such errors were decreased by support of the upper limb when aligning to the vertical but not when aligning to the a/p axis. Regression analysis showed that single trial errors in individual subjects were poorly correlated with gaze direction, but showed a dependence on shoulder angles for alignment to both axes. Thus, changes in position of the head and eyes do not influence perceptions of upper limb kinesthetic coordinate system axes. However, dependence of the errors on arm configuration suggests that such perceptions are generated from sensations of shoulder and elbow joint angle information. In a third experiment, perceptions of median plane location were tested by instructing four subjects to place the unseen right index fingertip directly in front of the sternum either by motion of the straight arm at the shoulder or by elbow flexion/extension with shoulder angle varied. Gaze angles were varied to the right and left by 0.5 radians to determine effects of gaze direction on such perceptions. These tasks were also carried out with subjects blind-folded and head orientation varied to test for effects of head orientation on perceptions of median plane location. Constant

  3. Track classification within wireless sensor network

    Science.gov (United States)

    Doumerc, Robin; Pannetier, Benjamin; Moras, Julien; Dezert, Jean; Canevet, Loic

    2017-05-01

    In this paper, we present our study on track classification by taking into account environmental information and target estimated states. The tracker uses several motion model adapted to different target dynamics (pedestrian, ground vehicle and SUAV, i.e. small unmanned aerial vehicle) and works in centralized architecture. The main idea is to explore both: classification given by heterogeneous sensors and classification obtained with our fusion module. The fusion module, presented in his paper, provides a class on each track according to track location, velocity and associated uncertainty. To model the likelihood on each class, a fuzzy approach is used considering constraints on target capability to move in the environment. Then the evidential reasoning approach based on Dempster-Shafer Theory (DST) is used to perform a time integration of this classifier output. The fusion rules are tested and compared on real data obtained with our wireless sensor network.In order to handle realistic ground target tracking scenarios, we use an autonomous smart computer deposited in the surveillance area. After the calibration step of the heterogeneous sensor network, our system is able to handle real data from a wireless ground sensor network. The performance of this system is evaluated in a real exercise for intelligence operation ("hunter hunt" scenario).

  4. The Eyes Are the Windows to the Mind: Direct Eye Gaze Triggers the Ascription of Others' Minds.

    Science.gov (United States)

    Khalid, Saara; Deska, Jason C; Hugenberg, Kurt

    2016-12-01

    Eye gaze is a potent source of social information with direct eye gaze signaling the desire to approach and averted eye gaze signaling avoidance. In the current work, we proposed that eye gaze signals whether or not to impute minds into others. Across four studies, we manipulated targets' eye gaze (i.e., direct vs. averted eye gaze) and measured explicit mind ascriptions (Study 1a, Study 1b, and Study 2) and beliefs about the likelihood of targets having mind (Study 3). In all four studies, we find novel evidence that the ascription of sophisticated humanlike minds to others is signaled by the display of direct eye gaze relative to averted eye gaze. Moreover, we provide evidence suggesting that this differential mentalization is due, at least in part, to beliefs that direct gaze targets are more likely to instigate social interaction. In short, eye contact triggers mind perception. © 2016 by the Society for Personality and Social Psychology, Inc.

  5. Gaze training enhances laparoscopic technical skill acquisition and multi-tasking performance: a randomized, controlled study.

    Science.gov (United States)

    Wilson, Mark R; Vine, Samuel J; Bright, Elizabeth; Masters, Rich S W; Defriend, David; McGrath, John S

    2011-12-01

    The operating room environment is replete with stressors and distractions that increase the attention demands of what are already complex psychomotor procedures. Contemporary research in other fields (e.g., sport) has revealed that gaze training interventions may support the development of robust movement skills. This current study was designed to examine the utility of gaze training for technical laparoscopic skills and to test performance under multitasking conditions. Thirty medical trainees with no laparoscopic experience were divided randomly into one of three treatment groups: gaze trained (GAZE), movement trained (MOVE), and discovery learning/control (DISCOVERY). Participants were fitted with a Mobile Eye gaze registration system, which measures eye-line of gaze at 25 Hz. Training consisted of ten repetitions of the "eye-hand coordination" task from the LAP Mentor VR laparoscopic surgical simulator while receiving instruction and video feedback (specific to each treatment condition). After training, all participants completed a control test (designed to assess learning) and a multitasking transfer test, in which they completed the procedure while performing a concurrent tone counting task. Not only did the GAZE group learn more quickly than the MOVE and DISCOVERY groups (faster completion times in the control test), but the performance difference was even more pronounced when multitasking. Differences in gaze control (target locking fixations), rather than tool movement measures (tool path length), underpinned this performance advantage for GAZE training. These results suggest that although the GAZE intervention focused on training gaze behavior only, there were indirect benefits for movement behaviors and performance efficiency. Additionally, focusing on a single external target when learning, rather than on complex movement patterns, may have freed-up attentional resources that could be applied to concurrent cognitive tasks.

  6. TRACKER

    CERN Multimedia

    K. Gill

    2010-01-01

    The Tracker has continued to operate with excellent performance during this first period with 7 TeV collisions. Strips operations have been very smooth. The up-time during collisions was 98.5%, up to end of May, with a large fraction of the down-time coming during the planned fine-timing scan with early 7 TeV collisions. Pixels operations are also going very well, besides problems related to background beam-gas collisions where the particles produced generate very large clusters in the barrel modules. When CMS triggers on these events, the FEDs affected overflow and then timeout. Effort was mobilised very quickly to understand and mitigate this problem, with modifications made to the pixel FED firmware in order to provide automatic recovery. With operations becoming more and more routine at P5, Pixels have begun the transition to centrally attended operation, which means that the P5 shifters will no longer be required to be on duty. The strip-Tracker is also planning to make this transition at the end of Ju...

  7. TRACKER

    CERN Document Server

    Bora Akgun

    2013-01-01

    Pixel Tracker Maintenance of the Pixel Tracker has been ongoing since it was extracted from inside CMS and safely stored at low temperatures in Pixel laboratory at Point 5 (see previous Bulletin).    All four half cylinders of the forward Pixel detector (FPIX) have been repaired and the failures have been understood. In October, a team of technicians from Fermilab replaced a total of three panels that were not repairable in place. The replacement of panels is a delicate operation that involves removing the half disks that hold the panels from the half cylinders, removing the damaged panels from the half disks, installing the new panels on the half disks, and finally putting the half disks back into the half cylinders and hooking up the cooling connections. The work was completed successfully. The same team also prepared the installation of the Phase 1 Pixel pilot blade system, installing a third half disk mechanics in the half cylinders; these half disks will host new Phase 1 P...

  8. Scintillating fibre (SciFi) tracker

    CERN Multimedia

    Caraban Gonzalez, Noemi

    2017-01-01

    128 modules – containing 11 000 km of scintillating fibres – will make up the new SciFi tracker, which will replace the outer and inner trackers of the LHCb detector as part of the experiment’s major upgrade during Long Shutdown 2 (LS2)

  9. The MICE scintillating-fibre tracker

    Energy Technology Data Exchange (ETDEWEB)

    Matsushita, T [Imperial College London (United Kingdom)], E-mail: T.Matsushita@imperial.ac.uk

    2008-06-15

    The international Muon Ionization Cooling Experiment (MICE) collaboration will carry out a systematic investigation of the ionization cooling of a muon beam. An ionization cooling channel is required to compress the phase-space volume occupied by the muon beam prior to acceleration in the baseline conceptual designs for both the Neutrino Factory and the Muon Collider. Muons entering and leaving the cooling channel will be measured in two solenoidal spectrometers, each of which is instrumented with a scintillating-fibre tracker. Each tracker is composed of five planar scintillating fibre stations, each station being composed of three planar layers of 350 micron scintillating fibres. The devices will be read out using the Visible Light Photon Counters (VLPCs) developed for use in the D0 experiment at the Tevatron. The design of the system will be presented along with the status of the tracker-construction project. The expected performance of prototypes of the full tracker will be summarised.

  10. Towards gaze-controlled platform games

    DEFF Research Database (Denmark)

    Muñoz, Jorge; Yannakakis, Georgios N.; Mulvey, Fiona

    2011-01-01

    This paper introduces the concept of using gaze as a sole modality for fully controlling player characters of fast-paced action computer games. A user experiment is devised to collect gaze and gameplay data from subjects playing a version of the popular Super Mario Bros platform game. The initial...... analysis shows that there is a rather limited grid around Mario where the efficient player focuses her attention the most while playing the game. The useful grid as we name it, projects the amount of meaningful visual information a designer should use towards creating successful player character...... controllers with the use of artificial intelligence for a platform game like Super Mario. Information about the eyes' position on the screen and the state of the game are utilized as inputs of an artificial neural network, which is trained to approximate which keyboard action is to be performed at each game...

  11. Gaze recognition in high-functioning autistic patients. Evidence from functional MRI

    International Nuclear Information System (INIS)

    Takebayashi, Hiroko; Ogai, Masahiro; Matsumoto, Hideo

    2006-01-01

    We examined whether patients with high-functioning autistic disorder (AD) would exhibit abnormal activation in brain regions implicated in the functioning of theory of mind (TOM) during gaze recognition. We investigated brain activity during gaze recognition in 5 patients with high-functioning AD and 9 normal subjects, using functional magnetic resonance imaging. On the gaze task, more activation was found in the left middle frontal gyrus, the right intraparietal sulcus, and the precentral and inferior parietal gyri bilaterally in controls than in AD patients, whereas the patient group showed more powerful signal changes in the left superior temporal gyrus, the right insula, and the right medial frontal gyrus. These results suggest that high-functioning AD patients have functional abnormalities not only in TOM-related brain regions, but also in widely distributed brain regions that are not normally activated upon the processing of information from another person's gaze. (author)

  12. Clinician's gaze behaviour in simulated paediatric emergencies.

    Science.gov (United States)

    McNaughten, Ben; Hart, Caroline; Gallagher, Stephen; Junk, Carol; Coulter, Patricia; Thompson, Andrew; Bourke, Thomas

    2018-03-07

    Differences in the gaze behaviour of experts and novices are described in aviation and surgery. This study sought to describe the gaze behaviour of clinicians from different training backgrounds during a simulated paediatric emergency. Clinicians from four clinical areas undertook a simulated emergency. Participants wore SMI (SensoMotoric Instruments) eye tracking glasses. We measured the fixation count and dwell time on predefined areas of interest and the time taken to key clinical interventions. Paediatric intensive care unit (PICU) consultants performed best and focused longer on the chest and airway. Paediatric consultants and trainees spent longer looking at the defibrillator and algorithm (51 180 ms and 50 551 ms, respectively) than the PICU and paediatric emergency medicine consultants. This study is the first to describe differences in the gaze behaviour between experts and novices in a resuscitation. They mirror those described in aviation and surgery. Further research is needed to evaluate the potential use of eye tracking as an educational tool. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Social evolution. Oxytocin-gaze positive loop and the coevolution of human-dog bonds.

    Science.gov (United States)

    Nagasawa, Miho; Mitsui, Shouhei; En, Shiori; Ohtani, Nobuyo; Ohta, Mitsuaki; Sakuma, Yasuo; Onaka, Tatsushi; Mogi, Kazutaka; Kikusui, Takefumi

    2015-04-17

    Human-like modes of communication, including mutual gaze, in dogs may have been acquired during domestication with humans. We show that gazing behavior from dogs, but not wolves, increased urinary oxytocin concentrations in owners, which consequently facilitated owners' affiliation and increased oxytocin concentration in dogs. Further, nasally administered oxytocin increased gazing behavior in dogs, which in turn increased urinary oxytocin concentrations in owners. These findings support the existence of an interspecies oxytocin-mediated positive loop facilitated and modulated by gazing, which may have supported the coevolution of human-dog bonding by engaging common modes of communicating social attachment. Copyright © 2015, American Association for the Advancement of Science.

  14. Silicon Tracker Design for the ILC

    International Nuclear Information System (INIS)

    Nelson, T.; SLAC

    2005-01-01

    The task of tracking charged particles in energy frontier collider experiments has been largely taken over by solid-state detectors. While silicon microstrip trackers offer many advantages in this environment, large silicon trackers are generally much more massive than their gaseous counterparts. Because of the properties of the machine itself, much of the material that comprises a typical silicon microstrip tracker can be eliminated from a design for the ILC. This realization is the inspiration for a tracker design using lightweight, short, mass-producible modules to tile closed, nested cylinders with silicon microstrips. This design relies upon a few key technologies to provide excellent performance with low cost and complexity. The details of this concept are discussed, along with the performance and status of the design effort

  15. Between Gazes

    DEFF Research Database (Denmark)

    Elias, Camelia

    2009-01-01

    In the film documentary Zizek! (2006) Astra Taylor, the film’s director, introduces Slavoj Zizek and his central notions of Lacanian psychoanalysis as they tie in with Marxism, ideology, and culture. Apart from following Zizek from New York to his home in Ljubljana, the documentary presents...... delivers his thoughts on philosophy while in bed or in the bathroom. It is clear that one of the devices that the documentary uses in its portrayal of Zizek is the palimpsest, and what is being layered is the gaze. My essay introduces the idea of layering as a case of intermediality between different art...

  16. Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor.

    Science.gov (United States)

    Naqvi, Rizwan Ali; Arsalan, Muhammad; Batchuluun, Ganbayar; Yoon, Hyo Sik; Park, Kang Ryoung

    2018-02-03

    A paradigm shift is required to prevent the increasing automobile accident deaths that are mostly due to the inattentive behavior of drivers. Knowledge of gaze region can provide valuable information regarding a driver's point of attention. Accurate and inexpensive gaze classification systems in cars can improve safe driving. However, monitoring real-time driving behaviors and conditions presents some challenges: dizziness due to long drives, extreme lighting variations, glasses reflections, and occlusions. Past studies on gaze detection in cars have been chiefly based on head movements. The margin of error in gaze detection increases when drivers gaze at objects by moving their eyes without moving their heads. To solve this problem, a pupil center corneal reflection (PCCR)-based method has been considered. However, the error of accurately detecting the pupil center and corneal reflection center is increased in a car environment due to various environment light changes, reflections on glasses surface, and motion and optical blurring of captured eye image. In addition, existing PCCR-based methods require initial user calibration, which is difficult to perform in a car environment. To address this issue, we propose a deep learning-based gaze detection method using a near-infrared (NIR) camera sensor considering driver head and eye movement that does not require any initial user calibration. The proposed system is evaluated on our self-constructed database as well as on open Columbia gaze dataset (CAVE-DB). The proposed method demonstrated greater accuracy than the previous gaze classification methods.

  17. Race perception and gaze direction differently impair visual working memory for faces: An event-related potential study.

    Science.gov (United States)

    Sessa, Paola; Dalmaso, Mario

    2016-01-01

    Humans are amazingly experts at processing and recognizing faces, however there are moderating factors of this ability. In the present study, we used the event-related potential technique to investigate the influence of both race and gaze direction on visual working memory (i.e., VWM) face representations. In a change detection task, we orthogonally manipulated race (own-race vs. other-race faces) and eye-gaze direction (direct gaze vs. averted gaze). Participants were required to encode identities of these faces. We quantified the amount of information encoded in VWM by monitoring the amplitude of the sustained posterior contralateral negativity (SPCN) time-locked to the faces. Notably, race and eye-gaze direction differently modulated SPCN amplitude such that other-race faces elicited reduced SPCN amplitudes compared with own-race faces only when displaying a direct gaze. On the other hand, faces displaying averted gaze, independently of their race, elicited increased SPCN amplitudes compared with faces displaying direct gaze. We interpret these findings as denoting that race and eye-gaze direction affect different face processing stages.

  18. Aversive eye gaze during a speech in virtual environment in patients with social anxiety disorder.

    Science.gov (United States)

    Kim, Haena; Shin, Jung Eun; Hong, Yeon-Ju; Shin, Yu-Bin; Shin, Young Seok; Han, Kiwan; Kim, Jae-Jin; Choi, Soo-Hee

    2018-03-01

    One of the main characteristics of social anxiety disorder is excessive fear of social evaluation. In such situations, anxiety can influence gaze behaviour. Thus, the current study adopted virtual reality to examine eye gaze pattern of social anxiety disorder patients while presenting different types of speeches. A total of 79 social anxiety disorder patients and 51 healthy controls presented prepared speeches on general topics and impromptu speeches on self-related topics to a virtual audience while their eye gaze was recorded. Their presentation performance was also evaluated. Overall, social anxiety disorder patients showed less eye gaze towards the audience than healthy controls. Types of speech did not influence social anxiety disorder patients' gaze allocation towards the audience. However, patients with social anxiety disorder showed significant correlations between the amount of eye gaze towards the audience while presenting self-related speeches and social anxiety cognitions. The current study confirms that eye gaze behaviour of social anxiety disorder patients is aversive and that their anxiety symptoms are more dependent on the nature of topic.

  19. CMS tracker slides into centre stage

    CERN Document Server

    2006-01-01

    As preparations for the magnet test and cosmic challenge get underway, a prototype tracker has been carefully inserted into the centre of CMS. The tracker, in its special platform, is slowly inserted into the centre of CMS. The CMS prototype tracker to be used for the magnet test and cosmic challenge coming up this summer has the same dimensions -2.5 m in diameter and 6 m in length- as the real one and tooling exactly like it. However, the support tube is only about 1% equipped, with 2 m2 of silicon detectors installed out of the total 200 m2. This is already more than any LEP experiment ever used and indicates the great care needed to be taken by engineers and technicians as these fragile detectors were installed and transported to Point 5. Sixteen thousand silicon detectors with a total of about 10 million strips will make up the full tracker. So far, 140 modules with about 100 000 strips have been implanted into the prototype tracker. These silicon strips will provide precision tracking for cosmic muon...

  20. Gaze Step Distributions Reflect Fixations and Saccades: A Comment on Stephen and Mirman (2010)

    Science.gov (United States)

    Bogartz, Richard S.; Staub, Adrian

    2012-01-01

    In three experimental tasks Stephen and Mirman (2010) measured gaze steps, the distance in pixels between gaze positions on successive samples from an eyetracker. They argued that the distribution of gaze steps is best fit by the lognormal distribution, and based on this analysis they concluded that interactive cognitive processes underlie eye…

  1. Gaze-Contingent Music Reward Therapy for Social Anxiety Disorder: A Randomized Controlled Trial.

    Science.gov (United States)

    Lazarov, Amit; Pine, Daniel S; Bar-Haim, Yair

    2017-07-01

    Patients with social anxiety disorder exhibit increased attentional dwelling on social threats, providing a viable target for therapeutics. This randomized controlled trial examined the efficacy of a novel gaze-contingent music reward therapy for social anxiety disorder designed to reduce attention dwelling on threats. Forty patients with social anxiety disorder were randomly assigned to eight sessions of either gaze-contingent music reward therapy, designed to divert patients' gaze toward neutral stimuli rather than threat stimuli, or to a control condition. Clinician and self-report measures of social anxiety were acquired pretreatment, posttreatment, and at 3-month follow-up. Dwell time on socially threatening faces was assessed during the training sessions and at pre- and posttreatment. Gaze-contingent music reward therapy yielded greater reductions of symptoms of social anxiety disorder than the control condition on both clinician-rated and self-reported measures. Therapeutic effects were maintained at follow-up. Gaze-contingent music reward therapy, but not the control condition, also reduced dwell time on threat, which partially mediated clinical effects. Finally, gaze-contingent music reward therapy, but not the control condition, also altered dwell time on socially threatening faces not used in training, reflecting near-transfer training generalization. This is the first randomized controlled trial to examine a gaze-contingent intervention in social anxiety disorder. The results demonstrate target engagement and clinical effects. This study sets the stage for larger randomized controlled trials and testing in other emotional disorders.

  2. Gaze-based assistive technology in daily activities in children with severe physical impairments-An intervention study.

    Science.gov (United States)

    Borgestig, Maria; Sandqvist, Jan; Ahlsten, Gunnar; Falkmer, Torbjörn; Hemmingsson, Helena

    2017-04-01

    To establish the impact of a gaze-based assistive technology (AT) intervention on activity repertoire, autonomous use, and goal attainment in children with severe physical impairments, and to examine parents' satisfaction with the gaze-based AT and with services related to the gaze-based AT intervention. Non-experimental multiple case study with before, after, and follow-up design. Ten children with severe physical impairments without speaking ability (aged 1-15 years) participated in gaze-based AT intervention for 9-10 months, during which period the gaze-based AT was implemented in daily activities. Repertoire of computer activities increased for seven children. All children had sustained usage of gaze-based AT in daily activities at follow-up, all had attained goals, and parents' satisfaction with the AT and with services was high. The gaze-based AT intervention was effective in guiding parents and teachers to continue supporting the children to perform activities with the AT after the intervention program.

  3. Gaze Shift as an Interactional Resource for Very Young Children

    Science.gov (United States)

    Kidwell, Mardi

    2009-01-01

    This article examines how very young children in a day care center make use of their peers' gaze shifts to differentially locate and prepare for the possibility of a caregiver intervention during situations of their biting, hitting, pushing, and the like. At issue is how the visible character of a gaze shift--that is, the manner in which it is…

  4. Star trackers for attitude determination

    DEFF Research Database (Denmark)

    Liebe, Carl Christian

    1995-01-01

    One problem comes to all spacecrafts using vector information. That is the problem of determining the attitude. This paper describes how the area of attitude determination instruments has evolved from simple pointing devices into the latest technology, which determines the attitude by utilizing...... a CCD camera and a powerful microcomputer. The instruments are called star trackers and they are capable of determining the attitude with an accuracy better than 1 arcsecond. The concept of the star tracker is explained. The obtainable accuracy is calculated, the numbers of stars to be included...... in the star catalogue are discussed and the acquisition of the initial attitude is explained. Finally the commercial market for star trackers is discussed...

  5. Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor

    Directory of Open Access Journals (Sweden)

    Rizwan Ali Naqvi

    2018-02-01

    Full Text Available A paradigm shift is required to prevent the increasing automobile accident deaths that are mostly due to the inattentive behavior of drivers. Knowledge of gaze region can provide valuable information regarding a driver’s point of attention. Accurate and inexpensive gaze classification systems in cars can improve safe driving. However, monitoring real-time driving behaviors and conditions presents some challenges: dizziness due to long drives, extreme lighting variations, glasses reflections, and occlusions. Past studies on gaze detection in cars have been chiefly based on head movements. The margin of error in gaze detection increases when drivers gaze at objects by moving their eyes without moving their heads. To solve this problem, a pupil center corneal reflection (PCCR-based method has been considered. However, the error of accurately detecting the pupil center and corneal reflection center is increased in a car environment due to various environment light changes, reflections on glasses surface, and motion and optical blurring of captured eye image. In addition, existing PCCR-based methods require initial user calibration, which is difficult to perform in a car environment. To address this issue, we propose a deep learning-based gaze detection method using a near-infrared (NIR camera sensor considering driver head and eye movement that does not require any initial user calibration. The proposed system is evaluated on our self-constructed database as well as on open Columbia gaze dataset (CAVE-DB. The proposed method demonstrated greater accuracy than the previous gaze classification methods.

  6. Investigating gaze-controlled input in a cognitive selection test

    OpenAIRE

    Gayraud, Katja; Hasse, Catrin; Eißfeldt, Hinnerk; Pannasch, Sebastian

    2017-01-01

    In the field of aviation, there is a growing interest in developing more natural forms of interaction between operators and systems to enhance safety and efficiency. These efforts also include eye gaze as an input channel for human-machine interaction. The present study investigates the application of gaze-controlled input in a cognitive selection test called Eye Movement Conflict Detection Test. The test enables eye movements to be studied as an indicator for psychological test performance a...

  7. Gaze-based interaction with public displays using off-the-shelf components

    DEFF Research Database (Denmark)

    San Agustin, Javier; Hansen, John Paulin; Tall, Martin Henrik

    Eye gaze can be used to interact with high-density information presented on large displays. We have built a system employing off-the-shelf hardware components and open-source gaze tracking software that enables users to interact with an interface displayed on a 55” screen using their eye movement...

  8. Gaze Strategies in Skateboard Trick Jumps: Spatiotemporal Constraints in Complex Locomotion

    Science.gov (United States)

    Klostermann, André; Küng, Philip

    2017-01-01

    Purpose: This study aimed to further the knowledge on gaze behavior in locomotion by studying gaze strategies in skateboard jumps of different difficulty that had to be performed either with or without an obstacle. Method: Nine experienced skateboarders performed "Ollie" and "Kickflip" jumps either over an obstacle or over a…

  9. Off-the-Shelf Gaze Interaction

    DEFF Research Database (Denmark)

    San Agustin, Javier

    People with severe motor-skill disabilities are often unable to use standard input devices such as a mouse or a keyboard to control a computer and they are, therefore, in strong need for alternative input devices. Gaze tracking offers them the possibility to use the movements of their eyes to int...

  10. LHCb upstream tracker

    CERN Multimedia

    Artuso, Marina

    2016-01-01

    The detector for the LHCb upgrade is designed for 40MHz readout, allowing the experiment to run at an instantaneous luminosity of 2x10^33 cm$^2$s$^-1$. The upgrade of the tracker subsystem in front of the dipole magnet, the Upstream Tracker, is crucial for charged track reconstruction and fast trigger decisions based on a tracking algorithm involving also vertex detector information. The detector consists of 4 planes with a total area of about 8.5m$^2$, made of single sided silicon strip sensors read-out by a novel custom-made ASIC (SALT). Details on the performance of prototype sensors, front-end electronics, near-detector electronics and mechanical components are presented.

  11. CMS Tracker Visualisation

    CERN Document Server

    Mennea, Maria Santa; Zito, Giuseppe

    2004-01-01

    To provide improvements in the performance of existing tracker data visualization tools in IGUANA, a 2D visualisation software has been developed, using the object oriented paradigm and software engineering techniques. We have designed 2D graphics objects and some of them have been implemented. The access to the new objects is made in ORCA plugin of IGUANA CMS. A new tracker object oriented model has been designed for developing these 2D graphics objects. The model consists of new classes which represent all its components (layers, modules, rings, petals, rods).The new classes are described here. The last part of this document contains a user manual of the software and will be updated with new releases.

  12. CMS silicon tracker developments

    International Nuclear Information System (INIS)

    Civinini, C.; Albergo, S.; Angarano, M.; Azzi, P.; Babucci, E.; Bacchetta, N.; Bader, A.; Bagliesi, G.; Basti, A.; Biggeri, U.; Bilei, G.M.; Bisello, D.; Boemi, D.; Bosi, F.; Borrello, L.; Bozzi, C.; Braibant, S.; Breuker, H.; Bruzzi, M.; Buffini, A.; Busoni, S.; Candelori, A.; Caner, A.; Castaldi, R.; Castro, A.; Catacchini, E.; Checcucci, B.; Ciampolini, P.; Creanza, D.; D'Alessandro, R.; Da Rold, M.; Demaria, N.; De Palma, M.; Dell'Orso, R.; Della Marina, R.D.R.; Dutta, S.; Eklund, C.; Feld, L.; Fiore, L.; Focardi, E.; French, M.; Freudenreich, K.; Frey, A.; Fuertjes, A.; Giassi, A.; Giorgi, M.; Giraldo, A.; Glessing, B.; Gu, W.H.; Hall, G.; Hammarstrom, R.; Hebbeker, T.; Honma, A.; Hrubec, J.; Huhtinen, M.; Kaminsky, A.; Karimaki, V.; Koenig, St.; Krammer, M.; Lariccia, P.; Lenzi, M.; Loreti, M.; Luebelsmeyer, K.; Lustermann, W.; Maettig, P.; Maggi, G.; Mannelli, M.; Mantovani, G.; Marchioro, A.; Mariotti, C.; Martignon, G.; Evoy, B. Mc; Meschini, M.; Messineo, A.; Migliore, E.; My, S.; Paccagnella, A.; Palla, F.; Pandoulas, D.; Papi, A.; Parrini, G.; Passeri, D.; Pieri, M.; Piperov, S.; Potenza, R.; Radicci, V.; Raffaelli, F.; Raymond, M.; Santocchia, A.; Schmitt, B.; Selvaggi, G.; Servoli, L.; Sguazzoni, G.; Siedling, R.; Silvestris, L.; Starodumov, A.; Stavitski, I.; Stefanini, G.; Surrow, B.; Tempesta, P.; Tonelli, G.; Tricomi, A.; Tuuva, T.; Vannini, C.; Verdini, P.G.; Viertel, G.; Xie, Z.; Yahong, Li; Watts, S.; Wittmer, B.

    2002-01-01

    The CMS Silicon tracker consists of 70 m 2 of microstrip sensors which design will be finalized at the end of 1999 on the basis of systematic studies of device characteristics as function of the most important parameters. A fundamental constraint comes from the fact that the detector has to be operated in a very hostile radiation environment with full efficiency. We present an overview of the current results and prospects for converging on a final set of parameters for the silicon tracker sensors

  13. Model of CMS Tracker

    CERN Multimedia

    Breuker

    1999-01-01

    A full scale CMS tracker mock-up exposed temporarily in the hall of building 40. The purpose of the mock-up is to study the routing of services, assembly and installation. The people in front are only a small fraction of the CMS tracker collaboration. Left to right : M. Atac, R. Castaldi, H. Breuker, D. Pandoulas,P. Petagna, A. Caner, A. Carraro, H. Postema, M. Oriunno, S. da Mota Silva, L. Van Lancker, W. Glessing, G. Benefice, A. Onnela, M. Gaspar, G. M. Bilei

  14. Facial Expressions Modulate the Ontogenetic Trajectory of Gaze-Following among Monkeys

    Science.gov (United States)

    Teufel, Christoph; Gutmann, Anke; Pirow, Ralph; Fischer, Julia

    2010-01-01

    Gaze-following, the tendency to direct one's attention to locations looked at by others, is a crucial aspect of social cognition in human and nonhuman primates. Whereas the development of gaze-following has been intensely studied in human infants, its early ontogeny in nonhuman primates has received little attention. Combining longitudinal and…

  15. ColorTracker

    NARCIS (Netherlands)

    Holzheu, Stefanie; Lee, S.; Herneoja, Aulikki; Österlund, Toni; Markkanen, Piia

    2016-01-01

    With the work-in-progress research project ColorTracker we explore color as a formal design tool. This project-based paper describes a novel software application that processes color composition of a place and transcribes the data into three-dimensional geometries for architectural design. The

  16. Photographic but not line-drawn faces show early perceptual neural sensitivity to eye gaze direction

    Directory of Open Access Journals (Sweden)

    Alejandra eRossi

    2015-04-01

    Full Text Available Our brains readily decode facial movements and changes in social attention, reflected in earlier and larger N170 event-related potentials (ERPs to viewing gaze aversions vs. direct gaze in real faces (Puce et al. 2000. In contrast, gaze aversions in line-drawn faces do not produce these N170 differences (Rossi et al., 2014, suggesting that physical stimulus properties or experimental context may drive these effects. Here we investigated the role of stimulus-induced context on neurophysiological responses to dynamic gaze. Sixteen healthy adults viewed line-drawn and real faces, with dynamic eye aversion and direct gaze transitions, and control stimuli (scrambled arrays and checkerboards while continuous electroencephalographic (EEG activity was recorded. EEG data from 2 temporo-occipital clusters of 9 electrodes in each hemisphere where N170 activity is known to be maximal were selected for analysis. N170 peak amplitude and latency, and temporal dynamics from event-related spectral perturbations (ERSPs were measured in 16 healthy subjects. Real faces generated larger N170s for averted vs. direct gaze motion, however, N170s to real and direct gaze were as large as those to respective controls. N170 amplitude did not differ across line-drawn gaze changes. Overall, bilateral mean gamma power changes for faces relative to control stimuli occurred between 150-350 ms, potentially reflecting signal detection of facial motion.Our data indicate that experimental context does not drive N170 differences to viewed gaze changes. Low-level stimulus properties, such as the high sclera/iris contrast change in real eyes likely drive the N170 changes to viewed aversive movements.

  17. Gender and facial dominance in gaze cuing: emotional context matters in the eyes that we follow.

    Directory of Open Access Journals (Sweden)

    Garian Ohlsen

    Full Text Available Gaze following is a socio-cognitive process that provides adaptive information about potential threats and opportunities in the individual's environment. The aim of the present study was to investigate the potential interaction between emotional context and facial dominance in gaze following. We used the gaze cue task to induce attention to or away from the location of a target stimulus. In the experiment, the gaze cue either belonged to a (dominant looking male face or a (non-dominant looking female face. Critically, prior to the task, individuals were primed with pictures of threat or no threat to induce either a dangerous or safe environment. Findings revealed that the primed emotional context critically influenced the gaze cuing effect. While a gaze cue of the dominant male face influenced performance in both the threat and no-threat conditions, the gaze cue of the non-dominant female face only influenced performance in the no-threat condition. This research suggests an implicit, context-dependent follower bias, which carries implications for research on visual attention, social cognition, and leadership.

  18. NIR tracking assists sports medicine in junior basketball training

    Science.gov (United States)

    Paeglis, Roberts; Bluss, Kristaps; Rudzitis, Andris; Spunde, Andris; Brice, Tamara; Nitiss, Edgars

    2011-07-01

    We recorded eye movements of eight elite junior basketball players. We hypothesized that a more stable gaze is correlated to a better shot rate. Upon preliminary testing we invited male juniors whose eyes could be reliably tracked in a game situation. To these ends, we used a head-mounted video-based eye tracker. The participants had no record of ocular or other health issues. No significant differences were found between shots made with and without the tracker cap, Paired samples t-test yielded p= .130 for the far and p=..900 > .050 for the middle range shots. The players made 40 shots from common far and middle range locations, 5 and 4 meters respectively for aged 14 years As expected, a statistical correlation was found between gaze fixation (in milliseconds) for the far and middle range shot rates, r=.782, p=.03. Notably, juniors who fixated longer before a shot had a more stable fixation or a lower gaze dispersion (in tracker's screen pixels), r=-.786, p=.02. This finding was augmented by the observation that the gaze dispersion while aiming at the basket was less (i.e., gaze more stable) in those who were more likely to score. We derived a regression equation linking fixation duration to shot success. We advocate infra-red eye tracking as a means to monitor player selection and training success.

  19. Right hemispheric dominance and interhemispheric cooperation in gaze-triggered reflexive shift of attention.

    Science.gov (United States)

    Okada, Takashi; Sato, Wataru; Kubota, Yasutaka; Toichi, Motomi; Murai, Toshiya

    2012-03-01

    The neural substrate for the processing of gaze remains unknown. The aim of the present study was to clarify which hemisphere dominantly processes and whether bilateral hemispheres cooperate with each other in gaze-triggered reflexive shift of attention. Twenty-eight normal subjects were tested. The non-predictive gaze cues were presented either in unilateral or bilateral visual fields. The subjects localized the target as soon as possible. Reaction times (RT) were shorter when gaze-cues were congruent toward than away from targets, whichever visual field they were presented in. RT were shorter in left than right visual field presentations. RT in mono-directional bilateral presentations were shorter than both of those in left and right presentations. When bi-directional bilateral cues were presented, RT were faster when valid cues were presented in the left than right visual fields. The right hemisphere appears to be dominant, and there is interhemispheric cooperation in gaze-triggered reflexive shift of attention. © 2012 The Authors. Psychiatry and Clinical Neurosciences © 2012 Japanese Society of Psychiatry and Neurology.

  20. The effects of social pressure and emotional expression on the cone of gaze in patients with social anxiety disorder.

    Science.gov (United States)

    Harbort, Johannes; Spiegel, Julia; Witthöft, Michael; Hecht, Heiko

    2017-06-01

    Patients with social anxiety disorder suffer from pronounced fears in social situations. As gaze perception is crucial in these situations, we examined which factors influence the range of gaze directions where mutual gaze is experienced (the cone of gaze). The social stimulus was modified by changing the number of people (heads) present and the emotional expression of their faces. Participants completed a psychophysical task, in which they had to adjust the eyes of a virtual head to gaze at the edge of the range where mutual eye-contact was experienced. The number of heads affected the width of the gaze cone: the more heads, the wider the gaze cone. The emotional expression of the virtual head had no consistent effect on the width of the gaze cone, it did however affect the emotional state of the participants. Angry expressions produced the highest arousal values. Highest valence emerged from happy faces, lowest valence from angry faces. These results suggest that the widening of the gaze cone in social anxiety disorder is not primarily mediated by their altered emotional reactivity. Implications for gaze assessment and gaze training in therapeutic contexts are discussed. Due to interindividual variability, enlarged gaze cones are not necessarily indicative of social anxiety disorder, they merely constitute a correlate at the group level. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Laser tracker error determination using a network measurement

    International Nuclear Information System (INIS)

    Hughes, Ben; Forbes, Alistair; Lewis, Andrew; Sun, Wenjuan; Veal, Dan; Nasr, Karim

    2011-01-01

    We report on a fast, easily implemented method to determine all the geometrical alignment errors of a laser tracker, to high precision. The technique requires no specialist equipment and can be performed in less than an hour. The technique is based on the determination of parameters of a geometric model of the laser tracker, using measurements of a set of fixed target locations, from multiple locations of the tracker. After fitting of the model parameters to the observed data, the model can be used to perform error correction of the raw laser tracker data or to derive correction parameters in the format of the tracker manufacturer's internal error map. In addition to determination of the model parameters, the method also determines the uncertainties and correlations associated with the parameters. We have tested the technique on a commercial laser tracker in the following way. We disabled the tracker's internal error compensation, and used a five-position, fifteen-target network to estimate all the geometric errors of the instrument. Using the error map generated from this network test, the tracker was able to pass a full performance validation test, conducted according to a recognized specification standard (ASME B89.4.19-2006). We conclude that the error correction determined from the network test is as effective as the manufacturer's own error correction methodologies

  2. Eye-gaze patterns as students study worked-out examples in mechanics

    Directory of Open Access Journals (Sweden)

    Brian H. Ross

    2010-10-01

    Full Text Available This study explores what introductory physics students actually look at when studying worked-out examples. Our classroom experiences indicate that introductory physics students neither discuss nor refer to the conceptual information contained in the text of worked-out examples. This study is an effort to determine to what extent students incorporate the textual information into the way they study. Student eye-gaze patterns were recorded as they studied the examples to aid them in solving a target problem. Contrary to our expectations from classroom interactions, students spent 40±3% of their gaze time reading the textual information. Their gaze patterns were also characterized by numerous jumps between corresponding mathematical and textual information, implying that they were combining information from both sources. Despite this large fraction of time spent reading the text, student recall of the conceptual information contained therein remained very poor. We also found that having a particular problem in mind had no significant effects on the gaze-patterns or conceptual information retention.

  3. MEG evidence for dynamic amygdala modulations by gaze and facial emotions.

    Directory of Open Access Journals (Sweden)

    Thibaud Dumas

    Full Text Available Amygdala is a key brain region for face perception. While the role of amygdala in the perception of facial emotion and gaze has been extensively highlighted with fMRI, the unfolding in time of amydgala responses to emotional versus neutral faces with different gaze directions is scarcely known.Here we addressed this question in healthy subjects using MEG combined with an original source imaging method based on individual amygdala volume segmentation and the localization of sources in the amygdala volume. We found an early peak of amygdala activity that was enhanced for fearful relative to neutral faces between 130 and 170 ms. The effect of emotion was again significant in a later time range (310-350 ms. Moreover, the amygdala response was greater for direct relative averted gaze between 190 and 350 ms, and this effect was selective of fearful faces in the right amygdala.Altogether, our results show that the amygdala is involved in the processing and integration of emotion and gaze cues from faces in different time ranges, thus underlining its role in multiple stages of face perception.

  4. Social eye gaze modulates processing of speech and co-speech gesture.

    Science.gov (United States)

    Holler, Judith; Schubotz, Louise; Kelly, Spencer; Hagoort, Peter; Schuetze, Manuela; Özyürek, Aslı

    2014-12-01

    In human face-to-face communication, language comprehension is a multi-modal, situated activity. However, little is known about how we combine information from different modalities during comprehension, and how perceived communicative intentions, often signaled through visual signals, influence this process. We explored this question by simulating a multi-party communication context in which a speaker alternated her gaze between two recipients. Participants viewed speech-only or speech+gesture object-related messages when being addressed (direct gaze) or unaddressed (gaze averted to other participant). They were then asked to choose which of two object images matched the speaker's preceding message. Unaddressed recipients responded significantly more slowly than addressees for speech-only utterances. However, perceiving the same speech accompanied by gestures sped unaddressed recipients up to a level identical to that of addressees. That is, when unaddressed recipients' speech processing suffers, gestures can enhance the comprehension of a speaker's message. We discuss our findings with respect to two hypotheses attempting to account for how social eye gaze may modulate multi-modal language comprehension. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. The CMS Tracker upgrade for HL-LHC

    CERN Document Server

    Ahuja, Sudha

    2017-01-01

    The LHC machine is planning an upgrade program which will smoothly bring the luminosity to about 5 $\\times$ $10^{34} $cm$^{-2}$s$^{-1}$ in 2028, to possibly reach an integrated luminosity of 3000 fb$^{-1}$ by the end of 2037. This High Luminosity LHC scenario, HL-LHC, will require a preparation program of the LHC detectors known as Phase-2 upgrade. The current CMS Outer Tracker, already running beyond design specifications, and CMS Phase1 Pixel Detector will not be able to survive HL-LHC radiation conditions and CMS will need completely new devices, in order to fully exploit the high-demanding operating conditions and the delivered luminosity. The new Outer Tracker should have also trigger capabilities. To achieve such goals, R$\\&$D activities are ongoing to explore options both for the Outer Tracker, and for the pixel Inner Tracker. Solutions are being developed that would allow including tracking information at Level-1. The design choices for the Tracker upgrades are discussed along with some highlights...

  6. Revisiting the Relationship between the Processing of Gaze Direction and the Processing of Facial Expression

    Science.gov (United States)

    Ganel, Tzvi

    2011-01-01

    There is mixed evidence on the nature of the relationship between the perception of gaze direction and the perception of facial expressions. Major support for shared processing of gaze and expression comes from behavioral studies that showed that observers cannot process expression or gaze and ignore irrelevant variations in the other dimension.…

  7. The LHCb Silicon Tracker, first operational results

    CERN Document Server

    Esperante, D; Adeva, B; Gallas, A; Pérez Trigo, E; Rodríguez Pérez, P; Pazos Álvarez, A; Saborido, J; Vàzquez, P; Bay, A; Bettler, M O; Blanc, F; Bressieux, J; Conti, G; Dupertuis, F; Fave, V; Frei, R; Gauvin, N; Haefeli, G; Keune, A; Luisier, J; Muresan, R; Nakada, T; Needham, M; Nicolas, L; Knecht, M; Potterat, C; Schneider, O; Tran, M; Aquines Gutierrez, O; Bauer, C; Britsch, M; Hofmann, W; Maciuc, F; Schmelling, M; Voss, H; Anderson, J; Buechler, A; Bursche, A; Chiapolini, N; de Cian, M; Elsaesser, C; Hangartner, V; Salzmann, C; Steiner, S; Steinkamp, O; Straumann, U; van Tilburg, J; Tobin, M; Vollhardt, A; Iakovenko, V; Okhrimenko, O; Pugatch, V

    2010-01-01

    The Large Hadron Collider beauty (LHCb) experiment at CERN (Conseil Européen pour la Recherche Nucléaire) is designed to perform precision measurements of b quark decays. The LHCb Silicon Tracker consists of two sub-detectors, the Tracker Turicensis and the Inner Tracker, which are built from silicon micro-strip technology. First performance results of both detectors using data from Large Hadron Collider synchronization tests are presented.

  8. Simulation studies for the ATLAS upgrade Strip tracker

    CERN Document Server

    Wang, Jike; The ATLAS collaboration

    2017-01-01

    ATLAS is making extensive efforts towards preparing a detector upgrade for the High luminosity operations of the LHC (HL-LHC), which will commence operation in ~10 years. The current ATLAS Inner Detector will be replaced by a all-silicon tracker (comprising an inner Pixel tracker and outer Strip tracker). The software currently used for the new silicon tracker is broadly inherited from that used for the LHC Run 1 and 2, but many new developments have been made to better fulfil the future detector and operation requirements. One aspect in particular which will be highlighted is the simulation software for the Strip tracker. The available geometry description software (including the detailed description for all the sensitive elements, the services, etc.) did not allow for accurate modeling of the planned detector design. A range of sensors/layouts for the Strip tracker are being considered and must be studied in detailed simulations in order to assess the performance and ascertain that requirements are met. For...

  9. Coding gaze tracking data with chromatic gradients for VR Exposure Therapy

    DEFF Research Database (Denmark)

    Herbelin, Bruno; Grillon, Helena; De Heras Ciechomski, Pablo

    2007-01-01

    This article presents a simple and intuitive way to represent the eye-tracking data gathered during immersive virtual reality exposure therapy sessions. Eye-tracking technology is used to observe gaze movements during vir- tual reality sessions and the gaze-map chromatic gradient coding allows to...... is fully compatible with different VR exposure systems and provides clinically meaningful data....

  10. Analyzing Virtual Physics Simulations with Tracker

    Science.gov (United States)

    Claessens, Tom

    2017-12-01

    In the physics teaching community, Tracker is well known as a user-friendly open source video analysis software, authored by Douglas Brown. With this tool, the user can trace markers indicated on a video or on stroboscopic photos and perform kinematic analyses. Tracker also includes a data modeling tool that allows one to fit some theoretical equations of motion onto experimentally obtained data. In the field of particle mechanics, Tracker has been effectively used for learning and teaching about projectile motion, "toss up" and free-fall vertical motion, and to explain the principle of mechanical energy conservation. Also, Tracker has been successfully used in rigid body mechanics to interpret the results of experiments with rolling/slipping cylinders and moving rods. In this work, I propose an original method in which Tracker is used to analyze virtual computer simulations created with a physics-based motion solver, instead of analyzing video recording or stroboscopic photos. This could be an interesting approach to study kinematics and dynamics problems in physics education, in particular when there is no or limited access to physical labs. I demonstrate the working method with a typical (but quite challenging) problem in classical mechanics: a slipping/rolling cylinder on a rough surface.

  11. Star Tracker Performance Estimate with IMU

    Science.gov (United States)

    Aretskin-Hariton, Eliot D.; Swank, Aaron J.

    2015-01-01

    A software tool for estimating cross-boresight error of a star tracker combined with an inertial measurement unit (IMU) was developed to support trade studies for the Integrated Radio and Optical Communication project (iROC) at the National Aeronautics and Space Administration Glenn Research Center. Typical laser communication systems, such as the Lunar Laser Communication Demonstration (LLCD) and the Laser Communication Relay Demonstration (LCRD), use a beacon to locate ground stations. iROC is investigating the use of beaconless precision laser pointing to enable laser communication at Mars orbits and beyond. Precision attitude knowledge is essential to the iROC mission to enable high-speed steering of the optical link. The preliminary concept to achieve this precision attitude knowledge is to use star trackers combined with an IMU. The Star Tracker Accuracy (STAcc) software was developed to rapidly assess the capabilities of star tracker and IMU configurations. STAcc determines the overall cross-boresight error of a star tracker with an IMU given the characteristic parameters: quantum efficiency, aperture, apparent star magnitude, exposure time, field of view, photon spread, detector pixels, spacecraft slew rate, maximum stars used for quaternion estimation, and IMU angular random walk. This paper discusses the supporting theory used to construct STAcc, verification of the program and sample results.

  12. The LHCb Silicon Inner Tracker

    International Nuclear Information System (INIS)

    Sievers, P.

    2002-01-01

    A silicon strip detector has been adopted as baseline technology for the LHCb Inner Tracker system. It consists of nine planar stations covering a cross-shaped area around the LHCb beam pipe. Depending on the final layout of the stations the sensitive surface of the Inner Tracker will be of the order of 14 m 2 . Ladders have to be 22 cm long and the pitch of the sensors should be as large as possible in order to reduce costs of the readout electronics. Major design criteria are material budget, short shaping time and a moderate spatial resolution of about 80 μm. After an introduction on the requirements of the LHCb Inner Tracker we present a description and characterization of silicon prototype sensors. First, laboratory and test beam results are discussed

  13. Examining the durability of incidentally learned trust from gaze cues.

    Science.gov (United States)

    Strachan, James W A; Tipper, Steven P

    2017-10-01

    In everyday interactions we find our attention follows the eye gaze of faces around us. As this cueing is so powerful and difficult to inhibit, gaze can therefore be used to facilitate or disrupt visual processing of the environment, and when we experience this we infer information about the trustworthiness of the cueing face. However, to date no studies have investigated how long these impressions last. To explore this we used a gaze-cueing paradigm where faces consistently demonstrated either valid or invalid cueing behaviours. Previous experiments show that valid faces are subsequently rated as more trustworthy than invalid faces. We replicate this effect (Experiment 1) and then include a brief interference task in Experiment 2 between gaze cueing and trustworthiness rating, which weakens but does not completely eliminate the effect. In Experiment 3, we explore whether greater familiarity with the faces improves the durability of trust learning and find that the effect is more resilient with familiar faces. Finally, in Experiment 4, we push this further and show that evidence of trust learning can be seen up to an hour after cueing has ended. Taken together, our results suggest that incidentally learned trust can be durable, especially for faces that deceive.

  14. Reading pleasure : Light in August and the theory of the gendered gaze

    NARCIS (Netherlands)

    Visser, [No Value

    1997-01-01

    This article discusses various components of the theory of the gendered gaze, in order to construct a framework for an analysis of how William Faulkner understood and fictionally presented the mechanism of the gaze in Light in August. Linking theory and praxis, this article explores the relationship

  15. The CMS all silicon Tracker simulation

    CERN Document Server

    Biasini, Maurizio

    2009-01-01

    The Compact Muon Solenoid (CMS) tracker detector is the world's largest silicon detector with about 201 m$^2$ of silicon strips detectors and 1 m$^2$ of silicon pixel detectors. It contains 66 millions pixels and 10 million individual sensing strips. The quality of the physics analysis is highly correlated with the precision of the Tracker detector simulation which is written on top of the GEANT4 and the CMS object-oriented framework. The hit position resolution in the Tracker detector depends on the ability to correctly model the CMS tracker geometry, the signal digitization and Lorentz drift, the calibration and inefficiency. In order to ensure high performance in track and vertex reconstruction, an accurate knowledge of the material budget is therefore necessary since the passive materials, involved in the readout, cooling or power systems, will create unwanted effects during the particle detection, such as multiple scattering, electron bremsstrahlung and photon conversion. In this paper, we present the CM...

  16. COGAIN2009 - "Gaze interaction for those who want it most"

    DEFF Research Database (Denmark)

    , with substantial amounts of applications to support communication, learning and entertainment already in use. However, there are still some uncertainties about this new technology amongst communication specialists and funding institutions. The 5th COGAIN conference will focus on spreading the experiences of people...... using gaze interaction in their daily life to potential users and specialists who have yet to benefit from it. The theme of the conference is "Gaze interaction for those who want it most". We present a total of 18 papers that have been reviewed and accepted by leading researchers and communication...... specialists. Several papers address gaze-based access to computer applications and several papers focus on environmental control. Previous COGAIN conferences have been a most effective launch pad for original new research ideas. Some of them have since found their way into journals and other conferences...

  17. Gazes and Bodies in the Pornographies of Desire

    Directory of Open Access Journals (Sweden)

    Mirko Lino

    2013-06-01

    Full Text Available Il saggio propone una lettura sociale della pornografia cinematografica usando la categoria dello sguardo e le sue declinazioni di genere. Partendo dal famoso saggio di Laura Mulvey, Visual Pleasure and Narrative Cinema (1975, si sposterà il centro dell’analisi dal cinema narrativo tradizionale al cinema pornografico, considerando quest’ultimo un modello che fa contrappunto al voyeurismo dello sguardo maschile (male gaze che la Mulvey riscontrava in alcuni film di Sternberg e di Hitckcock. Per dimostrare la natura contrappuntistica del cinema pornografico si metterà a confronto la medesima messa in scena di uno sguardo che desidera in due film molto differenti tra loro, Rear Window (1954 di Hitchcock e Behind the Green Door (1972 dei fratelli Mitchell. La natura contrappuntistica del porn movie si rintraccia anche nel dialogo che instaura con il cinema mainstream riguardo ai limiti del visibile in materia sessuale. Infine, la pornografia soddisfa la rappresentazione del desiderio delle altre identità di genere, mettendo in scena tipologie di sguardo adeguate – female gaze e queer gaze –, che diventano presto strumenti per una affermazione sociale della diversità sessuali.

  18. Real-time gaze estimation via pupil center tracking

    Directory of Open Access Journals (Sweden)

    Cazzato Dario

    2018-02-01

    Full Text Available Automatic gaze estimation not based on commercial and expensive eye tracking hardware solutions can enable several applications in the fields of human computer interaction (HCI and human behavior analysis. It is therefore not surprising that several related techniques and methods have been investigated in recent years. However, very few camera-based systems proposed in the literature are both real-time and robust. In this work, we propose a real-time user-calibration-free gaze estimation system that does not need person-dependent calibration, can deal with illumination changes and head pose variations, and can work with a wide range of distances from the camera. Our solution is based on a 3-D appearance-based method that processes the images from a built-in laptop camera. Real-time performance is obtained by combining head pose information with geometrical eye features to train a machine learning algorithm. Our method has been validated on a data set of images of users in natural environments, and shows promising results. The possibility of a real-time implementation, combined with the good quality of gaze tracking, make this system suitable for various HCI applications.

  19. Tracker Performance Metric

    National Research Council Canada - National Science Library

    Olson, Teresa; Lee, Harry; Sanders, Johnnie

    2002-01-01

    .... We have developed the Tracker Performance Metric (TPM) specifically for this purpose. It was designed to measure the output performance, on a frame-by-frame basis, using its output position and quality...

  20. Autonomous star tracker based on active pixel sensors (APS)

    Science.gov (United States)

    Schmidt, U.

    2017-11-01

    Star trackers are opto-electronic sensors used onboard of satellites for the autonomous inertial attitude determination. During the last years, star trackers became more and more important in the field of the attitude and orbit control system (AOCS) sensors. High performance star trackers are based up today on charge coupled device (CCD) optical camera heads. The Jena-Optronik GmbH is active in the field of opto-electronic sensors like star trackers since the early 80-ties. Today, with the product family ASTRO5, ASTRO10 and ASTRO15, all marked segments like earth observation, scientific applications and geo-telecom are supplied to European and Overseas customers. A new generation of star trackers can be designed based on the APS detector technical features. The measurement performance of the current CCD based star trackers can be maintained, the star tracker functionality, reliability and robustness can be increased while the unit costs are saved.

  1. A wireless sensor enabled by wireless power.

    Science.gov (United States)

    Lee, Da-Sheng; Liu, Yu-Hong; Lin, Chii-Ruey

    2012-11-22

    Through harvesting energy by wireless charging and delivering data by wireless communication, this study proposes the concept of a wireless sensor enabled by wireless power (WPWS) and reports the fabrication of a prototype for functional tests. One WPWS node consists of wireless power module and sensor module with different chip-type sensors. Its main feature is the dual antenna structure. Following RFID system architecture, a power harvesting antenna was designed to gather power from a standard reader working in the 915 MHz band. Referring to the Modbus protocol, the other wireless communication antenna was integrated on a node to send sensor data in parallel. The dual antenna structure integrates both the advantages of an RFID system and a wireless sensor. Using a standard UHF RFID reader, WPWS can be enabled in a distributed area with a diameter up to 4 m. Working status is similar to that of a passive tag, except that a tag can only be queried statically, while the WPWS can send dynamic data from the sensors. The function is the same as a wireless sensor node. Different WPWSs equipped with temperature and humidity, optical and airflow velocity sensors are tested in this study. All sensors can send back detection data within 8 s. The accuracy is within 8% deviation compared with laboratory equipment. A wireless sensor network enabled by wireless power should be a totally wireless sensor network using WPWS. However, distributed WPWSs only can form a star topology, the simplest topology for constructing a sensor network. Because of shielding effects, it is difficult to apply other complex topologies. Despite this limitation, WPWS still can be used to extend sensor network applications in hazardous environments. Further research is needed to improve WPWS to realize a totally wireless sensor network.

  2. Gaze-based assistive technology used in daily life by children with severe physical impairments - parents' experiences.

    Science.gov (United States)

    Borgestig, Maria; Rytterström, Patrik; Hemmingsson, Helena

    2017-07-01

    To describe and explore parents' experiences when their children with severe physical impairments receive gaze-based assistive technology (gaze-based assistive technology (AT)) for use in daily life. Semi-structured interviews were conducted twice, with one year in between, with parents of eight children with cerebral palsy that used gaze-based AT in their daily activities. To understand the parents' experiences, hermeneutical interpretations were used during data analysis. The findings demonstrate that for parents, children's gaze-based AT usage meant that children demonstrated agency, provided them with opportunities to show personality and competencies, and gave children possibilities to develop. Overall, children's gaze-based AT provides hope for a better future for their children with severe physical impairments; a future in which the children can develop and gain influence in life. Gaze-based AT provides children with new opportunities to perform activities and take initiatives to communicate, giving parents hope about the children's future.

  3. Magnet Test Setup of the CMS Tracker ready for installation

    CERN Multimedia

    Maximilien Brice

    2006-01-01

    The pieces of the Tracker that will be operated in the forthcoming Magnet Test and Cosmic Challenge (MTCC) have been transported inside the dummy tracker support tube to the CMS experimental hall (Point 5, Cessy). The operation took place during the night of 12th May, covering the ~15km distance in about three hours. The transport was monitored for shocks, temperature and humidity with the help of the CERN TS-IC section. The Tracker setup comprises segments of the Tracker Inner Barrel (TIB), the Tracker Outer Barrel (TOB) and Tracker EndCaps (TEC) detectors. It represents roughly 1% of the final CMS Tracker. Installation into the solenoid is foreseen to take place on Wednesday 17th May.

  4. Mutual Disambiguation of Eye Gaze and Speech for Sight Translation and Reading

    DEFF Research Database (Denmark)

    Kulkarni, Rucha; Jain, Kritika; Bansal, Himanshu

    2013-01-01

    and composition of the two modalities was used for integration. F-measure for Eye-Gaze and Word Accuracy for ASR were used as metrics to evaluate our results. In reading task, we demonstrated a significant improvement in both Eye-Gaze f-measure and speech Word Accuracy. In sight translation task, significant...

  5. MEG Evidence for Dynamic Amygdala Modulations by Gaze and Facial Emotions

    Science.gov (United States)

    Dumas, Thibaud; Dubal, Stéphanie; Attal, Yohan; Chupin, Marie; Jouvent, Roland; Morel, Shasha; George, Nathalie

    2013-01-01

    Background Amygdala is a key brain region for face perception. While the role of amygdala in the perception of facial emotion and gaze has been extensively highlighted with fMRI, the unfolding in time of amydgala responses to emotional versus neutral faces with different gaze directions is scarcely known. Methodology/Principal Findings Here we addressed this question in healthy subjects using MEG combined with an original source imaging method based on individual amygdala volume segmentation and the localization of sources in the amygdala volume. We found an early peak of amygdala activity that was enhanced for fearful relative to neutral faces between 130 and 170 ms. The effect of emotion was again significant in a later time range (310–350 ms). Moreover, the amygdala response was greater for direct relative averted gaze between 190 and 350 ms, and this effect was selective of fearful faces in the right amygdala. Conclusion Altogether, our results show that the amygdala is involved in the processing and integration of emotion and gaze cues from faces in different time ranges, thus underlining its role in multiple stages of face perception. PMID:24040190

  6. The sensation of the look: The gazes in Laurence Anyways

    OpenAIRE

    Schultz, Corey Kai Nelson

    2018-01-01

    This article analyses the gazes, looks, stares and glares in Laurence Anyways (Xavier Dolan, 2012), and examines their affective, interpretive, and symbolic qualities, and their potential to create viewer empathy through affect. The cinematic gaze can produce sensations of shame and fear, by offering a sequence of varied “encounters” to which viewers can react, before we have been given a character onto which we can deflect them, thus bypassing the representational, narrative and even the sym...

  7. Rosetta Star Tracker and Navigation Camera

    DEFF Research Database (Denmark)

    Thuesen, Gøsta

    1998-01-01

    Proposal in response to the Invitation to Tender (ITT) issued by Matra Marconi Space (MSS) for the procurement of the ROSETTA Star Tracker and Navigation Camera.......Proposal in response to the Invitation to Tender (ITT) issued by Matra Marconi Space (MSS) for the procurement of the ROSETTA Star Tracker and Navigation Camera....

  8. LHCb Upstream Tracker

    CERN Multimedia

    Gandini, Paolo

    2014-01-01

    The LHCb upgrade requires replacing the silicon strip tracker between the vertex locator (VELO) and the magnet. A new design has been developed and tested based on the "stave" concept planned for the ATLAS upgrade.

  9. LHCb Upstream Tracker

    CERN Multimedia

    Gandini, P

    2014-01-01

    The LHCb upgrade requires replacing the silicon strip tracker between the vertex locator (VELO) and the magnet. A new design has been developed and tested based on the "stave" concept planned for the ATLAS upgrade

  10. Attention and Gaze Control in Picture Naming, Word Reading, and Word Categorizing

    Science.gov (United States)

    Roelofs, Ardi

    2007-01-01

    The trigger for shifting gaze between stimuli requiring vocal and manual responses was examined. Participants were presented with picture-word stimuli and left- or right-pointing arrows. They vocally named the picture (Experiment 1), read the word (Experiment 2), or categorized the word (Experiment 3) and shifted their gaze to the arrow to…

  11. Computing eye gaze metrics for the automatic assessment of radiographer performance during X-ray image interpretation.

    Science.gov (United States)

    McLaughlin, Laura; Bond, Raymond; Hughes, Ciara; McConnell, Jonathan; McFadden, Sonyia

    2017-09-01

    To investigate image interpretation performance by diagnostic radiography students, diagnostic radiographers and reporting radiographers by computing eye gaze metrics using eye tracking technology. Three groups of participants were studied during their interpretation of 8 digital radiographic images including the axial and appendicular skeleton, and chest (prevalence of normal images was 12.5%). A total of 464 image interpretations were collected. Participants consisted of 21 radiography students, 19 qualified radiographers and 18 qualified reporting radiographers who were further qualified to report on the musculoskeletal (MSK) system. Eye tracking data was collected using the Tobii X60 eye tracker and subsequently eye gaze metrics were computed. Voice recordings, confidence levels and diagnoses provided a clear demonstration of the image interpretation and the cognitive processes undertaken by each participant. A questionnaire afforded the participants an opportunity to offer information on their experience in image interpretation and their opinion on the eye tracking technology. Reporting radiographers demonstrated a 15% greater accuracy rate (p≤0.001), were more confident (p≤0.001) and took a mean of 2.4s longer to clinically decide on all features compared to students. Reporting radiographers also had a 15% greater accuracy rate (p≤0.001), were more confident (p≤0.001) and took longer to clinically decide on an image diagnosis (p=0.02) than radiographers. Reporting radiographers had a greater mean fixation duration (p=0.01), mean fixation count (p=0.04) and mean visit count (p=0.04) within the areas of pathology compared to students. Eye tracking patterns, presented within heat maps, were a good reflection of group expertise and search strategies. Eye gaze metrics such as time to first fixate, fixation count, fixation duration and visit count within the areas of pathology were indicative of the radiographer's competency. The accuracy and confidence of

  12. Gaze characteristics of elite and near-elite athletes in ice hockey defensive tactics.

    Science.gov (United States)

    Martell, Stephen G; Vickers, Joan N

    2004-04-01

    Traditional visual search experiments, where the researcher pre-selects video-based scenes for the participant to respond to, shows that elite players make more efficient decisions than non-elites, but disagree on how they temporally regulate their gaze. Using the vision-in-action [J.N. Vickers, J. Exp. Psychol.: Human Percept. Perform. 22 (1996) 342] approach, we tested whether the significant gaze that differentiates elite and non-elite athletes occurred either: early in the task and was of more rapid duration [A.M. Williams et al., Res. Quart. Exer. Sport 65 (1994) 127; A.M. Williams and K. Davids, Res. Quart. Exer. Sport 69 (1998) 111], or late in the task and was of longer duration [W. Helsen, J.M. Pauwels, A cognitive approach to visual search in sport, in: D. Brogan, K. Carr (Eds.), Visual Search, vol. II, Taylor and Francis, London, 1992], or whether a more complex gaze control strategy was used that consisted of both early and rapid fixations followed by a late fixation of long duration prior to the final execution. We tested this using a live defensive zone task in ice hockey. Results indicated that athletes temporally regulated their gaze using two different gaze control strategies. First, fixation/tracking (F/T) gaze early in the trial were significantly shorter than the final F/T and confirmed that the elite group fixated the tactical locations more rapidly than the non-elite on successful plays. And secondly, the final F/T prior to critical movement initiation (i.e. F/T-1) was significantly longer for both groups, averaging 30% of the final part of the phase and occurred as the athletes isolated a single object or location to end the play. The results imply that expertise in defensive tactics is defined by a cascade of F/T, which began with the athletes fixating or tracking specific locations for short durations at the beginning of the play, and concluded with a final gaze of long duration to a relatively stable target at the end. The results are

  13. INNER TRACKER

    CERN Multimedia

    Peter Sharp

    In March the Silicon Strip Detector had been successfully connected to the PP1 patch panels on the CMS Cryostat, and every thing had been prepared to check out the Tracker and commission it with CMS with the ambition of joining the CMS Global Cosmic Run in April.  There followed serious problems with the cooling plant which through tremendous effort have been overcome and recently allowed commissioning of the tracker to proceed. In November 2007 there had been a failure of the heat exchanger in one of the seven cooling plants in the UXC cavern. After an analysis of the failure it was decided to replace this heat exchanger with a well-proven commercial heat exchanger and to re-commission the system. Re-commissioning the system proved to be more difficult than anticipated as on May 8 there was a second failure of a heat exchanger, in the main chiller plant in the USC service cavern. The analysis of the failure showed it was very similar to the previous failure. It was decided to replace all the heat ...

  14. Developments for the TOF Straw Tracker

    Energy Technology Data Exchange (ETDEWEB)

    Ucar, A.

    2006-07-01

    COSY-TOF is a very large acceptance spectrometer for charged particles using precise information on track geometry and time of flight of reaction products. It is an external detector system at the Cooler Synchrotron and storage ring COSY in Juelich. In order to improve the performance of the COSY-TOF, a new tracking detector ''Straw Tracker'' is being constructed which combines very low mass, operation in vacuum, very good resolution, high sampling density and very high acceptance. A comparison of pp{yields}d{pi}{sup +} data and a simulation using the straw tracker with geometry alone indicates big improvements with the new tracker. In order to investigate the straw tracker properties a small tracking hodoscope ''cosmic ray test facility'' was constructed in advance. It is made of two crossed hodoscopes consisting of 128 straw tubes arranged in 4 double planes. For the first time Juelich straws have been used for 3 dimensional reconstruction of cosmic ray tracks. In this illuminating field the space dependent response of scintillators and a straw tube were studied. (orig.)

  15. Developments for the TOF Straw Tracker

    International Nuclear Information System (INIS)

    Ucar, A.

    2006-01-01

    COSY-TOF is a very large acceptance spectrometer for charged particles using precise information on track geometry and time of flight of reaction products. It is an external detector system at the Cooler Synchrotron and storage ring COSY in Juelich. In order to improve the performance of the COSY-TOF, a new tracking detector ''Straw Tracker'' is being constructed which combines very low mass, operation in vacuum, very good resolution, high sampling density and very high acceptance. A comparison of pp→dπ + data and a simulation using the straw tracker with geometry alone indicates big improvements with the new tracker. In order to investigate the straw tracker properties a small tracking hodoscope ''cosmic ray test facility'' was constructed in advance. It is made of two crossed hodoscopes consisting of 128 straw tubes arranged in 4 double planes. For the first time Juelich straws have been used for 3 dimensional reconstruction of cosmic ray tracks. In this illuminating field the space dependent response of scintillators and a straw tube were studied. (orig.)

  16. Can gaze-contingent mirror-feedback from unfamiliar faces alter self-recognition?

    Science.gov (United States)

    Estudillo, Alejandro J; Bindemann, Markus

    2017-05-01

    This study focuses on learning of the self, by examining how human observers update internal representations of their own face. For this purpose, we present a novel gaze-contingent paradigm, in which an onscreen face mimics observers' own eye-gaze behaviour (in the congruent condition), moves its eyes in different directions to that of the observers (incongruent condition), or remains static and unresponsive (neutral condition). Across three experiments, the mimicry of the onscreen face did not affect observers' perceptual self-representations. However, this paradigm influenced observers' reports of their own face. This effect was such that observers felt the onscreen face to be their own and that, if the onscreen gaze had moved on its own accord, observers expected their own eyes to move too. The theoretical implications of these findings are discussed.

  17. Objective eye-gaze behaviour during face-to-face communication with proficient alaryngeal speakers: a preliminary study.

    Science.gov (United States)

    Evitts, Paul; Gallop, Robert

    2011-01-01

    There is a large body of research demonstrating the impact of visual information on speaker intelligibility in both normal and disordered speaker populations. However, there is minimal information on which specific visual features listeners find salient during conversational discourse. To investigate listeners' eye-gaze behaviour during face-to-face conversation with normal, laryngeal and proficient alaryngeal speakers. Sixty participants individually participated in a 10-min conversation with one of four speakers (typical laryngeal, tracheoesophageal, oesophageal, electrolaryngeal; 15 participants randomly assigned to one mode of speech). All speakers were > 85% intelligible and were judged to be 'proficient' by two certified speech-language pathologists. Participants were fitted with a head-mounted eye-gaze tracking device (Mobile Eye, ASL) that calculated the region of interest and mean duration of eye-gaze. Self-reported gaze behaviour was also obtained following the conversation using a 10 cm visual analogue scale. While listening, participants viewed the lower facial region of the oesophageal speaker more than the normal or tracheoesophageal speaker. Results of non-hierarchical cluster analyses showed that while listening, the pattern of eye-gaze was predominantly directed at the lower face of the oesophageal and electrolaryngeal speaker and more evenly dispersed among the background, lower face, and eyes of the normal and tracheoesophageal speakers. Finally, results show a low correlation between self-reported eye-gaze behaviour and objective regions of interest data. Overall, results suggest similar eye-gaze behaviour when healthy controls converse with normal and tracheoesophageal speakers and that participants had significantly different eye-gaze patterns when conversing with an oesophageal speaker. Results are discussed in terms of existing eye-gaze data and its potential implications on auditory-visual speech perception. © 2011 Royal College of Speech

  18. Age differences in conscious versus subconscious social perception: the influence of face age and valence on gaze following.

    Science.gov (United States)

    Bailey, Phoebe E; Slessor, Gillian; Rendell, Peter G; Bennetts, Rachel J; Campbell, Anna; Ruffman, Ted

    2014-09-01

    Gaze following is the primary means of establishing joint attention with others and is subject to age-related decline. In addition, young but not older adults experience an own-age bias in gaze following. The current research assessed the effects of subconscious processing on these age-related differences. Participants responded to targets that were either congruent or incongruent with the direction of gaze displayed in supraliminal and subliminal images of young and older faces. These faces displayed either neutral (Study 1) or happy and fearful (Study 2) expressions. In Studies 1 and 2, both age groups demonstrated gaze-directed attention by responding faster to targets that were congruent as opposed to incongruent with gaze-cues. In Study 1, subliminal stimuli did not attenuate the age-related decline in gaze-cuing, but did result in an own-age bias among older participants. In Study 2, gaze-cuing was reduced for older relative to young adults in response to supraliminal stimuli, and this could not be attributed to reduced visual acuity or age group differences in the perceived emotional intensity of the gaze-cue faces. Moreover, there were no age differences in gaze-cuing when responding to subliminal faces that were emotionally arousing. In addition, older adults demonstrated an own-age bias for both conscious and subconscious gaze-cuing when faces expressed happiness but not fear. We discuss growing evidence for age-related preservation of subconscious relative to conscious social perception, as well as an interaction between face age and valence in social perception. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  19. Gaze interaction with textual user interface

    DEFF Research Database (Denmark)

    Paulin Hansen, John; Lund, Haakon; Madsen, Janus Askø

    2015-01-01

    ” option for text navigation. People readily understood how to execute RSVP command prompts and a majority of them preferred gaze input to a pen pointer. We present the concept of a smartwatch that can track eye movements and mediate command options whenever in proximity of intelligent devices...

  20. Software and mathematical support of Kazakhstani star tracker

    Science.gov (United States)

    Akhmedov, D.; Yelubayev, S.; Ten, V.; Bopeyev, T.; Alipbayev, K.; Sukhenko, A.

    2016-10-01

    Currently the specialists of Kazakhstan have been developing the star tracker that is further planned to use on Kazakhstani satellites of various purposes. At the first stage it has been developed the experimental model of star tracker that has following characteristics: field of view 20°, update frequency 2 Hz, exclusion angle 40°, accuracy of attitude determination of optical axis/around optical axis 15/50 arcsec. Software and mathematical support are the most high technology parts of star tracker. The results of software and mathematical support development of experimental model of Kazakhstani star tracker are represented in this article. In particular, there are described the main mathematical models and algorithms that have been used as a basis for program units of preliminary image processing of starry sky, stars identification and star tracker attitude determination. The results of software and mathematical support testing with the help of program simulation complex using various configurations of defects including image sensor noises, point spread function modeling, optical system distortion up to 2% are presented. Analysis of testing results has shown that accuracy of attitude determination of star tracker is within the permissible range

  1. Application results for an augmented video tracker

    Science.gov (United States)

    Pierce, Bill

    1991-08-01

    The Relay Mirror Experiment (RME) is a research program to determine the pointing accuracy and stability levels achieved when a laser beam is reflected by the RME satellite from one ground station to another. This paper reports the results of using a video tracker augmented with a quad cell signal to improve the RME ground station tracking system performance. The video tracker controls a mirror to acquire the RME satellite, and provides a robust low bandwidth tracking loop to remove line of sight (LOS) jitter. The high-passed, high-gain quad cell signal is added to the low bandwidth, low-gain video tracker signal to increase the effective tracking loop bandwidth, and significantly improves LOS disturbance rejection. The quad cell augmented video tracking system is analyzed, and the math model for the tracker is developed. A MATLAB model is then developed from this, and performance as a function of bandwidth and disturbances is given. Improvements in performance due to the addition of the video tracker and the augmentation with the quad cell are provided. Actual satellite test results are then presented and compared with the simulated results.

  2. Gaze strategy in the free flying zebra finch (Taeniopygia guttata.

    Directory of Open Access Journals (Sweden)

    Dennis Eckmeier

    Full Text Available Fast moving animals depend on cues derived from the optic flow on their retina. Optic flow from translational locomotion includes information about the three-dimensional composition of the environment, while optic flow experienced during a rotational self motion does not. Thus, a saccadic gaze strategy that segregates rotations from translational movements during locomotion will facilitate extraction of spatial information from the visual input. We analysed whether birds use such a strategy by highspeed video recording zebra finches from two directions during an obstacle avoidance task. Each frame of the recording was examined to derive position and orientation of the beak in three-dimensional space. The data show that in all flights the head orientation was shifted in a saccadic fashion and was kept straight between saccades. Therefore, birds use a gaze strategy that actively stabilizes their gaze during translation to simplify optic flow based navigation. This is the first evidence of birds actively optimizing optic flow during flight.

  3. The Laser Alignment System for the CMS silicon strip tracker

    CERN Document Server

    Olzem, Jan

    2009-01-01

    The Laser Alignment System (LAS) of the CMS silicon strip Tracker has been designed for surveying the geometry of the large-scale Tracker support structures. It uses 40 laser beams ($\\lambda$ = 1075 nm) that induce signals on a subset of the Tracker silicon sensors. The positions in space of the laser spots on the sensors are reconstructed with a resolution of 30 $\\mu$m. From this, the LAS is capable of permanent in-time monitoring of the different Tracker components relative to each other with better than 30 $\\mu$m precision. Additionally, it can provide an absolute measurement of the Tracker mechanical structure with an accuracy better than 70 $\\mu$m, thereby supplying additional input to the track based alignment at detector startup. 31 out of the 40 LAS beams have been successfully operated during the CMS cosmic muon data taking campaign in autumn 2008. The alignment of the Tracker Endcap Discs and of the discs with respect to the Tracker Inner Barrel and Tracker Outer Barrel subdetectors was measured w...

  4. Modeling eye gaze patterns in clinician-patient interaction with lag sequential analysis.

    Science.gov (United States)

    Montague, Enid; Xu, Jie; Chen, Ping-Yu; Asan, Onur; Barrett, Bruce P; Chewning, Betty

    2011-10-01

    The aim of this study was to examine whether lag sequential analysis could be used to describe eye gaze orientation between clinicians and patients in the medical encounter. This topic is particularly important as new technologies are implemented into multiuser health care settings in which trust is critical and nonverbal cues are integral to achieving trust. This analysis method could lead to design guidelines for technologies and more effective assessments of interventions. Nonverbal communication patterns are important aspects of clinician-patient interactions and may affect patient outcomes. The eye gaze behaviors of clinicians and patients in 110 videotaped medical encounters were analyzed using the lag sequential method to identify significant behavior sequences. Lag sequential analysis included both event-based lag and time-based lag. Results from event-based lag analysis showed that the patient's gaze followed that of the clinician, whereas the clinician's gaze did not follow the patient's. Time-based sequential analysis showed that responses from the patient usually occurred within 2 s after the initial behavior of the clinician. Our data suggest that the clinician's gaze significantly affects the medical encounter but that the converse is not true. Findings from this research have implications for the design of clinical work systems and modeling interactions. Similar research methods could be used to identify different behavior patterns in clinical settings (physical layout, technology, etc.) to facilitate and evaluate clinical work system designs.

  5. Infants' Developing Understanding of Social Gaze

    Science.gov (United States)

    Beier, Jonathan S.; Spelke, Elizabeth S.

    2012-01-01

    Young infants are sensitive to self-directed social actions, but do they appreciate the intentional, target-directed nature of such behaviors? The authors addressed this question by investigating infants' understanding of social gaze in third-party interactions (N = 104). Ten-month-old infants discriminated between 2 people in mutual versus…

  6. Eye gaze during comprehension of American Sign Language by native and beginning signers.

    Science.gov (United States)

    Emmorey, Karen; Thompson, Robin; Colvin, Rachael

    2009-01-01

    An eye-tracking experiment investigated where deaf native signers (N = 9) and hearing beginning signers (N = 10) look while comprehending a short narrative and a spatial description in American Sign Language produced live by a fluent signer. Both groups fixated primarily on the signer's face (more than 80% of the time) but differed with respect to fixation location. Beginning signers fixated on or near the signer's mouth, perhaps to better perceive English mouthing, whereas native signers tended to fixate on or near the eyes. Beginning signers shifted gaze away from the signer's face more frequently than native signers, but the pattern of gaze shifts was similar for both groups. When a shift in gaze occurred, the sign narrator was almost always looking at his or her hands and was most often producing a classifier construction. We conclude that joint visual attention and attention to mouthing (for beginning signers), rather than linguistic complexity or processing load, affect gaze fixation patterns during sign language comprehension.

  7. The Design Parameters for the MICE Tracker Solenoid

    International Nuclear Information System (INIS)

    Green, Michael A.; Chen, C.Y.; Juang, Tiki; Lau, Wing W.; Taylor, Clyde; Virostek, Steve P.; Wahrer, Robert; Wang, S.T.; Witte, Holger; Yang, Stephanie Q.

    2006-01-01

    The first superconducting magnets to be installed in the muon ionization cooling experiment (MICE) will be the tracker solenoids. The tracker solenoid module is a five coil superconducting solenoid with a 400 mm diameter warm bore that is used to provide a 4 T magnetic field for the experiment tracker module. Three of the coils are used to produce a uniform field (up to 4 T with better than 1 percent uniformity) in a region that is 300 mm in diameter and 1000 mm long. The other two coils are used to match the muon beam into the MICE cooling channel. Two 2.94-meter long superconducting tracker solenoid modules have been ordered for MICE. The tracker solenoid will be cooled using two-coolers that produce 1.5 W each at 4.2 K. The magnet system is described. The decisions that drive the magnet design will be discussed in this report

  8. Reflections of Head Mounted systems for Domotic Control

    DEFF Research Database (Denmark)

    Mardanbeigi, Diako; Witzner Hansen, Dan

    2010-01-01

    In this report we would like to investigate the generalization of the concept of gaze interaction and investigate the possibility of using a gaze tracker for interaction not only with a single computer screen but also with multiple computer screens and possibly other environment objects in an int...

  9. Spatial updating depends on gaze direction even after loss of vision.

    Science.gov (United States)

    Reuschel, Johanna; Rösler, Frank; Henriques, Denise Y P; Fiehler, Katja

    2012-02-15

    Direction of gaze (eye angle + head angle) has been shown to be important for representing space for action, implying a crucial role of vision for spatial updating. However, blind people have no access to vision yet are able to perform goal-directed actions successfully. Here, we investigated the role of visual experience for localizing and updating targets as a function of intervening gaze shifts in humans. People who differed in visual experience (late blind, congenitally blind, or sighted) were briefly presented with a proprioceptive reach target while facing it. Before they reached to the target's remembered location, they turned their head toward an eccentric direction that also induced corresponding eye movements in sighted and late blind individuals. We found that reaching errors varied systematically as a function of shift in gaze direction only in participants with early visual experience (sighted and late blind). In the late blind, this effect was solely present in people with moveable eyes but not in people with at least one glass eye. Our results suggest that the effect of gaze shifts on spatial updating develops on the basis of visual experience early in life and remains even after loss of vision as long as feedback from the eyes and head is available.

  10. Radiologically defining horizontal gaze using EOS imaging-a prospective study of healthy subjects and a retrospective audit.

    Science.gov (United States)

    Hey, Hwee Weng Dennis; Tan, Kimberly-Anne; Ho, Vivienne Chien-Lin; Azhar, Syifa Bte; Lim, Joel-Louis; Liu, Gabriel Ka-Po; Wong, Hee-Kit

    2018-06-01

    As sagittal alignment of the cervical spine is important for maintaining horizontal gaze, it is important to determine the former for surgical correction. However, horizontal gaze remains poorly-defined from a radiological point of view. The objective of this study was to establish radiological criteria to define horizontal gaze. This study was conducted at a tertiary health-care institution over a 1-month period. A prospective cohort of healthy patients was used to determine the best radiological criteria for defining horizontal gaze. A retrospective cohort of patients without rigid spinal deformities was used to audit the incidence of horizontal gaze. Two categories of radiological parameters for determining horizontal gaze were tested: (1) the vertical offset distances of key identifiable structures from the horizontal gaze axis and (2) imaginary lines convergent with the horizontal gaze axis. Sixty-seven healthy subjects underwent whole-body EOS radiographs taken in a directed standing posture. Horizontal gaze was radiologically defined using each parameter, as represented by their means, 95% confidence intervals (CIs), and associated 2 standard deviations (SDs). Subsequently, applying the radiological criteria, we conducted a retrospective audit of such radiographs (before the implementation of a strict radioimaging standardization). The mean age of our prospective cohort was 46.8 years, whereas that of our retrospective cohort was 37.2 years. Gender was evenly distributed across both cohorts. The four parameters with the lowest 95% CI and 2 SD were the distance offsets of the midpoint of the hard palate (A) and the base of the sella turcica (B), the horizontal convergents formed by the tangential line to the hard palate (C), and the line joining the center of the orbital orifice with the internal occipital protuberance (D). In the prospective cohort, good sensitivity (>98%) was attained when two or more parameters were used. Audit using Criterion B

  11. Impact of cognitive and linguistic ability on gaze behavior in children with hearing impairment

    Directory of Open Access Journals (Sweden)

    Olof eSandgren

    2013-11-01

    Full Text Available In order to explore verbal-nonverbal integration, we investigated the influence of cognitive and linguistic ability on gaze behavior during spoken language conversation between children with mild-to-moderate hearing impairment (HI and normal-hearing (NH peers. Ten HI-NH and ten NH-NH dyads performed a referential communication task requiring description of faces. During task performance, eye movements and speech were tracked. Cox proportional hazards regression was used to model associations between performance on cognitive and linguistic tasks and the probability of gaze to the conversational partner’s face. Analyses compare the listeners in each dyad (HI: n = 10, mean age = 12;6 years, SD = 2;0, mean better ear pure-tone average 33.0 dB HL, SD = 7.8; NH: n = 10, mean age = 13;7 years, SD = 1;11. Group differences in gaze behavior – with HI gazing more to the conversational partner than NH – remained significant despite adjustment for ability on receptive grammar, expressive vocabulary, and complex working memory. Adjustment for phonological short term memory, as measured by nonword repetition, removed group differences, revealing an interaction between group membership and nonword repetition ability. Stratified analysis showed a twofold increase of the probability of gaze-to-partner for HI with low phonological short term memory capacity, and a decreased probability for HI with high capacity, as compared to NH peers. The results revealed differences in gaze behavior attributable to performance on a phonological short term memory task. Participants with hearing impairment and low phonological short term memory capacity showed a doubled probability of gaze to the conversational partner, indicative of a visual bias. The results stress the need to look beyond the hearing impairment in diagnostics and intervention. Acknowledgment of the finding requires clinical assessment of children with hearing impairment to be supported by tasks tapping

  12. Tactile band : accessing gaze signals from the sighted in face-to-face communication

    NARCIS (Netherlands)

    Qiu, S.; Rauterberg, G.W.M.; Hu, J.

    2016-01-01

    Gaze signals, frequently used by the sighted in social interactions as visual cues, are hardly accessible for low-vision and blind people. A concept is proposed to help the blind people access and react to gaze signals in face-to-face communication. 20 blind and low-vision participants were

  13. Does Gaze Direction Modulate Facial Expression Processing in Children with Autism Spectrum Disorder?

    Science.gov (United States)

    Akechi, Hironori; Senju, Atsushi; Kikuchi, Yukiko; Tojo, Yoshikuni; Osanai, Hiroo; Hasegawa, Toshikazu

    2009-01-01

    Two experiments investigated whether children with autism spectrum disorder (ASD) integrate relevant communicative signals, such as gaze direction, when decoding a facial expression. In Experiment 1, typically developing children (9-14 years old; n = 14) were faster at detecting a facial expression accompanying a gaze direction with a congruent…

  14. Real-time estimation of horizontal gaze angle by saccade integration using in-ear electrooculography.

    Science.gov (United States)

    Hládek, Ľuboš; Porr, Bernd; Brimijoin, W Owen

    2018-01-01

    The manuscript proposes and evaluates a real-time algorithm for estimating eye gaze angle based solely on single-channel electrooculography (EOG), which can be obtained directly from the ear canal using conductive ear moulds. In contrast to conventional high-pass filtering, we used an algorithm that calculates absolute eye gaze angle via statistical analysis of detected saccades. The estimated eye positions of the new algorithm were still noisy. However, the performance in terms of Pearson product-moment correlation coefficients was significantly better than the conventional approach in some instances. The results suggest that in-ear EOG signals captured with conductive ear moulds could serve as a basis for light-weight and portable horizontal eye gaze angle estimation suitable for a broad range of applications. For instance, for hearing aids to steer the directivity of microphones in the direction of the user's eye gaze.

  15. Toward a reliable gaze-independent hybrid BCI combining visual and natural auditory stimuli.

    Science.gov (United States)

    Barbosa, Sara; Pires, Gabriel; Nunes, Urbano

    2016-03-01

    Brain computer interfaces (BCIs) are one of the last communication options for patients in the locked-in state (LIS). For complete LIS patients, interfaces must be gaze-independent due to their eye impairment. However, unimodal gaze-independent approaches typically present levels of performance substantially lower than gaze-dependent approaches. The combination of multimodal stimuli has been pointed as a viable way to increase users' performance. A hybrid visual and auditory (HVA) P300-based BCI combining simultaneously visual and auditory stimulation is proposed. Auditory stimuli are based on natural meaningful spoken words, increasing stimuli discrimination and decreasing user's mental effort in associating stimuli to the symbols. The visual part of the interface is covertly controlled ensuring gaze-independency. Four conditions were experimentally tested by 10 healthy participants: visual overt (VO), visual covert (VC), auditory (AU) and covert HVA. Average online accuracy for the hybrid approach was 85.3%, which is more than 32% over VC and AU approaches. Questionnaires' results indicate that the HVA approach was the less demanding gaze-independent interface. Interestingly, the P300 grand average for HVA approach coincides with an almost perfect sum of P300 evoked separately by VC and AU tasks. The proposed HVA-BCI is the first solution simultaneously embedding natural spoken words and visual words to provide a communication lexicon. Online accuracy and task demand of the approach compare favorably with state-of-the-art. The proposed approach shows that the simultaneous combination of visual covert control and auditory modalities can effectively improve the performance of gaze-independent BCIs. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Gaze Embeddings for Zero-Shot Image Classification

    NARCIS (Netherlands)

    Karessli, N.; Akata, Z.; Schiele, B.; Bulling, A.

    2017-01-01

    Zero-shot image classification using auxiliary information, such as attributes describing discriminative object properties, requires time-consuming annotation by domain experts. We instead propose a method that relies on human gaze as auxiliary information, exploiting that even non-expert users have

  17. Sun tracker for clear or cloudy weather

    Science.gov (United States)

    Scott, D. R.; White, P. R.

    1979-01-01

    Sun tracker orients solar collector so that they absorb maximum possible sunlight without being fooled by bright clouds, holes in cloud cover, or other atmospheric conditions. Tracker follows sun within 0.25 deg arc and is accurate within + or - 5 deg when sun is hidden.

  18. Biasing moral decisions by exploiting the dynamics of eye gaze.

    Science.gov (United States)

    Pärnamets, Philip; Johansson, Petter; Hall, Lars; Balkenius, Christian; Spivey, Michael J; Richardson, Daniel C

    2015-03-31

    Eye gaze is a window onto cognitive processing in tasks such as spatial memory, linguistic processing, and decision making. We present evidence that information derived from eye gaze can be used to change the course of individuals' decisions, even when they are reasoning about high-level, moral issues. Previous studies have shown that when an experimenter actively controls what an individual sees the experimenter can affect simple decisions with alternatives of almost equal valence. Here we show that if an experimenter passively knows when individuals move their eyes the experimenter can change complex moral decisions. This causal effect is achieved by simply adjusting the timing of the decisions. We monitored participants' eye movements during a two-alternative forced-choice task with moral questions. One option was randomly predetermined as a target. At the moment participants had fixated the target option for a set amount of time we terminated their deliberation and prompted them to choose between the two alternatives. Although participants were unaware of this gaze-contingent manipulation, their choices were systematically biased toward the target option. We conclude that even abstract moral cognition is partly constituted by interactions with the immediate environment and is likely supported by gaze-dependent decision processes. By tracking the interplay between individuals, their sensorimotor systems, and the environment, we can influence the outcome of a decision without directly manipulating the content of the information available to them.

  19. Differences in gaze anticipation for locomotion with and without vision

    Science.gov (United States)

    Authié, Colas N.; Hilt, Pauline M.; N'Guyen, Steve; Berthoz, Alain; Bennequin, Daniel

    2015-01-01

    Previous experimental studies have shown a spontaneous anticipation of locomotor trajectory by the head and gaze direction during human locomotion. This anticipatory behavior could serve several functions: an optimal selection of visual information, for instance through landmarks and optic flow, as well as trajectory planning and motor control. This would imply that anticipation remains in darkness but with different characteristics. We asked 10 participants to walk along two predefined complex trajectories (limaçon and figure eight) without any cue on the trajectory to follow. Two visual conditions were used: (i) in light and (ii) in complete darkness with eyes open. The whole body kinematics were recorded by motion capture, along with the participant's right eye movements. We showed that in darkness and in light, horizontal gaze anticipates the orientation of the head which itself anticipates the trajectory direction. However, the horizontal angular anticipation decreases by a half in darkness for both gaze and head. In both visual conditions we observed an eye nystagmus with similar properties (frequency and amplitude). The main difference comes from the fact that in light, there is a shift of the orientations of the eye nystagmus and the head in the direction of the trajectory. These results suggest that a fundamental function of gaze is to represent self motion, stabilize the perception of space during locomotion, and to simulate the future trajectory, regardless of the vision condition. PMID:26106313

  20. A Methodology to Analyze Photovoltaic Tracker Uptime

    Energy Technology Data Exchange (ETDEWEB)

    Muller, Matthew T [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ruth, Dan [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-04-17

    A metric is developed to analyze the daily performance of single-axis photovoltaic (PV) trackers. The metric relies on comparing correlations between the daily time series of the PV power output and an array of simulated plane-of-array irradiances for the given day. Mathematical thresholds and a logic sequence are presented, so the daily tracking metric can be applied in an automated fashion on large-scale PV systems. The results of applying the metric are visually examined against the time series of the power output data for a large number of days and for various systems. The visual inspection results suggest that overall, the algorithm is accurate in identifying stuck or functioning trackers on clear-sky days. Visual inspection also shows that there are days that are not classified by the metric where the power output data may be sufficient to identify a stuck tracker. Based on the daily tracking metric, uptime results are calculated for 83 different inverters at 34 PV sites. The mean tracker uptime is calculated at 99% based on 2 different calculation methods. The daily tracking metric clearly has limitations, but as there is no existing metrics in the literature, it provides a valuable tool for flagging stuck trackers.

  1. Subjects and Objects of the Embodied Gaze: Abbas Kiarostami and the Real of the Individual Perspective

    Directory of Open Access Journals (Sweden)

    Gyenge Zsolt

    2016-12-01

    Full Text Available It is widely accepted that Abbas Kiarostami’s cinema revolves around the representation of the gaze. Many critics argue that he should be considered a late modernist who repeats the self-reflexive gestures of modernist European cinema decades after they were first introduced. The present paper will contradict this assertion by investigating the problematic of the Kiarostamian gaze and analyzing the perceptual side of the act of looking. I will argue that instead of focusing on the gaze of the spectator directed towards the filmic image, he exposes a gaze that is fully integrated into the reality to be captured on film. The second part of the paper will explain this by linking the concept of gaze to the Lacanian concept of the order of the Real. Finally, I will contextualize all this by discussing the Iranian director’s position between Eastern and Western traditions of representation.

  2. SOLAR TRACKER CERDAS DAN MURAH BERBASIS MIKROKONTROLER 8 BIT ATMega8535

    Directory of Open Access Journals (Sweden)

    I Wayan Sutaya

    2016-08-01

    Full Text Available prototipe produk solar tracker cerdas berbasis mikrokontroler AVR 8 bit. Solar tracker ini memasukkan filter digital IIR (Infinite Impulse Response pada bagian program. Memprogram filter ini membutuhkan perkalian 32 bit sedangkan prosesor yang tersedia pada mikrokontroler yang dipakai adalah 8 bit. Proses perkalian ini hanya bisa dilakukan pada mikrokontroler 8 bit dengan menggunakan bahasa assembly yang merupakan bahasa level hardware. Solar tracker cerdas yang menggunakan mikrokontroler 8 bit sebagai otak utama pada penelitian ini menjadikan produk ini berbiaya rendah. Pengujian yang dilakukan menunjukkan bahwa solar tracker cerdas dibandingkan dengan solar tracker biasa mempunyai perbedaan konsumsi daya baterai yang sangat signifikan yaitu terjadi penghematan sebesar 85 %. Besar penghematan konsumsi daya ini tentunya bukan sebuah angka konstan melainkan tergantung seberapa besar noise yang dikenakan pada alat solar tracker. Untuk sebuah perlakuan yang sama, maka semakin besar noise semakin besar pula perbedaan penghematan konsumsi daya pada solar tracker yang cerdas. Kata-kata kunci: solar tracker, filter digital, mikrokontroler 8 bit, konsumsi daya Abstract This research had made a prototype of smart solar tracker product based on microcontroller AVR 8 bit. The solar tracker used digital filter IIR (Infinite Impulse Response on its software. Filter programming needs 32 bit multiplication but the processor inside of the microcontroller that used in this research is 8 bit. This multiplication is only can be solved on microcontroller 8 bit by using assembly language in programming. The language is a hardware level language. The smart solar tracker using the microcontroller 8 bit as a main brain in this research made the product had a low cost. The test results show that the comparison in saving of baterai power consumption between the smart solar tracker and the normal one is 85 %. The percentage of the saving indubitably is not a constant

  3. Gaze Estimation for Off-Angle Iris Recognition Based on the Biometric Eye Model

    Energy Technology Data Exchange (ETDEWEB)

    Karakaya, Mahmut [ORNL; Barstow, Del R [ORNL; Santos-Villalobos, Hector J [ORNL; Thompson, Joseph W [ORNL; Bolme, David S [ORNL; Boehnen, Chris Bensing [ORNL

    2013-01-01

    Iris recognition is among the highest accuracy biometrics. However, its accuracy relies on controlled high quality capture data and is negatively affected by several factors such as angle, occlusion, and dilation. Non-ideal iris recognition is a new research focus in biometrics. In this paper, we present a gaze estimation method designed for use in an off-angle iris recognition framework based on the ANONYMIZED biometric eye model. Gaze estimation is an important prerequisite step to correct an off-angle iris images. To achieve the accurate frontal reconstruction of an off-angle iris image, we first need to estimate the eye gaze direction from elliptical features of an iris image. Typically additional information such as well-controlled light sources, head mounted equipment, and multiple cameras are not available. Our approach utilizes only the iris and pupil boundary segmentation allowing it to be applicable to all iris capture hardware. We compare the boundaries with a look-up-table generated by using our biologically inspired biometric eye model and find the closest feature point in the look-up-table to estimate the gaze. Based on the results from real images, the proposed method shows effectiveness in gaze estimation accuracy for our biometric eye model with an average error of approximately 3.5 degrees over a 50 degree range.

  4. Power distribution studies for CMS forward tracker

    International Nuclear Information System (INIS)

    Todri, A.; Turqueti, M.; Rivera, R.; Kwan, S.

    2009-01-01

    The Electronic Systems Engineering Department of the Computing Division at the Fermi National Accelerator Laboratory is carrying out R and D investigations for the upgrade of the power distribution system of the Compact Muon Solenoid (CMS) Pixel Tracker at the Large Hadron Collider (LHC). Among the goals of this effort is that of analyzing the feasibility of alternative powering schemes for the forward tracker, including DC to DC voltage conversion techniques using commercially available and custom switching regulator circuits. Tests of these approaches are performed using the PSI46 pixel readout chip currently in use at the CMS Tracker. Performance measures of the detector electronics will include pixel noise and threshold dispersion results. Issues related to susceptibility to switching noise will be studied and presented. In this paper, we describe the current power distribution network of the CMS Tracker, study the implications of the proposed upgrade with DC-DC converters powering scheme and perform noise susceptibility analysis.

  5. Home-Based Computer Gaming in Vestibular Rehabilitation of Gaze and Balance Impairment.

    Science.gov (United States)

    Szturm, Tony; Reimer, Karen M; Hochman, Jordan

    2015-06-01

    Disease or damage of the vestibular sense organs cause a range of distressing symptoms and functional problems that could include loss of balance, gaze instability, disorientation, and dizziness. A novel computer-based rehabilitation system with therapeutic gaming application has been developed. This method allows different gaze and head movement exercises to be coupled to a wide range of inexpensive, commercial computer games. It can be used in standing, and thus graded balance demands using a sponge pad can be incorporated into the program. A case series pre- and postintervention study was conducted of nine adults diagnosed with peripheral vestibular dysfunction who received a 12-week home rehabilitation program. The feasibility and usability of the home computer-based therapeutic program were established. Study findings revealed that using head rotation to interact with computer games, when coupled to demanding balance conditions, resulted in significant improvements in standing balance, dynamic visual acuity, gaze control, and walking performance. Perception of dizziness as measured by the Dizziness Handicap Inventory also decreased significantly. These preliminary findings provide support that a low-cost home game-based exercise program is well suited to train standing balance and gaze control (with active and passive head motion).

  6. Exploring associations between gaze patterns and putative human mirror neuron system activity.

    Science.gov (United States)

    Donaldson, Peter H; Gurvich, Caroline; Fielding, Joanne; Enticott, Peter G

    2015-01-01

    The human mirror neuron system (MNS) is hypothesized to be crucial to social cognition. Given that key MNS-input regions such as the superior temporal sulcus are involved in biological motion processing, and mirror neuron activity in monkeys has been shown to vary with visual attention, aberrant MNS function may be partly attributable to atypical visual input. To examine the relationship between gaze pattern and interpersonal motor resonance (IMR; an index of putative MNS activity), healthy right-handed participants aged 18-40 (n = 26) viewed videos of transitive grasping actions or static hands, whilst the left primary motor cortex received transcranial magnetic stimulation. Motor-evoked potentials recorded in contralateral hand muscles were used to determine IMR. Participants also underwent eyetracking analysis to assess gaze patterns whilst viewing the same videos. No relationship was observed between predictive gaze and IMR. However, IMR was positively associated with fixation counts in areas of biological motion in the videos, and negatively associated with object areas. These findings are discussed with reference to visual influences on the MNS, and the possibility that MNS atypicalities might be influenced by visual processes such as aberrant gaze pattern.

  7. Exploring associations between gaze patterns and putative human mirror neuron system activity

    Directory of Open Access Journals (Sweden)

    Peter Hugh Donaldson

    2015-07-01

    Full Text Available The human mirror neuron system (MNS is hypothesised to be crucial to social cognition. Given that key MNS-input regions such as the superior temporal sulcus are involved in biological motion processing, and mirror neuron activity in monkeys has been shown to vary with visual attention, aberrant MNS function may be partly attributable to atypical visual input. To examine the relationship between gaze pattern and interpersonal motor resonance (IMR; an index of putative MNS activity, healthy right-handed participants aged 18-40 (n = 26 viewed videos of transitive grasping actions or static hands, whilst the left primary motor cortex received transcranial magnetic stimulation (TMS. Motor-evoked potentials (MEPs recorded in contralateral hand muscles were used to determine IMR. Participants also underwent eyetracking analysis to assess gaze patterns whilst viewing the same videos. No relationship was observed between predictive gaze (PG and IMR. However, IMR was positively associated with fixation counts in areas of biological motion in the videos, and negatively associated with object areas. These findings are discussed with reference to visual influences on the MNS, and the possibility that MNS atypicalities might be influenced by visual processes such as aberrant gaze pattern.

  8. In the twinkling of an eye: synchronization of EEG and eye tracking based on blink signatures

    DEFF Research Database (Denmark)

    Bækgaard, Per; Petersen, Michael Kai; Larsen, Jakob Eg

    2014-01-01

    function based algorithm to correlate the signals. Comparing the accuracy of the method against a state of the art EYE-EEG plug-in for offline analysis of EEG and eye tracking data, we propose our approach could be applied for robust synchronization of biometric sensor data collected in a mobile context.......ACHIEVING ROBUST ADAPTIVE SYNCHRONIZATION OF MULTIMODAL BIOMETRIC INPUTS: The recent arrival of wireless EEG headsets that enable mobile real-time 3D brain imaging on smartphones, and low cost eye trackers that provide gaze control of tablets, will radically change how biometric sensors might...... be integrated into next generation user interfaces. In experimental lab settings EEG neuroimaging and eye tracking data are traditionally combined using external triggers to synchronize the signals. However, with biometric sensors increasingly being applied in everyday usage scenarios, there will be a need...

  9. The Use of Gaze to Control Drones

    DEFF Research Database (Denmark)

    Hansen, John Paulin; Alapetite, Alexandre; MacKenzie, I. Scott

    2014-01-01

    This paper presents an experimental investigation of gaze-based control modes for unmanned aerial vehicles (UAVs or “drones”). Ten participants performed a simple flying task. We gathered empirical measures, including task completion time, and examined the user experience for difficulty, reliabil...

  10. Modelling Structural Flexure Effects in CPV Sun Trackers

    OpenAIRE

    Luque-Heredia, Ignacio; Quéméré, G.; Magalhães, P.H.; Fraile de Lerma, Alberto; Hermanns, Lutz Karl Heinz; Alarcón Álvarez, Enrique; Luque López, Antonio

    2006-01-01

    Nowadays CPV trends mostly based in lens parqueted flat modules, enable the separate design of the sun tracker. To enable this possibility a set of specifications is to be prescribed for the tracker design team, which take into account fundamental requisites such as the maximum service loads both permanent and variable, the sun tracking accuracy and the tracker structural stiffness required to maintain the CPV array acceptance angle loss below a certain threshold. In its first part this paper...

  11. Real-time estimation of horizontal gaze angle by saccade integration using in-ear electrooculography.

    Directory of Open Access Journals (Sweden)

    Ľuboš Hládek

    Full Text Available The manuscript proposes and evaluates a real-time algorithm for estimating eye gaze angle based solely on single-channel electrooculography (EOG, which can be obtained directly from the ear canal using conductive ear moulds. In contrast to conventional high-pass filtering, we used an algorithm that calculates absolute eye gaze angle via statistical analysis of detected saccades. The estimated eye positions of the new algorithm were still noisy. However, the performance in terms of Pearson product-moment correlation coefficients was significantly better than the conventional approach in some instances. The results suggest that in-ear EOG signals captured with conductive ear moulds could serve as a basis for light-weight and portable horizontal eye gaze angle estimation suitable for a broad range of applications. For instance, for hearing aids to steer the directivity of microphones in the direction of the user's eye gaze.

  12. The Gaze-Cueing Effect in the United States and Japan: Influence of Cultural Differences in Cognitive Strategies on Control of Attention

    Directory of Open Access Journals (Sweden)

    Saki Takao

    2018-01-01

    Full Text Available The direction of gaze automatically and exogenously guides visual spatial attention, a phenomenon termed as the gaze-cueing effect. Although this effect arises when the duration of stimulus onset asynchrony (SOA between a non-predictive gaze cue and the target is relatively long, no empirical research has examined the factors underlying this extended cueing effect. Two experiments compared the gaze-cueing effect at longer SOAs (700 ms in Japanese and American participants. Cross-cultural studies on cognition suggest that Westerners tend to use a context-independent analytical strategy to process visual environments, whereas Asians use a context-dependent holistic approach. We hypothesized that Japanese participants would not demonstrate the gaze-cueing effect at longer SOAs because they are more sensitive to contextual information, such as the knowledge that the direction of a gaze is not predictive. Furthermore, we hypothesized that American participants would demonstrate the gaze-cueing effect at the long SOAs because they tend to follow gaze direction whether it is predictive or not. In Experiment 1, American participants demonstrated the gaze-cueing effect at the long SOA, indicating that their attention was driven by the central non-predictive gaze direction regardless of the SOAs. In Experiment 2, Japanese participants demonstrated no gaze-cueing effect at the long SOA, suggesting that the Japanese participants exercised voluntary control of their attention, which inhibited the gaze-cueing effect with the long SOA. Our findings suggest that the control of visual spatial attention elicited by social stimuli systematically differs between American and Japanese individuals.

  13. The Gaze-Cueing Effect in the United States and Japan: Influence of Cultural Differences in Cognitive Strategies on Control of Attention.

    Science.gov (United States)

    Takao, Saki; Yamani, Yusuke; Ariga, Atsunori

    2017-01-01

    The direction of gaze automatically and exogenously guides visual spatial attention, a phenomenon termed as the gaze-cueing effect . Although this effect arises when the duration of stimulus onset asynchrony (SOA) between a non-predictive gaze cue and the target is relatively long, no empirical research has examined the factors underlying this extended cueing effect. Two experiments compared the gaze-cueing effect at longer SOAs (700 ms) in Japanese and American participants. Cross-cultural studies on cognition suggest that Westerners tend to use a context-independent analytical strategy to process visual environments, whereas Asians use a context-dependent holistic approach. We hypothesized that Japanese participants would not demonstrate the gaze-cueing effect at longer SOAs because they are more sensitive to contextual information, such as the knowledge that the direction of a gaze is not predictive. Furthermore, we hypothesized that American participants would demonstrate the gaze-cueing effect at the long SOAs because they tend to follow gaze direction whether it is predictive or not. In Experiment 1, American participants demonstrated the gaze-cueing effect at the long SOA, indicating that their attention was driven by the central non-predictive gaze direction regardless of the SOAs. In Experiment 2, Japanese participants demonstrated no gaze-cueing effect at the long SOA, suggesting that the Japanese participants exercised voluntary control of their attention, which inhibited the gaze-cueing effect with the long SOA. Our findings suggest that the control of visual spatial attention elicited by social stimuli systematically differs between American and Japanese individuals.

  14. GigaTracker, a Thin and Fast Silicon Pixels Tracker

    CERN Document Server

    Velghe, Bob; Bonacini, Sandro; Ceccucci, Augusto; Kaplon, Jan; Kluge, Alexander; Mapelli, Alessandro; Morel, Michel; Noël, Jérôme; Noy, Matthew; Perktold, Lukas; Petagna, Paolo; Poltorak, Karolina; Riedler, Petra; Romagnoli, Giulia; Chiozzi, Stefano; Cotta Ramusino, Angelo; Fiorini, Massimiliano; Gianoli, Alberto; Petrucci, Ferruccio; Wahl, Heinrich; Arcidiacono, Roberta; Jarron, Pierre; Marchetto, Flavio; Gil, Eduardo Cortina; Nuessle, Georg; Szilasi, Nicolas

    2014-01-01

    GigaTracker, the NA62’s upstream spectrometer, plays a key role in the kinematically constrained background suppression for the study of the K + ! p + n ̄ n decay. It is made of three independent stations, each of which is a six by three cm 2 hybrid silicon pixels detector. To meet the NA62 physics goals, GigaTracker has to address challenging requirements. The hit time resolution must be better than 200 ps while keeping the total thickness of the sensor to less than 0.5 mm silicon equivalent. The 200 μm thick sensor is divided into 18000 300 μm 300 μm pixels bump-bounded to ten independent read-out chips. The chips use an end-of-column architecture and rely on time-over- threshold discriminators. A station can handle a crossing rate of 750 MHz. Microchannel cooling technology will be used to cool the assembly. It allows us to keep the sensor close to 0 C with 130 μm of silicon in the beam area. The sensor and read-out chip performance were validated using a 45 pixel demonstrator with a laser test setu...

  15. Gaze-based assistive technology used in daily life by children with severe physical impairments - parents' experiences

    OpenAIRE

    Borgestig, Maria; Rytterstrom, Patrik; Hemmingsson, Helena

    2017-01-01

    Objective: To describe and explore parents' experiences when their children with severe physical impairments receive gaze-based assistive technology (gaze-based assistive technology (AT)) for use in daily life. Methods: Semi-structured interviews were conducted twice, with one year in between, with parents of eight children with cerebral palsy that used gaze-based AT in their daily activities. To understand the parents' experiences, hermeneutical interpretations were used during data analysis...

  16. PyGaze: an open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments.

    Science.gov (United States)

    Dalmaijer, Edwin S; Mathôt, Sebastiaan; Van der Stigchel, Stefan

    2014-12-01

    The PyGaze toolbox is an open-source software package for Python, a high-level programming language. It is designed for creating eyetracking experiments in Python syntax with the least possible effort, and it offers programming ease and script readability without constraining functionality and flexibility. PyGaze can be used for visual and auditory stimulus presentation; for response collection via keyboard, mouse, joystick, and other external hardware; and for the online detection of eye movements using a custom algorithm. A wide range of eyetrackers of different brands (EyeLink, SMI, and Tobii systems) are supported. The novelty of PyGaze lies in providing an easy-to-use layer on top of the many different software libraries that are required for implementing eyetracking experiments. Essentially, PyGaze is a software bridge for eyetracking research.

  17. The CMS tracker control system

    Science.gov (United States)

    Dierlamm, A.; Dirkes, G. H.; Fahrer, M.; Frey, M.; Hartmann, F.; Masetti, L.; Militaru, O.; Shah, S. Y.; Stringer, R.; Tsirou, A.

    2008-07-01

    The Tracker Control System (TCS) is a distributed control software to operate about 2000 power supplies for the silicon modules of the CMS Tracker and monitor its environmental sensors. TCS must thus be able to handle about 104 power supply parameters, about 103 environmental probes from the Programmable Logic Controllers of the Tracker Safety System (TSS), about 105 parameters read via DAQ from the DCUs in all front end hybrids and from CCUs in all control groups. TCS is built on top of an industrial SCADA program (PVSS) extended with a framework developed at CERN (JCOP) and used by all LHC experiments. The logical partitioning of the detector is reflected in the hierarchical structure of the TCS, where commands move down to the individual hardware devices, while states are reported up to the root which is interfaced to the broader CMS control system. The system computes and continuously monitors the mean and maximum values of critical parameters and updates the percentage of currently operating hardware. Automatic procedures switch off selected parts of the detector using detailed granularity and avoiding widespread TSS intervention.

  18. ALAT SOLAR TRACKER BERBASIS MIKROKONTROLER 8 BIT ATMega8535

    Directory of Open Access Journals (Sweden)

    I Wayan Sutaya

    2015-07-01

    Full Text Available Penelitian yang telah dilakukan ini adalah membuat prototipe alat solar tracker. Alat ini berfungsi untuk menggerakkan modul sel surya sehingga permukaan sel surya bisa terkena sinar matahari secara maksimal. Saat ini sel surya di Indonesia banyak terpasang secara statis atau tidak dilengkapi alat solar tracker sehingga energi matahari tidak diterima secara maksimal. Hal ini menyebabkan sel surya yang terpasang di beberapa daerah di Indonesia tidak memberikan manfaat yang optimal. Alat solar tracker yang dihasilkan pada penelitian ini diharapkan sebagai solusi dari permasalahan yang ada saat ini. Mikrokontroler 8 bit ATMega8535 yang digunakan sebagai otak utama dari alat solar tracker menjadikan alat ini menjadi berbiaya murah. Serta teknik memprogram dengan bahasa assembly menjadikan alat ini tahan terhadap kegagalan sistem. Solar tracker ini sudah bisa beroperasi dengan baik dan cocok digunakan pada modul sel surya berukuran kecil.

  19. The CMS silicon tracker

    International Nuclear Information System (INIS)

    Focardi, E.; Albergo, S.; Angarano, M.; Azzi, P.; Babucci, E.; Bacchetta, N.; Bader, A.; Bagliesi, G.; Basti, A.; Biggeri, U.; Bilei, G.M.; Bisello, D.; Boemi, D.; Bosi, F.; Borrello, L.; Bozzi, C.; Braibant, S.; Breuker, H.; Bruzzi, M.; Buffini, A.; Busoni, S.; Candelori, A.; Caner, A.; Castaldi, R.; Castro, A.; Catacchini, E.; Checcucci, B; Ciampolini, P.; Civinini, C.; Creanza, D.; D'Alessandro, R.; Da Rold, M.; Demaria, N.; De Palma, M.; Dell'Orso, R.; Della Marina, R.; Dutta, S.; Eklund, C.; Feld, L.; Fiore, L.; French, M.; Freudenreich, K.; Frey, A.; Fuertjes, A.; Giassi, A.; Giorgi, M.; Giraldo, A.; Glessing, B.; Gu, W.H.; Hall, G.; Hammarstrom, R.; Hebbeker, T.; Honma, A.; Hrubec, J.; Huhtinen, M.; Kaminsky, A.; Karimaki, V.; Koenig, St.; Krammer, M.; Lariccia, P.; Lenzi, M.; Loreti, M.; Leubelsmeyer, K.; Lustermann, W.; Maettig, P.; Maggi, G.; Mannelli, M.; Mantovani, G.; Marchioro, A.; Mariotti, C.; Martignon, G.; Evoy, B.Mc; Meschini, M.; Messineo, A.; Migliore, E.; My, S.; Paccagnella, A.; Palla, F.; Pandoulas, D.; Papi, A.; Parrini, G.; Passeri, D.; Pieri, M.; Piperov, S.; Potenza, R.; Radicci, V.; Raffaelli, F.; Raymond, M.; Rizzo, F.; Santocchia, A.; Schmitt, B.; Selvaggi, G.; Servoli, L.; Sguazzoni, G.; Siedling, R.; Silvestris, L.; Starodumov, A.; Stavitski, I.; Stefanini, G.; Surrow, B.; Tempesta, P.; Tonelli, G.; Tricomi, A.; Tuuva, T.; Vannini, C.; Verdini, P.G.; Viertel, G.; Xie, Z.; Yahong, Li; Watts, S.; Wittmer, B.

    2000-01-01

    This paper describes the Silicon microstrip Tracker of the CMS experiment at LHC. It consists of a barrel part with 5 layers and two endcaps with 10 disks each. About 10 000 single-sided equivalent modules have to be built, each one carrying two daisy-chained silicon detectors and their front-end electronics. Back-to-back modules are used to read-out the radial coordinate. The tracker will be operated in an environment kept at a temperature of T=-10 deg. C to minimize the Si sensors radiation damage. Heavily irradiated detectors will be safely operated due to the high-voltage capability of the sensors. Full-size mechanical prototypes have been built to check the system aspects before starting the construction

  20. RFP for the Auroral Multiscale Midex (AMM) Mission star tracker

    DEFF Research Database (Denmark)

    Riis, Troels; Betto, Maurizio; Jørgensen, John Leif

    1999-01-01

    This document is in response to the John Hopkins University - Applied Physics Laboratory RFP for the Auroral Multiscale Midex Mission star tracker.It describes the functionality, the requirements and the performance of the ASC Star Tracker.......This document is in response to the John Hopkins University - Applied Physics Laboratory RFP for the Auroral Multiscale Midex Mission star tracker.It describes the functionality, the requirements and the performance of the ASC Star Tracker....

  1. Eye tracking in the wild

    DEFF Research Database (Denmark)

    Pece, Arthur E C; Hansen, Dan Witzner

    2005-01-01

    An active contour tracker is presented which can be used for gaze-based interaction with off-the-shelf components. The underlying contour model is based on image statistics and avoids explicit feature detection. The tracker combines particle filtering with the EM algorithm. The method exhibits ro...

  2. Which cue to ‘want’? Opioid stimulation of central amygdala makes goal-trackers show stronger goal-tracking, just as sign-trackers show stronger sign-tracking

    Science.gov (United States)

    DiFeliceantonio, Alexandra G.; Berridge, Kent C.

    2012-01-01

    Pavlovian cues that have been paired with reward can gain incentive salience. Drug addicts find drug cues motivationally attractive and binge eaters are attracted by food cues. But the level of incentive salience elicited by a cue re-encounter still varies across time and brain states. In an animal model, cues become attractive and ‘wanted’ in an ‘autoshaping’ paradigm, where different targets of incentive salience emerge for different individuals. Some individuals (sign-trackers) find a predictive discrete cue attractive while others find a reward contiguous and goal cue more attractive (location where reward arrives: goal-trackers). Here we assessed whether central amygdala mu opioid receptor stimulation enhances the phasic incentive salience of the goal-cue for goal-trackers during moments of predictive cue presence (expressed in both approach and consummatory behaviors to goal cue), just as it enhances the attractiveness of the predictive cue target for sign-trackers. Using detailed video analysis we measured the approaches, nibbles, sniffs, and bites directed at their preferred target for both sign-trackers and goal-trackers. We report that DAMGO microinjections in central amygdala made goal-trackers, like sign-trackers, show phasic increases in appetitive nibbles and sniffs directed at the goal-cue expressed selectively whenever the predictive cue was present. This indicates enhancement of incentive salience attributed by both goal trackers and sign-trackers, but attributed in different directions: each to their own target cue. For both phenotypes, amygdala opioid stimulation makes the individual’s prepotent cue into a stronger motivational magnet at phasic moments triggered by a CS that predicts the reward UCS. PMID:22391118

  3. Does the 'P300' speller depend on eye gaze?

    Science.gov (United States)

    Brunner, P.; Joshi, S.; Briskin, S.; Wolpaw, J. R.; Bischof, H.; Schalk, G.

    2010-10-01

    Many people affected by debilitating neuromuscular disorders such as amyotrophic lateral sclerosis, brainstem stroke or spinal cord injury are impaired in their ability to, or are even unable to, communicate. A brain-computer interface (BCI) uses brain signals, rather than muscles, to re-establish communication with the outside world. One particular BCI approach is the so-called 'P300 matrix speller' that was first described by Farwell and Donchin (1988 Electroencephalogr. Clin. Neurophysiol. 70 510-23). It has been widely assumed that this method does not depend on the ability to focus on the desired character, because it was thought that it relies primarily on the P300-evoked potential and minimally, if at all, on other EEG features such as the visual-evoked potential (VEP). This issue is highly relevant for the clinical application of this BCI method, because eye movements may be impaired or lost in the relevant user population. This study investigated the extent to which the performance in a 'P300' speller BCI depends on eye gaze. We evaluated the performance of 17 healthy subjects using a 'P300' matrix speller under two conditions. Under one condition ('letter'), the subjects focused their eye gaze on the intended letter, while under the second condition ('center'), the subjects focused their eye gaze on a fixation cross that was located in the center of the matrix. The results show that the performance of the 'P300' matrix speller in normal subjects depends in considerable measure on gaze direction. They thereby disprove a widespread assumption in BCI research, and suggest that this BCI might function more effectively for people who retain some eye-movement control. The applicability of these findings to people with severe neuromuscular disabilities (particularly in eye-movements) remains to be determined.

  4. It Takes Time and Experience to Learn How to Interpret Gaze in Mentalistic Terms

    Science.gov (United States)

    Leavens, David A.

    2006-01-01

    What capabilities are required for an organism to evince an "explicit" understanding of gaze as a mentalistic phenomenon? One possibility is that mentalistic interpretations of gaze, like concepts of unseen, supernatural beings, are culturally-specific concepts, acquired through cultural learning. These abstract concepts may either require a…

  5. Gaze-related mimic word activates the frontal eye field and related network in the human brain: an fMRI study.

    Science.gov (United States)

    Osaka, Naoyuki; Osaka, Mariko

    2009-09-18

    This is an fMRI study demonstrating new evidence that a mimic word highly suggestive of an eye gaze, heard by the ear, significantly activates the frontal eye field (FEF), inferior frontal gyrus (IFG), dorsolateral premotor area (PMdr) and superior parietal lobule (SPL) connected with the frontal-parietal network. However, hearing a non-sense words that did not imply gaze under the same task does not activate this area in humans. We concluded that the FEF would be a critical area for generating/processing an active gaze, evoked by an onomatopoeia word that implied gaze closely associated with social skill. We suggest that the implied active gaze may depend on prefrontal-parietal interactions that modify cognitive gaze led by spatial visual attention associated with the SPL.

  6. How Do We Update Faces? Effects of Gaze Direction and Facial Expressions on Working Memory Updating

    OpenAIRE

    Artuso, Caterina; Palladino, Paola; Ricciardelli, Paola

    2012-01-01

    The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM). We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g., joy), while averted gaze enh...

  7. The Iron Cage and the Gaze: Interpreting Medical Control in the English Health System

    Directory of Open Access Journals (Sweden)

    Mark Exworthy

    2015-05-01

    Full Text Available This paper seeks to determine the value of theoretical ideal-types of medical control. Whilst ideal types (such as the iron cage and gaze need revision in their application to medical settings, they remain useful in describing and explaining patterns of control and autonomy in the medical profession. The apparent transition from the cage to the gaze has often been over-stated since both types are found in many contemporary health reforms. Indeed, forms of neo-bureaucracy have emerged alongside surveillance of the gaze. These types are contextualised and elaborated in terms of two empirical examples: the management of medical performance and financial incentives for senior hospital doctors in England. Findings point towards the reformulation of medical control, an on-going re-stratification of the medical profession, and the internalisation of managerial discourses. The cumulative effect involves the medical profession’s ability to re-cast and enhance its position (vis-à-vis managerial interests.Keywords: medical profession, medical control, iron cage, gaze

  8. Between Gazes: Feminist, Queer, and 'Other' Films

    DEFF Research Database (Denmark)

    Elias, Camelia

    In this book Camelia Elias introduces key terms in feminist, queer, and postcolonial/diaspora film. Taking her point of departure in the question, "what do you want from me?" she detours through Lacanian theory of the gaze and reframes questions of subjectivity and representation in an entertaining...

  9. The influence of social and symbolic cues on observers' gaze behaviour.

    Science.gov (United States)

    Hermens, Frouke; Walker, Robin

    2016-08-01

    Research has shown that social and symbolic cues presented in isolation and at fixation have strong effects on observers, but it is unclear how cues compare when they are presented away from fixation and embedded in natural scenes. We here compare the effects of two types of social cue (gaze and pointing gestures) and one type of symbolic cue (arrow signs) on eye movements of observers under two viewing conditions (free viewing vs. a memory task). The results suggest that social cues are looked at more quickly, for longer and more frequently than the symbolic arrow cues. An analysis of saccades initiated from the cue suggests that the pointing cue leads to stronger cueing than the gaze and the arrow cue. While the task had only a weak influence on gaze orienting to the cues, stronger cue following was found for free viewing compared to the memory task. © 2015 The British Psychological Society.

  10. Eye-gaze control of the computer interface: Discrimination of zoom intent

    International Nuclear Information System (INIS)

    Goldberg, J.H.

    1993-01-01

    An analysis methodology and associated experiment were developed to assess whether definable and repeatable signatures of eye-gaze characteristics are evident, preceding a decision to zoom-in, zoom-out, or not to zoom at a computer interface. This user intent discrimination procedure can have broad application in disability aids and telerobotic control. Eye-gaze was collected from 10 subjects in a controlled experiment, requiring zoom decisions. The eye-gaze data were clustered, then fed into a multiple discriminant analysis (MDA) for optimal definition of heuristics separating the zoom-in, zoom-out, and no-zoom conditions. Confusion matrix analyses showed that a number of variable combinations classified at a statistically significant level, but practical significance was more difficult to establish. Composite contour plots demonstrated the regions in parameter space consistently assigned by the MDA to unique zoom conditions. Peak classification occurred at about 1200--1600 msec. Improvements in the methodology to achieve practical real-time zoom control are considered

  11. Method of Menu Selection by Gaze Movement Using AC EOG Signals

    Science.gov (United States)

    Kanoh, Shin'ichiro; Futami, Ryoko; Yoshinobu, Tatsuo; Hoshimiya, Nozomu

    A method to detect the direction and the distance of voluntary eye gaze movement from EOG (electrooculogram) signals was proposed and tested. In this method, AC-amplified vertical and horizontal transient EOG signals were classified into 8-class directions and 2-class distances of voluntary eye gaze movements. A horizontal and a vertical EOGs during eye gaze movement at each sampling time were treated as a two-dimensional vector, and the center of gravity of the sample vectors whose norms were more than 80% of the maximum norm was used as a feature vector to be classified. By the classification using the k-nearest neighbor algorithm, it was shown that the averaged correct detection rates on each subject were 98.9%, 98.7%, 94.4%, respectively. This method can avoid strict EOG-based eye tracking which requires DC amplification of very small signal. It would be useful to develop robust human interfacing systems based on menu selection for severely paralyzed patients.

  12. A comprehensive gaze stabilization controller based on cerebellar internal models

    DEFF Research Database (Denmark)

    Vannucci, Lorenzo; Falotico, Egidio; Tolu, Silvia

    2017-01-01

    . The VOR works in conjunction with the opto-kinetic reflex (OKR), which is a visual feedback mechanism that allows to move the eye at the same speed as the observed scene. Together they keep the image stationary on the retina. In this work we implement on a humanoid robot a model of gaze stabilization...... based on the coordination of VCR and VOR and OKR. The model, inspired by neuroscientific cerebellar theories, is provided with learning and adaptation capabilities based on internal models. We present the results for the gaze stabilization model on three sets of experiments conducted on the SABIAN robot...

  13. Gaze stability, dynamic balance and participation deficits in people with multiple sclerosis at fall-risk.

    Science.gov (United States)

    Garg, Hina; Dibble, Leland E; Schubert, Michael C; Sibthorp, Jim; Foreman, K Bo; Gappmaier, Eduard

    2018-05-05

    Despite the common complaints of dizziness and demyelination of afferent or efferent pathways to and from the vestibular nuclei which may adversely affect the angular Vestibulo-Ocular Reflex (aVOR) and vestibulo-spinal function in persons with Multiple Sclerosis (PwMS), few studies have examined gaze and dynamic balance function in PwMS. 1) Determine the differences in gaze stability, dynamic balance and participation measures between PwMS and controls, 2) Examine the relationships between gaze stability, dynamic balance and participation. Nineteen ambulatory PwMS at fall-risk and 14 age-matched controls were recruited. Outcomes included (a) gaze stability [angular Vestibulo-Ocular Reflex (aVOR) gain (ratio of eye to head velocity); number of Compensatory Saccades (CS) per head rotation; CS latency; gaze position error; Coefficient of Variation (CV) of aVOR gain], (b) dynamic balance [Functional Gait Assessment, FGA; four square step test], and (c) participation [dizziness handicap inventory; activities-specific balance confidence scale]. Separate independent t-tests and Pearson's correlations were calculated. PwMS were age = 53 ± 11.7yrs and had 4.2 ± 3.3 falls/yr. PwMS demonstrated significant (pbalance and participation measures compared to controls. CV of aVOR gain and CS latency were significantly correlated with FGA. Deficits and correlations across a spectrum of disability measures highlight the relevance of gaze and dynamic balance assessment in PwMS. This article is protected by copyright. All rights reserved. © 2018 Wiley Periodicals, Inc.

  14. PageRank tracker: from ranking to tracking.

    Science.gov (United States)

    Gong, Chen; Fu, Keren; Loza, Artur; Wu, Qiang; Liu, Jia; Yang, Jie

    2014-06-01

    Video object tracking is widely used in many real-world applications, and it has been extensively studied for over two decades. However, tracking robustness is still an issue in most existing methods, due to the difficulties with adaptation to environmental or target changes. In order to improve adaptability, this paper formulates the tracking process as a ranking problem, and the PageRank algorithm, which is a well-known webpage ranking algorithm used by Google, is applied. Labeled and unlabeled samples in tracking application are analogous to query webpages and the webpages to be ranked, respectively. Therefore, determining the target is equivalent to finding the unlabeled sample that is the most associated with existing labeled set. We modify the conventional PageRank algorithm in three aspects for tracking application, including graph construction, PageRank vector acquisition and target filtering. Our simulations with the use of various challenging public-domain video sequences reveal that the proposed PageRank tracker outperforms mean-shift tracker, co-tracker, semiboosting and beyond semiboosting trackers in terms of accuracy, robustness and stability.

  15. The CMS Outer Tracker for HL-LHC

    CERN Document Server

    Dierlamm, Alexander Hermann

    2018-01-01

    The LHC is planning an upgrade program, which will bring the luminosity to about $5-7\\times10^{34}$~cm$^{-2}$s$^{-1}$ in 2026, with a goal of an integrated luminosity of 3000 fb$^{-1}$ by the end of 2037. This High Luminosity LHC scenario, HL-LHC, will require a preparation program of the LHC detectors known as Phase-2 Upgrade. The current CMS Tracker is already running beyond design specifications and will not be able to cope with the HL-LHC radiation conditions. CMS will need a completely new Tracker in order to fully exploit the highly demanding operating conditions and the delivered luminosity. The new Outer Tracker system is designed to provide robust tracking as well as Level-1 trigger capabilities using closely spaced modules composed of silicon macro-pixel and/or strip sensors. Research and development activities are ongoing to explore options and develop module components and designs for the HL-LHC environment. The design choices for the CMS Outer Tracker Upgrade are discussed along with some highlig...

  16. The Disturbance of Gaze in Progressive Supranuclear Palsy (PSP: Implications for Pathogenesis

    Directory of Open Access Journals (Sweden)

    Athena L Chen

    2010-12-01

    Full Text Available Progressive supranuclear palsy (PSP is a disease of later life that is currently regarded as a form of neurodegenerative tauopathy. Disturbance of gaze is a cardinal clinical feature of PSP that often helps clinicians to establish the diagnosis. Since the neurobiology of gaze control is now well understood, it is possible to use eye movements as investigational tools to understand aspects of the pathogenesis of PSP. In this review, we summarize each disorder of gaze control that occurs in PSP, drawing on our studies of fifty patients, and on reports from other laboratories that have measured the disturbances of eye movements. When these gaze disorders are approached by considering each functional class of eye movements and its neurobiological basis, a distinct pattern of eye movement deficits emerges that provides insight into the pathogenesis of PSP. Although some aspects of all forms of eye movements are affected in PSP, the predominant defects concern vertical saccades (slow and hypometric, both up and down, impaired vergence, and inability to modulate the linear vestibulo-ocular reflex appropriately for viewing distance. These vertical and vergence eye movements habitually work in concert to enable visuomotor skills that are important during locomotion with the hands free. Taken with the prominent early feature of falls, these findings suggest that PSP tauopathy impairs a recently-evolved neural system concerned with bipedal locomotion in an erect posture and frequent gaze shifts between the distant environment and proximate hands. This approach provides a conceptual framework that can be used to address the nosological challenge posed by overlapping clinical and neuropathological features of neurodegenerative tauopathies.

  17. Energy-efficient digital and wireless IC design for wireless smart sensing

    Science.gov (United States)

    Zhou, Jun; Huang, Xiongchuan; Wang, Chao; Tae-Hyoung Kim, Tony; Lian, Yong

    2017-10-01

    Wireless smart sensing is now widely used in various applications such as health monitoring and structural monitoring. In conventional wireless sensor nodes, significant power is consumed in wirelessly transmitting the raw data. Smart sensing adds local intelligence to the sensor node and reduces the amount of wireless data transmission via on-node digital signal processing. While the total power consumption is reduced compared to conventional wireless sensing, the power consumption of the digital processing becomes as dominant as wireless data transmission. This paper reviews the state-of-the-art energy-efficient digital and wireless IC design techniques for reducing the power consumption of the wireless smart sensor node to prolong battery life and enable self-powered applications.

  18. Data acquisition software for the CMS strip tracker

    International Nuclear Information System (INIS)

    Bainbridge, R; Cripps, N; Fulcher, J; Radicci, V; Wingham, M; Baulieu, G; Bel, S; Delaere, C; Drouhin, F; Gill, K; Mirabito, L; Cole, J; Jesus, A C A; Giassi, A; Giordano, D; Gross, L; Hahn, K; Mersi, S; Nikolic, M; Tkaczyk, S

    2008-01-01

    The CMS silicon strip tracker, providing a sensitive area of approximately 200 m 2 and comprising 10 million readout channels, has recently been completed at the tracker integration facility at CERN. The strip tracker community is currently working to develop and integrate the online and offline software frameworks, known as XDAQ and CMSSW respectively, for the purposes of data acquisition and detector commissioning and monitoring. Recent developments have seen the integration of many new services and tools within the online data acquisition system, such as event building, online distributed analysis, an online monitoring framework, and data storage management. We review the various software components that comprise the strip tracker data acquisition system, the software architectures used for stand-alone and global data-taking modes. Our experiences in commissioning and operating one of the largest ever silicon micro-strip tracking systems are also reviewed

  19. Dynamic modeling of patient and physician eye gaze to understand the effects of electronic health records on doctor-patient communication and attention.

    Science.gov (United States)

    Montague, Enid; Asan, Onur

    2014-03-01

    The aim of this study was to examine eye gaze patterns between patients and physicians while electronic health records were used to support patient care. Eye gaze provides an indication of physician attention to patient, patient/physician interaction, and physician behaviors such as searching for information and documenting information. A field study was conducted where 100 patient visits were observed and video recorded in a primary care clinic. Videos were then coded for gaze behaviors where patients' and physicians' gaze at each other and artifacts such as electronic health records were coded using a pre-established objective coding scheme. Gaze data were then analyzed using lag sequential methods. Results showed that there are several eye gaze patterns significantly dependent to each other. All doctor-initiated gaze patterns were followed by patient gaze patterns. Some patient-initiated gaze patterns were also followed by doctor gaze patterns significantly unlike the findings in previous studies. Health information technology appears to contribute to some of the new significant patterns that have emerged. Differences were also found in gaze patterns related to technology that differ from patterns identified in studies with paper charts. Several sequences related to patient-doctor-technology were also significant. Electronic health records affect the patient-physician eye contact dynamic differently than paper charts. This study identified several patterns of patient-physician interaction with electronic health record systems. Consistent with previous studies, physician initiated gaze is an important driver of the interactions between patient and physician and patient and technology. Published by Elsevier Ireland Ltd.

  20. Hovering by Gazing: A Novel Strategy for Implementing Saccadic Flight-Based Navigation in GPS-Denied Environments

    Directory of Open Access Journals (Sweden)

    Augustin Manecy

    2014-04-01

    Full Text Available Hovering flies are able to stay still in place when hovering above flowers and burst into movement towards a new object of interest (a target. This suggests that sensorimotor control loops implemented onboard could be usefully mimicked for controlling Unmanned Aerial Vehicles (UAVs. In this study, the fundamental head-body movements occurring in free-flying insects was simulated in a sighted twin-engine robot with a mechanical decoupling inserted between its eye (or gaze and its body. The robot based on this gaze control system achieved robust and accurate hovering performances, without an accelerometer, over a ground target despite a narrow eye field of view (±5°. The gaze stabilization strategy validated under Processor-In-the-Loop (PIL and inspired by three biological Oculomotor Reflexes (ORs enables the aerial robot to lock its gaze onto a fixed target regardless of its roll angle. In addition, the gaze control mechanism allows the robot to perform short range target to target navigation by triggering an automatic fast “target jump” behaviour based on a saccadic eye movement.

  1. Optical Airborne Tracker System

    Data.gov (United States)

    National Aeronautics and Space Administration — The Optical Airborne Tracker System (OATS) is an airborne dual-axis optical tracking system capable of pointing at any sky location or ground target.  The objectives...

  2. Gazing toward humans: a study on water rescue dogs using the impossible task paradigm.

    Science.gov (United States)

    D'Aniello, Biagio; Scandurra, Anna; Prato-Previde, Emanuela; Valsecchi, Paola

    2015-01-01

    Various studies have assessed the role of life experiences, including learning opportunities, living conditions and the quality of dog-human relationships, in the use of human cues and problem-solving ability. The current study investigates how and to what extent training affects the behaviour of dogs and the communication of dogs with humans by comparing dogs trained for a water rescue service and untrained pet dogs in the impossible task paradigm. Twenty-three certified water rescue dogs (the water rescue group) and 17 dogs with no training experience (the untrained group) were tested using a modified version of the impossible task described by Marshall-Pescini et al. in 2009. The results demonstrated that the water rescue dogs directed their first gaze significantly more often towards the owner and spent more time gazing toward two people compared to the untrained pet dogs. There was no difference between the dogs of the two groups as far as in the amount of time spent gazing at the owner or the stranger; neither in the interaction with the apparatus attempting to obtain food. The specific training regime, aimed at promoting cooperation during the performance of water rescue, could account for the longer gazing behaviour shown toward people by the water rescue dogs and the priority of gazing toward the owner. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Latvian government in double jeopardy with EU, Latvijas Gaze

    Index Scriptorium Estoniae

    2005-01-01

    Läti soovib saada Euroopa Komisjonilt ajapikendust gaasituru liberaliseerimiseks 2010. aastani ning lubab sel juhul sõlmida Latvijas Gaze'ga kokkuleppe, et viimane loobuks gaasitarnete ainuõigusest Lätis

  4. Intranasal Oxytocin Treatment Increases Eye-Gaze Behavior toward the Owner in Ancient Japanese Dog Breeds

    Directory of Open Access Journals (Sweden)

    Miho Nagasawa

    2017-09-01

    Full Text Available Dogs acquired unique cognitive abilities during domestication, which is thought to have contributed to the formation of the human-dog bond. In European breeds, but not in wolves, a dog’s gazing behavior plays an important role in affiliative interactions with humans and stimulates oxytocin secretion in both humans and dogs, which suggests that this interspecies oxytocin and gaze-mediated bonding was also acquired during domestication. In this study, we investigated whether Japanese breeds, which are classified as ancient breeds and are relatively close to wolves genetically, establish a bond with their owners through gazing behavior. The subject dogs were treated with either oxytocin or saline before the starting of the behavioral testing. We also evaluated physiological changes in the owners during mutual gazing by analyzing their heart rate variability (HRV and subsequent urinary oxytocin levels in both dogs and their owners. We found that oxytocin treatment enhanced the gazing behavior of Japanese dogs and increased their owners’ urinary oxytocin levels, as was seen with European breeds; however, the measured durations of skin contact and proximity to their owners were relatively low. In the owners’ HRV readings, inter-beat (R-R intervals (RRI, the standard deviation of normal to normal inter-beat (R-R intervals (SDNN, and the root mean square of successive heartbeat interval differences (RMSSD were lower when the dogs were treated with oxytocin compared with saline. Furthermore, the owners of female dogs showed lower SDNN than the owners of male dogs. These results suggest that the owners of female Japanese dogs exhibit more tension during interactions, and apart from gazing behavior, the dogs may show sex differences in their interactions with humans as well. They also suggest that Japanese dogs use eye-gazing as an attachment behavior toward humans similar to European breeds; however, there is a disparity between the dog sexes when

  5. Activity trackers: a critical review.

    Science.gov (United States)

    Lee, Jeon; Finkelstein, Joseph

    2014-01-01

    The wearable consumer health devices can be mainly divided into activity trackers, sleep trackers, and stress management devices. These devices are widely advertised to provide positive effects on the user's daily behaviours and overall heath. However, objective evidence supporting these claims appears to be missing. The goal of this study was to review available evidence pertaining to performance of activity trackers. A comprehensive review of available information has been conducted for seven representative devices and the validity of marketing claims was assessed. The device assessment was based on availability of verified output metrics, theoretical frameworks, systematic evaluation, and FDA clearance. The review identified critical absence of supporting evidence of advertised functions and benefits for the majority of the devices. Six out of seven devices did not provide any information on sensor accuracy and output validity at all. Possible underestimation or overestimation of specific health indicators reported to consumers was not clearly disclosed to the public. Furthermore, significant limitations of these devices which can be categorized into user restrictions, user responsibilities and company disclaimers could not be easily found or comprehended by unsophisticated users and may represent a serious health hazard.

  6. The CMS tracker control system

    International Nuclear Information System (INIS)

    Dierlamm, A; Dirkes, G H; Fahrer, M; Frey, M; Hartmann, F; Masetti, L; Militaru, O; Shah, S Y; Stringer, R; Tsirou, A

    2008-01-01

    The Tracker Control System (TCS) is a distributed control software to operate about 2000 power supplies for the silicon modules of the CMS Tracker and monitor its environmental sensors. TCS must thus be able to handle about 10 4 power supply parameters, about 10 3 environmental probes from the Programmable Logic Controllers of the Tracker Safety System (TSS), about 10 5 parameters read via DAQ from the DCUs in all front end hybrids and from CCUs in all control groups. TCS is built on top of an industrial SCADA program (PVSS) extended with a framework developed at CERN (JCOP) and used by all LHC experiments. The logical partitioning of the detector is reflected in the hierarchical structure of the TCS, where commands move down to the individual hardware devices, while states are reported up to the root which is interfaced to the broader CMS control system. The system computes and continuously monitors the mean and maximum values of critical parameters and updates the percentage of currently operating hardware. Automatic procedures switch off selected parts of the detector using detailed granularity and avoiding widespread TSS intervention

  7. [Left lateral gaze paresis due to subcortical hematoma in the right precentral gyrus].

    Science.gov (United States)

    Sato, K; Takamori, M

    1998-03-01

    We report a case of transient left lateral gaze paresis due to a hemorrhagic lesion restricted in the right precentral gyrus. A 74-year-old female experienced a sudden clumsiness of the left upper extremity. A neurological examination revealed a left central facial paresis, distal dominant muscle weakness in the left upper limb and left lateral gaze paresis. There were no other focal neurological signs. Laboratory data were all normal. Brain CTs and MRIs demonstrated a subcortical hematoma in the right precentral gyrus. The neurological symptoms and signs disappeared over seven days. A recent physiological study suggested that the human frontal eye field (FEF) is located in the posterior part of the middle frontal gyrus (Brodmann's area 8) and the precentral gyrus around the precentral sulcus. More recent studies stressed the role of the precentral sulcus and the precentral gyrus. Our case supports those physiological findings. The hematoma affected both the FEF and its underlying white matter in our case. We assume the lateral gaze paresis is attributable to the disruption of the fibers from the FEF. It is likely that fibers for motor control of the face, upper extremity, and lateral gaze lie adjacently in the subcortical area.

  8. Subjects and Objects of the Embodied Gaze: Abbas Kiarostami and the Real of the Individual Perspective

    OpenAIRE

    Gyenge Zsolt

    2016-01-01

    It is widely accepted that Abbas Kiarostami’s cinema revolves around the representation of the gaze. Many critics argue that he should be considered a late modernist who repeats the self-reflexive gestures of modernist European cinema decades after they were first introduced. The present paper will contradict this assertion by investigating the problematic of the Kiarostamian gaze and analyzing the perceptual side of the act of looking. I will argue that instead of focusing on the gaze of the...

  9. Reliability and validity of ten consumer activity trackers

    NARCIS (Netherlands)

    Kooiman, Thea; Dontje, Manon L.; Sprenger, Siska; Krijnen, Wim; van der Schans, Cees; de Groot, Martijn

    2015-01-01

    Background: Activity trackers can potentially stimulate users to increase their physical activity behavior. The aim of this study was to examine the reliability and validity of ten consumer activity trackers for measuring step count in both laboratory and free-living conditions. Method: Healthy

  10. Wireless device monitoring methods, wireless device monitoring systems, and articles of manufacture

    Science.gov (United States)

    McCown, Steven H [Rigby, ID; Derr, Kurt W [Idaho Falls, ID; Rohde, Kenneth W [Idaho Falls, ID

    2012-05-08

    Wireless device monitoring methods, wireless device monitoring systems, and articles of manufacture are described. According to one embodiment, a wireless device monitoring method includes accessing device configuration information of a wireless device present at a secure area, wherein the device configuration information comprises information regarding a configuration of the wireless device, accessing stored information corresponding to the wireless device, wherein the stored information comprises information regarding the configuration of the wireless device, comparing the device configuration information with the stored information, and indicating the wireless device as one of authorized and unauthorized for presence at the secure area using the comparing.

  11. Aerodynamical study of a photovoltaic solar tracker

    OpenAIRE

    Gutiérrez Castillo, José Leonardo

    2016-01-01

    Investigate the aerodynamic features of ground-mounted solar trackers under atmospheric boundary layer flows. Study and identify the aerodynamical interactions of solar trackers when they are displayed as an array. State of the art. Literature review about CFD applied to solar panels. Analytic approach of the problem. Application of CFD analysis. Validation of the results. Discussion of the results. Improvements proposal.

  12. How does image noise affect actual and predicted human gaze allocation in assessing image quality?

    Science.gov (United States)

    Röhrbein, Florian; Goddard, Peter; Schneider, Michael; James, Georgina; Guo, Kun

    2015-07-01

    A central research question in natural vision is how to allocate fixation to extract informative cues for scene perception. With high quality images, psychological and computational studies have made significant progress to understand and predict human gaze allocation in scene exploration. However, it is unclear whether these findings can be generalised to degraded naturalistic visual inputs. In this eye-tracking and computational study, we methodically distorted both man-made and natural scenes with Gaussian low-pass filter, circular averaging filter and Additive Gaussian white noise, and monitored participants' gaze behaviour in assessing perceived image qualities. Compared with original high quality images, distorted images attracted fewer numbers of fixations but longer fixation durations, shorter saccade distance and stronger central fixation bias. This impact of image noise manipulation on gaze distribution was mainly determined by noise intensity rather than noise type, and was more pronounced for natural scenes than for man-made scenes. We furthered compared four high performing visual attention models in predicting human gaze allocation in degraded scenes, and found that model performance lacked human-like sensitivity to noise type and intensity, and was considerably worse than human performance measured as inter-observer variance. Furthermore, the central fixation bias is a major predictor for human gaze allocation, which becomes more prominent with increased noise intensity. Our results indicate a crucial role of external noise intensity in determining scene-viewing gaze behaviour, which should be considered in the development of realistic human-vision-inspired attention models. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. The Tourist Gaze and ‘Family Treasure Trails’ in Museums

    DEFF Research Database (Denmark)

    Larsen, Jonas; Svabo, Connie

    2014-01-01

    Museums are largely neglected in the tourist research literature. This is even more striking given that they are arguably designed for gazing. There is little doubt that “graying” of the Western population adds to the number and range of museums. And yet, even in adult museums, there will be chil......Museums are largely neglected in the tourist research literature. This is even more striking given that they are arguably designed for gazing. There is little doubt that “graying” of the Western population adds to the number and range of museums. And yet, even in adult museums......, there will be children who are “dragged along.” Museums are increasingly aware of such conflicts and dilemmas. Many museums offer printed booklets with “treasure trails.” They afford a trail through the museum that forms a treasure hunt for specific objects and correct answers to questions related to the objects....... This article draws attention to this overlooked, mundane technology and gives it its deserved share of the limelight. We are concerned with exploring ethnographically how trails are designed and especially used by young families in museums for gazing. The article gives insight into how children, broadly...

  14. Laser tracker TSPI uncertainty quantification via centrifuge trajectory

    Science.gov (United States)

    Romero, Edward; Paez, Thomas; Brown, Timothy; Miller, Timothy

    2009-08-01

    Sandia National Laboratories currently utilizes two laser tracking systems to provide time-space-position-information (TSPI) and high speed digital imaging of test units under flight. These laser trackers have been in operation for decades under the premise of theoretical accuracies based on system design and operator estimates. Advances in optical imaging and atmospheric tracking technology have enabled opportunities to provide more precise six degree of freedom measurements from these trackers. Applying these technologies to the laser trackers requires quantified understanding of their current errors and uncertainty. It was well understood that an assortment of variables contributed to laser tracker uncertainty but the magnitude of these contributions was not quantified and documented. A series of experiments was performed at Sandia National Laboratories large centrifuge complex to quantify TSPI uncertainties of Sandia National Laboratories laser tracker III. The centrifuge was used to provide repeatable and economical test unit trajectories of a test-unit to use for TSPI comparison and uncertainty analysis. On a centrifuge, testunits undergo a known trajectory continuously with a known angular velocity. Each revolution may represent an independent test, which may be repeated many times over for magnitudes of data practical for statistical analysis. Previously these tests were performed at Sandia's rocket sled track facility but were found to be costly with challenges in the measurement ground truth TSPI. The centrifuge along with on-board measurement equipment was used to provide known ground truth position of test units. This paper discusses the experimental design and techniques used to arrive at measures of laser tracker error and uncertainty.

  15. The research and development of the automatic solar power tracker

    OpenAIRE

    Li Yan Ping; Yuan Zhong Ying

    2016-01-01

    The article describes a kind of automatic tracker using solar power. It depends on two important parts which are servo system and adjusting mechanism system to keep the tracker operating normally. The article focuses on describing the characteristics and functions of two systems and the operating details of the automatic solar power tracker.

  16. Advances in RGB and RGBD Generic Object Trackers

    KAUST Repository

    Bibi, Adel Aamer

    2016-01-01

    new state-of-the-art trackers. One of which is 3D based tracker in a particle filter framework where both synchronization and registration of RGB and depth streams are adjusted automatically, and three works in correlation filters that achieve state-of-the-art

  17. A simulator-based approach to evaluating optical trackers

    NARCIS (Netherlands)

    Smit, F.A.; Liere, van R.

    2009-01-01

    We describe a software framework to evaluate the performance of model-based optical trackers in virtual environments. The framework can be used to evaluate and compare the performance of different trackers under various conditions, to study the effects of varying intrinsic and extrinsic camera

  18. Eye Contact and Fear of Being Laughed at in a Gaze Discrimination Task

    Directory of Open Access Journals (Sweden)

    Jorge Torres-Marín

    2017-11-01

    Full Text Available Current approaches conceptualize gelotophobia as a personality trait characterized by a disproportionate fear of being laughed at by others. Consistently with this perspective, gelotophobes are also described as neurotic and introverted and as having a paranoid tendency to anticipate derision and mockery situations. Although research on gelotophobia has significantly progressed over the past two decades, no evidence exists concerning the potential effects of gelotophobia in reaction to eye contact. Previous research has pointed to difficulties in discriminating gaze direction as the basis of possible misinterpretations of others’ intentions or mental states. The aim of the present research was to examine whether gelotophobia predisposition modulates the effects of eye contact (i.e., gaze discrimination when processing faces portraying several emotional expressions. In two different experiments, participants performed an experimental gaze discrimination task in which they responded, as quickly and accurately as possible, to the eyes’ directions on faces displaying either a happy, angry, fear, neutral, or sad emotional expression. In particular, we expected trait-gelotophobia to modulate the eye contact effect, showing specific group differences in the happiness condition. The results of Study 1 (N = 40 indicated that gelotophobes made more errors than non-gelotophobes did in the gaze discrimination task. In contrast to our initial hypothesis, the happiness expression did not have any special role in the observed differences between individuals with high vs. low trait-gelotophobia. In Study 2 (N = 40, we replicated the pattern of data concerning gaze discrimination ability, even after controlling for individuals’ scores on social anxiety. Furthermore, in our second experiment, we found that gelotophobes did not exhibit any problem with identifying others’ emotions, or a general incorrect attribution of affective features, such as valence

  19. Advances in RGB and RGBD Generic Object Trackers

    KAUST Repository

    Bibi, Adel

    2016-04-01

    Visual object tracking is a classical and very popular problem in computer vision with a plethora of applications such as vehicle navigation, human computer interface, human motion analysis, surveillance, auto-control systems and many more. Given the initial state of a target in the first frame, the goal of tracking is to predict states of the target over time where the states describe a bounding box covering the target. Despite numerous object tracking methods that have been proposed in recent years [1-4], most of these trackers suffer a degradation in performance mainly because of several challenges that include illumination changes, motion blur, complex motion, out of plane rotation, and partial or full occlusion, while occlusion is usually the most contributing factor in degrading the majority of trackers, if not all of them. This thesis is devoted to the advancement of generic object trackers tackling different challenges through different proposed methods. The work presented propose four new state-of-the-art trackers. One of which is 3D based tracker in a particle filter framework where both synchronization and registration of RGB and depth streams are adjusted automatically, and three works in correlation filters that achieve state-of-the-art performance in terms of accuracy while maintaining reasonable speeds.

  20. Direct tracking error characterization on a single-axis solar tracker

    International Nuclear Information System (INIS)

    Sallaberry, Fabienne; Pujol-Nadal, Ramon; Larcher, Marco; Rittmann-Frank, Mercedes Hannelore

    2015-01-01

    Highlights: • The solar tracker of a small-size parabolic trough collector was tested. • A testing procedure for the tracking error characterization of a single-axis tracker was proposed. • A statistical analysis on the tracking error distribution was done regarding different variables. • The optical losses due to the tracking error were calculated based on a ray-tracing simulation. - Abstract: The solar trackers are devices used to orientate solar concentrating systems in order to increase the focusing of the solar radiation on a receiver. A solar concentrator with a medium or high concentration ratio needs to be orientated correctly by an accurate solar tracking mechanism to avoid losing the sunrays out from the receiver. Hence, to obtain an appropriate operation, it is important to know the accuracy of a solar tracker in regard to the required precision of the concentrator in order to maximize the collector optical efficiency. A procedure for the characterization of the accuracy of a solar tracker is presented for a single-axis solar tracker. More precisely, this study focuses on the estimation of the positioning angle error of a parabolic trough collector using a direct procedure. A testing procedure, adapted from the International standard IEC 62817 for photovoltaic trackers, was defined. The results show that the angular tracking error was within ±0.4° for this tracker. The optical losses due to the tracking were calculated using the longitudinal incidence angle modifier obtained by ray-tracing simulation. The acceptance angles for various transversal angles were analyzed, and the average optical loss, due to the tracking, was 0.317% during the whole testing campaign. The procedure presented in this work showed that the tracker precision was adequate for the requirements of the analyzed optical system.

  1. Gaze-based hints during child-robot gameplay

    NARCIS (Netherlands)

    Mwangi, E.; Barakova, Emilia I.; Diaz, M.L.Z.; Mallofre, A.C.; Rauterberg, G.W.M.; Kheddar, A.; Yoshida, E.; Sam Ge, S.; Suzuki, K.; Cabibihan, J.J.; Eyssel, F.; He, H.

    2017-01-01

    This paper presents a study that examines whether gaze hints provided by a robot tutor influences the behavior of children in a card matching game. In this regard, we conducted a within-subjects experiment, in which children played a card game “Memory” in the presence of a robot tutor in two

  2. Gaze Cueing by Pareidolia Faces

    OpenAIRE

    Kohske Takahashi; Katsumi Watanabe

    2013-01-01

    Visual images that are not faces are sometimes perceived as faces (the pareidolia phenomenon). While the pareidolia phenomenon provides people with a strong impression that a face is present, it is unclear how deeply pareidolia faces are processed as faces. In the present study, we examined whether a shift in spatial attention would be produced by gaze cueing of face-like objects. A robust cueing effect was observed when the face-like objects were perceived as faces. The magnitude of the cuei...

  3. Wireless mesh networks.

    Science.gov (United States)

    Wang, Xinheng

    2008-01-01

    Wireless telemedicine using GSM and GPRS technologies can only provide low bandwidth connections, which makes it difficult to transmit images and video. Satellite or 3G wireless transmission provides greater bandwidth, but the running costs are high. Wireless networks (WLANs) appear promising, since they can supply high bandwidth at low cost. However, the WLAN technology has limitations, such as coverage. A new wireless networking technology named the wireless mesh network (WMN) overcomes some of the limitations of the WLAN. A WMN combines the characteristics of both a WLAN and ad hoc networks, thus forming an intelligent, large scale and broadband wireless network. These features are attractive for telemedicine and telecare because of the ability to provide data, voice and video communications over a large area. One successful wireless telemedicine project which uses wireless mesh technology is the Emergency Room Link (ER-LINK) in Tucson, Arizona, USA. There are three key characteristics of a WMN: self-organization, including self-management and self-healing; dynamic changes in network topology; and scalability. What we may now see is a shift from mobile communication and satellite systems for wireless telemedicine to the use of wireless networks based on mesh technology, since the latter are very attractive in terms of cost, reliability and speed.

  4. The research and development of the automatic solar power tracker

    Directory of Open Access Journals (Sweden)

    Li Yan Ping

    2016-01-01

    Full Text Available The article describes a kind of automatic tracker using solar power. It depends on two important parts which are servo system and adjusting mechanism system to keep the tracker operating normally. The article focuses on describing the characteristics and functions of two systems and the operating details of the automatic solar power tracker.

  5. Holistic integration of gaze cues in visual face and body perception: Evidence from the composite design.

    Science.gov (United States)

    Vrancken, Leia; Germeys, Filip; Verfaillie, Karl

    2017-01-01

    A considerable amount of research on identity recognition and emotion identification with the composite design points to the holistic processing of these aspects in faces and bodies. In this paradigm, the interference from a nonattended face half on the perception of the attended half is taken as evidence for holistic processing (i.e., a composite effect). Far less research, however, has been dedicated to the concept of gaze. Nonetheless, gaze perception is a substantial component of face and body perception, and holds critical information for everyday communicative interactions. Furthermore, the ability of human observers to detect direct versus averted eye gaze is effortless, perhaps similar to identity perception and emotion recognition. However, the hypothesis of holistic perception of eye gaze has never been tested directly. Research on gaze perception with the composite design could facilitate further systematic comparison with other aspects of face and body perception that have been investigated using the composite design (i.e., identity and emotion). In the present research, a composite design was administered to assess holistic processing of gaze cues in faces (Experiment 1) and bodies (Experiment 2). Results confirmed that eye and head orientation (Experiment 1A) and head and body orientation (Experiment 2A) are integrated in a holistic manner. However, the composite effect was not completely disrupted by inversion (Experiments 1B and 2B), a finding that will be discussed together with implications for future research.

  6. The Rethorics of Gaze in Luhrmann's "Postmodern Great Gatsby"

    Directory of Open Access Journals (Sweden)

    Paola Fallerini

    2014-05-01

    Adopting the perspective suggested by the rhetoric of the gaze (Laura Mulvey it is highlighted the metalinguistic and metatextual reflection through which this movie contributes to the critical interpretation of Fitzgerald’s novel.

  7. Wireless virtualization

    CERN Document Server

    Wen, Heming; Le-Ngoc, Tho

    2013-01-01

    This SpringerBriefs is an overview of the emerging field of wireless access and mobile network virtualization. It provides a clear and relevant picture of the current virtualization trends in wireless technologies by summarizing and comparing different architectures, techniques and technologies applicable to a future virtualized wireless network infrastructure. The readers are exposed to a short walkthrough of the future Internet initiative and network virtualization technologies in order to understand the potential role of wireless virtualization in the broader context of next-generation ubiq

  8. Attention and gaze shifting in dual-task and go/no-go performance with vocal responding

    NARCIS (Netherlands)

    Lamers, M.J.M.; Roelofs, A.P.A.

    2011-01-01

    Evidence from go/no-go performance on the Eriksen flanker task with manual responding suggests that individuals gaze at stimuli just as long as needed to identify them (e.g.. Sanders, 1998). In contrast, evidence from dual-task performance with vocal responding suggests that gaze shifts occur after

  9. The LHCb Silicon Tracker Project

    International Nuclear Information System (INIS)

    Agari, M.; Bauer, C.; Baumeister, D.; Blouw, J.; Hofmann, W.; Knoepfle, K.T.; Loechner, S.; Schmelling, M.; Pugatch, V.; Bay, A.; Carron, B.; Frei, R.; Jiminez-Otero, S.; Tran, M.-T.; Voss, H.; Adeva, B.; Esperante, D.; Lois, C.; Vasquez, P.; Bernhard, R.P.; Bernet, R.; Ermoline, Y.; Gassner, J.; Koestner, S.; Lehner, F.; Needham, M.; Siegler, M.; Steinkamp, O.; Straumann, U.; Vollhardt, A.; Volyanskyy, D.

    2006-01-01

    Two silicon strip detectors, the Trigger Tracker(TT) and the Inner Tracker(Italy) will be constructed for the LHCb experiment. Transverse momentum information extracted from the TT will be used in the Level 1 trigger. The IT is part of the main tracking system behind the magnet. Both silicon detectors will be read out using a custom-developed chip by the ASIC lab in Heidelberg. The signal-over-noise behavior and performance of various geometrical designs of the silicon sensors, in conjunction with the Beetle read-out chip, have been extensively studied in test beam experiments. Results from those experiments are presented, and have been used in the final choice of sensor geometry

  10. The CMS silicon strip tracker

    International Nuclear Information System (INIS)

    Focardi, E.; Albergo, S.; Angarano, M.; Azzi, P.; Babucci, E.; Bacchetta, N.; Bader, A.; Bagliesi, G.; Bartalini, P.; Basti, A.; Biggeri, U.; Bilei, G.M.; Bisello, D.; Boemi, D.; Bosi, F.; Borrello, L.; Bozzi, C.; Braibant, S.; Breuker, H.; Bruzzi, M.; Candelori, A.; Caner, A.; Castaldi, R.; Castro, A.; Catacchini, E.; Checcucci, B.; Ciampolini, P.; Civinini, C.; Creanza, D.; D'Alessandro, R.; Da Rold, M.; Demaria, N.; De Palma, M.; Dell'Orso, R.; Marina, R. Della; Dutta, S.; Eklund, C.; Elliott-Peisert, A.; Feld, L.; Fiore, L.; French, M.; Freudenreich, K.; Fuertjes, A.; Giassi, A.; Giraldo, A.; Glessing, B.; Gu, W.H.; Hall, G.; Hammerstrom, R.; Hebbeker, T.; Hrubec, J.; Huhtinen, M.; Kaminsky, A.; Karimaki, V.; Koenig, St.; Krammer, M.; Lariccia, P.; Lenzi, M.; Loreti, M.; Luebelsmeyer, K.; Lustermann, W.; Maettig, P.; Maggi, G.; Mannelli, M.; Mantovani, G.; Marchioro, A.; Mariotti, C.; Martignon, G.; Evoy, B. Mc; Meschini, M.; Messineo, A.; My, S.; Paccagnella, A.; Palla, F.; Pandoulas, D.; Parrini, G.; Passeri, D.; Pieri, M.; Piperov, S.; Potenza, R.; Raffaelli, F.; Raso, G.; Raymond, M.; Santocchia, A.; Schmitt, B.; Selvaggi, G.; Servoli, L.; Sguazzoni, G.; Siedling, R.; Silvestris, L.; Skog, K.; Starodumov, A.; Stavitski, I.; Stefanini, G.; Tempesta, P.; Tonelli, G.; Tricomi, A.; Tuuva, T.; Vannini, C.; Verdini, P.G.; Viertel, G.; Xie, Z.; Wang, Y.; Watts, S.; Wittmer, B.

    1999-01-01

    The Silicon Strip Tracker (SST) is the intermediate part of the CMS Central Tracker System. SST is based on microstrip silicon devices and in combination with pixel detectors and the Microstrip Gas Chambers aims at performing pattern recognition, track reconstruction and momentum measurements for all tracks with p T ≥2 GeV/c originating from high luminosity interactions at √s=14 TeV at LHC. We aim at exploiting the advantages and the physics potential of the precise tracking performance provided by the microstrip silicon detectors on a large scale apparatus and in a much more difficult environment than ever. In this paper we describe the actual SST layout and the readout system. (author)

  11. Cultural differences in gaze and emotion recognition: Americans contrast more than Chinese.

    Science.gov (United States)

    Stanley, Jennifer Tehan; Zhang, Xin; Fung, Helene H; Isaacowitz, Derek M

    2013-02-01

    We investigated the influence of contextual expressions on emotion recognition accuracy and gaze patterns among American and Chinese participants. We expected Chinese participants would be more influenced by, and attend more to, contextual information than Americans. Consistent with our hypothesis, Americans were more accurate than Chinese participants at recognizing emotions embedded in the context of other emotional expressions. Eye-tracking data suggest that, for some emotions, Americans attended more to the target faces, and they made more gaze transitions to the target face than Chinese. For all emotions except anger and disgust, Americans appeared to use more of a contrasting strategy where each face was individually contrasted with the target face, compared with Chinese who used less of a contrasting strategy. Both cultures were influenced by contextual information, although the benefit of contextual information depended upon the perceptual dissimilarity of the contextual emotions to the target emotion and the gaze pattern employed during the recognition task. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  12. Stay tuned: Inter-individual neural synchronization during mutual gaze and joint attention

    Directory of Open Access Journals (Sweden)

    Daisuke N Saito

    2010-11-01

    Full Text Available Eye contact provides a communicative link between humans, prompting joint attention. As spontaneous brain activity may have an important role in coordination of neuronal processing within the brain, their inter-subject synchronization may occur during eye contact. To test this, we conducted simultaneous functional MRI in pairs of adults. Eye contact was maintained at baseline while the subjects engaged in real-time gaze exchange in a joint attention task. Averted gaze activated the bilateral occipital pole extending to the right posterior superior temporal sulcus, the dorso-medial prefrontal cortex, and bilateral inferior frontal gyrus. Following a partner’s gaze towards an object activated the left intraparietal sulcus. After all task-related effects were modeled out, inter-individual correlation analysis of residual time-courses was performed. Paired subjects showed more prominent correlations than non-paired subjects in the right inferior frontal gyrus, suggesting that this region is involved in sharing intention during eye contact that provides the context for joint attention.

  13. Work on the ATLAS semiconductor tracker barrel

    CERN Multimedia

    Maximilien Brice

    2005-01-01

    Precision work is performed on the semiconductor tracker barrel of the ATLAS experiment. All work on these delicate components must be performed in a clean room so that impurities in the air, such as dust, do not contaminate the detector. The semiconductor tracker will be mounted in the barrel close to the heart of the ATLAS experiment to detect the path of particles produced in proton-proton collisions.

  14. Looking at Eye Gaze Processing and Its Neural Correlates in Infancy--Implications for Social Development and Autism Spectrum Disorder

    Science.gov (United States)

    Hoehl, Stefanie; Reid, Vincent M.; Parise, Eugenio; Handl, Andrea; Palumbo, Letizia; Striano, Tricia

    2009-01-01

    The importance of eye gaze as a means of communication is indisputable. However, there is debate about whether there is a dedicated neural module, which functions as an eye gaze detector and when infants are able to use eye gaze cues in a referential way. The application of neuroscience methodologies to developmental psychology has provided new…

  15. Head mounted device for point-of-gaze estimation in three dimensions

    DEFF Research Database (Denmark)

    Hansen, Dan Witzner; Lidegaard, Morten; Krüger, Norbert

    2014-01-01

    This paper presents a fully calibrated extended geometric approach for gaze estimation in three dimensions (3D). The methodology is based on a geometric approach utilising a fully calibrated binocular setup constructed as a head-mounted system. The approach is based on utilisation of two ordinary...... in the horizontal and vertical dimensions regarding fixations. However, even though the workspace is limited, the fact that the system is designed as a head-mounted device, the workspace volume is relatively positioned to the pose of the device. Hence gaze can be estimated in 3D with relatively free head...

  16. Wireless Biological Electronic Sensors.

    Science.gov (United States)

    Cui, Yue

    2017-10-09

    The development of wireless biological electronic sensors could open up significant advances for both fundamental studies and practical applications in a variety of areas, including medical diagnosis, environmental monitoring, and defense applications. One of the major challenges in the development of wireless bioelectronic sensors is the successful integration of biosensing units and wireless signal transducers. In recent years, there are a few types of wireless communication systems that have been integrated with biosensing systems to construct wireless bioelectronic sensors. To successfully construct wireless biological electronic sensors, there are several interesting questions: What types of biosensing transducers can be used in wireless bioelectronic sensors? What types of wireless systems can be integrated with biosensing transducers to construct wireless bioelectronic sensors? How are the electrical sensing signals generated and transmitted? This review will highlight the early attempts to address these questions in the development of wireless biological electronic sensors.

  17. Influence of Gaze Direction on Face Recognition: A Sensitive Effect

    Directory of Open Access Journals (Sweden)

    Noémy Daury

    2011-08-01

    Full Text Available This study was aimed at determining the conditions in which eye-contact may improve recognition memory for faces. Different stimuli and procedures were tested in four experiments. The effect of gaze direction on memory was found when a simple “yes-no” recognition task was used but not when the recognition task was more complex (e.g., including “Remember-Know” judgements, cf. Experiment 2, or confidence ratings, cf. Experiment 4. Moreover, even when a “yes-no” recognition paradigm was used, the effect occurred with one series of stimuli (cf. Experiment 1 but not with another one (cf. Experiment 3. The difficulty to produce the positive effect of gaze direction on memory is discussed.

  18. The Gaze-Cueing Effect in the United States and Japan: Influence of Cultural Differences in Cognitive Strategies on Control of Attention

    OpenAIRE

    Saki Takao; Yusuke Yamani; Atsunori Ariga

    2018-01-01

    The direction of gaze automatically and exogenously guides visual spatial attention, a phenomenon termed as the gaze-cueing effect. Although this effect arises when the duration of stimulus onset asynchrony (SOA) between a non-predictive gaze cue and the target is relatively long, no empirical research has examined the factors underlying this extended cueing effect. Two experiments compared the gaze-cueing effect at longer SOAs (700 ms) in Japanese and American participants. Cross-cultural st...

  19. Context-sensitivity in Conversation. Eye gaze and the German Repair Initiator ‘bitte?’ (´pardon?´)

    DEFF Research Database (Denmark)

    Egbert, Maria

    1996-01-01

    . In addition, repair is sensitive to certain characteristics of social situations. The selection of a particular repair initiator, German bitte? ‘pardon?’, indexes that there is no mutual gaze between interlocutors; i.e., there is no common course of action. The selection of bitte? not only initiates repair......; it also spurs establishment of mutual gaze, and thus displays that there is attention to a common focus. (Conversation analysis, context, cross-linguistic analysis, repair, gaze, telephone conversation, co-present interaction, grammar and interaction)...

  20. Gender and facial dominance in gaze cuing: Emotional context matters in the eyes that we follow

    NARCIS (Netherlands)

    Ohlsen, G.; van Zoest, W.; van Vugt, M.

    2013-01-01

    Gaze following is a socio-cognitive process that provides adaptive information about potential threats and opportunities in the individual's environment. The aim of the present study was to investigate the potential interaction between emotional context and facial dominance in gaze following. We

  1. Autonomous social gaze model for an interactive virtual character in real-life settings

    OpenAIRE

    Yumak, Zerrin; van den Brink, Bram; Egges, Arjan

    2018-01-01

    This paper presents a gaze behavior model for an interactive virtual character situated in the real world. We are interested in estimating which user has an intention to interact, in other words which user is engaged with the virtual character. The model takes into account behavioral cues such as proximity, velocity, posture and sound, estimates an engagement score and drives the gaze behavior of the virtual character. Initially, we assign equal weights to these fea...

  2. Improved remote gaze estimation using corneal reflection-adaptive geometric transforms

    Science.gov (United States)

    Ma, Chunfei; Baek, Seung-Jin; Choi, Kang-A.; Ko, Sung-Jea

    2014-05-01

    Recently, the remote gaze estimation (RGE) technique has been widely applied to consumer devices as a more natural interface. In general, the conventional RGE method estimates a user's point of gaze using a geometric transform, which represents the relationship between several infrared (IR) light sources and their corresponding corneal reflections (CRs) in the eye image. Among various methods, the homography normalization (HN) method achieves state-of-the-art performance. However, the geometric transform of the HN method requiring four CRs is infeasible for the case when fewer than four CRs are available. To solve this problem, this paper proposes a new RGE method based on three alternative geometric transforms, which are adaptive to the number of CRs. Unlike the HN method, the proposed method not only can operate with two or three CRs, but can also provide superior accuracy. To further enhance the performance, an effective error correction method is also proposed. By combining the introduced transforms with the error-correction method, the proposed method not only provides high accuracy and robustness for gaze estimation, but also allows for a more flexible system setup with a different number of IR light sources. Experimental results demonstrate the effectiveness of the proposed method.

  3. Text Entry by Gazing and Smiling

    Directory of Open Access Journals (Sweden)

    Outi Tuisku

    2013-01-01

    Full Text Available Face Interface is a wearable prototype that combines the use of voluntary gaze direction and facial activations, for pointing and selecting objects on a computer screen, respectively. The aim was to investigate the functionality of the prototype for entering text. First, three on-screen keyboard layout designs were developed and tested (n=10 to find a layout that would be more suitable for text entry with the prototype than traditional QWERTY layout. The task was to enter one word ten times with each of the layouts by pointing letters with gaze and select them by smiling. Subjective ratings showed that a layout with large keys on the edge and small keys near the center of the keyboard was rated as the most enjoyable, clearest, and most functional. Second, using this layout, the aim of the second experiment (n=12 was to compare entering text with Face Interface to entering text with mouse. The results showed that text entry rate for Face Interface was 20 characters per minute (cpm and 27 cpm for the mouse. For Face Interface, keystrokes per character (KSPC value was 1.1 and minimum string distance (MSD error rate was 0.12. These values compare especially well with other similar techniques.

  4. Sociability and gazing toward humans in dogs and wolves: Simple behaviors with broad implications.

    Science.gov (United States)

    Bentosela, Mariana; Wynne, C D L; D'Orazio, M; Elgier, A; Udell, M A R

    2016-01-01

    Sociability, defined as the tendency to approach and interact with unfamiliar people, has been found to modulate some communicative responses in domestic dogs, including gaze behavior toward the human face. The objective of this study was to compare sociability and gaze behavior in pet domestic dogs and in human-socialized captive wolves in order to identify the relative influence of domestication and learning in the development of the dog-human bond. In Experiment 1, we assessed the approach behavior and social tendencies of dogs and wolves to a familiar and an unfamiliar person. In Experiment 2, we compared the animal's duration of gaze toward a person's face in the presence of food, which the animals could see but not access. Dogs showed higher levels of interspecific sociability than wolves in all conditions, including those where attention was unavailable. In addition, dogs gazed longer at the person's face than wolves in the presence of out-of-reach food. The potential contributions of domestication, associative learning, and experiences during ontogeny to prosocial behavior toward humans are discussed. © 2016 Society for the Experimental Analysis of Behavior.

  5. Eye Gaze Correlates of Motor Impairment in VR Observation of Motor Actions.

    Science.gov (United States)

    Alves, J; Vourvopoulos, A; Bernardino, A; Bermúdez I Badia, S

    2016-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Methodologies, Models and Algorithms for Patients Rehabilitation". Identify eye gaze correlates of motor impairment in a virtual reality motor observation task in a study with healthy participants and stroke patients. Participants consisted of a group of healthy subjects (N = 20) and a group of stroke survivors (N = 10). Both groups were required to observe a simple reach-and-grab and place-and-release task in a virtual environment. Additionally, healthy subjects were required to observe the task in a normal condition and a constrained movement condition. Eye movements were recorded during the observation task for later analysis. For healthy participants, results showed differences in gaze metrics when comparing the normal and arm-constrained conditions. Differences in gaze metrics were also found when comparing dominant and non-dominant arm for saccades and smooth pursuit events. For stroke patients, results showed longer smooth pursuit segments in action observation when observing the paretic arm, thus providing evidence that the affected circuitry may be activated for eye gaze control during observation of the simulated motor action. This study suggests that neural motor circuits are involved, at multiple levels, in observation of motor actions displayed in a virtual reality environment. Thus, eye tracking combined with action observation tasks in a virtual reality display can be used to monitor motor deficits derived from stroke, and consequently can also be used for rehabilitation of stroke patients.

  6. The effects of varying contextual demands on age-related positive gaze preferences.

    Science.gov (United States)

    Noh, Soo Rim; Isaacowitz, Derek M

    2015-06-01

    Despite many studies on the age-related positivity effect and its role in visual attention, discrepancies remain regarding whether full attention is required for age-related differences to emerge. The present study took a new approach to this question by varying the contextual demands of emotion processing. This was done by adding perceptual distractions, such as visual and auditory noise, that could disrupt attentional control. Younger and older participants viewed pairs of happy-neutral and fearful-neutral faces while their eye movements were recorded. Facial stimuli were shown either without noise, embedded in a background of visual noise (low, medium, or high), or with simultaneous auditory babble. Older adults showed positive gaze preferences, looking toward happy faces and away from fearful faces; however, their gaze preferences tended to be influenced by the level of visual noise. Specifically, the tendency to look away from fearful faces was not present in conditions with low and medium levels of visual noise but was present when there were high levels of visual noise. It is important to note, however, that in the high-visual-noise condition, external cues were present to facilitate the processing of emotional information. In addition, older adults' positive gaze preferences disappeared or were reduced when they first viewed emotional faces within a distracting context. The current results indicate that positive gaze preferences may be less likely to occur in distracting contexts that disrupt control of visual attention. (c) 2015 APA, all rights reserved.

  7. Gaze Behavior in a Natural Environment with a Task-Relevant Distractor: How the Presence of a Goalkeeper Distracts the Penalty Taker

    Directory of Open Access Journals (Sweden)

    Johannes Kurz

    2018-01-01

    Full Text Available Gaze behavior in natural scenes has been shown to be influenced not only by top–down factors such as task demands and action goals but also by bottom–up factors such as stimulus salience and scene context. Whereas gaze behavior in the context of static pictures emphasizes spatial accuracy, gazing in natural scenes seems to rely more on where to direct the gaze involving both anticipative components and an evaluation of ongoing actions. Not much is known about gaze behavior in far-aiming tasks in which multiple task-relevant targets and distractors compete for the allocation of visual attention via gaze. In the present study, we examined gaze behavior in the far-aiming task of taking a soccer penalty. This task contains a proximal target, the ball; a distal target, an empty location within the goal; and a salient distractor, the goalkeeper. Our aim was to investigate where participants direct their gaze in a natural environment with multiple potential fixation targets that differ in task relevance and salience. Results showed that the early phase of the run-up seems to be driven by both the salience of the stimulus setting and the need to perform a spatial calibration of the environment. The late run-up, in contrast, seems to be controlled by attentional demands of the task with penalty takers having habitualized a visual routine that is not disrupted by external influences (e.g., the goalkeeper. In addition, when trying to shoot a ball as accurately as possible, penalty takers directed their gaze toward the ball in order to achieve optimal foot-ball contact. These results indicate that whether gaze is driven by salience of the stimulus setting or by attentional demands depends on the phase of the actual task.

  8. Limitations of gaze transfer: without visual context, eye movements do not to help to coordinate joint action, whereas mouse movements do.

    Science.gov (United States)

    Müller, Romy; Helmert, Jens R; Pannasch, Sebastian

    2014-10-01

    Remote cooperation can be improved by transferring the gaze of one participant to the other. However, based on a partner's gaze, an interpretation of his communicative intention can be difficult. Thus, gaze transfer has been inferior to mouse transfer in remote spatial referencing tasks where locations had to be pointed out explicitly. Given that eye movements serve as an indicator of visual attention, it remains to be investigated whether gaze and mouse transfer differentially affect the coordination of joint action when the situation demands an understanding of the partner's search strategies. In the present study, a gaze or mouse cursor was transferred from a searcher to an assistant in a hierarchical decision task. The assistant could use this cursor to guide his movement of a window which continuously opened up the display parts the searcher needed to find the right solution. In this context, we investigated how the ease of using gaze transfer depended on whether a link could be established between the partner's eye movements and the objects he was looking at. Therefore, in addition to the searcher's cursor, the assistant either saw the positions of these objects or only a grey background. When the objects were visible, performance and the number of spoken words were similar for gaze and mouse transfer. However, without them, gaze transfer resulted in longer solution times and more verbal effort as participants relied more strongly on speech to coordinate the window movement. Moreover, an analysis of the spatio-temporal coupling of the transmitted cursor and the window indicated that when no visual object information was available, assistants confidently followed the searcher's mouse but not his gaze cursor. Once again, the results highlight the importance of carefully considering task characteristics when applying gaze transfer in remote cooperation. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. The CMS Tracker Data Quality Monitoring Expert GUI

    CERN Document Server

    Palmonari, Francesco

    2009-01-01

    The CMS Tracker data quality monitoring (DQM) is a demanding task due the detector's high granularity. It consists of about 15148 strip and 1440 pixel detector modules. About 350,000 histograms are defined and filled accessing information from different stages of data reconstruction to check the data quality. It is impossible to manage such a large number of histograms by shift personnel and experts. A tracker specific Graphical User Interface (GUI) is developed to simplify the navigation and to spot detector problems efficiently. The GUI is web-based and implemented with Ajax technology. We will describe the framework and the specific features of the expert GUI developed for the CMS Tracker DQM system.

  10. Control Algorithms for Large-scale Single-axis Photovoltaic Trackers

    Directory of Open Access Journals (Sweden)

    Dorian Schneider

    2012-01-01

    Full Text Available The electrical yield of large-scale photovoltaic power plants can be greatly improved by employing solar trackers. While fixed-tilt superstructures are stationary and immobile, trackers move the PV-module plane in order to optimize its alignment to the sun. This paper introduces control algorithms for single-axis trackers (SAT, including a discussion for optimal alignment and backtracking. The results are used to simulate and compare the electrical yield of fixed-tilt and SAT systems. The proposed algorithms have been field tested, and are in operation in solar parks worldwide.

  11. The DELPHI Silicon Tracker in the global pattern recognition

    CERN Document Server

    Elsing, M

    2000-01-01

    ALEPH and DELPHI were the first experiments operating a silicon vertex detector at LEP. During the past 10 years of data taking the DELPHI Silicon Tracker was upgraded three times to follow the different tracking requirements for LEP 1 and LEP 2 as well as to improve the tracking performance. Several steps in the development of the pattern recognition software were done in order to understand and fully exploit the silicon tracker information. This article gives an overview of the final algorithms and concepts of the track reconstruction using the Silicon Tracker in DELPHI.

  12. The DELPHI Silicon Tracker in the global pattern recognition

    International Nuclear Information System (INIS)

    Elsing, M.

    2000-01-01

    ALEPH and DELPHI were the first experiments operating a silicon vertex detector at LEP. During the past 10 years of data taking the DELPHI Silicon Tracker was upgraded three times to follow the different tracking requirements for LEP 1 and LEP 2 as well as to improve the tracking performance. Several steps in the development of the pattern recognition software were done in order to understand and fully exploit the silicon tracker information. This article gives an overview of the final algorithms and concepts of the track reconstruction using the Silicon Tracker in DELPHI

  13. Novel Eye Movement Disorders in Whipple’s Disease—Staircase Horizontal Saccades, Gaze-Evoked Nystagmus, and Esotropia

    Directory of Open Access Journals (Sweden)

    Aasef G. Shaikh

    2017-07-01

    Full Text Available Whipple’s disease, a rare systemic infectious disorder, is complicated by the involvement of the central nervous system in about 5% of cases. Oscillations of the eyes and the jaw, called oculo-masticatory myorhythmia, are pathognomonic of the central nervous system involvement but are often absent. Typical manifestations of the central nervous system Whipple’s disease are cognitive impairment, parkinsonism mimicking progressive supranuclear palsy with vertical saccade slowing, and up-gaze range limitation. We describe a unique patient with the central nervous system Whipple’s disease who had typical features, including parkinsonism, cognitive impairment, and up-gaze limitation; but also had diplopia, esotropia with mild horizontal (abduction more than adduction limitation, and vertigo. The patient also had gaze-evoked nystagmus and staircase horizontal saccades. Latter were thought to be due to mal-programmed small saccades followed by a series of corrective saccades. The saccades were disconjugate due to the concurrent strabismus. Also, we noted disconjugacy in the slow phase of gaze-evoked nystagmus. The disconjugacy of the slow phase of gaze-evoked nystagmus was larger during monocular viewing condition. We propose that interaction of the strabismic drifts of the covered eyes and the nystagmus drift, putatively at the final common pathway might lead to such disconjugacy.

  14. Testing the dual-route model of perceived gaze direction: Linear combination of eye and head cues.

    Science.gov (United States)

    Otsuka, Yumiko; Mareschal, Isabelle; Clifford, Colin W G

    2016-06-01

    We have recently proposed a dual-route model of the effect of head orientation on perceived gaze direction (Otsuka, Mareschal, Calder, & Clifford, 2014; Otsuka, Mareschal, & Clifford, 2015), which computes perceived gaze direction as a linear combination of eye orientation and head orientation. By parametrically manipulating eye orientation and head orientation, we tested the adequacy of a linear model to account for the effect of horizontal head orientation on perceived direction of gaze. Here, participants adjusted an on-screen pointer toward the perceived gaze direction in two image conditions: Normal condition and Wollaston condition. Images in the Normal condition included a change in the visible part of the eye along with the change in head orientation, while images in the Wollaston condition were manipulated to have identical eye regions across head orientations. Multiple regression analysis with explanatory variables of eye orientation and head orientation revealed that linear models account for most of the variance both in the Normal condition and in the Wollaston condition. Further, we found no evidence that the model with a nonlinear term explains significantly more variance. Thus, the current study supports the dual-route model that computes the perceived gaze direction as a linear combination of eye orientation and head orientation.

  15. Technical Training Seminar: Laser Trackers: the Local Positioning Technology (LPT)

    CERN Document Server

    Davide Vitè

    2005-01-01

    Friday 20 May from 10:00 to 16:00, Training Centre (bldg. 593) Laser Trackers: the Local Positioning Technology (LPT) Simon Moser, Michael Lettau, Achim Lupus, Niklaus Suter, Leica GEOSYSTEMS AG, Switzerland Laser trackers are used at CERN for different applications within the LHC Project. Leica Geosystems AG have been developing during the last four years the revolutionary Local Positioning Technology (LPT). Laser trackers are increasingly used to ensure accuracy of large fabrications, and alignment in the final assembly process. Competing portable Coordinate Measuring Machines (CMM) with articulated arms require a frequent repositioning, known to lead to a loss of accuracy and efficiency. Leica Geosystems developed armless solutions, the T-Probe and T-Scan, for use with its laser trackers. The combination of the tracker technology with photogrammetry is the base of LPT, enabling real time measurements with free hand-held devices, such as the T-Probe and T-Scan. T-Probe and T-Scan overcome the proble...

  16. Spatiotemporal characteristics of gaze of children with autism spectrum disorders while looking at classroom scenes.

    Directory of Open Access Journals (Sweden)

    Takahiro Higuchi

    Full Text Available Children with autism spectrum disorders (ASD who have neurodevelopmental impairments in social communication often refuse to go to school because of difficulties in learning in class. The exact cause of maladaptation to school in such children is unknown. We hypothesized that these children have difficulty in paying attention to objects at which teachers are pointing. We performed gaze behavior analysis of children with ASD to understand their difficulties in the classroom. The subjects were 26 children with ASD (19 boys and 7 girls; mean age, 8.6 years and 27 age-matched children with typical development (TD (14 boys and 13 girls; mean age, 8.2 years. We measured eye movements of the children while they performed free viewing of two movies depicting actual classes: a Japanese class in which a teacher pointed at cartoon characters and an arithmetic class in which the teacher pointed at geometric figures. In the analysis, we defined the regions of interest (ROIs as the teacher's face and finger, the cartoon characters and geometric figures at which the teacher pointed, and the classroom wall that contained no objects. We then compared total gaze time for each ROI between the children with ASD and TD by two-way ANOVA. Children with ASD spent less gaze time on the cartoon characters pointed at by the teacher; they spent more gaze time on the wall in both classroom scenes. We could differentiate children with ASD from those with TD almost perfectly by the proportion of total gaze time that children with ASD spent looking at the wall. These results suggest that children with ASD do not follow the teacher's instructions in class and persist in gazing at inappropriate visual areas such as walls. Thus, they may have difficulties in understanding content in class, leading to maladaptation to school.

  17. Spatiotemporal characteristics of gaze of children with autism spectrum disorders while looking at classroom scenes.

    Science.gov (United States)

    Higuchi, Takahiro; Ishizaki, Yuko; Noritake, Atsushi; Yanagimoto, Yoshitoki; Kobayashi, Hodaka; Nakamura, Kae; Kaneko, Kazunari

    2017-01-01

    Children with autism spectrum disorders (ASD) who have neurodevelopmental impairments in social communication often refuse to go to school because of difficulties in learning in class. The exact cause of maladaptation to school in such children is unknown. We hypothesized that these children have difficulty in paying attention to objects at which teachers are pointing. We performed gaze behavior analysis of children with ASD to understand their difficulties in the classroom. The subjects were 26 children with ASD (19 boys and 7 girls; mean age, 8.6 years) and 27 age-matched children with typical development (TD) (14 boys and 13 girls; mean age, 8.2 years). We measured eye movements of the children while they performed free viewing of two movies depicting actual classes: a Japanese class in which a teacher pointed at cartoon characters and an arithmetic class in which the teacher pointed at geometric figures. In the analysis, we defined the regions of interest (ROIs) as the teacher's face and finger, the cartoon characters and geometric figures at which the teacher pointed, and the classroom wall that contained no objects. We then compared total gaze time for each ROI between the children with ASD and TD by two-way ANOVA. Children with ASD spent less gaze time on the cartoon characters pointed at by the teacher; they spent more gaze time on the wall in both classroom scenes. We could differentiate children with ASD from those with TD almost perfectly by the proportion of total gaze time that children with ASD spent looking at the wall. These results suggest that children with ASD do not follow the teacher's instructions in class and persist in gazing at inappropriate visual areas such as walls. Thus, they may have difficulties in understanding content in class, leading to maladaptation to school.

  18. A 6 D.O.F. opto-inertial tracker for virtual reality experiments in microgravity

    Science.gov (United States)

    Zaoui, Mohamed; Wormell, Dean; Altshuler, Yury; Foxlin, Eric; McIntyre, Joseph

    2001-08-01

    Gravity plays a role in many different levels of human motor behavior. It dictates the laws of motion of our body and limbs, as well as of the objects in the external world with which we wish to interact. The dynamic interaction of our body with the world is molded within gravity's constraints. The role played by gravity in the perception of visual stimuli and the elaboration of human movement is an active research theme in the field of Neurophysiology. Conditions of microgravity, coupled with techniques from the world of virtual reality, provide a unique opportunity to address these questions concerning the function of the human sensorimotor system [1]. The ability to measure movements of the head and to update in real time the visual scene presented to the subject based on these measurements is a key element in producing a realistic virtual environment. A variety of head-tracking hardware exists on the market today [2-4], but none seem particularly well suited to the constraints of working with a space station environment. Nor can any of the existing commercial systems meet the more stringent requirements for physiological experimentation (high accuracy, high resolution, low jitter, low lag) in a wireless configuration. To this end, we have developed and tested a hybrid opto-inertial 6 degree-of-freedom tracker based on existing inertial technology [5-8]. To confirm that the inertial components and algorithms will function properly, this system was tested in the microgravity conditions of parabolic flight. Here we present the design goals of this tracker, the system configuration and the results of 0g and 1g testing.

  19. Documentation for delivery of Star Tracker to CHAMP

    DEFF Research Database (Denmark)

    Madsen, Peter Buch; Betto, Maurizio; Jørgensen, John Leif

    1999-01-01

    The documentation EIDP (End Item Data Package) describes all the tests which have been performed on the flight hardware of the Star Tracker for the German satellite CHAMP.......The documentation EIDP (End Item Data Package) describes all the tests which have been performed on the flight hardware of the Star Tracker for the German satellite CHAMP....

  20. The ATLAS Fast Tracker

    CERN Document Server

    Volpi, Guido; The ATLAS collaboration

    2015-01-01

    The use of tracking information at the trigger level in the LHC Run II period is crucial for the trigger an data acquisition (TDAQ) system. The tracking precision is in fact important to identify specific decay products of the Higgs boson or new phenomena, a well as to distinguish the contributions coming from many contemporary collisions that occur at every bunch crossing. However, the track reconstruction is among the most demanding tasks performed by the TDAQ computing farm; in fact, full reconstruction at full Level-1 trigger accept rate (100 KHz) is not possible. In order to overcome this limitation, the ATLAS experiment is planning the installation of a specific processor: the Fast Tracker (FTK), which is aimed at achieving this goal. The FTK is a pipeline of high performance electronic, based on custom and commercial devices, which is expected to reconstruct, with high resolution, the trajectories of charged tracks with a transverse momentum above 1 GeV, using the ATLAS inner tracker information. Patte...

  1. I Reach Faster When I See You Look: Gaze Effects in Human–Human and Human–Robot Face-to-Face Cooperation

    Science.gov (United States)

    Boucher, Jean-David; Pattacini, Ugo; Lelong, Amelie; Bailly, Gerard; Elisei, Frederic; Fagel, Sascha; Dominey, Peter Ford; Ventre-Dominey, Jocelyne

    2012-01-01

    Human–human interaction in natural environments relies on a variety of perceptual cues. Humanoid robots are becoming increasingly refined in their sensorimotor capabilities, and thus should now be able to manipulate and exploit these social cues in cooperation with their human partners. Previous studies have demonstrated that people follow human and robot gaze, and that it can help them to cope with spatially ambiguous language. Our goal is to extend these findings into the domain of action, to determine how human and robot gaze can influence the speed and accuracy of human action. We report on results from a human–human cooperation experiment demonstrating that an agent’s vision of her/his partner’s gaze can significantly improve that agent’s performance in a cooperative task. We then implement a heuristic capability to generate such gaze cues by a humanoid robot that engages in the same cooperative interaction. The subsequent human–robot experiments demonstrate that a human agent can indeed exploit the predictive gaze of their robot partner in a cooperative task. This allows us to render the humanoid robot more human-like in its ability to communicate with humans. The long term objectives of the work are thus to identify social cooperation cues, and to validate their pertinence through implementation in a cooperative robot. The current research provides the robot with the capability to produce appropriate speech and gaze cues in the context of human–robot cooperation tasks. Gaze is manipulated in three conditions: Full gaze (coordinated eye and head), eyes hidden with sunglasses, and head fixed. We demonstrate the pertinence of these cues in terms of statistical measures of action times for humans in the context of a cooperative task, as gaze significantly facilitates cooperation as measured by human response times. PMID:22563315

  2. I Reach Faster When I See You Look: Gaze Effects in Human-Human and Human-Robot Face-to-Face Cooperation.

    Science.gov (United States)

    Boucher, Jean-David; Pattacini, Ugo; Lelong, Amelie; Bailly, Gerrard; Elisei, Frederic; Fagel, Sascha; Dominey, Peter Ford; Ventre-Dominey, Jocelyne

    2012-01-01

    Human-human interaction in natural environments relies on a variety of perceptual cues. Humanoid robots are becoming increasingly refined in their sensorimotor capabilities, and thus should now be able to manipulate and exploit these social cues in cooperation with their human partners. Previous studies have demonstrated that people follow human and robot gaze, and that it can help them to cope with spatially ambiguous language. Our goal is to extend these findings into the domain of action, to determine how human and robot gaze can influence the speed and accuracy of human action. We report on results from a human-human cooperation experiment demonstrating that an agent's vision of her/his partner's gaze can significantly improve that agent's performance in a cooperative task. We then implement a heuristic capability to generate such gaze cues by a humanoid robot that engages in the same cooperative interaction. The subsequent human-robot experiments demonstrate that a human agent can indeed exploit the predictive gaze of their robot partner in a cooperative task. This allows us to render the humanoid robot more human-like in its ability to communicate with humans. The long term objectives of the work are thus to identify social cooperation cues, and to validate their pertinence through implementation in a cooperative robot. The current research provides the robot with the capability to produce appropriate speech and gaze cues in the context of human-robot cooperation tasks. Gaze is manipulated in three conditions: Full gaze (coordinated eye and head), eyes hidden with sunglasses, and head fixed. We demonstrate the pertinence of these cues in terms of statistical measures of action times for humans in the context of a cooperative task, as gaze significantly facilitates cooperation as measured by human response times.

  3. Gaze-informed, task-situated representation of space in primate hippocampus during virtual navigation

    Science.gov (United States)

    Wirth, Sylvia; Baraduc, Pierre; Planté, Aurélie; Pinède, Serge; Duhamel, Jean-René

    2017-01-01

    To elucidate how gaze informs the construction of mental space during wayfinding in visual species like primates, we jointly examined navigation behavior, visual exploration, and hippocampal activity as macaque monkeys searched a virtual reality maze for a reward. Cells sensitive to place also responded to one or more variables like head direction, point of gaze, or task context. Many cells fired at the sight (and in anticipation) of a single landmark in a viewpoint- or task-dependent manner, simultaneously encoding the animal’s logical situation within a set of actions leading to the goal. Overall, hippocampal activity was best fit by a fine-grained state space comprising current position, view, and action contexts. Our findings indicate that counterparts of rodent place cells in primates embody multidimensional, task-situated knowledge pertaining to the target of gaze, therein supporting self-awareness in the construction of space. PMID:28241007

  4. Strange-face Illusions During Interpersonal-Gazing and Personality Differences of Spirituality.

    Science.gov (United States)

    Caputo, Giovanni B

    Strange-face illusions are produced when two individuals gaze at each other in the eyes in low illumination for more than a few minutes. Usually, the members of the dyad perceive numinous apparitions, like the other's face deformations and perception of a stranger or a monster in place of the other, and feel a short lasting dissociation. In the present experiment, the influence of the spirituality personality trait on strength and number of strange-face illusions was investigated. Thirty participants were preliminarily tested for superstition (Paranormal Belief Scale, PBS) and spirituality (Spiritual Transcendence Scale, STS); then, they were randomly assigned to 15 dyads. Dyads performed the intersubjective gazing task for 10 minutes and, finally, strange-face illusions (measured through the Strange-Face Questionnaire, SFQ) were evaluated. The first finding was that SFQ was independent of PBS; hence, strange-face illusions during intersubjective gazing are authentically perceptual, hallucination-like phenomena, and not due to superstition. The second finding was that SFQ depended on the spiritual-universality scale of STS (a belief in the unitive nature of life; e.g., "there is a higher plane of consciousness or spirituality that binds all people") and the two variables were negatively correlated. Thus, strange-face illusions, in particular monstrous apparitions, could potentially disrupt binding among human beings. Strange-face illusions can be considered as 'projections' of the subject's unconscious into the other's face. In conclusion, intersubjective gazing at low illumination can be a tool for conscious integration of unconscious 'shadows of the Self' in order to reach completeness of the Self. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Gaze movements and spatial working memory in collision avoidance: a traffic intersection task

    Directory of Open Access Journals (Sweden)

    Gregor eHardiess

    2013-06-01

    Full Text Available Street crossing under traffic is an everyday activity including collision detection as well as avoidance of objects in the path of motion. Such tasks demand extraction and representation of spatio-temporal information about relevant obstacles in an optimized format. Relevant task information is extracted visually by the use of gaze movements and represented in spatial working memory. In a virtual reality traffic intersection task, subjects are confronted with a two-lane intersection where cars are appearing with different frequencies, corresponding to high and low traffic densities. Under free observation and exploration of the scenery (using unrestricted eye and head movements the overall task for the subjects was to predict the potential-of-collision (POC of the cars or to adjust an adequate driving speed in order to cross the intersection without collision (i.e., to find the free space for crossing. In a series of experiments, gaze movement parameters, task performance, and the representation of car positions within working memory at distinct time points were assessed in normal subjects as well as in neurological patients suffering from homonymous hemianopia. In the following, we review the findings of these experiments together with other studies and provide a new perspective of the role of gaze behavior and spatial memory in collision detection and avoidance, focusing on the following questions: (i which sensory variables can be identified supporting adequate collision detection? (ii How do gaze movements and working memory contribute to collision avoidance when multiple moving objects are present and (iii how do they correlate with task performance? (iv How do patients with homonymous visual field defects use gaze movements and working memory to compensate for visual field loss? In conclusion, we extend the theory of collision detection and avoidance in the case of multiple moving objects and provide a new perspective on the combined

  6. EMC Diagnosis and Corrective Actions for Silicon Strip Tracker Detectors

    Energy Technology Data Exchange (ETDEWEB)

    Arteche, F.; /CERN /Imperial Coll., London; Rivetta, C.; /SLAC

    2006-06-06

    The tracker sub-system is one of the five sub-detectors of the Compact Muon Solenoid (CMS) experiment under construction at CERN for the Large Hadron Collider (LHC) accelerator. The tracker subdetector is designed to reconstruct tracks of charged sub-atomic particles generated after collisions. The tracker system processes analogue signals from 10 million channels distributed across 14000 silicon micro-strip detectors. It is designed to process signals of a few nA and digitize them at 40 MHz. The overall sub-detector is embedded in a high particle radiation environment and a magnetic field of 4 Tesla. The evaluation of the electromagnetic immunity of the system is very important to optimize the performance of the tracker sub-detector and the whole CMS experiment. This paper presents the EMC diagnosis of the CMS silicon tracker sub-detector. Immunity tests were performed using the final prototype of the Silicon Tracker End-Caps (TEC) system to estimate the sensitivity of the system to conducted noise, evaluate the weakest areas of the system and take corrective actions before the integration of the overall detector. This paper shows the results of one of those tests, that is the measurement and analysis of the immunity to CM external conducted noise perturbations.

  7. Effects of Facial Symmetry and Gaze Direction on Perception of Social Attributes: A Study in Experimental Art History

    Directory of Open Access Journals (Sweden)

    Per Olav Folgerø

    2016-09-01

    Full Text Available This article explores the possibility of testing hypotheses about art production in the past by collecting data in the present. We call this enterprise experimental art history. Why did medieval artists prefer to paint Christ with his face directed towards the beholder, while profane faces were noticeably more often painted in different degrees of profile? Is a preference for frontal faces motivated by deeper evolutionary and biological considerations? Head and gaze direction is a significant factor for detecting the intentions of others, and accurate detection of gaze direction depends on strong contrast between a dark iris and a bright sclera, a combination that is only found in humans among the primates. One uniquely human capacity is language acquisition, where the detection of shared or joint attention, for example through detection of gaze direction, contributes significantly to the ease of acquisition. The perceived face and gaze direction is also related to fundamental emotional reactions such as fear, aggression, empathy and sympathy. The fast-track modulator model presents a related fast and unconscious subcortical route that involves many central brain areas. Activity in this pathway mediates the affective valence of the stimulus. In particular different sub-regions of the amygdala show specific activation as response to gaze direction, head orientation, and the valence of facial expression.We present three experiments on the effects of face orientation and gaze direction on the judgments of social attributes. We observed that frontal faces with direct gaze were more highly associated with positive adjectives. Does this help to associate positive values to the Holy Face in a Western context? The formal result indicates that the Holy Face is perceived more positively than profiles with both direct and averted gaze. Two control studies, using a Brazilian and a Dutch database of photographs, showed a similar but weaker effect with a

  8. Age differences in conscious versus subconscious social perception: The influence of face age and valence on gaze following.

    OpenAIRE

    Bailey, P.E.; Slessor, G.; Rendell, P.G.; Bennetts, Rachel; Campbell, A.; Ruffman, T.

    2014-01-01

    Gaze following is the primary means of establishing joint attention with others and is subject to age-related decline. In addition, young but not older adults experience an own-age bias in gaze following. The current research assessed the effects of subconscious processing on these age-related differences. Participants responded to targets that were either congruent or incongruent with the direction of gaze displayed in supraliminal and subliminal images of young and older faces. These faces ...

  9. Performance studies of the CMS Strip Tracker before installation

    CERN Document Server

    Adam, Wolfgang; Dragicevic, Marko; Friedl, Markus; Fruhwirth, R; Hansel, S; Hrubec, Josef; Krammer, Manfred; Oberegger, Margit; Pernicka, Manfred; Schmid, Siegfried; Stark, Roland; Steininger, Helmut; Uhl, Dieter; Waltenberger, Wolfgang; Widl, Edmund; Van Mechelen, Pierre; Cardaci, Marco; Beaumont, Willem; de Langhe, Eric; de Wolf, Eddi A; Delmeire, Evelyne; Hashemi, Majid; Bouhali, Othmane; Charaf, Otman; Clerbaux, Barbara; Dewulf, Jean-Paul; Elgammal, Sherif; Hammad, Gregory Habib; de Lentdecker, Gilles; Marage, Pierre Edouard; Vander Velde, Catherine; Vanlaer, Pascal; Wickens, John; Adler, Volker; Devroede, Olivier; De Weirdt, Stijn; D'Hondt, Jorgen; Goorens, Robert; Heyninck, Jan; Maes, Joris; Mozer, Matthias Ulrich; Tavernier, Stefaan; Van Lancker, Luc; Van Mulders, Petra; Villella, Ilaria; Wastiels, C; Bonnet, Jean-Luc; Bruno, Giacomo; De Callatay, Bernard; Florins, Benoit; Giammanco, Andrea; Gregoire, Ghislain; Keutgen, Thomas; Kcira, Dorian; Lemaitre, Vincent; Michotte, Daniel; Militaru, Otilia; Piotrzkowski, Krzysztof; Quertermont, L; Roberfroid, Vincent; Rouby, Xavier; Teyssier, Daniel; Daubie, Evelyne; Anttila, Erkki; Czellar, Sandor; Engstrom, Pauli; Harkonen, J; Karimaki, V; Kostesmaa, J; Kuronen, Auli; Lampen, Tapio; Linden, Tomas; Luukka, Panja-Riina; Maenpaa, T; Michal, Sebastien; Tuominen, Eija; Tuominiemi, Jorma; Ageron, Michel; Baulieu, Guillaume; Bonnevaux, Alain; Boudoul, Gaelle; Chabanat, Eric; Chabert, Eric Christian; Chierici, Roberto; Contardo, Didier; Della Negra, Rodolphe; Dupasquier, Thierry; Gelin, Georges; Giraud, Noël; Guillot, Gérard; Estre, Nicolas; Haroutunian, Roger; Lumb, Nicholas; Perries, Stephane; Schirra, Florent; Trocme, Benjamin; Vanzetto, Sylvain; Agram, Jean-Laurent; Blaes, Reiner; Drouhin, Frédéric; Ernenwein, Jean-Pierre; Fontaine, Jean-Charles; Berst, Jean-Daniel; Brom, Jean-Marie; Didierjean, Francois; Goerlach, Ulrich; Graehling, Philippe; Gross, Laurent; Hosselet, J; Juillot, Pierre; Lounis, Abdenour; Maazouzi, Chaker; Olivetto, Christian; Strub, Roger; Van Hove, Pierre; Anagnostou, Georgios; Brauer, Richard; Esser, Hans; Feld, Lutz; Karpinski, Waclaw; Klein, Katja; Kukulies, Christoph; Olzem, Jan; Ostapchuk, Andrey; Pandoulas, Demetrios; Pierschel, Gerhard; Raupach, Frank; Schael, Stefan; Schwering, Georg; Sprenger, Daniel; Thomas, Maarten; Weber, Markus; Wittmer, Bruno; Wlochal, Michael; Beissel, Franz; Bock, E; Flugge, G; Gillissen, C; Hermanns, Thomas; Heydhausen, Dirk; Jahn, Dieter; Kaussen, Gordon; Linn, Alexander; Perchalla, Lars; Poettgens, Michael; Pooth, Oliver; Stahl, Achim; Zoeller, Marc Henning; Buhmann, Peter; Butz, Erik; Flucke, Gero; Hamdorf, Richard Helmut; Hauk, Johannes; Klanner, Robert; Pein, Uwe; Schleper, Peter; Steinbruck, G; Blum, P; De Boer, Wim; Dierlamm, Alexander; Dirkes, Guido; Fahrer, Manuel; Frey, Martin; Furgeri, Alexander; Hartmann, Frank; Heier, Stefan; Hoffmann, Karl-Heinz; Kaminski, Jochen; Ledermann, Bernhard; Liamsuwan, Thiansin; Muller, S; Muller, Th; Schilling, Frank-Peter; Simonis, Hans-Jürgen; Steck, Pia; Zhukov, Valery; Cariola, P; De Robertis, Giuseppe; Ferorelli, Raffaele; Fiore, Luigi; Preda, M; Sala, Giuliano; Silvestris, Lucia; Tempesta, Paolo; Zito, Giuseppe; Creanza, Donato; De Filippis, Nicola; De Palma, Mauro; Giordano, Domenico; Maggi, Giorgio; Manna, Norman; My, Salvatore; Selvaggi, Giovanna; Albergo, Sebastiano; Chiorboli, Massimiliano; Costa, Salvatore; Galanti, Mario; Giudice, Nunzio; Guardone, Nunzio; Noto, Francesco; Potenza, Renato; Saizu, Mirela Angela; Sparti, V; Sutera, Concetta; Tricomi, Alessia; Tuve, Cristina; Brianzi, Mirko; Civinini, Carlo; Maletta, Fernando; Manolescu, Florentina; Meschini, Marco; Paoletti, Simone; Sguazzoni, Giacomo; Broccolo, B; Ciulli, Vitaliano; D'Alessandro, Raffaello; Focardi, Ettore; Frosali, Simone; Genta, Chiara; Landi, Gregorio; Lenzi, Piergiulio; Macchiolo, Anna; Magini, Nicolo; Parrini, Giuliano; Scarlini, Enrico; Cerati, Giuseppe Benedetto; Azzi, Patrizia; Bacchetta, Nicola; Candelori, Andrea; Dorigo, Tommaso; Kaminsky, A; Karaevski, S; Khomenkov, Volodymyr; Reznikov, Sergey; Tessaro, Mario; Bisello, Dario; De Mattia, Marco; Giubilato, Piero; Loreti, Maurizio; Mattiazzo, Serena; Nigro, Massimo; Paccagnella, Alessandro; Pantano, Devis; Pozzobon, Nicola; Tosi, Mia; Bilei, Gian Mario; Checcucci, Bruno; Fano, Livio; Servoli, Leonello; Ambroglini, Filippo; Babucci, Ezio; Benedetti, Daniele; Biasini, Maurizio; Caponeri, Benedetta; Covarelli, Roberto; Giorgi, Marco; Lariccia, Paolo; Mantovani, Giancarlo; Marcantonini, Marta; Postolache, Vasile; Santocchia, Attilio; Spiga, Daniele; Bagliesi, Giuseppe; Balestri, Gabriele; Berretta, Luca; Bianucci, S; Boccali, Tommaso; Bosi, Filippo; Bracci, Fabrizio; Castaldi, Rino; Ceccanti, Marco; Cecchi, Roberto; Cerri, Claudio; Cucoanes, Andi Sebastian; Dell'Orso, Roberto; Dobur, Didar; Dutta, Suchandra; Giassi, Alessandro; Giusti, Simone; Kartashov, Dmitry; Kraan, Aafke; Lomtadze, Teimuraz; Lungu, George-Adrian; Magazzu, Guido; Mammini, Paolo; Mariani, Filippo; Martinelli, Giovanni; Moggi, Andrea; Palla, Fabrizio; Palmonari, Francesco; Petragnani, Giulio; Profeti, Alessandro; Raffaelli, Fabrizio; Rizzi, Domenico; Sanguinetti, Giulio; Sarkar, Subir; Sentenac, Daniel; Serban, Alin Titus; Slav, Adrian; Soldani, A; Spagnolo, Paolo; Tenchini, Roberto; Tolaini, Sergio; Venturi, Andrea; Verdini, Piero Giorgio; Vos, Marcel; Zaccarelli, Luciano; Avanzini, Carlo; Basti, Andrea; Benucci, Leonardo; Bocci, Andrea; Cazzola, Ugo; Fiori, Francesco; Linari, Stefano; Massa, Maurizio; Messineo, Alberto; Segneri, Gabriele; Tonelli, Guido; Azzurri, Paolo; Bernardini, Jacopo; Borrello, Laura; Calzolari, Federico; Foa, Lorenzo; Gennai, Simone; Ligabue, Franco; Petrucciani, Giovanni; Rizzi, Andrea; Yang, Zong-Chang; Benotto, Franco; Demaria, Natale; Dumitrache, Floarea; Farano, R; Borgia, Maria Assunta; Castello, Roberto; Costa, Marco; Migliore, Ernesto; Romero, Alessandra; Abbaneo, Duccio; Abbas, M; Ahmed, Ijaz; Akhtar, I; Albert, Eric; Bloch, Christoph; Breuker, Horst; Butt, Shahid Aleem; Buchmuller, Oliver; Cattai, Ariella; Delaere, Christophe; Delattre, Michel; Edera, Laura Maria; Engstrom, Pauli; Eppard, Michael; Gateau, Maryline; Gill, Karl; Giolo-Nicollerat, Anne-Sylvie; Grabit, Robert; Honma, Alan; Huhtinen, Mika; Kloukinas, Kostas; Kortesmaa, Jarmo; Kottelat, Luc-Joseph; Kuronen, Auli; Leonardo, Nuno; Ljuslin, Christer; Mannelli, Marcello; Masetti, Lorenzo; Marchioro, Alessandro; Mersi, Stefano; Michal, Sebastien; Mirabito, Laurent; Muffat-Joly, Jeannine; Onnela, Antti; Paillard, Christian; Pal, Imre; Pernot, Jean-Francois; Petagna, Paolo; Petit, Patrick; Piccut, C; Pioppi, Michele; Postema, Hans; Ranieri, Riccardo; Ricci, Daniel; Rolandi, Gigi; Ronga, Frederic Jean; Sigaud, Christophe; Syed, A; Siegrist, Patrice; Tropea, Paola; Troska, Jan; Tsirou, Andromachi; Vander Donckt, Muriel; Vasey, François; Alagoz, Enver; Amsler, Claude; Chiochia, Vincenzo; Regenfus, Christian; Robmann, Peter; Rochet, Jacky; Rommerskirchen, Tanja; Schmidt, Alexander; Steiner, Stefan; Wilke, Lotte; Church, Ivan; Cole, Joanne; Coughlan, John A; Gay, Arnaud; Taghavi, S; Tomalin, Ian R; Bainbridge, Robert; Cripps, Nicholas; Fulcher, Jonathan; Hall, Geoffrey; Noy, Matthew; Pesaresi, Mark; Radicci, Valeria; Raymond, David Mark; Sharp, Peter; Stoye, Markus; Wingham, Matthew; Zorba, Osman; Goitom, Israel; Hobson, Peter R; Reid, Ivan; Teodorescu, Liliana; Hanson, Gail; Jeng, Geng-Yuan; Liu, Haidong; Pasztor, Gabriella; Satpathy, Asish; Stringer, Robert; Mangano, Boris; Affolder, K; Affolder, T; Allen, Andrea; Barge, Derek; Burke, Samuel; Callahan, D; Campagnari, Claudio; Crook, A; D'Alfonso, Mariarosaria; Dietch, J; Garberson, Jeffrey; Hale, David; Incandela, H; Incandela, Joe; Jaditz, Stephen; Kalavase, Puneeth; Kreyer, Steven Lawrence; Kyre, Susanne; Lamb, James; Mc Guinness, C; Mills, C; Nguyen, Harold; Nikolic, Milan; Lowette, Steven; Rebassoo, Finn; Ribnik, Jacob; Richman, Jeffrey; Rubinstein, Noah; Sanhueza, S; Shah, Yousaf Syed; Simms, L; Staszak, D; Stoner, J; Stuart, David; Swain, Sanjay Kumar; Vlimant, Jean-Roch; White, Dean; Ulmer, Keith; Wagner, Stephen Robert; Bagby, Linda; Bhat, Pushpalatha C; Burkett, Kevin; Cihangir, Selcuk; Gutsche, Oliver; Jensen, Hans; Johnson, Mark; Luzhetskiy, Nikolay; Mason, David; Miao, Ting; Moccia, Stefano; Noeding, Carsten; Ronzhin, Anatoly; Skup, Ewa; Spalding, William J; Spiegel, Leonard; Tkaczyk, Slawek; Yumiceva, Francisco; Zatserklyaniy, Andriy; Zerev, E; Anghel, Ioana Maria; Bazterra, Victor Eduardo; Gerber, Cecilia Elena; Khalatian, S; Shabalina, Elizaveta; Baringer, Philip; Bean, Alice; Chen, Jie; Hinchey, Carl Louis; Martin, Christophe; Moulik, Tania; Robinson, Richard; Gritsan, Andrei; Lae, Chung Khim; Tran, Nhan Viet; Everaerts, Pieter; Hahn, Kristan Allan; Harris, Philip; Nahn, Steve; Rudolph, Matthew; Sung, Kevin; Betchart, Burton; Demina, Regina; Gotra, Yury; Korjenevski, Sergey; Miner, Daniel Carl; Orbaker, Douglas; Christofek, Leonard; Hooper, Ryan; Landsberg, Greg; Nguyen, Duong; Narain, Meenakshi; Speer, Thomas; Tsang, Ka Vang

    2009-01-01

    In March 2007 the assembly of the Silicon Strip Tracker was completed at the Tracker Integration Facility at CERN. Nearly 15% of the detector was instrumented using cables, fiber optics, power supplies, and electronics intended for the operation at the LHC. A local chiller was used to circulate the coolant for low temperature operation. In order to understand the efficiency and alignment of the strip tracker modules, a cosmic ray trigger was implemented. From March through July 4.5 million triggers were recorded. This period, referred to as the Sector Test, provided practical experience with the operation of the Tracker, especially safety, data acquisition, power, and cooling systems. This paper describes the performance of the strip system during the Sector Test, which consisted of five distinct periods defined by the coolant temperature. Significant emphasis is placed on comparisons between the data and results from Monte Carlo studies.

  10. Learning to interact with a computer by gaze

    DEFF Research Database (Denmark)

    Aoki, Hirotaka; Hansen, John Paulin; Itoh, Kenji

    2008-01-01

    that inefficient eye movements was dramatically reduced after only 15 to 25 sentences of typing, equal to approximately 3-4 hours of practice. The performance data fits a general learning model based on the power law of practice. The learning model can be used to estimate further improvements in gaze typing...

  11. Gaze differences in processing pictures with emotional content.

    Science.gov (United States)

    Budimir, Sanja; Palmović, Marijan

    2011-01-01

    The International Affective Picture System (IAPS) is a set of standardized emotionally evocative color photographs developed by NIMH Center for Emotion and Attention at the University of Florida. It contains more than 900 emotional pictures indexed by emotional valence, arousal and dominance. However, when IAPS pictures were used in studying emotions with the event-related potentials, the results have shown a great deal of variation and inconsistency. In this research arousal and dominance of pictures were controlled while emotional valence was manipulated as 3 categories, pleasant, neutral and unpleasant pictures. Two experiments were conducted with an eye-tracker in order to determine to what the participants turn their gaze. Participants were 25 psychology students with normal vision. Every participant saw all pictures in color and same pictures in black/white version. This makes 200 analyzed units for color pictures and 200 for black and white pictures. Every picture was divided into figure and ground. Considering that perception can be influenced by color, edges, luminosity and contrast and since all those factors are collapsed on the pictures in IAPS, we compared color pictures with same black and white pictures. In first eye-tracking IAPS research we analyzed 12 emotional pictures and showed that participants have higher number of fixations for ground on neutral and unpleasant pictures and for figure on pleasant pictures. Second experiment was conducted with 4 sets of emotional complementary pictures (pleasant/unpleasant) which differ only on the content in the figure area and it was shown that participants were more focused on the figure area than on the ground area. Future ERP (event related potential) research with IAPS pictures should take into consideration these findings and to either choose pictures with blank ground or adjust pictures in the way that ground is blank. For the following experiments suggestion is to put emotional content in the figure

  12. Last ATLAS transition radiation tracker module installed

    CERN Multimedia

    Maximilien Brice

    2005-01-01

    The ATLAS transition radiation tracker consists of 96 modules and will join the pixel detector and silicon tracker at the heart of the experiment to map the trajectories of particles and identify electrons produced when proton beams collide. In the last image the team responsible for assembly are shown from left to right: Kirill Egorov (Petersburg Nuclear Physics Institute), Pauline Gagnon (Indiana University), Ben Legeyt (University of Pennsylvania), Chuck Long (Hampton University), John Callahan (Indiana University) and Alex High (University of Pennsylvania).

  13. The influence of banner advertisements on attention and memory: human faces with averted gaze can enhance advertising effectiveness.

    Science.gov (United States)

    Sajjacholapunt, Pitch; Ball, Linden J

    2014-01-01

    Research suggests that banner advertisements used in online marketing are often overlooked, especially when positioned horizontally on webpages. Such inattention invariably gives rise to an inability to remember advertising brands and messages, undermining the effectiveness of this marketing method. Recent interest has focused on whether human faces within banner advertisements can increase attention to the information they contain, since the gaze cues conveyed by faces can influence where observers look. We report an experiment that investigated the efficacy of faces located in banner advertisements to enhance the attentional processing and memorability of banner contents. We tracked participants' eye movements when they examined webpages containing either bottom-right vertical banners or bottom-center horizontal banners. We also manipulated facial information such that banners either contained no face, a face with mutual gaze or a face with averted gaze. We additionally assessed people's memories for brands and advertising messages. Results indicated that relative to other conditions, the condition involving faces with averted gaze increased attention to the banner overall, as well as to the advertising text and product. Memorability of the brand and advertising message was also enhanced. Conversely, in the condition involving faces with mutual gaze, the focus of attention was localized more on the face region rather than on the text or product, weakening any memory benefits for the brand and advertising message. This detrimental impact of mutual gaze on attention to advertised products was especially marked for vertical banners. These results demonstrate that the inclusion of human faces with averted gaze in banner advertisements provides a promising means for marketers to increase the attention paid to such adverts, thereby enhancing memory for advertising information.

  14. The influence of banner advertisements on attention and memory: Human faces with averted gaze can enhance advertising effectiveness

    Directory of Open Access Journals (Sweden)

    Pitch eSajjacholapunt

    2014-03-01

    Full Text Available Research suggests that banner advertisements used in online marketing are often overlooked, especially when positioned horizontally on webpages. Such inattention invariably gives rise to an inability to remember advertising brands and messages, undermining the effectiveness of this marketing method. Recent interest has focused on whether human faces within banner advertisements can increase attention to the information they contain, since the gaze cues conveyed by faces can influence where observers look. We report an experiment that investigated the efficacy of faces located in banner advertisements to enhance the attentional processing and memorability of banner contents. We tracked participants’ eye movements when they examined webpages containing either bottom-right vertical banners or bottom-centre horizontal banners. We also manipulated facial information such that banners either contained no face, a face with mutual gaze or a face with averted gaze. We additionally assessed people’s memories for brands and advertising messages. Results indicated that relative to other conditions, the condition involving faces with averted gaze increased attention to the banner overall, as well as to the advertising text and product. Memorability of the brand and advertising message was also enhanced. Conversely, in the condition involving faces with mutual gaze, the focus of attention was localised more on the face region rather than on the text or product, weakening any memory benefits for the brand and advertising message. This detrimental impact of mutual gaze on attention to advertised products was especially marked for vertical banners. These results demonstrate that the inclusion of human faces with averted gaze in banner advertisements provides a promising means for marketers to increase the attention paid to such adverts, thereby enhancing memory for advertising information.

  15. The Wireless ATM Architecture

    Directory of Open Access Journals (Sweden)

    R. Palitefka

    1998-06-01

    Full Text Available An overview of the proposed wireless ATM structure is provided. Wireless communication have been developed to a level where offered services can now be extended beyond voice and data. There are already wireless LANs, cordless systems offering data services and mobile data. Wireless LAN systems are basically planned for local, on-promises and in-house networking providing short distance radio or infrared links between computer system. The main challenge of wireless ATM is to harmonise the development of broadband wireless system with service B -ISDN/ATM and ATM LANs, and offer multimedia multiservice features for the support of time-sensitive voice communication, video, desktop multimedia applications, and LAN data traffic for the wireless user.

  16. A novel attention training paradigm based on operant conditioning of eye gaze: Preliminary findings.

    Science.gov (United States)

    Price, Rebecca B; Greven, Inez M; Siegle, Greg J; Koster, Ernst H W; De Raedt, Rudi

    2016-02-01

    Inability to engage with positive stimuli is a widespread problem associated with negative mood states across many conditions, from low self-esteem to anhedonic depression. Though attention retraining procedures have shown promise as interventions in some clinical populations, novel procedures may be necessary to reliably attenuate chronic negative mood in refractory clinical populations (e.g., clinical depression) through, for example, more active, adaptive learning processes. In addition, a focus on individual difference variables predicting intervention outcome may improve the ability to provide such targeted interventions efficiently. To provide preliminary proof-of-principle, we tested a novel paradigm using operant conditioning to train eye gaze patterns toward happy faces. Thirty-two healthy undergraduates were randomized to receive operant conditioning of eye gaze toward happy faces (train-happy) or neutral faces (train-neutral). At the group level, the train-happy condition attenuated sad mood increases following a stressful task, in comparison to train-neutral. In individual differences analysis, greater physiological reactivity (pupil dilation) in response to happy faces (during an emotional face-search task at baseline) predicted decreased mood reactivity after stress. These Preliminary results suggest that operant conditioning of eye gaze toward happy faces buffers against stress-induced effects on mood, particularly in individuals who show sufficient baseline neural engagement with happy faces. Eye gaze patterns to emotional face arrays may have a causal relationship with mood reactivity. Personalized medicine research in depression may benefit from novel cognitive training paradigms that shape eye gaze patterns through feedback. Baseline neural function (pupil dilation) may be a key mechanism, aiding in iterative refinement of this approach. (c) 2016 APA, all rights reserved).

  17. Studying the influence of race on the gaze cueing effect using eye tracking method

    Directory of Open Access Journals (Sweden)

    Galina Ya. Menshikova

    2017-06-01

    Full Text Available The gaze direction of another person is an important social cue, allowing us to orient quickly in social interactions. The effect of short-term redirection of visual attention to the same object that other people are looking at is known as the gaze cueing effect. There is evidence that the strength of this effect depends on many social factors, such as the trust in a partner, her/his gender, social attitudes, etc. In our study we investigated the influence of race of face stimuli on the strength of the gaze cueing effect. Using the modified Posner Cueing Task an attentional shift was assessed in a scene where avatar faces of different race were used as distractors. Participants were instructed to fix the black dot in the centre of the screen until it changes colour, and then as soon as possible to make a rightward or leftward saccade, depending on colour of a fixed point. A male distractor face was shown in the centre of the screen simultaneously with a fixed point. The gaze direction of the distractor face changed from straight ahead to rightward or leftward at the moment when colour of a fixed point changed. It could be either congruent or incongruent with the saccade direction. We used face distractors of three race categories: Caucasian (own race faces, Asian and African (other race faces. Twenty five Caucasian participants took part in our study. The results showed that the race of face distractors influence the strength of the gaze cueing effect, that manifested in the change of latency and velocity of the ongoing saccades.

  18. There is more to gaze than meets the eye: How animals perceive the visual behaviour of others

    NARCIS (Netherlands)

    Goossens, B.M.A.

    2008-01-01

    Gaze following and the ability to understand that another individual sees something different from oneself are considered important components of human and animal social cognition. In animals, gaze following has been documented in various species, however, the underlying cognitive mechanisms and the

  19. Gaze cuing of attention in snake phobic women: the influence of facial expression

    Directory of Open Access Journals (Sweden)

    Carolina ePletti

    2015-04-01

    Full Text Available Only a few studies investigated whether animal phobics exhibit attentional biases in contexts where no phobic stimuli are present. Among these, recent studies provided evidence for a bias toward facial expressions of fear and disgust in animal phobics. Such findings may be due to the fact that these expressions could signal the presence of a phobic object in the surroundings. To test this hypothesis and further investigate attentional biases for emotional faces in animal phobics, we conducted an experiment using a gaze-cuing paradigm in which participants’ attention was driven by the task-irrelevant gaze of a centrally presented face. We employed dynamic negative facial expressions of disgust, fear and anger and found an enhanced gaze-cuing effect in snake phobics as compared to controls, irrespective of facial expression. These results provide evidence of a general hypervigilance in animal phobics in the absence of phobic stimuli, and indicate that research on specific phobias should not be limited to symptom provocation paradigms.

  20. Radiation hard silicon sensors for the CMS tracker upgrade

    CERN Document Server

    Pohlsen, Thomas

    2013-01-01

    At an instantaneous luminosity of $5 \\times 10^{34}$ cm$^{-2}$ s$^{-1}$, the high-luminosity phase of the Large Hadron Collider (HL-LHC) is expected to deliver a total of $3\\,000$ fb$^{-1}$ of collisions, hereby increasing the discovery potential of the LHC experiments significantly. However, the radiation dose of the tracking systems will be severe, requiring new radiation hard sensors for the CMS tracker. The CMS tracker collaboration has initiated a large material investigation and irradiation campaign to identify the silicon material and design that fulfils all requirements for detectors for the HL-LHC. Focussing on the upgrade of the outer tracker region, pad sensors as well as fully functional strip sensors have been implemented on silicon wafers with different material properties and thicknesses. The samples were irradiated with a mixture of neutrons and protons corresponding to fluences as expected for the positions of detector layers in the future tracker. Different proton energies were used for irr...

  1. Gaze-independent BCI-spelling using rapid serial visual presentation (RSVP).

    Science.gov (United States)

    Acqualagna, Laura; Blankertz, Benjamin

    2013-05-01

    A Brain Computer Interface (BCI) speller is a communication device, which can be used by patients suffering from neurodegenerative diseases to select symbols in a computer application. For patients unable to overtly fixate the target symbol, it is crucial to develop a speller independent of gaze shifts. In the present online study, we investigated rapid serial visual presentation (RSVP) as a paradigm for mental typewriting. We investigated the RSVP speller in three conditions, regarding the Stimulus Onset Asynchrony (SOA) and the use of color features. A vocabulary of 30 symbols was presented one-by-one in a pseudo random sequence at the same location of display. All twelve participants were able to successfully operate the RSVP speller. The results show a mean online spelling rate of 1.43 symb/min and a mean symbol selection accuracy of 94.8% in the best condition. We conclude that the RSVP is a promising paradigm for BCI spelling and its performance is competitive with the fastest gaze-independent spellers in literature. The RSVP speller does not require gaze shifts towards different target locations and can be operated by non-spatial visual attention, therefore it can be considered as a valid paradigm in applications with patients for impaired oculo-motor control. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  2. Gaze stabilization in chronic vestibular-loss and in cerebellar ataxia: interactions of feedforward and sensory feedback mechanisms.

    Science.gov (United States)

    Sağlam, M; Lehnen, N

    2014-01-01

    During gaze shifts, humans can use visual, vestibular, and proprioceptive feedback, as well as feedforward mechanisms, for stabilization against active and passive head movements. The contributions of feedforward and sensory feedback control, and the role of the cerebellum, are still under debate. To quantify these contributions, we increased the head moment of inertia in three groups (ten healthy, five chronic vestibular-loss and nine cerebellar-ataxia patients) while they performed large gaze shifts to flashed targets in darkness. This induces undesired head oscillations. Consequently, both active (desired) and passive (undesired) head movements had to be compensated for to stabilize gaze. All groups compensated for active and passive head movements, vestibular-loss patients less than the other groups (P feedforward mechanisms substantially contribute to gaze stabilization. Proprioception alone is not sufficient (gain 0.2). Stabilization against active and passive head movements was not impaired in our cerebellar ataxia patients.

  3. Software alignment of the LHCb Outer Tracker chambers

    Energy Technology Data Exchange (ETDEWEB)

    Deissenroth, Marc

    2010-04-21

    This work presents an alignment algorithm that was developed to precisely determine the positions of the LHCb Outer Tracker detector elements. The algorithm is based on the reconstruction of tracks and exploits that misalignments of the detector change the residual between a measured hit and the reconstructed track. It considers different levels of granularities of the Outer Tracker geometry and fully accounts for correlations of all elements which are imposed by particle trajectories. In extensive tests, simulated shifts and rotations for different levels of the detector granularity have been used as input to the track reconstruction and alignment procedure. With about 260 000 tracks the misalignments are recovered with a statistical precision of O(10 - 100 {mu}m) for the translational degrees of freedom and of O(10{sup -2} - 10{sup -1} mrad) for rotations. A study has been performed to determine the impact of Outer Tracker misalignments on the performance of the track reconstruction algorithms. It shows that the achieved statistical precision does not decrease the track reconstruction performance in a significant way. During the commissioning of the LHCb detector, cosmic ray muon events have been collected. The events have been analysed and used for the first alignment of the 216 Outer Tracker modules. The module positions have been determined within {proportional_to} 90 {mu}m. The developed track based alignment algorithm has demonstrated its reliability and is one of the core algorithms which are used for the precise determination of the positions of the LHCb Outer Tracker elements. (orig.)

  4. Software alignment of the LHCb Outer Tracker chambers

    International Nuclear Information System (INIS)

    Deissenroth, Marc

    2010-01-01

    This work presents an alignment algorithm that was developed to precisely determine the positions of the LHCb Outer Tracker detector elements. The algorithm is based on the reconstruction of tracks and exploits that misalignments of the detector change the residual between a measured hit and the reconstructed track. It considers different levels of granularities of the Outer Tracker geometry and fully accounts for correlations of all elements which are imposed by particle trajectories. In extensive tests, simulated shifts and rotations for different levels of the detector granularity have been used as input to the track reconstruction and alignment procedure. With about 260 000 tracks the misalignments are recovered with a statistical precision of O(10 - 100 μm) for the translational degrees of freedom and of O(10 -2 - 10 -1 mrad) for rotations. A study has been performed to determine the impact of Outer Tracker misalignments on the performance of the track reconstruction algorithms. It shows that the achieved statistical precision does not decrease the track reconstruction performance in a significant way. During the commissioning of the LHCb detector, cosmic ray muon events have been collected. The events have been analysed and used for the first alignment of the 216 Outer Tracker modules. The module positions have been determined within ∝ 90 μm. The developed track based alignment algorithm has demonstrated its reliability and is one of the core algorithms which are used for the precise determination of the positions of the LHCb Outer Tracker elements. (orig.)

  5. Wireless Communication Technologies

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Wireless Communication Technologies. Since 1999, the wireless LAN has experienced a tremendous growth. Reasons: Adoption of industry standards. Interoperability testing. The progress of wireless equipments to higher data rates. Rapid decrease in product ...

  6. LHCb: LHCb Upstream Tracker

    CERN Multimedia

    Manning Jr, P; Stone, S

    2014-01-01

    The LHCb upgrade requires replacing the silicon strip tracker between the vertex locator and the magnet. A new design has been developed and tested based on the "stave" concept planned for the ATLAS upgrade. We will describe the new detector being constructed and show its improved performance in charged particle tracking and triggering.

  7. Visual perception during mirror gazing at one's own face in schizophrenia.

    Science.gov (United States)

    Caputo, Giovanni B; Ferrucci, Roberta; Bortolomasi, Marco; Giacopuzzi, Mario; Priori, Alberto; Zago, Stefano

    2012-09-01

    In normal observers gazing at one's own face in the mirror for some minutes, at a low illumination level, triggers the perception of strange faces, a new perceptual illusion that has been named 'strange-face in the mirror'. Subjects see distortions of their own faces, but often they see monsters, archetypical faces, faces of dead relatives, and of animals. We designed this study to primarily compare strange-face apparitions in response to mirror gazing in patients with schizophrenia and healthy controls. The study included 16 patients with schizophrenia and 21 healthy controls. In this paper we administered a 7 minute mirror gazing test (MGT). Before the mirror gazing session, all subjects underwent assessment with the Cardiff Anomalous Perception Scale (CAPS). When the 7minute MGT ended, the experimenter assessed patients and controls with a specifically designed questionnaire and interviewed them, asking them to describe strange-face perceptions. Apparitions of strange-faces in the mirror were significantly more intense in schizophrenic patients than in controls. All the following variables were higher in patients than in healthy controls: frequency (p<.005) and cumulative duration of apparitions (p<.009), number and types of strange-faces (p<.002), self-evaluation scores on Likert-type scales of apparition strength (p<.03) and of reality of apparitions (p<.001). In schizophrenic patients, these Likert-type scales showed correlations (p<.05) with CAPS total scores. These results suggest that the increase of strange-face apparitions in schizophrenia can be produced by ego dysfunction, by body dysmorphic disorder and by misattribution of self-agency. MGT may help in completing the standard assessment of patients with schizophrenia, independently of hallucinatory psychopathology. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. The Oxytocin Receptor Gene (OXTR) and gazing behavior during social interaction: An observational study in young adults

    NARCIS (Netherlands)

    Verhagen, M.; Engels, R.C.M.E.; Roekel, G.H. van

    2014-01-01

    Background: In the present study, the relation between a polymorphic marker within the OXTR gene (rs53576) and gazing behavior during two separate social interaction tasks was examined. Gazing behavior was considered to be an integral part of belonging regulation processes. Methods: We conducted an

  9. Gaze stability of observers watching Op Art pictures.

    Science.gov (United States)

    Zanker, Johannes M; Doyle, Melanie; Robin, Walker

    2003-01-01

    It has been the matter of some debate why we can experience vivid dynamic illusions when looking at static pictures composed from simple black and white patterns. The impression of illusory motion is particularly strong when viewing some of the works of 'Op Artists, such as Bridget Riley's painting Fall. Explanations of the illusory motion have ranged from retinal to cortical mechanisms, and an important role has been attributed to eye movements. To assess the possible contribution of eye movements to the illusory-motion percept we studied the strength of the illusion under different viewing conditions, and analysed the gaze stability of observers viewing the Riley painting and control patterns that do not produce the illusion. Whereas the illusion was reduced, but not abolished, when watching the painting through a pinhole, which reduces the effects of accommodation, it was not perceived in flash afterimages, suggesting an important role for eye movements in generating the illusion for this image. Recordings of eye movements revealed an abundance of small involuntary saccades when looking at the Riley pattern, despite the fact that gaze was kept within the dedicated fixation region. The frequency and particular characteristics of these rapid eye movements can vary considerably between different observers, but, although there was a tendency for gaze stability to deteriorate while viewing a Riley painting, there was no significant difference in saccade frequency between the stimulus and control patterns. Theoretical considerations indicate that such small image displacements can generate patterns of motion signals in a motion-detector network, which may serve as a simple and sufficient, but not necessarily exclusive, explanation for the illusion. Why such image displacements lead to perceptual results with a group of Op Art and similar patterns, but remain invisible for other stimuli, is discussed.

  10. Alignment of the CMS Silicon Strip Tracker during stand-alone Commissioning

    CERN Document Server

    Adam, W.; Dragicevic, M.; Friedl, M.; Fruhwirth, R.; Hansel, S.; Hrubec, J.; Krammer, M.; Oberegger, M.; Pernicka, M.; Schmid, S.; Stark, R.; Steininger, H.; Uhl, D.; Waltenberger, W.; Widl, E.; Van Mechelen, P.; Cardaci, M.; Beaumont, W.; de Langhe, E.; de Wolf, E.A.; Delmeire, E.; Hashemi, M.; Bouhali, O.; Charaf, O.; Clerbaux, B.; Dewulf, J.-P.; Elgammal, S.; Hammad, G.; de Lentdecker, G.; Marage, P.; Vander Velde, C.; Vanlaer, P.; Wickens, J.; Adler, V.; Devroede, O.; De Weirdt, S.; D'Hondt, J.; Goorens, R.; Heyninck, J.; Maes, J.; Mozer, Matthias Ulrich; Tavernier, S.; Van Lancker, L.; Van Mulders, P.; Villella, I.; Wastiels, C.; Bonnet, J.-L.; Bruno, G.; De Callatay, B.; Florins, B.; Giammanco, A.; Gregoire, G.; Keutgen, Th.; Kcira, D.; Lemaitre, V.; Michotte, D.; Militaru, O.; Piotrzkowski, K.; Quertermont, L.; Roberfroid, V.; Rouby, X.; Teyssier, D.; daubie, E.; Anttila, E.; Czellar, S.; Engstrom, P.; Harkonen, J.; Karimaki, V.; Kostesmaa, J.; Kuronen, A.; Lampen, T.; Linden, T.; Luukka, P.-R.; Maenaa, T.; Michal, S.; Tuominen, E.; Tuominiemi, J.; Ageron, M.; Baulieu, G.; Bonnevaux, A.; Boudoul, G.; Chabanat, E.; Chabert, E.; Chierici, R.; Contardo, D.; Della Negra, R.; Dupasquier, T.; Gelin, G.; Giraud, N.; Guillot, G.; Estre, N.; Haroutunian, R.; Lumb, N.; Perries, S.; Schirra, F.; Trocme, B.; Vanzetto, S.; Agram, J.-L.; Blaes, R.; Drouhin, F.; Ernenwein, J.-P.; Fontaine, J.-C.; Berst, J.-D.; Brom, J.-M.; Didierjean, F.; Goerlach, U.; Graehling, P.; Gross, L.; Hosselet, J.; Juillot, P.; Lounis, A.; Maazouzi, C.; Olivetto, C.; Strub, R.; Van Hove, P.; Anagnostou, G.; Brauer, R.; Esser, H.; Feld, L.; Karpinski, W.; Klein, K.; Kukulies, C.; Olzem, J.; Ostapchuk, A.; Pandoulas, D.; Pierschel, G.; Raupach, F.; Schael, S.; Schwering, G.; Sprenger, D.; Thomas, M.; Weber, M.; Wittmer, B.; Wlochal, M.; Beissel, F.; Bock, E.; Flugge, G.; Gillissen, C.; Hermanns, T.; Heydhausen, D.; Jahn, D.; Kaussen, G.; Linn, A.; Perchalla, L.; Poettgens, M.; Pooth, O.; Stahl, A.; Zoeller, M.H.; Buhmann, P.; Butz, E.; Flucke, G.; Hamdorf, R.; Hauk, J.; Klanner, R.; Pein, U.; Schleper, P.; Steinbruck, G.; Blum, P.; De Boer, W.; Dierlamm, A.; Dirkes, G.; Fahrer, M.; Frey, M.; Furgeri, A.; Hartmann, F.; Heier, S.; Hoffmann, K.-H.; Kaminski, J.; Ledermann, B.; Liamsuwan, T.; Muller, S.; Muller, Th.; Schilling, F.-P.; Simonis, H.-J.; Steck, P.; Zhukov, V.; Cariola, P.; De Robertis, G.; Ferorelli, R.; Fiore, L.; Preda, M.; Sala, G.; Silvestris, L.; Tempesta, P.; Zito, G.; Creanza, D.; De Filippis, N.; De Palma, M.; Giordano, D.; Maggi, G.; Manna, N.; My, S.; Selvaggi, G.; Albergo, S.; Chiorboli, M.; Costa, S.; Galanti, M.; Giudice, N.; Guardone, N.; Noto, F.; Potenza, R.; Saizu, M.A.; Sparti, V.; Sutera, C.; Tricomi, A.; Tuve, C.; Brianzi, M.; Civinini, C.; Maletta, F.; Manolescu, F.; Meschini, M.; Paoletti, S.; Sguazzoni, G.; Broccolo, B.; Ciulli, V.; D'Alessandro, R.; Focardi, E.; Frosali, S.; Genta, C.; Landi, G.; Lenzi, P.; Macchiolo, A.; Magini, N.; Parrini, G.; Scarlini, E.; Cerati, G.; Azzi, P.; Bacchetta, N.; Candelori, A.; Dorigo, T.; Kaminsky, A.; Karaevski, S.; Khomenkov, V.; Reznikov, S.; Tessaro, M.; Bisello, D.; De Mattia, M.; Giubilato, P.; Loreti, M.; Mattiazzo, S.; Nigro, M.; Paccagnella, A.; Pantano, D.; Pozzobon, N.; Tosi, M.; Bilei, G.M.; Checcucci, B.; Fano, L.; Servoli, L.; Ambroglini, F.; Babucci, E.; Benedetti, D.; Biasini, M.; Caponeri, B.; Covarelli, R.; Giorgi, M.; Lariccia, P.; Mantovani, G.; Marcantonini, M.; Postolache, V.; Santocchia, A.; Spiga, D.; Bagliesi, G.; Balestri, G.; Berretta, L.; Bianucci, S.; Boccali, T.; Bosi, F.; Bracci, F.; Castaldi, R.; Ceccanti, M.; Cecchi, R.; Cerri, C.; Cucoanes, A.S.; Dell'Orso, R.; Dobur, D.; Dutta, S.; Giassi, A.; Giusti, S.; Kartashov, D.; Kraan, A.; Lomtadze, T.; Lungu, G.A.; Magazzu, G.; Mammini, P.; Mariani, F.; Martinelli, G.; Moggi, A.; Palla, F.; Palmonari, F.; Petragnani, G.; Profeti, A.; Raffaelli, F.; Rizzi, D.; Sanguinetti, G.; Sarkar, S.; Sentenac, D.; Serban, A.T.; Slav, A.; Soldani, A.; Spagnolo, P.; Tenchini, R.; Tolaini, S.; Venturi, A.; Verdini, P.G.; Vos, M.; Zaccarelli, L.; Avanzini, C.; Basti, A.; Benucci, L.; Bocci, A.; Cazzola, U.; Fiori, F.; Linari, S.; Massa, M.; Messineo, A.; Segneri, G.; Tonelli, G.; Azzurri, P.; Bernardini, J.; Borrello, L.; Calzolari, F.; Foa, L.; Gennai, S.; Ligabue, F.; Petrucciani, G.; Rizzi, A.; Yang, Z.; Benotto, F.; Demaria, N.; Dumitrache, F.; Farano, R.; Borgia, M.A.; Castello, R.; Costa, M.; Migliore, E.; Romero, A.; Abbaneo, D.; Abbas, M.; Ahmed, I.; Akhtar, I.; Albert, E.; Bloch, C.; Breuker, H.; Butt, S.; Buchmuller, O.; Cattai, A.; Delaere, C.; Delattre, M.; Edera, L.M.; Engstrom, P.; Eppard, M.; Gateau, M.; Gill, K.; Giolo-Nicollerat, A.-S.; Grabit, R.; Honma, A.; Huhtinen, M.; Kloukinas, K.; Kortesmaa, J.; Kottelat, L.J.; Kuronen, A.; Leonardo, N.; Ljuslin, C.; Mannelli, M.; Masetti, L.; Marchioro, A.; Mersi, S.; Michal, S.; Mirabito, L.; Muffat-Joly, J.; Onnela, A.; Paillard, C.; Pal, I.; Pernot, J.F.; Petagna, P.; Petit, P.; Piccut, C.; Pioppi, M.; Postema, H.; Ranieri, R.; Ricci, D.; Rolandi, G.; Ronga, F.; Sigaud, C.; Syed, A.; Siegrist, P.; Tropea, P.; Troska, J.; Tsirou, A.; Vander Donckt, M.; Vasey, F.; Alagoz, E.; Amsler, Claude; Chiochia, V.; Regenfus, Christian; Robmann, P.; Rochet, J.; Rommerskirchen, T.; Schmidt, A.; Steiner, S.; Wilke, L.; Church, I.; Cole, J.; Coughlan, J.; Gay, A.; Taghavi, S.; Tomalin, I.; Bainbridge, R.; Cripps, N.; Fulcher, J.; Hall, G.; Noy, M.; Pesaresi, M.; Radicci, V.; Raymond, D.M.; Sharp, P.; Stoye, M.; Wingham, M.; Zorba, O.; Goitom, I.; Hobson, P.R.; Reid, I.; Teodorescu, L.; Hanson, G.; Jeng, G.-Y.; Liu, H.; Pasztor, G.; Satpathy, A.; Stringer, R.; Mangano, B.; Affolder, K.; Affolder, T.; Allen, A.; Barge, D.; Burke, S.; Callahan, D.; Campagnari, C.; Crook, A.; D'Alfonso, M.; Dietch, J.; Garberson, Jeffrey Ford; Hale, D.; Incandela, H.; Incandela, J.; Jaditz, S.; Kalavase, P.; Kreyer, S.; Kyre, S.; Lamb, J.; Mc Guinnessr, C.; Mills, C.; Nguyen, H.; Nikolic, M.; Lowette, S.; Rebassoo, F.; Ribnik, J.; Richman, J.; Rubinstein, N.; Sanhueza, S.; Shah, Y.; Simms, L.; Staszak, D.; Stoner, J.; Stuart, D.; Swain, S.; Vlimant, J.-R.; White, D.; Ulmer, K.A.; Wagner, S.R.; Bagby, L.; Bhat, P.C.; Burkett, K.; Cihangir, S.; Gutsche, O.; Jensen, H.; Johnson, M.; Luzhetskiy, N.; Mason, D.; Miao, T.; Moccia, S.; Noeding, C.; Ronzhin, A.; Skup, E.; Spalding, W.J.; Spiegel, L.; Tkaczyk, S.; Yumiceva, F.; Zatserklyaniy, A.; Zerev, E.; Anghel, I.; Bazterra, V.E.; Gerber, C.E.; Khalatian, S.; Shabalina, E.; Baringer, Philip S.; Bean, A.; Chen, J.; Hinchey, C.; Martin, C.; Moulik, T.; Robinson, R.; Gritsan, A.V.; Lae, C.K.; Tran, N.V.; Everaerts, P.; Hahn, K.A.; Harris, P.; Nahn, S.; Rudolph, M.; Sung, K.; Betchart, B.; Demina, R.; Gotra, Y.; Korjenevski, S.; Miner, D.; Orbaker, D.; Christofek, L.; Hooper, R.; Landsberg, G.; Nguyen, D.; Narain, M.; Speer, T.; Tsang, K.V.

    2009-01-01

    The results of the CMS tracker alignment analysis are presented using the data from cosmic tracks, optical survey information, and the laser alignment system at the Tracker Integration Facility at CERN. During several months of operation in the spring and summer of 2007, about five million cosmic track events were collected with a partially active CMS Tracker. This allowed us to perform first alignment of the active silicon modules with the cosmic tracks using three different statistical approaches; validate the survey and laser alignment system performance; and test the stability of Tracker structures under various stresses and temperatures ranging from +15C to -15C. Comparison with simulation shows that the achieved alignment precision in the barrel part of the tracker leads to residual distributions similar to those obtained with a random misalignment of 50 (80) microns in the outer (inner) part of the barrel.

  11. How do we update faces? Effects of gaze direction and facial expressions on working memory updating

    Directory of Open Access Journals (Sweden)

    Caterina eArtuso

    2012-09-01

    Full Text Available The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM. We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g. joy, while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g. fear. Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g. joy-direct gaze were compared to low binding conditions (e.g. joy-averted gaze. Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes.

  12. How do we update faces? Effects of gaze direction and facial expressions on working memory updating.

    Science.gov (United States)

    Artuso, Caterina; Palladino, Paola; Ricciardelli, Paola

    2012-01-01

    The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM). We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g., joy), while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g., fear). Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g., joy-direct gaze) were compared to low binding conditions (e.g., joy-averted gaze). Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes.

  13. Self-Monitoring of Gaze in High Functioning Autism

    Science.gov (United States)

    Grynszpan, Ouriel; Nadel, Jacqueline; Martin, Jean-Claude; Simonin, Jerome; Bailleul, Pauline; Wang, Yun; Gepner, Daniel; Le Barillier, Florence; Constant, Jacques

    2012-01-01

    Atypical visual behaviour has been recently proposed to account for much of social misunderstanding in autism. Using an eye-tracking system and a gaze-contingent lens display, the present study explores self-monitoring of eye motion in two conditions: free visual exploration and guided exploration via blurring the visual field except for the focal…

  14. 3D Part-Based Sparse Tracker with Automatic Synchronization and Registration

    KAUST Repository

    Bibi, Adel Aamer; Zhang, Tianzhu; Ghanem, Bernard

    2016-01-01

    In this paper, we present a part-based sparse tracker in a particle filter framework where both the motion and appearance model are formulated in 3D. The motion model is adaptive and directed according to a simple yet powerful occlusion handling paradigm, which is intrinsically fused in the motion model. Also, since 3D trackers are sensitive to synchronization and registration noise in the RGB and depth streams, we propose automated methods to solve these two issues. Extensive experiments are conducted on a popular RGBD tracking benchmark, which demonstrate that our tracker can achieve superior results, outperforming many other recent and state-of-the-art RGBD trackers.

  15. 3D Part-Based Sparse Tracker with Automatic Synchronization and Registration

    KAUST Repository

    Bibi, Adel Aamer

    2016-12-13

    In this paper, we present a part-based sparse tracker in a particle filter framework where both the motion and appearance model are formulated in 3D. The motion model is adaptive and directed according to a simple yet powerful occlusion handling paradigm, which is intrinsically fused in the motion model. Also, since 3D trackers are sensitive to synchronization and registration noise in the RGB and depth streams, we propose automated methods to solve these two issues. Extensive experiments are conducted on a popular RGBD tracking benchmark, which demonstrate that our tracker can achieve superior results, outperforming many other recent and state-of-the-art RGBD trackers.

  16. Cortical Activation during Landmark-Centered vs. Gaze-Centered Memory of Saccade Targets in the Human: An FMRI Study

    Directory of Open Access Journals (Sweden)

    Ying Chen

    2017-06-01

    Full Text Available A remembered saccade target could be encoded in egocentric coordinates such as gaze-centered, or relative to some external allocentric landmark that is independent of the target or gaze (landmark-centered. In comparison to egocentric mechanisms, very little is known about such a landmark-centered representation. Here, we used an event-related fMRI design to identify brain areas supporting these two types of spatial coding (i.e., landmark-centered vs. gaze-centered for target memory during the Delay phase where only target location, not saccade direction, was specified. The paradigm included three tasks with identical display of visual stimuli but different auditory instructions: Landmark Saccade (remember target location relative to a visual landmark, independent of gaze, Control Saccade (remember original target location relative to gaze fixation, independent of the landmark, and a non-spatial control, Color Report (report target color. During the Delay phase, the Control and Landmark Saccade tasks activated overlapping areas in posterior parietal cortex (PPC and frontal cortex as compared to the color control, but with higher activation in PPC for target coding in the Control Saccade task and higher activation in temporal and occipital cortex for target coding in Landmark Saccade task. Gaze-centered directional selectivity was observed in superior occipital gyrus and inferior occipital gyrus, whereas landmark-centered directional selectivity was observed in precuneus and midposterior intraparietal sulcus. During the Response phase after saccade direction was specified, the parietofrontal network in the left hemisphere showed higher activation for rightward than leftward saccades. Our results suggest that cortical activation for coding saccade target direction relative to a visual landmark differs from gaze-centered directional selectivity for target memory, from the mechanisms for other types of allocentric tasks, and from the directionally

  17. Teaching Astronomy Using Tracker

    Science.gov (United States)

    Belloni, Mario; Christian, Wolfgang; Brown, Douglas

    2013-01-01

    A recent paper in this journal presented a set of innovative uses of video analysis for introductory physics using Tracker. In addition, numerous other papers have described how video analysis can be a meaningful part of introductory courses. Yet despite this, there are few resources for using video analysis in introductory astronomy classes. In…

  18. Prototype ATLAS straw tracker

    CERN Multimedia

    Laurent Guiraud

    1998-01-01

    This is an early prototype of the straw tracking device for the ATLAS detector at CERN. This detector will be part of the LHC project, scheduled to start operation in 2008. The straw tracker will consist of thousands of gas-filled straws, each containing a wire, allowing the tracks of particles to be followed.

  19. Alignment of Ion Accelerator for Surface Analysis using Theodolite and Laser Tracker

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Tae Sung; Seo, Dong Hyuk; Kim, Dae Il; Kim, Han Sung; Kwon, Hyeok Jung; Cho, Yong Sub [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The method of ion accelerator alignment is used two ways which are a theodolite and laser tracker. For the alignment and maintenance of the proton linear accelerator, the laser tracker is typically used at KOMAC. While the device for alignment by using laser tracker is not installed in all ion accelerator components, it was used in parallel in two methods. In this paper, alignment methods are introduced and the result and comparison of each alignment method are presented. The ion accelerator for surface analysis has aligned using theodolite and laser tracker. The two ways for alignment have advantage as well as weakness. But alignment using laser tracker is stronger than using theodolite. Because it is based on alignment and position data and it is more detailed. Also since the beam distribution is smaller than accelerator component that is direction of beam progress, main component (ex. Magnet, Chamber, Pelletron tank, etc.) alignment using laser tracker is enough to align the ion accelerator.

  20. Novel approach to improve the attitude update rate of a star tracker.

    Science.gov (United States)

    Zhang, Shuo; Xing, Fei; Sun, Ting; You, Zheng; Wei, Minsong

    2018-03-05

    The star tracker is widely used in attitude control systems of spacecraft for attitude measurement. The attitude update rate of a star tracker is important to guarantee the attitude control performance. In this paper, we propose a novel approach to improve the attitude update rate of a star tracker. The electronic Rolling Shutter (RS) imaging mode of the complementary metal-oxide semiconductor (CMOS) image sensor in the star tracker is applied to acquire star images in which the star spots are exposed with row-to-row time offsets, thereby reflecting the rotation of star tracker at different times. The attitude estimation method with a single star spot is developed to realize the multiple attitude updates by a star image, so as to reach a high update rate. The simulation and experiment are performed to verify the proposed approaches. The test results demonstrate that the proposed approach is effective and the attitude update rate of a star tracker is increased significantly.

  1. SOLAR TRACKER CERDAS DAN MURAH BERBASIS MIKROKONTROLER 8 BIT ATMega8535

    OpenAIRE

    I Wayan Sutaya; Ketut Udy Ariawan

    2016-01-01

    prototipe produk solar tracker cerdas berbasis mikrokontroler AVR 8 bit. Solar tracker ini memasukkan filter digital IIR (Infinite Impulse Response) pada bagian program. Memprogram filter ini membutuhkan perkalian 32 bit sedangkan prosesor yang tersedia pada mikrokontroler yang dipakai adalah 8 bit. Proses perkalian ini hanya bisa dilakukan pada mikrokontroler 8 bit dengan menggunakan bahasa assembly yang merupakan bahasa level hardware. Solar tracker cerdas yang menggunakan mikrokontroler 8 ...

  2. The effect of arousal and eye gaze direction on trust evaluations of stranger's faces: A potential pathway to paranoid thinking.

    Science.gov (United States)

    Abbott, Jennie; Middlemiss, Megan; Bruce, Vicki; Smailes, David; Dudley, Robert

    2018-09-01

    When asked to evaluate faces of strangers, people with paranoia show a tendency to rate others as less trustworthy. The present study investigated the impact of arousal on this interpersonal bias, and whether this bias was specific to evaluations of trust or additionally affected other trait judgements. The study also examined the impact of eye gaze direction, as direct eye gaze has been shown to heighten arousal. In two experiments, non-clinical participants completed face rating tasks before and after either an arousal manipulation or control manipulation. Experiment one examined the effects of heightened arousal on judgements of trustworthiness. Experiment two examined the specificity of the bias, and the impact of gaze direction. Experiment one indicated that the arousal manipulation led to lower trustworthiness ratings. Experiment two showed that heightened arousal reduced trust evaluations of trustworthy faces, particularly trustworthy faces with averted gaze. The control group rated trustworthy faces with direct gaze as more trustworthy post-manipulation. There was some evidence that attractiveness ratings were affected similarly to the trust judgements, whereas judgements of intelligence were not affected by higher arousal. In both studies, participants reported low levels of arousal even after the manipulation and the use of a non-clinical sample limits the generalisability to clinical samples. There is a complex interplay between arousal, evaluations of trustworthiness and gaze direction. Heightened arousal influences judgements of trustworthiness, but within the context of face type and gaze direction. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. CMS tracker observes muons

    CERN Multimedia

    2006-01-01

    A computer image of a cosmic ray traversing the many layers of the TEC+ silicon sensors. The first cosmic muon tracks have been observed in one of the CMS tracker endcaps. On 14 March, a sector on one of the two large tracker endcaps underwent a cosmic muon run. Since then, thousands of tracks have been recorded. These data will be used not only to study the tracking, but also to exercise various track alignment algorithms The endcap tested, called the TEC+, is under construction at RWTH Aachen in Germany. The endcaps have a modular design, with silicon strip modules mounted onto wedge-shaped carbon fibre support plates, so-called petals. Up to 28 modules are arranged in radial rings on both sides of these plates. One eighth of an endcap is populated with 18 petals and called a sector. The next major step is a test of the first sector at CMS operating conditions, with the silicon modules at a temperature below -10°C. Afterwards, the remaining seven sectors have to be integrated. In autumn 2006, TEC+ wil...

  4. ATLAS Silicon Microstrip Tracker

    CERN Document Server

    Haefner, Petra; The ATLAS collaboration

    2010-01-01

    The SemiConductor Tracker (SCT), made up from silicon micro-strip detectors is the key precision tracking device in ATLAS, one of the experiments at CERN LHC. The completed SCT is in very good shape: 99.3% of the SCT strips are operational, noise occupancy and hit efficiency exceed the design specifications. In the talk the current status of the SCT will be reviewed. We will report on the operation of the detector and observed problems, with stress on the sensor and electronics performance. TWEPP Summary In December 2009 the ATLAS experiment at the CERN Large Hadron Collider (LHC) recorded the first proton- proton collisions at a centre-of-mass energy of 900 GeV and this was followed by the unprecedented energy of 7 TeV in March 2010. The SemiConductor Tracker (SCT) is the key precision tracking device in ATLAS, made up from silicon micro-strip detectors processed in the planar p-in-n technology. The signal from the strips is processed in the front-end ASICS ABCD3TA, working in the binary readout mode. Data i...

  5. Wireless security in mobile health.

    Science.gov (United States)

    Osunmuyiwa, Olufolabi; Ulusoy, Ali Hakan

    2012-12-01

    Mobile health (m-health) is an extremely broad term that embraces mobile communication in the health sector and data packaging. The four broad categories of wireless networks are wireless personal area network, wireless metropolitan area network, wireless wide area network, and wireless local area network. Wireless local area network is the most notable of the wireless networking tools obtainable in the health sector. Transfer of delicate and critical information on radio frequencies should be secure, and the right to use must be meticulous. This article covers the business opportunities in m-health, threats faced by wireless networks in hospitals, and methods of mitigating these threats.

  6. WOMAN AS OBJECT OF MALE GAZE IN SOME WORKS OF ...

    African Journals Online (AJOL)

    Nkiruka

    marketing and sale of the product, but also an object of male gaze. ... Edward Manet, whose painting Olympia, thought to be inspired by Titian‟s ... encountered in Western art history, whereas unidentifiable nude males were infrequently.

  7. CMS tracker towards the HL-LHC

    CERN Document Server

    Alunni Solestizi, Luisa

    2015-01-01

    In sight of the incoming new LHC era (High Luminosity - LHC), characterized by a jump forward in the precision boundary and in the event rate, all the CMS sub-detector are developing and studying innovative strategies of trigger, pattern recognition, event timing and so on. A crucial aspect will be the online event selection: a totally new paradigm is needed, given the huge amount of events. In this picture the most granular and innermost sub-detector, the tracker, will play a decisive role. The phase-2 tracker will be involved in the L1 Trigger and, taking advantage of both the Associative Memories and the FPGA, it can ensure a trigger decision in proper time and with satisfactory performances.

  8. Keeping Your Eye on the Rail: Gaze Behaviour of Horse Riders Approaching a Jump

    Science.gov (United States)

    Hall, Carol; Varley, Ian; Kay, Rachel; Crundall, David

    2014-01-01

    The gaze behaviour of riders during their approach to a jump was investigated using a mobile eye tracking device (ASL Mobile Eye). The timing, frequency and duration of fixations on the jump and the percentage of time when their point of gaze (POG) was located elsewhere were assessed. Fixations were identified when the POG remained on the jump for 100 ms or longer. The jumping skill of experienced but non-elite riders (n = 10) was assessed by means of a questionnaire. Their gaze behaviour was recorded as they completed a course of three identical jumps five times. The speed and timing of the approach was calculated. Gaze behaviour throughout the overall approach and during the last five strides before take-off was assessed following frame-by-frame analyses. Differences in relation to both round and jump number were found. Significantly longer was spent fixated on the jump during round 2, both during the overall approach and during the last five strides (pJump 1 was fixated on significantly earlier and more frequently than jump 2 or 3 (pjump 3 than with jump 1 (p = 0.01) but there was no difference in errors made between rounds. Although no significant correlations between gaze behaviour and skill scores were found, the riders who scored higher for jumping skill tended to fixate on the jump earlier (p = 0.07), when the horse was further from the jump (p = 0.09) and their first fixation on the jump was of a longer duration (p = 0.06). Trials with elite riders are now needed to further identify sport-specific visual skills and their relationship with performance. Visual training should be included in preparation for equestrian sports participation, the positive impact of which has been clearly demonstrated in other sports. PMID:24846055

  9. Four-cell solar tracker

    Science.gov (United States)

    Berdahl, C. M.

    1981-01-01

    Forty cm Sun tracker, consisting of optical telescope and four solar cells, stays pointed at Sun throughout day for maximum energy collection. Each solar cell generates voltage proportional to part of solar image it receives; voltages drive servomotors that keep image centered. Mirrored portion of cylinder extends acquisition angle of device by reflecting Sun image back onto solar cells.

  10. Altered attentional and perceptual processes as indexed by N170 during gaze perception in schizophrenia: Relationship with perceived threat and paranoid delusions.

    Science.gov (United States)

    Tso, Ivy F; Calwas, Anita M; Chun, Jinsoo; Mueller, Savanna A; Taylor, Stephan F; Deldin, Patricia J

    2015-08-01

    Using gaze information to orient attention and guide behavior is critical to social adaptation. Previous studies have suggested that abnormal gaze perception in schizophrenia (SCZ) may originate in abnormal early attentional and perceptual processes and may be related to paranoid symptoms. Using event-related brain potentials (ERPs), this study investigated altered early attentional and perceptual processes during gaze perception and their relationship to paranoid delusions in SCZ. Twenty-eight individuals with SCZ or schizoaffective disorder and 32 demographically matched healthy controls (HCs) completed a gaze-discrimination task with face stimuli varying in gaze direction (direct, averted), head orientation (forward, deviated), and emotion (neutral, fearful). ERPs were recorded during the task. Participants rated experienced threat from each face after the task. Participants with SCZ were as accurate as, though slower than, HCs on the task. Participants with SCZ displayed enlarged N170 responses over the left hemisphere to averted gaze presented in fearful relative to neutral faces, indicating a heightened encoding sensitivity to faces signaling external threat. This abnormality was correlated with increased perceived threat and paranoid delusions. Participants with SCZ also showed a reduction of N170 modulation by head orientation (normally increased amplitude to deviated faces relative to forward faces), suggesting less integration of contextual cues of head orientation in gaze perception. The psychophysiological deviations observed during gaze discrimination in SCZ underscore the role of early attentional and perceptual abnormalities in social information processing and paranoid symptoms of SCZ. (c) 2015 APA, all rights reserved).

  11. Development of a new Silicon Tracker at CMS for Super-LHC

    CERN Document Server

    Pesaresi, Mark

    2010-01-01

    Tracking is an essential requirement for any high energy particle physics experiment. The Compact Muon Solenoid (CMS) detector at the Large Hadron Collider (LHC) employs an all silicon tracker, the largest of its kind, for the precise measurement of track momentum and vertex position. With approximately 10 million detector channels in the strip tracker alone, the analogue non-sparsified readout system has been designed to handle the large data volumes generated at the 100 kHz Level 1 (L1) trigger rate. Fluctuations in the event rate are controlled using buffers whose occupancies are constantly monitored to prevent overflows, otherwise causing loss of synchronisation and data. The status of the tracker is reported by the APV emulator (APVe), which has now been successfully commissioned within the silicon strip tracker readout system. The APVe plays a crucial role in the synchronisation of the tracker by deterministic calculation of the front end buffer occupancy and by monitoring the status of the Front End Dr...

  12. In Defense of Sparse Tracking: Circulant Sparse Tracker

    KAUST Repository

    Zhang, Tianzhu; Bibi, Adel Aamer; Ghanem, Bernard

    2016-01-01

    Sparse representation has been introduced to visual tracking by finding the best target candidate with minimal reconstruction error within the particle filter framework. However, most sparse representation based trackers have high computational cost, less than promising tracking performance, and limited feature representation. To deal with the above issues, we propose a novel circulant sparse tracker (CST), which exploits circulant target templates. Because of the circulant structure property, CST has the following advantages: (1) It can refine and reduce particles using circular shifts of target templates. (2) The optimization can be efficiently solved entirely in the Fourier domain. (3) High dimensional features can be embedded into CST to significantly improve tracking performance without sacrificing much computation time. Both qualitative and quantitative evaluations on challenging benchmark sequences demonstrate that CST performs better than all other sparse trackers and favorably against state-of-the-art methods.

  13. In Defense of Sparse Tracking: Circulant Sparse Tracker

    KAUST Repository

    Zhang, Tianzhu

    2016-12-13

    Sparse representation has been introduced to visual tracking by finding the best target candidate with minimal reconstruction error within the particle filter framework. However, most sparse representation based trackers have high computational cost, less than promising tracking performance, and limited feature representation. To deal with the above issues, we propose a novel circulant sparse tracker (CST), which exploits circulant target templates. Because of the circulant structure property, CST has the following advantages: (1) It can refine and reduce particles using circular shifts of target templates. (2) The optimization can be efficiently solved entirely in the Fourier domain. (3) High dimensional features can be embedded into CST to significantly improve tracking performance without sacrificing much computation time. Both qualitative and quantitative evaluations on challenging benchmark sequences demonstrate that CST performs better than all other sparse trackers and favorably against state-of-the-art methods.

  14. The Benslimane's Artistic Model for Females' Gaze Beauty: An Original Assessment Tool.

    Science.gov (United States)

    Benslimane, Fahd; van Harpen, Laura; Myers, Simon R; Ingallina, Fabio; Ghanem, Ali M

    2017-02-01

    The aim of this paper is to analyze the aesthetic characteristics of the human females' gaze using anthropometry and to present an artistic model to represent it: "The Frame Concept." In this model, the eye fissure represents a painting, and the most peripheral shadows around it represent the frame of this painting. The narrower the frame, the more aesthetically pleasing and youthful the gaze appears. This study included a literature review of the features that make the gaze appear attractive. Photographs of models with attractive gazes were examined, and old photographs of patients were compared to recent photographs. The frame ratio was defined by anthropometric measurements of modern portraits of twenty consecutive Miss World winners. The concept was then validated for age and attractiveness across centuries by analysis of modern female photographs and works of art acknowledged for portraying beautiful young and older women in classical paintings. The frame height inversely correlated with attractiveness in modern female portrait photographs. The eye fissure frame ratio of modern idealized female portraits was similar to that of beautiful female portraits idealized by classical artists. In contrast, the eye fissure frames of classical artists' mothers' portraits were significantly wider than those of beautiful younger women. The Frame Concept is a valid artistic tool that provides an understanding of both the aesthetic and aging characteristics of the female periorbital region, enabling the practitioner to plan appropriate aesthetic interventions. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the A3 online Instructions to Authors. www.springer.com/00266 .

  15. Star tracker operation in a high density proton field

    Science.gov (United States)

    Miklus, Kenneth J.; Kissh, Frank; Flynn, David J.

    1993-01-01

    Algorithms that reject transient signals due to proton effects on charge coupled device (CCD) sensors have been implemented in the HDOS ASTRA-l Star Trackers to be flown on the TOPEX mission scheduled for launch in July 1992. A unique technique for simulating a proton-rich environment to test trackers is described, as well as the test results obtained. Solar flares or an orbit that passes through the South Atlantic Anomaly can subject the vehicle to very high proton flux levels. There are three ways in which spurious proton generated signals can impact tracker performance: the many false signals can prevent or extend the time to acquire a star; a proton-generated signal can compromise the accuracy of the star's reported magnitude and position; and the tracked star can be lost, requiring reacquisition. Tests simulating a proton-rich environment were performed on two ASTRA-1 Star Trackers utilizing these new algorithms. There were no false acquisitions, no lost stars, and a significant reduction in reported position errors due to these improvements.

  16. Upgrading the ATLAS barrel tracker for the super-LHC

    International Nuclear Information System (INIS)

    Bates, Richard L.

    2009-01-01

    It has been proposed to increase the luminosity of the large hadron collider (LHC) at CERN by an order of magnitude, with the upgraded machine dubbed super-LHC. The ATLAS experiment will require a new tracker for this high-luminosity operation due to radiation damage and event density. In order to cope with the order of magnitude increase in pile-up backgrounds at the higher luminosity, an all-silicon tracker is being designed. The new strip detector will use significantly shorter strips than the current silicon tracker in order to minimize the occupancy. As the increased luminosity will mean a corresponding increase in radiation dose, a new generation of extremely radiation-hard silicon detectors is required. An R and D program is underway to develop silicon sensors with sufficient radiation hardness. New front-end electronics and readout systems are being designed to cope with the higher data rates. The challenges facing the sensors and the cooling and mechanical support will be discussed. A possible tracker layout will be described.

  17. DIAGNOSIS OF MYASTHENIA GRAVIS USING FUZZY GAZE TRACKING SOFTWARE

    Directory of Open Access Journals (Sweden)

    Javad Rasti

    2015-04-01

    Full Text Available Myasthenia Gravis (MG is an autoimmune disorder, which may lead to paralysis and even death if not treated on time. One of its primary symptoms is severe muscular weakness, initially arising in the eye muscles. Testing the mobility of the eyeball can help in early detection of MG. In this study, software was designed to analyze the ability of the eye muscles to focus in various directions, thus estimating the MG risk. Progressive weakness in gazing at the directions prompted by the software can reveal abnormal fatigue of the eye muscles, which is an alert sign for MG. To assess the user’s ability to keep gazing at a specified direction, a fuzzy algorithm was applied to images of the user’s eyes to determine the position of the iris in relation to the sclera. The results of the tests performed on 18 healthy volunteers and 18 volunteers in early stages of MG confirmed the validity of the suggested software.

  18. Documentation for delivery of Star Tracker to ADEOS II

    DEFF Research Database (Denmark)

    Madsen, Peter Buch; Betto, Maurizio; Denver, Troelz

    1999-01-01

    The documentation EIDP (End Item Data Package) describes all the tests which have been performed on the Flight Hardware of the Star Tracker for the Japanese satellite ADEOS II.......The documentation EIDP (End Item Data Package) describes all the tests which have been performed on the Flight Hardware of the Star Tracker for the Japanese satellite ADEOS II....

  19. Tracker: Image-Processing and Object-Tracking System Developed

    Science.gov (United States)

    Klimek, Robert B.; Wright, Theodore W.

    1999-01-01

    Tracker is an object-tracking and image-processing program designed and developed at the NASA Lewis Research Center to help with the analysis of images generated by microgravity combustion and fluid physics experiments. Experiments are often recorded on film or videotape for analysis later. Tracker automates the process of examining each frame of the recorded experiment, performing image-processing operations to bring out the desired detail, and recording the positions of the objects of interest. It can load sequences of images from disk files or acquire images (via a frame grabber) from film transports, videotape, laser disks, or a live camera. Tracker controls the image source to automatically advance to the next frame. It can employ a large array of image-processing operations to enhance the detail of the acquired images and can analyze an arbitrarily large number of objects simultaneously. Several different tracking algorithms are available, including conventional threshold and correlation-based techniques, and more esoteric procedures such as "snake" tracking and automated recognition of character data in the image. The Tracker software was written to be operated by researchers, thus every attempt was made to make the software as user friendly and self-explanatory as possible. Tracker is used by most of the microgravity combustion and fluid physics experiments performed by Lewis, and by visiting researchers. This includes experiments performed on the space shuttles, Mir, sounding rockets, zero-g research airplanes, drop towers, and ground-based laboratories. This software automates the analysis of the flame or liquid s physical parameters such as position, velocity, acceleration, size, shape, intensity characteristics, color, and centroid, as well as a number of other measurements. It can perform these operations on multiple objects simultaneously. Another key feature of Tracker is that it performs optical character recognition (OCR). This feature is useful in

  20. LHCb Upgrade: Scintillating Fibre Tracker

    International Nuclear Information System (INIS)

    Tobin, Mark

    2016-01-01

    The LHCb detector will be upgraded during the Long Shutdown 2 (LS2) of the LHC in order to cope with higher instantaneous luminosities and to read out the data at 40 MHz using a trigger-less read-out system. All front-end electronics will be replaced and several sub-detectors must be redesigned to cope with higher occupancy. The current tracking detectors downstream of the LHCb dipole magnet will be replaced by the Scintillating Fibre (SciFi) Tracker. The SciFi Tracker will use scintillating fibres read out by Silicon Photomultipliers (SiPMs). State-of-the-art multi-channel SiPM arrays are being developed to read out the fibres and a custom ASIC will be used to digitise the signals from the SiPMs. The evolution of the design since the Technical Design Report in 2014 and the latest R & D results are presented.

  1. Using Variable Dwell Time to Accelerate Gaze-based Web Browsing with Two-step Selection

    OpenAIRE

    Chen, Zhaokang; Shi, Bertram E.

    2017-01-01

    In order to avoid the "Midas Touch" problem, gaze-based interfaces for selection often introduce a dwell time: a fixed amount of time the user must fixate upon an object before it is selected. Past interfaces have used a uniform dwell time across all objects. Here, we propose an algorithm for adjusting the dwell times of different objects based on the inferred probability that the user intends to select them. In particular, we introduce a probabilistic model of natural gaze behavior while sur...

  2. CCNA Wireless Study Guide

    CERN Document Server

    Lammle, Todd

    2010-01-01

    A complete guide to the CCNA Wireless exam by leading networking authority Todd Lammle. The CCNA Wireless certification is the most respected entry-level certification in this rapidly growing field. Todd Lammle is the undisputed authority on networking, and this book focuses exclusively on the skills covered in this Cisco certification exam. The CCNA Wireless Study Guide joins the popular Sybex study guide family and helps network administrators advance their careers with a highly desirable certification.: The CCNA Wireless certification is the most respected entry-level wireless certification

  3. Social communication with virtual agents: The effects of body and gaze direction on attention and emotional responding in human observers.

    Science.gov (United States)

    Marschner, Linda; Pannasch, Sebastian; Schulz, Johannes; Graupner, Sven-Thomas

    2015-08-01

    In social communication, the gaze direction of other persons provides important information to perceive and interpret their emotional response. Previous research investigated the influence of gaze by manipulating mutual eye contact. Therefore, gaze and body direction have been changed as a whole, resulting in only congruent gaze and body directions (averted or directed) of another person. Here, we aimed to disentangle these effects by using short animated sequences of virtual agents posing with either direct or averted body or gaze. Attention allocation by means of eye movements, facial muscle response, and emotional experience to agents of different gender and facial expressions were investigated. Eye movement data revealed longer fixation durations, i.e., a stronger allocation of attention, when gaze and body direction were not congruent with each other or when both were directed towards the observer. This suggests that direct interaction as well as incongruous signals increase the demands of attentional resources in the observer. For the facial muscle response, only the reaction of muscle zygomaticus major revealed an effect of body direction, expressed by stronger activity in response to happy expressions for direct compared to averted gaze when the virtual character's body was directed towards the observer. Finally, body direction also influenced the emotional experience ratings towards happy expressions. While earlier findings suggested that mutual eye contact is the main source for increased emotional responding and attentional allocation, the present results indicate that direction of the virtual agent's body and head also plays a minor but significant role. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Optical System Error Analysis and Calibration Method of High-Accuracy Star Trackers

    Directory of Open Access Journals (Sweden)

    Zheng You

    2013-04-01

    Full Text Available The star tracker is a high-accuracy attitude measurement device widely used in spacecraft. Its performance depends largely on the precision of the optical system parameters. Therefore, the analysis of the optical system parameter errors and a precise calibration model are crucial to the accuracy of the star tracker. Research in this field is relatively lacking a systematic and universal analysis up to now. This paper proposes in detail an approach for the synthetic error analysis of the star tracker, without the complicated theoretical derivation. This approach can determine the error propagation relationship of the star tracker, and can build intuitively and systematically an error model. The analysis results can be used as a foundation and a guide for the optical design, calibration, and compensation of the star tracker. A calibration experiment is designed and conducted. Excellent calibration results are achieved based on the calibration model. To summarize, the error analysis approach and the calibration method are proved to be adequate and precise, and could provide an important guarantee for the design, manufacture, and measurement of high-accuracy star trackers.

  5. Optical system error analysis and calibration method of high-accuracy star trackers.

    Science.gov (United States)

    Sun, Ting; Xing, Fei; You, Zheng

    2013-04-08

    The star tracker is a high-accuracy attitude measurement device widely used in spacecraft. Its performance depends largely on the precision of the optical system parameters. Therefore, the analysis of the optical system parameter errors and a precise calibration model are crucial to the accuracy of the star tracker. Research in this field is relatively lacking a systematic and universal analysis up to now. This paper proposes in detail an approach for the synthetic error analysis of the star tracker, without the complicated theoretical derivation. This approach can determine the error propagation relationship of the star tracker, and can build intuitively and systematically an error model. The analysis results can be used as a foundation and a guide for the optical design, calibration, and compensation of the star tracker. A calibration experiment is designed and conducted. Excellent calibration results are achieved based on the calibration model. To summarize, the error analysis approach and the calibration method are proved to be adequate and precise, and could provide an important guarantee for the design, manufacture, and measurement of high-accuracy star trackers.

  6. 3D gaze tracking system for NVidia 3D Vision®.

    Science.gov (United States)

    Wibirama, Sunu; Hamamoto, Kazuhiko

    2013-01-01

    Inappropriate parallax setting in stereoscopic content generally causes visual fatigue and visual discomfort. To optimize three dimensional (3D) effects in stereoscopic content by taking into account health issue, understanding how user gazes at 3D direction in virtual space is currently an important research topic. In this paper, we report the study of developing a novel 3D gaze tracking system for Nvidia 3D Vision(®) to be used in desktop stereoscopic display. We suggest an optimized geometric method to accurately measure the position of virtual 3D object. Our experimental result shows that the proposed system achieved better accuracy compared to conventional geometric method by average errors 0.83 cm, 0.87 cm, and 1.06 cm in X, Y, and Z dimensions, respectively.

  7. Eye Gaze Controlled Projected Display in Automotive and Military Aviation Environments

    Directory of Open Access Journals (Sweden)

    Gowdham Prabhakar

    2018-01-01

    Full Text Available This paper presents an eye gaze controlled projected display that can be used in aviation and automotive environment as a head up display. We have presented details of the hardware and software used in developing the display and an algorithm to improve performance of point and selection tasks in eye gaze controlled graphical user interface. The algorithm does not require changing layout of an interface; it rather puts a set of hotspots on clickable targets using a Simulated Annealing algorithm. Four user studies involving driving and flight simulators have found that the proposed projected display can improve driving and flying performance and significantly reduce pointing and selection times for secondary mission control tasks compared to existing interaction systems.

  8. Application Of Expert System Techniques To A Visual Tracker

    Science.gov (United States)

    Myler, Harley R.; Thompson, Wiley E.; Flachs, Gerald M.

    1985-04-01

    A structure for visual tracking system is presented which relies on information developed from previous tracking scenarios stored in a knowledge base to enhance tracking performance. The system is comprised of a centroid tracker front end which supplies segmented image features to a data reduction algorithm which holds the reduced data in a temporary data base relation. This relation is then classified vio two separate modes, learn and track. Under learn mode, an external teacher-irector operator provides identification and weighting cues for membership in a long-term storage relation within a knowledge base. Track mode operates autonomously from the learn mode where the system determines feature validity by applying fuzzy set membership criteria to previously stored track information in the database. Results determined from the classification generate tracker directives which either enhance or permit current tracking to continue or cause the tracker to search for alternate targets based upon analysis of a global target tracking list. The classification algorithm is based on correlative analysis of the tracker's segmented output presentation after low pass filtering derives lower order harmonics of the feature. The fuzzy set membership criteria is based on size, rotation, Irame location, and past history of the feature. The first three factors are lin-ear operations on the spectra, while the last is generated as a context relation in the knowledge base. The context relation interlinks data between features to facilitate tracker operation during feature occlusion or presence of countermeasures.

  9. Prior Knowledge Facilitates Mutual Gaze Convergence and Head Nodding Synchrony in Face-to-face Communication.

    Science.gov (United States)

    Thepsoonthorn, C; Yokozuka, T; Miura, S; Ogawa, K; Miyake, Y

    2016-12-02

    As prior knowledge is claimed to be an essential key to achieve effective education, we are interested in exploring whether prior knowledge enhances communication effectiveness. To demonstrate the effects of prior knowledge, mutual gaze convergence and head nodding synchrony are observed as indicators of communication effectiveness. We conducted an experiment on lecture task between lecturer and student under 2 conditions: prior knowledge and non-prior knowledge. The students in prior knowledge condition were provided the basic information about the lecture content and were assessed their understanding by the experimenter before starting the lecture while the students in non-prior knowledge had none. The result shows that the interaction in prior knowledge condition establishes significantly higher mutual gaze convergence (t(15.03) = 6.72, p < 0.0001; α = 0.05, n = 20) and head nodding synchrony (t(16.67) = 1.83, p = 0.04; α = 0.05, n = 19) compared to non-prior knowledge condition. This study reveals that prior knowledge facilitates mutual gaze convergence and head nodding synchrony. Furthermore, the interaction with and without prior knowledge can be evaluated by measuring or observing mutual gaze convergence and head nodding synchrony.

  10. The wireless internet explained

    CERN Document Server

    Rhoton, John

    2001-01-01

    The Wireless Internet Explained covers the full spectrum of wireless technologies from a wide range of vendors, including initiatives by Microsoft and Compaq. The Wireless Internet Explained takes a practical look at wireless technology. Rhoton explains the concepts behind the physics, and provides an overview that clarifies the convoluted set of standards heaped together under the umbrella of wireless. It then expands on these technical foundations to give a panorama of the increasingly crowded landscape of wireless product offerings. When it comes to actual implementation the book gives abundant down-to-earth advice on topics ranging from the selection and deployment of mobile devices to the extremely sensitive subject of security.Written by an expert on Internet messaging, the author of Digital Press''s successful Programmer''s Guide to Internet Mail and X.400 and SMTP: Battle of the E-mail Protocols, The Wireless Internet Explained describes and evaluates the current state of the fast-growing and crucial...

  11. Users’ experiences of wearable activity trackers: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Carol Maher

    2017-11-01

    Full Text Available Abstract Background Wearable activity trackers offer considerable promise for helping users to adopt healthier lifestyles. This study aimed to explore users’ experience of activity trackers, including usage patterns, sharing of data to social media, perceived behaviour change (physical activity, diet and sleep, and technical issues/barriers to use. Methods A cross-sectional online survey was developed and administered to Australian adults who were current or former activity tracker users. Results were analysed descriptively, with differences between current and former users and wearable brands explored using independent samples t-tests, Mann-Whitney, and chi square tests. Results Participants included 200 current and 37 former activity tracker users (total N = 237 with a mean age of 33.1 years (SD 12.4, range 18–74 years. Fitbit (67.5% and Garmin devices (16.5% were most commonly reported. Participants typically used their trackers for sustained periods (5–7 months and most intended to continue usage. Participants reported they had improved their physical activity (51–81% more commonly than they had their diet (14–40% or sleep (11–24%, and slightly more participants reported to value the real time feedback (89% compared to the long-term monitoring (78%. Most users (70% reported they had experienced functionality issues with their devices, most commonly related to battery life and technical difficulties. Conclusions Results suggest users find activity trackers appealing and useful tools for increasing perceived physical activity levels over a sustained period.

  12. Seeing to hear? Patterns of gaze to speaking faces in children with autism spectrum disorders.

    Directory of Open Access Journals (Sweden)

    Julia eIrwin

    2014-05-01

    Full Text Available Using eye-tracking methodology, gaze to a speaking face was compared in a group of children with autism spectrum disorders (ASD and those with typical development (TD. Patterns of gaze were observed under three conditions: audiovisual (AV speech in auditory noise, visual only speech and an AV non-face, non-speech control. Children with ASD looked less to the face of the speaker and fixated less on the speakers’ mouth than TD controls. No differences in gaze were reported for the non-face, non-speech control task. Since the mouth holds much of the articulatory information available on the face, these findings suggest that children with ASD may have reduced access to critical linguistic information. This reduced access to visible articulatory information could be a contributor to the communication and language problems exhibited by children with ASD.

  13. Performance of the LHCb Outer Tracker

    CERN Document Server

    Arink, R; Bachmann, S.; Bagaturia, Y.; Band, H.; Bauer, Th.; Berkien, A.; Farber, Ch.; Bien, A.; Blouw, J.; Ceelie, L.; Coco, V.; Deckenhoff, M.; Deng, Z.; Dettori, F.; van Eijk, D.; Ekelhof, R.; Gersabeck, E.; Grillo, L.; Hulsbergen, W.D.; Karbach, T.M.; Koopman, R.; Kozlinskiy, A.; Langenbruch, Ch.; Lavrentyev, V.; Linn, Ch.; Merk, M.; Merkel, J.; Meissner, M.; Michalowski, J.; Morawski, P.; Nawrot, A.; Nedos, M.; Pellegrino, A.; Polok, G.; van Petten, O.; Rovekamp, J.; Schimmel, F.; Schuylenburg, H.; Schwemmer, R.; Seyfert, P.; Serra, N.; Sluijk, T.; Spaan, B.; Spelt, J.; Storaci, B.; Szczekowski, M.; Swientek, S.; Tolk, S.; Tuning, N.; Uwer, U.; Wiedner, D.; Witek, M.; Zeng, M.; Zwart, A.

    2014-01-01

    The LHCb Outer Tracker is a gaseous detector covering an area of 5x6 m2 with 12 double layers of straw tubes. The detector with its services are described together with the commissioning and calibration procedures. Based on data of the first LHC running period from 2010 to 2012, the performance of the readout electronics and the single hit resolution and efficiency are presented. The efficiency to detect a hit in the central half of the straw is estimated to be 99.2%, and the position resolution is determined to be approximately 200 um. The Outer Tracker received a dose in the hottest region corresponding to 0.12 C/cm, and no signs of gain deterioration or other ageing effects are observed.

  14. The Alpha Magnetic Spectrometer Silicon Tracker

    CERN Document Server

    Burger, W J

    1999-01-01

    The Alpha Magnetic Spectrometer (AMS) is designed as a independent module for installation on the International Space Station Alpha (ISSA) in the year 2002 for an operational period of three years. The principal scientific objectives are the searches for antimatter and dark matter in cosmic rays. The AMS uses 5.5 m sup 2 of silicon microstrip sensors to reconstruct charged particle trajectories in the field of a permanent magnet. The detector design and construction covered a 3 yr period which terminated with a test flight on the NASA space shuttle Discovery during June 2-12, 1988. In this contribution, we describe the shuttle version of the AMS silicon tracker, including preliminary results of the tracker performance during the flight. (author)

  15. The CDF online silicon vertex tracker

    International Nuclear Information System (INIS)

    Ashmanskas, W.

    2001-01-01

    The CDF Online Silicon Vertex Tracker reconstructs 2-D tracks by linking hit positions measured by the Silicon Vertex Detector to the Central Outer Chamber tracks found by the eXtremely Fast Tracker. The system has been completely built and assembled and it is now being commissioned using the first CDF run II data. The precision measurement of the track impact parameter will allow triggering on B hadron decay vertices and thus investigating important areas in the B sector, like CP violation and B s mixing. In this paper we briefly review the architecture and the tracking algorithms implemented in the SVT and we report on the performance of the system achieved in the early phase of CDF run II

  16. The CDF online Silicon Vertex Tracker

    International Nuclear Information System (INIS)

    Ashmanskas, W.; Bardi, A.; Bari, M.; Belforte, S.; Berryhill, J.; Bogdan, M.; Carosi, R.; Cerri, A.; Chlachidze, G.; Culbertson, R.; Dell'Orso, M.; Donati, S.; Fiori, I.; Frisch, H.J.; Galeotti, S.; Giannetti, P.; Glagolev, V.; Moneta, L.; Morsani, F.; Nakaya, T.; Passuello, D.; Punzi, G.; Rescigno, M.; Ristori, L.; Sanders, H.; Sarkar, S.; Semenov, A.; Shochet, M.; Speer, T.; Spinella, F.; Wu, X.; Yang, U.; Zanello, L.; Zanetti, A.M.

    2002-01-01

    The CDF Online Silicon Vertex Tracker (SVT) reconstructs 2D tracks by linking hit positions measured by the Silicon Vertex Detector to the Central Outer Chamber tracks found by the eXtremely Fast Tracker (XFT). The system has been completely built and assembled and it is now being commissioned using the first CDF run II data. The precision measurement of the track impact parameter will allow triggering on B hadron decay vertices and thus investigating important areas in the B sector, like CP violation and B s mixing. In this paper we briefly review the architecture and the tracking algorithms implemented in the SVT and we report on the performance of the system achieved in the early phase of CDF run II

  17. A framework for performance evaluation of model-based optical trackers

    NARCIS (Netherlands)

    Smit, F.A.; Liere, van R.

    2008-01-01

    We describe a software framework to evaluate the performance of model-based optical trackers in virtual environments. The framework can be used to evaluate and compare the performance of different trackers under various conditions, to study the effects of varying intrinsic and extrinsic camera

  18. Postural sway and gaze can track the complex motion of a visual target.

    Directory of Open Access Journals (Sweden)

    Vassilia Hatzitaki

    Full Text Available Variability is an inherent and important feature of human movement. This variability has form exhibiting a chaotic structure. Visual feedback training using regular predictive visual target motions does not take into account this essential characteristic of the human movement, and may result in task specific learning and loss of visuo-motor adaptability. In this study, we asked how well healthy young adults can track visual target cues of varying degree of complexity during whole-body swaying in the Anterior-Posterior (AP and Medio-Lateral (ML direction. Participants were asked to track three visual target motions: a complex (Lorenz attractor, a noise (brown and a periodic (sine moving target while receiving online visual feedback about their performance. Postural sway, gaze and target motion were synchronously recorded and the degree of force-target and gaze-target coupling was quantified using spectral coherence and Cross-Approximate entropy. Analysis revealed that both force-target and gaze-target coupling was sensitive to the complexity of the visual stimuli motions. Postural sway showed a higher degree of coherence with the Lorenz attractor than the brown noise or sinusoidal stimulus motion. Similarly, gaze was more synchronous with the Lorenz attractor than the brown noise and sinusoidal stimulus motion. These results were similar regardless of whether tracking was performed in the AP or ML direction. Based on the theoretical model of optimal movement variability tracking of a complex signal may provide a better stimulus to improve visuo-motor adaptation and learning in postural control.

  19. Interacting with target tracking algorithms in a gaze-enhanced motion video analysis system

    Science.gov (United States)

    Hild, Jutta; Krüger, Wolfgang; Heinze, Norbert; Peinsipp-Byma, Elisabeth; Beyerer, Jürgen

    2016-05-01

    Motion video analysis is a challenging task, particularly if real-time analysis is required. It is therefore an important issue how to provide suitable assistance for the human operator. Given that the use of customized video analysis systems is more and more established, one supporting measure is to provide system functions which perform subtasks of the analysis. Recent progress in the development of automated image exploitation algorithms allow, e.g., real-time moving target tracking. Another supporting measure is to provide a user interface which strives to reduce the perceptual, cognitive and motor load of the human operator for example by incorporating the operator's visual focus of attention. A gaze-enhanced user interface is able to help here. This work extends prior work on automated target recognition, segmentation, and tracking algorithms as well as about the benefits of a gaze-enhanced user interface for interaction with moving targets. We also propose a prototypical system design aiming to combine both the qualities of the human observer's perception and the automated algorithms in order to improve the overall performance of a real-time video analysis system. In this contribution, we address two novel issues analyzing gaze-based interaction with target tracking algorithms. The first issue extends the gaze-based triggering of a target tracking process, e.g., investigating how to best relaunch in the case of track loss. The second issue addresses the initialization of tracking algorithms without motion segmentation where the operator has to provide the system with the object's image region in order to start the tracking algorithm.

  20. The LHCb Silicon Tracker - Control system specific tools and challenges

    CERN Document Server

    Adeva, G; Esperante Pereira, D; Gallas, A; Pazos Alvarez, A; Perez Trigo, E; Rodriguez Perez, P; Saborido, J; Amhis, Y; Bay, A; Blanc, F; Bressieux, J; Conti, G; Dupertuis, F; Fave, V; Frei, R; Gauvin, N; Haefeli, G; Keune, A; Luisier, J; Marki, R; Muresan, R; Nakada, T; Needham, M; Knecht, M; Schneider, O; Tran, M; Anderson, J; Buechler, A; Bursche, A; Chiapolini, N; De Cian, M; Elsasser, C; Salzmann, C; Saornil Gamarra, S; Steiner, S; Steinkamp, O; Straumann, U; van Tilburg, J; Tobin, M; Vollhardt, A; Aquines Gutierrez, O; Bauer, C; Britsch, M; Maciuc, F; Schmelling, M; Voss, H; Iakovenko, V; Okhrimenko, O; Pugatch, V

    2014-01-01

    The Experiment Control System (ECS) of the LHCb Silicon Tracker sub-detectors is built on the integrated LHCb ECS framework. Although all LHCb sub-detectors use the same framework and follow the same guidelines, the Silicon Tracker control system uses some interesting additional features in terms of operation and monitoring. The main details are described in this document. Since its design, the Silicon Tracker control system has been continuously evolving in a quite disorganized way. Some major maintenance activities are required to be able to keep improving. A description of those activities can also be found here.

  1. Wireless Access

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Wireless Access. Wireless connect to the Base station. Easy and Convenient access. Costlier as compared to the wired technology. Reliability challenges. We see it as a complementary technology to the DSL.

  2. Forecasting method in multilateration accuracy based on laser tracker measurement

    International Nuclear Information System (INIS)

    Aguado, Sergio; Santolaria, Jorge; Samper, David; José Aguilar, Juan

    2017-01-01

    Multilateration based on a laser tracker (LT) requires the measurement of a set of points from three or more positions. Although the LTs’ angular information is not used, multilateration produces a volume of measurement uncertainty. This paper presents two new coefficients from which to determine whether the measurement of a set of points, before performing the necessary measurements, will improve or worsen the accuracy of the multilateration results, avoiding unnecessary measurement, and reducing the time and economic cost required. The first specific coefficient measurement coefficient (MC LT ) is unique for each laser tracker. It determines the relationship between the radial and angular laser tracker measurement noise. Similarly, the second coefficient is related with specific conditions of measurement β . It is related with the spatial angle between the laser tracker positions α and its effect on error reduction. Both parameters MC LT and β are linked in error reduction limits. Beside these, a new methodology to determine the multilateration reduction limit according to the multilateration technique of an ideal laser tracker distribution and a random one are presented. It provides general rules and advice from synthetic tests that are validated through a real test carried out in a coordinate measurement machine. (paper)

  3. mm-Wave Hybrid Photonic Wireless Links for Ultra-High Speed Wireless Transmissions

    DEFF Research Database (Denmark)

    Rommel, Simon; Vegas Olmos, Juan José; Tafur Monroy, Idelfonso

    Hybrid photonic-wireless transmission schemes in the mm-wave frequency range are promising candidates to enable the multi-gigabit per second data communications required from wireless and mobile networks of the 5th and future generations. Large FCC spectrum allocations for wireless transmission...

  4. Quality control of geological voxel models using experts' gaze

    NARCIS (Netherlands)

    Maanen, P.P. van; Busschers, F.S.; Brouwer, A.M.; Meulen, M.J. van der; Erp, J.B.F. van

    2015-01-01

    Due to an expected increase in geological voxel model data-flow and user demands, the development of improved quality control for such models is crucial. This study explores the potential of a new type of quality control that improves the detection of errors by just using gaze behavior of 12

  5. Quality Control of Geological Voxel Models using Experts' Gaze

    NARCIS (Netherlands)

    van Maanen, Peter-Paul; Busschers, Freek S.; Brouwer, Anne-Marie; van der Meulendijk, Michiel J.; van Erp, Johannes Bernardus Fransiscus

    Due to an expected increase in geological voxel model data-flow and user demands, the development of improved quality control for such models is crucial. This study explores the potential of a new type of quality control that improves the detection of errors by just using gaze behavior of 12

  6. Determinants for sustained use of an activity tracker : observational study

    NARCIS (Netherlands)

    Hermsen, Sander; Moons, Jonas; Kerkhof, Peter; Wiekens, Carina; De Groot, Martijn

    2017-01-01

    BACKGROUND: A lack of physical activity is considered to cause 6% of deaths globally. Feedback from wearables such as activity trackers has the potential to encourage daily physical activity. To date, little research is available on the natural development of adherence to activity trackers or on

  7. Control system design of the CERN/CMS tracker thermal screen

    CERN Document Server

    Carrone, E

    2003-01-01

    The Tracker is one of the CMS (Compact Muon Solenoid experiment) subdetectors to be installed at the LHC (Large Hadron Collider) accelerator, scheduled to start data taking in 2007 at CERN (European Organization for Nuclear Research). The tracker will be operated at a temperature of -10 degree C in order to reduce the radiation damage on the silicon detectors; hence, an insulated environment has to be provided by means of a screen that introduces a thermal separation between the Tracker and the neighboring detection systems. The control system design includes a formal description of the process by means of a thermodynamic model; then, the electrical equivalence is derived. The transfer function is inferred by the ratio of the voltage on the outer skin and the voltage input, i.e. the ratio of the temperature outside the tracker and the heat generated (which is the controlled variable). A PID (Proportional Integral Derivative) controller has been designed using MatLab. The results achieved so far prove that thi...

  8. The CMS Tracker Upgrade for HL-LHC\\\\ Sensor R$\\&$D

    CERN Document Server

    Naseri, Mohsen

    2014-01-01

    At an instantaneous luminosity of 5~$\\times10^{34}~cm^{-2}~s^{-1}$, the high-luminosity phase of the Large Hadron Collider (HL-LHC) is expected to deliver a total of 3000~fb$^{-1}$ of collisions, hereby increasing the discovery potential of the LHC experiments significantly. However, the radiation environment of the tracking system will be severe, requiring new radiation hard sensors for the CMS tracker. Focusing on the upgrade of the outer tracker region, the CMS tracker collaboration has almost completed a large material investigation and irradiation campaign to identify the silicon material and design that fulfils all requirements of a new tracking detector at HL-LHC. Focusing on the upgrade of the outer tracker region, pad diodes as well as fully functional strip sensors have been implemented on silicon wafers with different material properties and thicknesses. The samples were irradiated with a mixture of neutrons and protons corresponding to fluences as expected for various positions in the future track...

  9. Development and Testing of the AMEGO Silicon Tracker System

    Science.gov (United States)

    Griffin, Sean; Amego Team

    2018-01-01

    The All-sky Medium Energy Gamma-ray Observatory (AMEGO) is a probe-class mission in consideration for the 2020 decadal review designed to operate at energies from ˜ 200 keV to > 10 GeV. Operating a detector in this energy regime is challenging due to the crossover in the interaction cross-section for Compton scattering and pair production. AMEGO is made of four major subsystems: a plastic anticoincidence detector for rejecting cosmic-ray events, a silicon tracker for measuring the energies of Compton scattered electrons and pair-production products, a CZT calorimeter for measuring the energy and location of Compton scattered photons, and a CsI calorimeter for measuring the energy of the pair-production products at high energies. The tracker comprises layers of dual-sided silicon strip detectors which provide energy and localization information for Compton scattering and pair-production events. A prototype tracker system is under development at GSFC; in this contribution we provide details on the verification, packaging, and testing of the prototype tracker, as well as present plans for the development of the front-end electronics, beam tests, and a balloon flight.

  10. The AMS silicon tracker readout, performance results with minimum ionizing particles

    CERN Document Server

    Alpat, B; Battiston, R; Bourquin, Maurice; Burger, W J; Extermann, Pierre; Chang, Y H; Hou, S R; Pauluzzi, M; Produit, N; Qiu, S; Rapin, D; Ribordy, R; Toker, O; Wu, S X

    2000-01-01

    First results for the AMS silicon tracker readout performance are presented. Small 20.0*20.0*0.300 mm/sup 3/ silicon microstrip detectors were installed in a 50 GeV electron beam at CERN. The detector readout consisted of prototypes of the tracker data reduction card equipped with a 12-bit ADC and the tracker frontend hybrid with VA_hdr readout chips. The system performance is assessed in terms of signal-to-noise, position resolution, and efficiency. (13 refs).

  11. When What You See Is What You Get: The Consequences of the Objectifying Gaze for Women and Men

    Science.gov (United States)

    Gervais, Sarah J.; Vescio, Theresa K.; Allen, Jill

    2011-01-01

    This research examined the effects of the objectifying gaze on math performance, interaction motivation, body surveillance, body shame, and body dissatisfaction. In an experiment, undergraduate participants (67 women and 83 men) received an objectifying gaze during an interaction with a trained confederate of the other sex. As hypothesized, the…

  12. Using an eye tracker for accurate eye movement artifact correction

    NARCIS (Netherlands)

    Kierkels, J.J.M.; Riani, J.; Bergmans, J.W.M.; Boxtel, van G.J.M.

    2007-01-01

    We present a new method to correct eye movement artifacts in electroencephalogram (EEG) data. By using an eye tracker, whose data cannot be corrupted by any electrophysiological signals, an accurate method for correction is developed. The eye-tracker data is used in a Kalman filter to estimate which

  13. Design approach for solar cell and battery of a persistent solar powered GPS tracker

    Science.gov (United States)

    Sahraei, Nasim; Watson, Sterling M.; Pennes, Anthony; Marius Peters, Ian; Buonassisi, Tonio

    2017-08-01

    Sensors with wireless communication can be powered by photovoltaic (PV) devices. However, using solar power requires thoughtful design of the power system, as well as a careful management of the power consumption, especially for devices with cellular communication (because of their higher power consumption). A design approach can minimize system size, weight, and/or cost, while maximizing device performance (data transmission rate and persistence). In this contribution, we describe our design approach for a small form-factor, solar-powered GPS tracker with cellular communication. We evaluate the power consumption of the device in different stages of operation. Combining measured power consumption and the calculated energy-yield of a solar cell, we estimate the battery capacity and solar cell area required for 5 years of continuous operation. We evaluate trade-offs between PV and battery size by simulating the battery state of charge. The data show a trade-off between battery capacity and solar-cell area for given target data transmission rate and persistence. We use this analysis to determine the combination of solar panel area and battery capacity for a given application and the data transmission rate that results in minimum cost or total weight of the system.

  14. Monitoring radiation damage in the LHCb Silicon Tracker

    CERN Multimedia

    Graverini, Elena

    2018-01-01

    The purpose of LHCb is to search for indirect evidence of new physics in decays of heavy hadrons. The LHCb detector is a single-arm forward spectrometer with precise silicon-strip detectors in the regions with highest particle occupancies. The non-uniform exposure of the LHCb sensors makes it an ideal laboratory to study radiation damage effects in silicon detectors. The LHCb Silicon Tracker is composed of an upstream tracker, the TT, and of the inner part of the downstream tracker (IT). Dedicated scans are regularly taken, which allow a precise measurement of the charge collection efficiency (CCE) and the calibration of the operational voltages. The measured evolution of the effective depletion voltage $V_{depl}$ is shown, and compared with the Hamburg model prediction. The magnitudes of the sensor leakage current are also analysed and compared to their expected evolution according to phenomenological models. Our results prove that both the TT and the IT will withstand normal operation until the end of the L...

  15. Communications device identification methods, communications methods, wireless communications readers, wireless communications systems, and articles of manufacture

    Science.gov (United States)

    Steele, Kerry D [Kennewick, WA; Anderson, Gordon A [Benton City, WA; Gilbert, Ronald W [Morgan Hill, CA

    2011-02-01

    Communications device identification methods, communications methods, wireless communications readers, wireless communications systems, and articles of manufacture are described. In one aspect, a communications device identification method includes providing identification information regarding a group of wireless identification devices within a wireless communications range of a reader, using the provided identification information, selecting one of a plurality of different search procedures for identifying unidentified ones of the wireless identification devices within the wireless communications range, and identifying at least some of the unidentified ones of the wireless identification devices using the selected one of the search procedures.

  16. Wireless adiabatic power transfer

    International Nuclear Information System (INIS)

    Rangelov, A.A.; Suchowski, H.; Silberberg, Y.; Vitanov, N.V.

    2011-01-01

    Research highlights: → Efficient and robust mid-range wireless energy transfer between two coils. → The adiabatic energy transfer is analogous to adiabatic passage in quantum optics. → Wireless energy transfer is insensitive to any resonant constraints. → Wireless energy transfer is insensitive to noise in the neighborhood of the coils. - Abstract: We propose a technique for efficient mid-range wireless power transfer between two coils, by adapting the process of adiabatic passage for a coherently driven two-state quantum system to the realm of wireless energy transfer. The proposed technique is shown to be robust to noise, resonant constraints, and other interferences that exist in the neighborhood of the coils.

  17. Data Quality Monitoring of the CMS Tracker

    International Nuclear Information System (INIS)

    Dutta, Suchandra

    2011-01-01

    The Data Quality Monitoring system for the Tracker has been developed within the CMS Software framework. It has been designed to be used during online data taking as well as during offline reconstruction. The main goal of the online system is to monitor detector performance and identify problems very efficiently during data collection so that proper actions can be taken to fix it. On the other hand any issue with data reconstruction or calibration can be detected during offline processing using the same tool. The monitoring is performed using histograms which are filled with information from raw and reconstructed data computed at the level of individual detectors. Furthermore, statistical tests are performed on these histograms to check the quality and flags are generated automatically. Results are visualized with web based graphical user interfaces. Final data certification is done combining these automatic flags and manual inspection. The Tracker DQM system has been successfully used during cosmic data taking and it has been optimised to fulfill the condition of collision data taking. In this paper we describe the functionality of the CMS Tracker DQM system and the experience acquired during proton-proton collision.

  18. Adaptive gaze stabilization through cerebellar internal models in a humanoid robot

    DEFF Research Database (Denmark)

    Vannucci, Lorenzo; Tolu, Silvia; Falotico, Egidio

    2016-01-01

    Two main classes of reflexes relying on the vestibular system are involved in the stabilization of the human gaze: The vestibulocollic reflex (VCR), which stabilizes the head in space and the vestibulo-ocular reflex (VOR), which stabilizes the visual axis to minimize retinal image motion. The VOR...... on the coordination of VCR and VOR and OKR. The model, inspired on neuroscientific cerebellar theories, is provided with learning and adaptation capabilities based on internal models. Tests on a simulated humanoid platform confirm the effectiveness of our approach....... works in conjunction with the opto-kinetic reflex (OKR), which is a visual feedback mechanism for moving the eye at the same speed as the observed scene. Together they keep the image stationary on the retina. In this work we present the first complete model of gaze stabilization based...

  19. Wireless communication technology NFC

    OpenAIRE

    MÁROVÁ, Kateřina

    2014-01-01

    Aim of this bachelor thesis is to handle the issue of new wireless communication technology NFC (Near Field Communication) including a comparison of advantages and disadvantages of NFC with other wireless technologies (Bluetooth, Wi-Fi, etc.). NFC is a technology for wireless communications between different electronic devices, one of which is typically a mobile phone. Near Field Communication allows wireless communication at very short distance by approaching or enclosing two devices and can...

  20. 75 FR 8400 - In the Matter of Certain Wireless Communications System Server Software, Wireless Handheld...

    Science.gov (United States)

    2010-02-24

    ... Communications System Server Software, Wireless Handheld Devices and Battery Packs; Notice of Investigation... within the United States after importation of certain wireless communications system server software... certain wireless communications system server software, wireless handheld devices or battery packs that...