WorldWideScience

Sample records for human computer qhc

  1. Qubits and quantum Hamiltonian computing performances for operating a digital Boolean 1/2-adder

    Science.gov (United States)

    Dridi, Ghassen; Faizy Namarvar, Omid; Joachim, Christian

    2018-04-01

    Quantum Boolean (1 + 1) digits 1/2-adders are designed with 3 qubits for the quantum computing (Qubits) and 4 quantum states for the quantum Hamiltonian computing (QHC) approaches. Detailed analytical solutions are provided to analyse the time operation of those different 1/2-adder gates. QHC is more robust to noise than Qubits and requires about the same amount of energy for running its 1/2-adder logical operations. QHC is faster in time than Qubits but its logical output measurement takes longer.

  2. The mathematics of a quantum Hamiltonian computing half adder Boolean logic gate

    International Nuclear Information System (INIS)

    Dridi, G; Julien, R; Hliwa, M; Joachim, C

    2015-01-01

    The mathematics behind the quantum Hamiltonian computing (QHC) approach of designing Boolean logic gates with a quantum system are given. Using the quantum eigenvalue repulsion effect, the QHC AND, NAND, OR, NOR, XOR, and NXOR Hamiltonian Boolean matrices are constructed. This is applied to the construction of a QHC half adder Hamiltonian matrix requiring only six quantum states to fullfil a half Boolean logical truth table. The QHC design rules open a nano-architectronic way of constructing Boolean logic gates inside a single molecule or atom by atom at the surface of a passivated semi-conductor. (paper)

  3. The mathematics of a quantum Hamiltonian computing half adder Boolean logic gate.

    Science.gov (United States)

    Dridi, G; Julien, R; Hliwa, M; Joachim, C

    2015-08-28

    The mathematics behind the quantum Hamiltonian computing (QHC) approach of designing Boolean logic gates with a quantum system are given. Using the quantum eigenvalue repulsion effect, the QHC AND, NAND, OR, NOR, XOR, and NXOR Hamiltonian Boolean matrices are constructed. This is applied to the construction of a QHC half adder Hamiltonian matrix requiring only six quantum states to fullfil a half Boolean logical truth table. The QHC design rules open a nano-architectronic way of constructing Boolean logic gates inside a single molecule or atom by atom at the surface of a passivated semi-conductor.

  4. Realization of a quantum Hamiltonian Boolean logic gate on the Si(001):H surface.

    Science.gov (United States)

    Kolmer, Marek; Zuzak, Rafal; Dridi, Ghassen; Godlewski, Szymon; Joachim, Christian; Szymonski, Marek

    2015-08-07

    The design and construction of the first prototypical QHC (Quantum Hamiltonian Computing) atomic scale Boolean logic gate is reported using scanning tunnelling microscope (STM) tip-induced atom manipulation on an Si(001):H surface. The NOR/OR gate truth table was confirmed by dI/dU STS (Scanning Tunnelling Spectroscopy) tracking how the surface states of the QHC quantum circuit on the Si(001):H surface are shifted according to the input logical status.

  5. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  6. Human-centered Computing: Toward a Human Revolution

    OpenAIRE

    Jaimes, Alejandro; Gatica-Perez, Daniel; Sebe, Nicu; Huang, Thomas S.

    2007-01-01

    Human-centered computing studies the design, development, and deployment of mixed-initiative human-computer systems. HCC is emerging from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts.

  7. Human Computer Music Performance

    OpenAIRE

    Dannenberg, Roger B.

    2012-01-01

    Human Computer Music Performance (HCMP) is the study of music performance by live human performers and real-time computer-based performers. One goal of HCMP is to create a highly autonomous artificial performer that can fill the role of a human, especially in a popular music setting. This will require advances in automated music listening and understanding, new representations for music, techniques for music synchronization, real-time human-computer communication, music generation, sound synt...

  8. Cooperation in human-computer communication

    OpenAIRE

    Kronenberg, Susanne

    2000-01-01

    The goal of this thesis is to simulate cooperation in human-computer communication to model the communicative interaction process of agents in natural dialogs in order to provide advanced human-computer interaction in that coherence is maintained between contributions of both agents, i.e. the human user and the computer. This thesis contributes to certain aspects of understanding and generation and their interaction in the German language. In spontaneous dialogs agents cooperate by the pro...

  9. Humans, computers and wizards human (simulated) computer interaction

    CERN Document Server

    Fraser, Norman; McGlashan, Scott; Wooffitt, Robin

    2013-01-01

    Using data taken from a major European Union funded project on speech understanding, the SunDial project, this book considers current perspectives on human computer interaction and argues for the value of an approach taken from sociology which is based on conversation analysis.

  10. Language evolution and human-computer interaction

    Science.gov (United States)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  11. From Human-Computer Interaction to Human-Robot Social Interaction

    OpenAIRE

    Toumi, Tarek; Zidani, Abdelmadjid

    2014-01-01

    Human-Robot Social Interaction became one of active research fields in which researchers from different areas propose solutions and directives leading robots to improve their interactions with humans. In this paper we propose to introduce works in both human robot interaction and human computer interaction and to make a bridge between them, i.e. to integrate emotions and capabilities concepts of the robot in human computer model to become adequate for human robot interaction and discuss chall...

  12. When computers were human

    CERN Document Server

    Grier, David Alan

    2013-01-01

    Before Palm Pilots and iPods, PCs and laptops, the term ""computer"" referred to the people who did scientific calculations by hand. These workers were neither calculating geniuses nor idiot savants but knowledgeable people who, in other circumstances, might have become scientists in their own right. When Computers Were Human represents the first in-depth account of this little-known, 200-year epoch in the history of science and technology. Beginning with the story of his own grandmother, who was trained as a human computer, David Alan Grier provides a poignant introduction to the wider wo

  13. Safety Metrics for Human-Computer Controlled Systems

    Science.gov (United States)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  14. Occupational stress in human computer interaction.

    Science.gov (United States)

    Smith, M J; Conway, F T; Karsh, B T

    1999-04-01

    There have been a variety of research approaches that have examined the stress issues related to human computer interaction including laboratory studies, cross-sectional surveys, longitudinal case studies and intervention studies. A critical review of these studies indicates that there are important physiological, biochemical, somatic and psychological indicators of stress that are related to work activities where human computer interaction occurs. Many of the stressors of human computer interaction at work are similar to those stressors that have historically been observed in other automated jobs. These include high workload, high work pressure, diminished job control, inadequate employee training to use new technology, monotonous tasks, por supervisory relations, and fear for job security. New stressors have emerged that can be tied primarily to human computer interaction. These include technology breakdowns, technology slowdowns, and electronic performance monitoring. The effects of the stress of human computer interaction in the workplace are increased physiological arousal; somatic complaints, especially of the musculoskeletal system; mood disturbances, particularly anxiety, fear and anger; and diminished quality of working life, such as reduced job satisfaction. Interventions to reduce the stress of computer technology have included improved technology implementation approaches and increased employee participation in implementation. Recommendations for ways to reduce the stress of human computer interaction at work are presented. These include proper ergonomic conditions, increased organizational support, improved job content, proper workload to decrease work pressure, and enhanced opportunities for social support. A model approach to the design of human computer interaction at work that focuses on the system "balance" is proposed.

  15. Ubiquitous human computing.

    Science.gov (United States)

    Zittrain, Jonathan

    2008-10-28

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a drawing pin and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This paper explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

  16. Human-computer interaction : Guidelines for web animation

    OpenAIRE

    Galyani Moghaddam, Golnessa; Moballeghi, Mostafa

    2006-01-01

    Human-computer interaction in the large is an interdisciplinary area which attracts researchers, educators, and practioners from many differenf fields. Human-computer interaction studies a human and a machine in communication, it draws from supporting knowledge on both the machine and the human side. This paper is related to the human side of human-computer interaction and focuses on animations. The growing use of animation in Web pages testifies to the increasing ease with which such multim...

  17. Human Computing and Machine Understanding of Human Behavior: A Survey

    NARCIS (Netherlands)

    Pantic, Maja; Pentland, Alex; Nijholt, Antinus; Huang, Thomas; Quek, F.; Yang, Yie

    2006-01-01

    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing, which we will call human computing, should

  18. 2012 International Conference on Human-centric Computing

    CERN Document Server

    Jin, Qun; Yeo, Martin; Hu, Bin; Human Centric Technology and Service in Smart Space, HumanCom 2012

    2012-01-01

    The theme of HumanCom is focused on the various aspects of human-centric computing for advances in computer science and its applications and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in human-centric computing. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject.

  19. Fundamentals of human-computer interaction

    CERN Document Server

    Monk, Andrew F

    1985-01-01

    Fundamentals of Human-Computer Interaction aims to sensitize the systems designer to the problems faced by the user of an interactive system. The book grew out of a course entitled """"The User Interface: Human Factors for Computer-based Systems"""" which has been run annually at the University of York since 1981. This course has been attended primarily by systems managers from the computer industry. The book is organized into three parts. Part One focuses on the user as processor of information with studies on visual perception; extracting information from printed and electronically presented

  20. Guest Editorial Special Issue on Human Computing

    NARCIS (Netherlands)

    Pantic, Maja; Santos, E.; Pentland, A.; Nijholt, Antinus

    2009-01-01

    The seven articles in this special issue focus on human computing. Most focus on two challenging issues in human computing, namely, machine analysis of human behavior in group interactions and context-sensitive modeling.

  1. Stereo Vision for Unrestricted Human-Computer Interaction

    OpenAIRE

    Eldridge, Ross; Rudolph, Heiko

    2008-01-01

    Human computer interfaces have come long way in recent years, but the goal of a computer interpreting unrestricted human movement remains elusive. The use of stereo vision in this field has enabled the development of systems that begin to approach this goal. As computer technology advances we come ever closer to a system that can react to the ambiguities of human movement in real-time. In the foreseeable future stereo computer vision is not likely to replace the keyboard or mouse. There is at...

  2. Making IBM's Computer, Watson, Human

    Science.gov (United States)

    Rachlin, Howard

    2012-01-01

    This essay uses the recent victory of an IBM computer (Watson) in the TV game, "Jeopardy," to speculate on the abilities Watson would need, in addition to those it has, to be human. The essay's basic premise is that to be human is to behave as humans behave and to function in society as humans function. Alternatives to this premise are considered…

  3. Advanced Technologies, Embedded and Multimedia for Human-Centric Computing

    CERN Document Server

    Chao, Han-Chieh; Deng, Der-Jiunn; Park, James; HumanCom and EMC 2013

    2014-01-01

    The theme of HumanCom and EMC are focused on the various aspects of human-centric computing for advances in computer science and its applications, embedded and multimedia computing and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. And the theme of EMC (Advanced in Embedded and Multimedia Computing) is focused on the various aspects of embedded system, smart grid, cloud and multimedia computing, and it provides an opportunity for academic, industry professionals to discuss the latest issues and progress in the area of embedded and multimedia computing. Therefore this book will be include the various theories and practical applications in human-centric computing and embedded and multimedia computing.

  4. The epistemology and ontology of human-computer interaction

    NARCIS (Netherlands)

    Brey, Philip A.E.

    2005-01-01

    This paper analyzes epistemological and ontological dimensions of Human-Computer Interaction (HCI) through an analysis of the functions of computer systems in relation to their users. It is argued that the primary relation between humans and computer systems has historically been epistemic:

  5. Modeling multimodal human-computer interaction

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2004-01-01

    Incorporating the well-known Unified Modeling Language into a generic modeling framework makes research on multimodal human-computer interaction accessible to a wide range off software engineers. Multimodal interaction is part of everyday human discourse: We speak, move, gesture, and shift our gaze

  6. Minimal mobile human computer interaction

    NARCIS (Netherlands)

    el Ali, A.

    2013-01-01

    In the last 20 years, the widespread adoption of personal, mobile computing devices in everyday life, has allowed entry into a new technological era in Human Computer Interaction (HCI). The constant change of the physical and social context in a user's situation made possible by the portability of

  7. Human-computer interaction and management information systems

    CERN Document Server

    Galletta, Dennis F

    2014-01-01

    ""Human-Computer Interaction and Management Information Systems: Applications"" offers state-of-the-art research by a distinguished set of authors who span the MIS and HCI fields. The original chapters provide authoritative commentaries and in-depth descriptions of research programs that will guide 21st century scholars, graduate students, and industry professionals. Human-Computer Interaction (or Human Factors) in MIS is concerned with the ways humans interact with information, technologies, and tasks, especially in business, managerial, organizational, and cultural contexts. It is distinctiv

  8. Human performance models for computer-aided engineering

    Science.gov (United States)

    Elkind, Jerome I. (Editor); Card, Stuart K. (Editor); Hochberg, Julian (Editor); Huey, Beverly Messick (Editor)

    1989-01-01

    This report discusses a topic important to the field of computational human factors: models of human performance and their use in computer-based engineering facilities for the design of complex systems. It focuses on a particular human factors design problem -- the design of cockpit systems for advanced helicopters -- and on a particular aspect of human performance -- vision and related cognitive functions. By focusing in this way, the authors were able to address the selected topics in some depth and develop findings and recommendations that they believe have application to many other aspects of human performance and to other design domains.

  9. Measuring Multimodal Synchrony for Human-Computer Interaction

    NARCIS (Netherlands)

    Reidsma, Dennis; Nijholt, Antinus; Tschacher, Wolfgang; Ramseyer, Fabian; Sourin, A.

    2010-01-01

    Nonverbal synchrony is an important and natural element in human-human interaction. It can also play various roles in human-computer interaction. In particular this is the case in the interaction between humans and the virtual humans that inhabit our cyberworlds. Virtual humans need to adapt their

  10. Human Computation An Integrated Approach to Learning from the Crowd

    CERN Document Server

    Law, Edith

    2011-01-01

    Human computation is a new and evolving research area that centers around harnessing human intelligence to solve computational problems that are beyond the scope of existing Artificial Intelligence (AI) algorithms. With the growth of the Web, human computation systems can now leverage the abilities of an unprecedented number of people via the Web to perform complex computation. There are various genres of human computation applications that exist today. Games with a purpose (e.g., the ESP Game) specifically target online gamers who generate useful data (e.g., image tags) while playing an enjoy

  11. Accident sequence analysis of human-computer interface design

    International Nuclear Information System (INIS)

    Fan, C.-F.; Chen, W.-H.

    2000-01-01

    It is important to predict potential accident sequences of human-computer interaction in a safety-critical computing system so that vulnerable points can be disclosed and removed. We address this issue by proposing a Multi-Context human-computer interaction Model along with its analysis techniques, an Augmented Fault Tree Analysis, and a Concurrent Event Tree Analysis. The proposed augmented fault tree can identify the potential weak points in software design that may induce unintended software functions or erroneous human procedures. The concurrent event tree can enumerate possible accident sequences due to these weak points

  12. Handbook of human computation

    CERN Document Server

    Michelucci, Pietro

    2013-01-01

    This volume addresses the emerging area of human computation, The chapters, written by leading international researchers, explore existing and future opportunities to combine the respective strengths of both humans and machines in order to create powerful problem-solving capabilities. The book bridges scientific communities, capturing and integrating the unique perspective and achievements of each. It coalesces contributions from industry and across related disciplines in order to motivate, define, and anticipate the future of this exciting new frontier in science and cultural evolution. Reade

  13. Human computer confluence applied in healthcare and rehabilitation.

    Science.gov (United States)

    Viaud-Delmon, Isabelle; Gaggioli, Andrea; Ferscha, Alois; Dunne, Stephen

    2012-01-01

    Human computer confluence (HCC) is an ambitious research program studying how the emerging symbiotic relation between humans and computing devices can enable radically new forms of sensing, perception, interaction, and understanding. It is an interdisciplinary field, bringing together researches from horizons as various as pervasive computing, bio-signals processing, neuroscience, electronics, robotics, virtual & augmented reality, and provides an amazing potential for applications in medicine and rehabilitation.

  14. An Interdisciplinary Bibliography for Computers and the Humanities Courses.

    Science.gov (United States)

    Ehrlich, Heyward

    1991-01-01

    Presents an annotated bibliography of works related to the subject of computers and the humanities. Groups items into textbooks and overviews; introductions; human and computer languages; literary and linguistic analysis; artificial intelligence and robotics; social issue debates; computers' image in fiction; anthologies; writing and the…

  15. The Next Wave: Humans, Computers, and Redefining Reality

    Science.gov (United States)

    Little, William

    2018-01-01

    The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.

  16. Human computing and machine understanding of human behavior: A survey

    NARCIS (Netherlands)

    Pentland, Alex; Huang, Thomas S.; Huang, Th.S.; Nijholt, Antinus; Pantic, Maja; Pentland, A.

    2007-01-01

    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing should be about anticipatory user interfaces

  17. Plasmonic photocatalytic reactions enhanced by hot electrons in a one-dimensional quantum well

    Directory of Open Access Journals (Sweden)

    H. J. Huang

    2015-11-01

    Full Text Available The plasmonic endothermic oxidation of ammonium ions in a spinning disk reactor resulted in light energy transformation through quantum hot charge carriers (QHC, or quantum hot electrons, during a chemical reaction. It is demonstrated with a simple model that light of various intensities enhance the chemical oxidization of ammonium ions in water. It was further observed that light illumination, which induces the formation of plasmons on a platinum (Pt thin film, provided higher processing efficiency compared with the reaction on a bare glass disk. These induced plasmons generate quantum hot electrons with increasing momentum and energy in the one-dimensional quantum well of a Pt thin film. The energy carried by the quantum hot electrons provided the energy needed to catalyze the chemical reaction. The results indicate that one-dimensional confinement in spherical coordinates (i.e., nanoparticles is not necessary to provide an extra excited state for QHC generation; an 8 nm Pt thin film for one-dimensional confinement in Cartesian coordinates can also provide the extra excited state for the generation of QHC.

  18. Approaching Engagement towards Human-Engaged Computing

    DEFF Research Database (Denmark)

    Niksirat, Kavous Salehzadeh; Sarcar, Sayan; Sun, Huatong

    2018-01-01

    Debates regarding the nature and role of HCI research and practice have intensified in recent years, given the ever increasingly intertwined relations between humans and technologies. The framework of Human-Engaged Computing (HEC) was proposed and developed over a series of scholarly workshops to...

  19. An intelligent multi-media human-computer dialogue system

    Science.gov (United States)

    Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.

    1988-01-01

    Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.

  20. Image Visual Realism: From Human Perception to Machine Computation.

    Science.gov (United States)

    Fan, Shaojing; Ng, Tian-Tsong; Koenig, Bryan L; Herberg, Jonathan S; Jiang, Ming; Shen, Zhiqi; Zhao, Qi

    2017-08-30

    Visual realism is defined as the extent to which an image appears to people as a photo rather than computer generated. Assessing visual realism is important in applications like computer graphics rendering and photo retouching. However, current realism evaluation approaches use either labor-intensive human judgments or automated algorithms largely dependent on comparing renderings to reference images. We develop a reference-free computational framework for visual realism prediction to overcome these constraints. First, we construct a benchmark dataset of 2520 images with comprehensive human annotated attributes. From statistical modeling on this data, we identify image attributes most relevant for visual realism. We propose both empirically-based (guided by our statistical modeling of human data) and CNN-learned features to predict visual realism of images. Our framework has the following advantages: (1) it creates an interpretable and concise empirical model that characterizes human perception of visual realism; (2) it links computational features to latent factors of human image perception.

  1. Parallel structures in human and computer memory

    Science.gov (United States)

    Kanerva, Pentti

    1986-08-01

    If we think of our experiences as being recorded continuously on film, then human memory can be compared to a film library that is indexed by the contents of the film strips stored in it. Moreover, approximate retrieval cues suffice to retrieve information stored in this library: We recognize a familiar person in a fuzzy photograph or a familiar tune played on a strange instrument. This paper is about how to construct a computer memory that would allow a computer to recognize patterns and to recall sequences the way humans do. Such a memory is remarkably similar in structure to a conventional computer memory and also to the neural circuits in the cortex of the cerebellum of the human brain. The paper concludes that the frame problem of artificial intelligence could be solved by the use of such a memory if we were able to encode information about the world properly.

  2. Multimodal Information Presentation for High-Load Human Computer Interaction

    NARCIS (Netherlands)

    Cao, Y.

    2011-01-01

    This dissertation addresses multimodal information presentation in human computer interaction. Information presentation refers to the manner in which computer systems/interfaces present information to human users. More specifically, the focus of our work is not on which information to present, but

  3. My4Sight: A Human Computation Platform for Improving Flu Predictions

    OpenAIRE

    Akupatni, Vivek Bharath

    2015-01-01

    While many human computation (human-in-the-loop) systems exist in the field of Artificial Intelligence (AI) to solve problems that can't be solved by computers alone, comparatively fewer platforms exist for collecting human knowledge, and evaluation of various techniques for harnessing human insights in improving forecasting models for infectious diseases, such as Influenza and Ebola. In this thesis, we present the design and implementation of My4Sight, a human computation system develope...

  4. Pilots of the future - Human or computer?

    Science.gov (United States)

    Chambers, A. B.; Nagel, D. C.

    1985-01-01

    In connection with the occurrence of aircraft accidents and the evolution of the air-travel system, questions arise regarding the computer's potential for making fundamental contributions to improving the safety and reliability of air travel. An important result of an analysis of the causes of aircraft accidents is the conclusion that humans - 'pilots and other personnel' - are implicated in well over half of the accidents which occur. Over 70 percent of the incident reports contain evidence of human error. In addition, almost 75 percent show evidence of an 'information-transfer' problem. Thus, the question arises whether improvements in air safety could be achieved by removing humans from control situations. In an attempt to answer this question, it is important to take into account also certain advantages which humans have in comparison to computers. Attention is given to human error and the effects of technology, the motivation to automate, aircraft automation at the crossroads, the evolution of cockpit automation, and pilot factors.

  5. Artifical Intelligence for Human Computing

    NARCIS (Netherlands)

    Huang, Th.S.; Nijholt, Antinus; Pantic, Maja; Pentland, A.; Unknown, [Unknown

    2007-01-01

    This book constitutes the thoroughly refereed post-proceedings of two events discussing AI for Human Computing: one Special Session during the Eighth International ACM Conference on Multimodal Interfaces (ICMI 2006), held in Banff, Canada, in November 2006, and a Workshop organized in conjunction

  6. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    Science.gov (United States)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  7. Introduction to human-computer interaction

    CERN Document Server

    Booth, Paul

    2014-01-01

    Originally published in 1989 this title provided a comprehensive and authoritative introduction to the burgeoning discipline of human-computer interaction for students, academics, and those from industry who wished to know more about the subject. Assuming very little knowledge, the book provides an overview of the diverse research areas that were at the time only gradually building into a coherent and well-structured field. It aims to explain the underlying causes of the cognitive, social and organizational problems typically encountered when computer systems are introduced. It is clear and co

  8. The Past, Present and Future of Human Computer Interaction

    KAUST Repository

    Churchill, Elizabeth

    2018-01-16

    Human Computer Interaction (HCI) focuses on how people interact with, and are transformed by computation. Our current technology landscape is changing rapidly. Interactive applications, devices and services are increasingly becoming embedded into our environments. From our homes to the urban and rural spaces, we traverse everyday. We are increasingly able toヨoften required toヨmanage and configure multiple, interconnected devices and program their interactions. Artificial intelligence (AI) techniques are being used to create dynamic services that learn about us and others, that make conclusions about our intents and affiliations, and that mould our digital interactions based in predictions about our actions and needs, nudging us toward certain behaviors. Computation is also increasingly embedded into our bodies. Understanding human interactions in the everyday digital and physical context. During this lecture, Elizabeth Churchill -Director of User Experience at Google- will talk about how an emerging landscape invites us to revisit old methods and tactics for understanding how people interact with computers and computation, and how it challenges us to think about new methods and frameworks for understanding the future of human-centered computation.

  9. Hybrid Human-Computing Distributed Sense-Making: Extending the SOA Paradigm for Dynamic Adjudication and Optimization of Human and Computer Roles

    Science.gov (United States)

    Rimland, Jeffrey C.

    2013-01-01

    In many evolving systems, inputs can be derived from both human observations and physical sensors. Additionally, many computation and analysis tasks can be performed by either human beings or artificial intelligence (AI) applications. For example, weather prediction, emergency event response, assistive technology for various human sensory and…

  10. Benefits of Subliminal Feedback Loops in Human-Computer Interaction

    OpenAIRE

    Walter Ritter

    2011-01-01

    A lot of efforts have been directed to enriching human-computer interaction to make the user experience more pleasing or efficient. In this paper, we briefly present work in the fields of subliminal perception and affective computing, before we outline a new approach to add analog communication channels to the human-computer interaction experience. In this approach, in addition to symbolic predefined mappings of input to output, a subliminal feedback loop is used that provides feedback in evo...

  11. HuRECA: Human Reliability Evaluator for Computer-based Control Room Actions

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Lee, Seung Jun; Jang, Seung Cheol

    2011-01-01

    As computer-based design features such as computer-based procedures (CBP), soft controls (SCs), and integrated information systems are being adopted in main control rooms (MCR) of nuclear power plants, a human reliability analysis (HRA) method capable of dealing with the effects of these design features on human reliability is needed. From the observations of human factors engineering verification and validation experiments, we have drawn some major important characteristics on operator behaviors and design-related influencing factors (DIFs) from the perspective of human reliability. Firstly, there are new DIFs that should be considered in developing an HRA method for computer-based control rooms including especially CBP and SCs. In the case of the computer-based procedure rather than the paper-based procedure, the structural and managerial elements should be considered as important PSFs in addition to the procedural contents. In the case of the soft controllers, the so-called interface management tasks (or secondary tasks) should be reflected in the assessment of human error probability. Secondly, computer-based control rooms can provide more effective error recovery features than conventional control rooms. Major error recovery features for computer-based control rooms include the automatic logic checking function of the computer-based procedure and the information sharing feature of the general computer-based designs

  12. Choice of Human-Computer Interaction Mode in Stroke Rehabilitation.

    Science.gov (United States)

    Mousavi Hondori, Hossein; Khademi, Maryam; Dodakian, Lucy; McKenzie, Alison; Lopes, Cristina V; Cramer, Steven C

    2016-03-01

    Advances in technology are providing new forms of human-computer interaction. The current study examined one form of human-computer interaction, augmented reality (AR), whereby subjects train in the real-world workspace with virtual objects projected by the computer. Motor performances were compared with those obtained while subjects used a traditional human-computer interaction, that is, a personal computer (PC) with a mouse. Patients used goal-directed arm movements to play AR and PC versions of the Fruit Ninja video game. The 2 versions required the same arm movements to control the game but had different cognitive demands. With AR, the game was projected onto the desktop, where subjects viewed the game plus their arm movements simultaneously, in the same visual coordinate space. In the PC version, subjects used the same arm movements but viewed the game by looking up at a computer monitor. Among 18 patients with chronic hemiparesis after stroke, the AR game was associated with 21% higher game scores (P = .0001), 19% faster reaching times (P = .0001), and 15% less movement variability (P = .0068), as compared to the PC game. Correlations between game score and arm motor status were stronger with the AR version. Motor performances during the AR game were superior to those during the PC game. This result is due in part to the greater cognitive demands imposed by the PC game, a feature problematic for some patients but clinically useful for others. Mode of human-computer interface influences rehabilitation therapy demands and can be individualized for patients. © The Author(s) 2015.

  13. Human-computer systems interaction backgrounds and applications 3

    CERN Document Server

    Kulikowski, Juliusz; Mroczek, Teresa; Wtorek, Jerzy

    2014-01-01

    This book contains an interesting and state-of the art collection of papers on the recent progress in Human-Computer System Interaction (H-CSI). It contributes the profound description of the actual status of the H-CSI field and also provides a solid base for further development and research in the discussed area. The contents of the book are divided into the following parts: I. General human-system interaction problems; II. Health monitoring and disabled people helping systems; and III. Various information processing systems. This book is intended for a wide audience of readers who are not necessarily experts in computer science, machine learning or knowledge engineering, but are interested in Human-Computer Systems Interaction. The level of particular papers and specific spreading-out into particular parts is a reason why this volume makes fascinating reading. This gives the reader a much deeper insight than he/she might glean from research papers or talks at conferences. It touches on all deep issues that ...

  14. A Human/Computer Learning Network to Improve Biodiversity Conservation and Research

    OpenAIRE

    Kelling, Steve; Gerbracht, Jeff; Fink, Daniel; Lagoze, Carl; Wong, Weng-Keen; Yu, Jun; Damoulas, Theodoros; Gomes, Carla

    2012-01-01

    In this paper we describe eBird, a citizen-science project that takes advantage of the human observational capacity to identify birds to species, which is then used to accurately represent patterns of bird occurrences across broad spatial and temporal extents. eBird employs artificial intelligence techniques such as machine learning to improve data quality by taking advantage of the synergies between human computation and mechanical computation. We call this a Human-Computer Learning Network,...

  15. Computational Complexity and Human Decision-Making.

    Science.gov (United States)

    Bossaerts, Peter; Murawski, Carsten

    2017-12-01

    The rationality principle postulates that decision-makers always choose the best action available to them. It underlies most modern theories of decision-making. The principle does not take into account the difficulty of finding the best option. Here, we propose that computational complexity theory (CCT) provides a framework for defining and quantifying the difficulty of decisions. We review evidence showing that human decision-making is affected by computational complexity. Building on this evidence, we argue that most models of decision-making, and metacognition, are intractable from a computational perspective. To be plausible, future theories of decision-making will need to take into account both the resources required for implementing the computations implied by the theory, and the resource constraints imposed on the decision-maker by biology. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Human ear recognition by computer

    CERN Document Server

    Bhanu, Bir; Chen, Hui

    2010-01-01

    Biometrics deals with recognition of individuals based on their physiological or behavioral characteristics. The human ear is a new feature in biometrics that has several merits over the more common face, fingerprint and iris biometrics. Unlike the fingerprint and iris, it can be easily captured from a distance without a fully cooperative subject, although sometimes it may be hidden with hair, scarf and jewellery. Also, unlike a face, the ear is a relatively stable structure that does not change much with the age and facial expressions. ""Human Ear Recognition by Computer"" is the first book o

  17. L'ordinateur a visage humain (The Computer in Human Guise).

    Science.gov (United States)

    Otman, Gabriel

    1986-01-01

    Discusses the tendency of humans to describe parts and functions of a computer with terminology that refers to human characteristics; for example, parts of the body (electronic brain), intellectual activities (optical memory), and physical activities (command). Computers are also described through metaphors, connotations, allusions, and analogies…

  18. Human-Computer Interaction and Information Management Research Needs

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — In a visionary future, Human-Computer Interaction HCI and Information Management IM have the potential to enable humans to better manage their lives through the use...

  19. From humans to computers cognition through visual perception

    CERN Document Server

    Alexandrov, Viktor Vasilievitch

    1991-01-01

    This book considers computer vision to be an integral part of the artificial intelligence system. The core of the book is an analysis of possible approaches to the creation of artificial vision systems, which simulate human visual perception. Much attention is paid to the latest achievements in visual psychology and physiology, the description of the functional and structural organization of the human perception mechanism, the peculiarities of artistic perception and the expression of reality. Computer vision models based on these data are investigated. They include the processes of external d

  20. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  1. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  2. Overview Electrotactile Feedback for Enhancing Human Computer Interface

    Science.gov (United States)

    Pamungkas, Daniel S.; Caesarendra, Wahyu

    2018-04-01

    To achieve effective interaction between a human and a computing device or machine, adequate feedback from the computing device or machine is required. Recently, haptic feedback is increasingly being utilised to improve the interactivity of the Human Computer Interface (HCI). Most existing haptic feedback enhancements aim at producing forces or vibrations to enrich the user’s interactive experience. However, these force and/or vibration actuated haptic feedback systems can be bulky and uncomfortable to wear and only capable of delivering a limited amount of information to the user which can limit both their effectiveness and the applications they can be applied to. To address this deficiency, electrotactile feedback is used. This involves delivering haptic sensations to the user by electrically stimulating nerves in the skin via electrodes placed on the surface of the skin. This paper presents a review and explores the capability of electrotactile feedback for HCI applications. In addition, a description of the sensory receptors within the skin for sensing tactile stimulus and electric currents alsoseveral factors which influenced electric signal to transmit to the brain via human skinare explained.

  3. Human Memory Organization for Computer Programs.

    Science.gov (United States)

    Norcio, A. F.; Kerst, Stephen M.

    1983-01-01

    Results of study investigating human memory organization in processing of computer programming languages indicate that algorithmic logic segments form a cognitive organizational structure in memory for programs. Statement indentation and internal program documentation did not enhance organizational process of recall of statements in five Fortran…

  4. Psychosocial and Cultural Modeling in Human Computation Systems: A Gamification Approach

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Riensche, Roderick M.; Haack, Jereme N.; Butner, R. Scott

    2013-11-20

    “Gamification”, the application of gameplay to real-world problems, enables the development of human computation systems that support decision-making through the integration of social and machine intelligence. One of gamification’s major benefits includes the creation of a problem solving environment where the influence of cognitive and cultural biases on human judgment can be curtailed through collaborative and competitive reasoning. By reducing biases on human judgment, gamification allows human computation systems to exploit human creativity relatively unhindered by human error. Operationally, gamification uses simulation to harvest human behavioral data that provide valuable insights for the solution of real-world problems.

  5. Treatment of human-computer interface in a decision support system

    International Nuclear Information System (INIS)

    Heger, A.S.; Duran, F.A.; Cox, R.G.

    1992-01-01

    One of the most challenging applications facing the computer community is development of effective adaptive human-computer interface. This challenge stems from the complex nature of the human part of this symbiosis. The application of this discipline to the environmental restoration and waste management is further complicated due to the nature of environmental data. The information that is required to manage environmental impacts of human activity is fundamentally complex. This paper will discuss the efforts at Sandia National Laboratories in developing the adaptive conceptual model manager within the constraint of the environmental decision-making. A computer workstation, that hosts the Conceptual Model Manager and the Sandia Environmental Decision Support System will also be discussed

  6. The UK Human Genome Mapping Project online computing service.

    Science.gov (United States)

    Rysavy, F R; Bishop, M J; Gibbs, G P; Williams, G W

    1992-04-01

    This paper presents an overview of computing and networking facilities developed by the Medical Research Council to provide online computing support to the Human Genome Mapping Project (HGMP) in the UK. The facility is connected to a number of other computing facilities in various centres of genetics and molecular biology research excellence, either directly via high-speed links or through national and international wide-area networks. The paper describes the design and implementation of the current system, a 'client/server' network of Sun, IBM, DEC and Apple servers, gateways and workstations. A short outline of online computing services currently delivered by this system to the UK human genetics research community is also provided. More information about the services and their availability could be obtained by a direct approach to the UK HGMP-RC.

  7. Virtual reality/ augmented reality technology : the next chapter of human-computer interaction

    OpenAIRE

    Huang, Xing

    2015-01-01

    No matter how many different size and shape the computer has, the basic components of computers are still the same. If we use the user perspective to look for the development of computer history, we can surprisingly find that it is the input output device that leads the development of the industry development, in one word, human-computer interaction changes the development of computer history. Human computer interaction has been gone through three stages, the first stage relies on the inpu...

  8. Human-Computer Interaction The Agency Perspective

    CERN Document Server

    Oliveira, José

    2012-01-01

    Agent-centric theories, approaches and technologies are contributing to enrich interactions between users and computers. This book aims at highlighting the influence of the agency perspective in Human-Computer Interaction through a careful selection of research contributions. Split into five sections; Users as Agents, Agents and Accessibility, Agents and Interactions, Agent-centric Paradigms and Approaches, and Collective Agents, the book covers a wealth of novel, original and fully updated material, offering:   ü  To provide a coherent, in depth, and timely material on the agency perspective in HCI ü  To offer an authoritative treatment of the subject matter presented by carefully selected authors ü  To offer a balanced and broad coverage of the subject area, including, human, organizational, social, as well as technological concerns. ü  To offer a hands-on-experience by covering representative case studies and offering essential design guidelines   The book will appeal to a broad audience of resea...

  9. The Emotiv EPOC interface paradigm in Human-Computer Interaction

    OpenAIRE

    Ancău Dorina; Roman Nicolae-Marius; Ancău Mircea

    2017-01-01

    Numerous studies have suggested the use of decoded error potentials in the brain to improve human-computer communication. Together with state-of-the-art scientific equipment, experiments have also tested instruments with more limited performance for the time being, such as Emotiv EPOC. This study presents a review of these trials and a summary of the results obtained. However, the level of these results indicates a promising prospect for using this headset as a human-computer interface for er...

  10. Object categorization: computer and human vision perspectives

    National Research Council Canada - National Science Library

    Dickinson, Sven J

    2009-01-01

    .... The result of a series of four highly successful workshops on the topic, the book gathers many of the most distinguished researchers from both computer and human vision to reflect on their experience...

  11. A Perspective on Computational Human Performance Models as Design Tools

    Science.gov (United States)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  12. Gaze-and-brain-controlled interfaces for human-computer and human-robot interaction

    Directory of Open Access Journals (Sweden)

    Shishkin S. L.

    2017-09-01

    Full Text Available Background. Human-machine interaction technology has greatly evolved during the last decades, but manual and speech modalities remain single output channels with their typical constraints imposed by the motor system’s information transfer limits. Will brain-computer interfaces (BCIs and gaze-based control be able to convey human commands or even intentions to machines in the near future? We provide an overview of basic approaches in this new area of applied cognitive research. Objective. We test the hypothesis that the use of communication paradigms and a combination of eye tracking with unobtrusive forms of registering brain activity can improve human-machine interaction. Methods and Results. Three groups of ongoing experiments at the Kurchatov Institute are reported. First, we discuss the communicative nature of human-robot interaction, and approaches to building a more e cient technology. Specifically, “communicative” patterns of interaction can be based on joint attention paradigms from developmental psychology, including a mutual “eye-to-eye” exchange of looks between human and robot. Further, we provide an example of “eye mouse” superiority over the computer mouse, here in emulating the task of selecting a moving robot from a swarm. Finally, we demonstrate a passive, noninvasive BCI that uses EEG correlates of expectation. This may become an important lter to separate intentional gaze dwells from non-intentional ones. Conclusion. The current noninvasive BCIs are not well suited for human-robot interaction, and their performance, when they are employed by healthy users, is critically dependent on the impact of the gaze on selection of spatial locations. The new approaches discussed show a high potential for creating alternative output pathways for the human brain. When support from passive BCIs becomes mature, the hybrid technology of the eye-brain-computer (EBCI interface will have a chance to enable natural, fluent, and the

  13. The Emotiv EPOC interface paradigm in Human-Computer Interaction

    Directory of Open Access Journals (Sweden)

    Ancău Dorina

    2017-01-01

    Full Text Available Numerous studies have suggested the use of decoded error potentials in the brain to improve human-computer communication. Together with state-of-the-art scientific equipment, experiments have also tested instruments with more limited performance for the time being, such as Emotiv EPOC. This study presents a review of these trials and a summary of the results obtained. However, the level of these results indicates a promising prospect for using this headset as a human-computer interface for error decoding.

  14. Computer simulation of human motion in sports biomechanics.

    Science.gov (United States)

    Vaughan, C L

    1984-01-01

    This chapter has covered some important aspects of the computer simulation of human motion in sports biomechanics. First the definition and the advantages and limitations of computer simulation were discussed; second, research on various sporting activities were reviewed. These activities included basic movements, aquatic sports, track and field athletics, winter sports, gymnastics, and striking sports. This list was not exhaustive and certain material has, of necessity, been omitted. However, it was felt that a sufficiently broad and interesting range of activities was chosen to illustrate both the advantages and the pitfalls of simulation. It is almost a decade since Miller [53] wrote a review chapter similar to this one. One might be tempted to say that things have changed radically since then--that computer simulation is now a widely accepted and readily applied research tool in sports biomechanics. This is simply not true, however. Biomechanics researchers still tend to emphasize the descriptive type of study, often unfortunately, when a little theoretical explanation would have been more helpful [29]. What will the next decade bring? Of one thing we can be certain: The power of computers, particularly the readily accessible and portable microcomputer, will expand beyond all recognition. The memory and storage capacities will increase dramatically on the hardware side, and on the software side the trend will be toward "user-friendliness." It is likely that a number of software simulation packages designed specifically for studying human motion [31, 96] will be extensively tested and could gain wide acceptance in the biomechanics research community. Nevertheless, a familiarity with Newtonian and Lagrangian mechanics, optimization theory, and computers in general, as well as practical biomechanical insight, will still be a prerequisite for successful simulation models of human motion. Above all, the biomechanics researcher will still have to bear in mind that

  15. Human law and computer law comparative perspectives

    CERN Document Server

    Hildebrandt, Mireille

    2014-01-01

    This book probes the epistemological and hermeneutic implications of data science and artificial intelligence for democracy and the Rule of Law, and the challenges posed by computing technologies traditional legal thinking and the regulation of human affairs.

  16. Proxemics in Human-Computer Interaction

    OpenAIRE

    Greenberg, Saul; Honbaek, Kasper; Quigley, Aaron; Reiterer, Harald; Rädle, Roman

    2014-01-01

    In 1966, anthropologist Edward Hall coined the term "proxemics." Proxemics is an area of study that identifies the culturally dependent ways in which people use interpersonal distance to understand and mediate their interactions with others. Recent research has demonstrated the use of proxemics in human-computer interaction (HCI) for supporting users' explicit and implicit interactions in a range of uses, including remote office collaboration, home entertainment, and games. One promise of pro...

  17. Computer aided systems human engineering: A hypermedia tool

    Science.gov (United States)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  18. Brain-Computer Interfaces Revolutionizing Human-Computer Interaction

    CERN Document Server

    Graimann, Bernhard; Allison, Brendan

    2010-01-01

    A brain-computer interface (BCI) establishes a direct output channel between the human brain and external devices. BCIs infer user intent via direct measures of brain activity and thus enable communication and control without movement. This book, authored by experts in the field, provides an accessible introduction to the neurophysiological and signal-processing background required for BCI, presents state-of-the-art non-invasive and invasive approaches, gives an overview of current hardware and software solutions, and reviews the most interesting as well as new, emerging BCI applications. The book is intended not only for students and young researchers, but also for newcomers and other readers from diverse backgrounds keen to learn about this vital scientific endeavour.

  19. Applying systemic-structural activity theory to design of human-computer interaction systems

    CERN Document Server

    Bedny, Gregory Z; Bedny, Inna

    2015-01-01

    Human-Computer Interaction (HCI) is an interdisciplinary field that has gained recognition as an important field in ergonomics. HCI draws on ideas and theoretical concepts from computer science, psychology, industrial design, and other fields. Human-Computer Interaction is no longer limited to trained software users. Today people interact with various devices such as mobile phones, tablets, and laptops. How can you make such interaction user friendly, even when user proficiency levels vary? This book explores methods for assessing the psychological complexity of computer-based tasks. It also p

  20. Feedback Loops in Communication and Human Computing

    NARCIS (Netherlands)

    op den Akker, Hendrikus J.A.; Heylen, Dirk K.J.; Pantic, Maja; Pentland, Alex; Nijholt, Antinus; Huang, Thomas S.

    Building systems that are able to analyse communicative behaviours or take part in conversations requires a sound methodology in which the complex organisation of conversations is understood and tested on real-life samples. The data-driven approaches to human computing not only have a value for the

  1. Human-computer interaction handbook fundamentals, evolving technologies and emerging applications

    CERN Document Server

    Sears, Andrew

    2007-01-01

    This second edition of The Human-Computer Interaction Handbook provides an updated, comprehensive overview of the most important research in the field, including insights that are directly applicable throughout the process of developing effective interactive information technologies. It features cutting-edge advances to the scientific knowledge base, as well as visionary perspectives and developments that fundamentally transform the way in which researchers and practitioners view the discipline. As the seminal volume of HCI research and practice, The Human-Computer Interaction Handbook feature

  2. Human Pacman: A Mobile Augmented Reality Entertainment System Based on Physical, Social, and Ubiquitous Computing

    Science.gov (United States)

    Cheok, Adrian David

    This chapter details the Human Pacman system to illuminate entertainment computing which ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on infrastructure provided by mobile computing, wireless LAN, and ubiquitous computing. With Human Pacman, we have a physical role-playing computer fantasy together with real human-social and mobile-gaming that emphasizes on collaboration and competition between players in a wide outdoor physical area that allows natural wide-area human-physical movements. Pacmen and Ghosts are now real human players in the real world experiencing mixed computer graphics fantasy-reality provided by using the wearable computers on them. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide novel experiences of seamless transitions between the real and virtual worlds. This is an example of a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.

  3. HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL*

    Science.gov (United States)

    Malamud, Ofer; Pop-Eleches, Cristian

    2012-01-01

    This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven’s Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors. PMID:22719135

  4. Applying Human Computation Methods to Information Science

    Science.gov (United States)

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  5. Proceedings of the Third International Conference on Intelligent Human Computer Interaction

    CERN Document Server

    Pokorný, Jaroslav; Snášel, Václav; Abraham, Ajith

    2013-01-01

    The Third International Conference on Intelligent Human Computer Interaction 2011 (IHCI 2011) was held at Charles University, Prague, Czech Republic from August 29 - August 31, 2011. This conference was third in the series, following IHCI 2009 and IHCI 2010 held in January at IIIT Allahabad, India. Human computer interaction is a fast growing research area and an attractive subject of interest for both academia and industry. There are many interesting and challenging topics that need to be researched and discussed. This book aims to provide excellent opportunities for the dissemination of interesting new research and discussion about presented topics. It can be useful for researchers working on various aspects of human computer interaction. Topics covered in this book include user interface and interaction, theoretical background and applications of HCI and also data mining and knowledge discovery as a support of HCI applications.

  6. Mobile human-computer interaction perspective on mobile learning

    CSIR Research Space (South Africa)

    Botha, Adèle

    2010-10-01

    Full Text Available Applying a Mobile Human Computer Interaction (MHCI) view to the domain of education using Mobile Learning (Mlearning), the research outlines its understanding of the influences and effects of different interactions on the use of mobile technology...

  7. Object recognition in images by human vision and computer vision

    NARCIS (Netherlands)

    Chen, Q.; Dijkstra, J.; Vries, de B.

    2010-01-01

    Object recognition plays a major role in human behaviour research in the built environment. Computer based object recognition techniques using images as input are challenging, but not an adequate representation of human vision. This paper reports on the differences in object shape recognition

  8. Optimal design methods for a digital human-computer interface based on human reliability in a nuclear power plant

    International Nuclear Information System (INIS)

    Jiang, Jianjun; Zhang, Li; Xie, Tian; Wu, Daqing; Li, Min; Wang, Yiqun; Peng, Yuyuan; Peng, Jie; Zhang, Mengjia; Li, Peiyao; Ma, Congmin; Wu, Xing

    2017-01-01

    Highlights: • A complete optimization process is established for digital human-computer interfaces of Npps. • A quick convergence search method is proposed. • The authors propose an affinity error probability mapping function to test human reliability. - Abstract: This is the second in a series of papers describing the optimal design method for a digital human-computer interface of nuclear power plant (Npp) from three different points based on human reliability. The purpose of this series is to explore different optimization methods from varying perspectives. This present paper mainly discusses the optimal design method for quantity of components of the same factor. In monitoring process, quantity of components has brought heavy burden to operators, thus, human errors are easily triggered. To solve the problem, the authors propose an optimization process, a quick convergence search method and an affinity error probability mapping function. Two balanceable parameter values of the affinity error probability function are obtained by experiments. The experimental results show that the affinity error probability mapping function about human-computer interface has very good sensitivity and stability, and that quick convergence search method for fuzzy segments divided by component quantity has better performance than general algorithm.

  9. Human Inspired Self-developmental Model of Neural Network (HIM): Introducing Content/Form Computing

    Science.gov (United States)

    Krajíček, Jiří

    This paper presents cross-disciplinary research between medical/psychological evidence on human abilities and informatics needs to update current models in computer science to support alternative methods for computation and communication. In [10] we have already proposed hypothesis introducing concept of human information model (HIM) as cooperative system. Here we continue on HIM design in detail. In our design, first we introduce Content/Form computing system which is new principle of present methods in evolutionary computing (genetic algorithms, genetic programming). Then we apply this system on HIM (type of artificial neural network) model as basic network self-developmental paradigm. Main inspiration of our natural/human design comes from well known concept of artificial neural networks, medical/psychological evidence and Sheldrake theory of "Nature as Alive" [22].

  10. A novel polar-based human face recognition computational model

    Directory of Open Access Journals (Sweden)

    Y. Zana

    2009-07-01

    Full Text Available Motivated by a recently proposed biologically inspired face recognition approach, we investigated the relation between human behavior and a computational model based on Fourier-Bessel (FB spatial patterns. We measured human recognition performance of FB filtered face images using an 8-alternative forced-choice method. Test stimuli were generated by converting the images from the spatial to the FB domain, filtering the resulting coefficients with a band-pass filter, and finally taking the inverse FB transformation of the filtered coefficients. The performance of the computational models was tested using a simulation of the psychophysical experiment. In the FB model, face images were first filtered by simulated V1- type neurons and later analyzed globally for their content of FB components. In general, there was a higher human contrast sensitivity to radially than to angularly filtered images, but both functions peaked at the 11.3-16 frequency interval. The FB-based model presented similar behavior with regard to peak position and relative sensitivity, but had a wider frequency band width and a narrower response range. The response pattern of two alternative models, based on local FB analysis and on raw luminance, strongly diverged from the human behavior patterns. These results suggest that human performance can be constrained by the type of information conveyed by polar patterns, and consequently that humans might use FB-like spatial patterns in face processing.

  11. Open-Box Muscle-Computer Interface: Introduction to Human-Computer Interactions in Bioengineering, Physiology, and Neuroscience Courses

    Science.gov (United States)

    Landa-Jiménez, M. A.; González-Gaspar, P.; Pérez-Estudillo, C.; López-Meraz, M. L.; Morgado-Valle, C.; Beltran-Parrazal, L.

    2016-01-01

    A Muscle-Computer Interface (muCI) is a human-machine system that uses electromyographic (EMG) signals to communicate with a computer. Surface EMG (sEMG) signals are currently used to command robotic devices, such as robotic arms and hands, and mobile robots, such as wheelchairs. These signals reflect the motor intention of a user before the…

  12. Human-Computer Interaction in Smart Environments

    Science.gov (United States)

    Paravati, Gianluca; Gatteschi, Valentina

    2015-01-01

    Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  13. Cognitive engineering models: A prerequisite to the design of human-computer interaction in complex dynamic systems

    Science.gov (United States)

    Mitchell, Christine M.

    1993-01-01

    This chapter examines a class of human-computer interaction applications, specifically the design of human-computer interaction for the operators of complex systems. Such systems include space systems (e.g., manned systems such as the Shuttle or space station, and unmanned systems such as NASA scientific satellites), aviation systems (e.g., the flight deck of 'glass cockpit' airplanes or air traffic control) and industrial systems (e.g., power plants, telephone networks, and sophisticated, e.g., 'lights out,' manufacturing facilities). The main body of human-computer interaction (HCI) research complements but does not directly address the primary issues involved in human-computer interaction design for operators of complex systems. Interfaces to complex systems are somewhat special. The 'user' in such systems - i.e., the human operator responsible for safe and effective system operation - is highly skilled, someone who in human-machine systems engineering is sometimes characterized as 'well trained, well motivated'. The 'job' or task context is paramount and, thus, human-computer interaction is subordinate to human job interaction. The design of human interaction with complex systems, i.e., the design of human job interaction, is sometimes called cognitive engineering.

  14. Can Computers Foster Human Users’ Creativity? Theory and Praxis of Mixed-Initiative Co-Creativity

    Directory of Open Access Journals (Sweden)

    Antonios Liapis

    2016-07-01

    Full Text Available This article discusses the impact of artificially intelligent computers to the process of design, play and educational activities. A computational process which has the necessary intelligence and creativity to take a proactive role in such activities can not only support human creativity but also foster it and prompt lateral thinking. The argument is made both from the perspective of human creativity, where the computational input is treated as an external stimulus which triggers re-framing of humans’ routines and mental associations, but also from the perspective of computational creativity where human input and initiative constrains the search space of the algorithm, enabling it to focus on specific possible solutions to a problem rather than globally search for the optimal. The article reviews four mixed-initiative tools (for design and educational play based on how they contribute to human-machine co-creativity. These paradigms serve different purposes, afford different human interaction methods and incorporate different computationally creative processes. Assessing how co-creativity is facilitated on a per-paradigm basis strengthens the theoretical argument and provides an initial seed for future work in the burgeoning domain of mixed-initiative interaction.

  15. Human-Computer Interaction, Tourism and Cultural Heritage

    Science.gov (United States)

    Cipolla Ficarra, Francisco V.

    We present a state of the art of the human-computer interaction aimed at tourism and cultural heritage in some cities of the European Mediterranean. In the work an analysis is made of the main problems deriving from training understood as business and which can derail the continuous growth of the HCI, the new technologies and tourism industry. Through a semiotic and epistemological study the current mistakes in the context of the interrelations of the formal and factual sciences will be detected and also the human factors that have an influence on the professionals devoted to the development of interactive systems in order to safeguard and boost cultural heritage.

  16. Humor in Human-Computer Interaction : A Short Survey

    NARCIS (Netherlands)

    Nijholt, Anton; Niculescu, Andreea; Valitutti, Alessandro; Banchs, Rafael E.; Joshi, Anirudha; Balkrishan, Devanuj K.; Dalvi, Girish; Winckler, Marco

    2017-01-01

    This paper is a short survey on humor in human-computer interaction. It describes how humor is designed and interacted with in social media, virtual agents, social robots and smart environments. Benefits and future use of humor in interactions with artificial entities are discussed based on

  17. Computer-based personality judgments are more accurate than those made by humans.

    Science.gov (United States)

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-27

    Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy.

  18. Human-Computer Interaction in Smart Environments

    Directory of Open Access Journals (Sweden)

    Gianluca Paravati

    2015-08-01

    Full Text Available Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  19. Cognition beyond the brain computation, interactivity and human artifice

    CERN Document Server

    Cowley, Stephen J

    2013-01-01

    Arguing that a collective dimension has given cognitive flexibility to human intelligence, this book shows that traditional cognitive psychology underplays the role of bodies, dialogue, diagrams, tools, talk, customs, habits, computers and cultural practices.

  20. A Human-Centred Tangible approach to learning Computational Thinking

    Directory of Open Access Journals (Sweden)

    Tommaso Turchi

    2016-08-01

    Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

  1. Reciprocity in computer-human interaction: source-based, norm-based, and affect-based explanations.

    Science.gov (United States)

    Lee, Seungcheol Austin; Liang, Yuhua Jake

    2015-04-01

    Individuals often apply social rules when they interact with computers, and this is known as the Computers Are Social Actors (CASA) effect. Following previous work, one approach to understand the mechanism responsible for CASA is to utilize computer agents and have the agents attempt to gain human compliance (e.g., completing a pattern recognition task). The current study focuses on three key factors frequently cited to influence traditional notions of compliance: evaluations toward the source (competence and warmth), normative influence (reciprocity), and affective influence (mood). Structural equation modeling assessed the effects of these factors on human compliance with computer request. The final model shows that norm-based influence (reciprocity) increased the likelihood of compliance, while evaluations toward the computer agent did not significantly influence compliance.

  2. Constructing a Computer Model of the Human Eye Based on Tissue Slice Images

    OpenAIRE

    Dai, Peishan; Wang, Boliang; Bao, Chunbo; Ju, Ying

    2010-01-01

    Computer simulation of the biomechanical and biological heat transfer in ophthalmology greatly relies on having a reliable computer model of the human eye. This paper proposes a novel method on the construction of a geometric model of the human eye based on tissue slice images. Slice images were obtained from an in vitro Chinese human eye through an embryo specimen processing methods. A level set algorithm was used to extract contour points of eye tissues while a principle component analysi...

  3. MoCog1: A computer simulation of recognition-primed human decision making

    Science.gov (United States)

    Gevarter, William B.

    1991-01-01

    The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  4. Implementations of the CC'01 Human-Computer Interaction Guidelines Using Bloom's Taxonomy

    Science.gov (United States)

    Manaris, Bill; Wainer, Michael; Kirkpatrick, Arthur E.; Stalvey, RoxAnn H.; Shannon, Christine; Leventhal, Laura; Barnes, Julie; Wright, John; Schafer, J. Ben; Sanders, Dean

    2007-01-01

    In today's technology-laden society human-computer interaction (HCI) is an important knowledge area for computer scientists and software engineers. This paper surveys existing approaches to incorporate HCI into computer science (CS) and such related issues as the perceived gap between the interests of the HCI community and the needs of CS…

  5. Computers, the Human Mind, and My In-Laws' House.

    Science.gov (United States)

    Esque, Timm J.

    1996-01-01

    Discussion of human memory, computer memory, and the storage of information focuses on a metaphor that can account for memory without storage and can set the stage for systemic research around a more comprehensive, understandable theory. (Author/LRW)

  6. Computer science security research and human subjects: emerging considerations for research ethics boards.

    Science.gov (United States)

    Buchanan, Elizabeth; Aycock, John; Dexter, Scott; Dittrich, David; Hvizdak, Erin

    2011-06-01

    This paper explores the growing concerns with computer science research, and in particular, computer security research and its relationship with the committees that review human subjects research. It offers cases that review boards are likely to confront, and provides a context for appropriate consideration of such research, as issues of bots, clouds, and worms enter the discourse of human subjects review.

  7. Homo ludens in the loop playful human computation systems

    CERN Document Server

    Krause, Markus

    2014-01-01

    The human mind is incredible. It solves problems with ease that will elude machines even for the next decades. This book explores what happens when humans and machines work together to solve problems machines cannot yet solve alone. It explains how machines and computers can work together and how humans can have fun helping to face some of the most challenging problems of artificial intelligence. In this book, you will find designs for games that are entertaining and yet able to collect data to train machines for complex tasks such as natural language processing or image understanding. You wil

  8. Computer-based personality judgments are more accurate than those made by humans

    Science.gov (United States)

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-01

    Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

  9. Design Science in Human-Computer Interaction: A Model and Three Examples

    Science.gov (United States)

    Prestopnik, Nathan R.

    2013-01-01

    Humanity has entered an era where computing technology is virtually ubiquitous. From websites and mobile devices to computers embedded in appliances on our kitchen counters and automobiles parked in our driveways, information and communication technologies (ICTs) and IT artifacts are fundamentally changing the ways we interact with our world.…

  10. Computational 3-D Model of the Human Respiratory System

    Science.gov (United States)

    We are developing a comprehensive, morphologically-realistic computational model of the human respiratory system that can be used to study the inhalation, deposition, and clearance of contaminants, while being adaptable for age, race, gender, and health/disease status. The model ...

  11. Domain Decomposition for Computing Extremely Low Frequency Induced Current in the Human Body

    OpenAIRE

    Perrussel , Ronan; Voyer , Damien; Nicolas , Laurent; Scorretti , Riccardo; Burais , Noël

    2011-01-01

    International audience; Computation of electromagnetic fields in high resolution computational phantoms requires solving large linear systems. We present an application of Schwarz preconditioners with Krylov subspace methods for computing extremely low frequency induced fields in a phantom issued from the Visible Human.

  12. Human errors and violations in computer and information security: the viewpoint of network administrators and security specialists.

    Science.gov (United States)

    Kraemer, Sara; Carayon, Pascale

    2007-03-01

    This paper describes human errors and violations of end users and network administration in computer and information security. This information is summarized in a conceptual framework for examining the human and organizational factors contributing to computer and information security. This framework includes human error taxonomies to describe the work conditions that contribute adversely to computer and information security, i.e. to security vulnerabilities and breaches. The issue of human error and violation in computer and information security was explored through a series of 16 interviews with network administrators and security specialists. The interviews were audio taped, transcribed, and analyzed by coding specific themes in a node structure. The result is an expanded framework that classifies types of human error and identifies specific human and organizational factors that contribute to computer and information security. Network administrators tended to view errors created by end users as more intentional than unintentional, while errors created by network administrators as more unintentional than intentional. Organizational factors, such as communication, security culture, policy, and organizational structure, were the most frequently cited factors associated with computer and information security.

  13. A Software Framework for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2009-01-01

    This paper describes a software framework we designed and implemented for the development and research in the area of multimodal human-computer interface. The proposed framework is based on publish / subscribe architecture, which allows developers and researchers to conveniently configure, test and

  14. Distinguishing humans from computers in the game of go: A complex network approach

    Science.gov (United States)

    Coquidé, C.; Georgeot, B.; Giraud, O.

    2017-08-01

    We compare complex networks built from the game of go and obtained from databases of human-played games with those obtained from computer-played games. Our investigations show that statistical features of the human-based networks and the computer-based networks differ, and that these differences can be statistically significant on a relatively small number of games using specific estimators. We show that the deterministic or stochastic nature of the computer algorithm playing the game can also be distinguished from these quantities. This can be seen as a tool to implement a Turing-like test for go simulators.

  15. Automated planning target volume generation: an evaluation pitting a computer-based tool against human experts

    International Nuclear Information System (INIS)

    Ketting, Case H.; Austin-Seymour, Mary; Kalet, Ira; Jacky, Jon; Kromhout-Schiro, Sharon; Hummel, Sharon; Unger, Jonathan; Fagan, Lawrence M.; Griffin, Tom

    1997-01-01

    Purpose: Software tools are seeing increased use in three-dimensional treatment planning. However, the development of these tools frequently omits careful evaluation before placing them in clinical use. This study demonstrates the application of a rigorous evaluation methodology using blinded peer review to an automated software tool that produces ICRU-50 planning target volumes (PTVs). Methods and Materials: Seven physicians from three different institutions involved in three-dimensional treatment planning participated in the evaluation. Four physicians drew partial PTVs on nine test cases, consisting of four nasopharynx and five lung primaries. Using the same information provided to the human experts, the computer tool generated PTVs for comparison. The remaining three physicians, designated evaluators, individually reviewed the PTVs for acceptability. To exclude bias, the evaluators were blinded to the source (human or computer) of the PTVs they reviewed. Their scorings of the PTVs were statistically examined to determine if the computer tool performed as well as the human experts. Results: The computer tool was as successful as the human experts in generating PTVs. Failures were primarily attributable to insufficient margins around the clinical target volume and to encroachment upon critical structures. In a qualitative analysis, the human and computer experts displayed similar types and distributions of errors. Conclusions: Rigorous evaluation of computer-based radiotherapy tools requires comparison to current practice and can reveal areas for improvement before the tool enters clinical practice

  16. Cognitive engineering in the design of human-computer interaction and expert systems

    International Nuclear Information System (INIS)

    Salvendy, G.

    1987-01-01

    The 68 papers contributing to this book cover the following areas: Theories of Interface Design; Methodologies of Interface Design; Applications of Interface Design; Software Design; Human Factors in Speech Technology and Telecommunications; Design of Graphic Dialogues; Knowledge Acquisition for Knowledge-Based Systems; Design, Evaluation and Use of Expert Systems. This demonstrates the dual role of cognitive engineering. On the one hand cognitive engineering is utilized to design computing systems which are compatible with human cognition and can be effectively and be easily utilized by all individuals. On the other hand, cognitive engineering is utilized to transfer human cognition into the computer for the purpose of building expert systems. Two papers are of interest to INIS

  17. A conceptual and computational model of moral decision making in human and artificial agents.

    Science.gov (United States)

    Wallach, Wendell; Franklin, Stan; Allen, Colin

    2010-07-01

    Recently, there has been a resurgence of interest in general, comprehensive models of human cognition. Such models aim to explain higher-order cognitive faculties, such as deliberation and planning. Given a computational representation, the validity of these models can be tested in computer simulations such as software agents or embodied robots. The push to implement computational models of this kind has created the field of artificial general intelligence (AGI). Moral decision making is arguably one of the most challenging tasks for computational approaches to higher-order cognition. The need for increasingly autonomous artificial agents to factor moral considerations into their choices and actions has given rise to another new field of inquiry variously known as Machine Morality, Machine Ethics, Roboethics, or Friendly AI. In this study, we discuss how LIDA, an AGI model of human cognition, can be adapted to model both affective and rational features of moral decision making. Using the LIDA model, we will demonstrate how moral decisions can be made in many domains using the same mechanisms that enable general decision making. Comprehensive models of human cognition typically aim for compatibility with recent research in the cognitive and neural sciences. Global workspace theory, proposed by the neuropsychologist Bernard Baars (1988), is a highly regarded model of human cognition that is currently being computationally instantiated in several software implementations. LIDA (Franklin, Baars, Ramamurthy, & Ventura, 2005) is one such computational implementation. LIDA is both a set of computational tools and an underlying model of human cognition, which provides mechanisms that are capable of explaining how an agent's selection of its next action arises from bottom-up collection of sensory data and top-down processes for making sense of its current situation. We will describe how the LIDA model helps integrate emotions into the human decision-making process, and we

  18. Advanced approaches to characterize the human intestinal microbiota by computational meta-analysis

    NARCIS (Netherlands)

    Nikkilä, J.; Vos, de W.M.

    2010-01-01

    GOALS: We describe advanced approaches for the computational meta-analysis of a collection of independent studies, including over 1000 phylogenetic array datasets, as a means to characterize the variability of human intestinal microbiota. BACKGROUND: The human intestinal microbiota is a complex

  19. Dual-Modality Imaging of the Human Finger Joint Systems by Using Combined Multispectral Photoacoustic Computed Tomography and Ultrasound Computed Tomography

    Directory of Open Access Journals (Sweden)

    Yubin Liu

    2016-01-01

    Full Text Available We developed a homemade dual-modality imaging system that combines multispectral photoacoustic computed tomography and ultrasound computed tomography for reconstructing the structural and functional information of human finger joint systems. The fused multispectral photoacoustic-ultrasound computed tomography (MPAUCT system was examined by the phantom and in vivo experimental tests. The imaging results indicate that the hard tissues such as the bones and the soft tissues including the blood vessels, the tendon, the skins, and the subcutaneous tissues in the finger joints systems can be effectively recovered by using our multimodality MPAUCT system. The developed MPAUCT system is able to provide us with more comprehensive information of the human finger joints, which shows its potential for characterization and diagnosis of bone or joint diseases.

  20. The Human-Computer Interaction of Cross-Cultural Gaming Strategy

    Science.gov (United States)

    Chakraborty, Joyram; Norcio, Anthony F.; Van Der Veer, Jacob J.; Andre, Charles F.; Miller, Zachary; Regelsberger, Alexander

    2015-01-01

    This article explores the cultural dimensions of the human-computer interaction that underlies gaming strategies. The article is a desktop study of existing literature and is organized into five sections. The first examines the cultural aspects of knowledge processing. The social constructs technology interaction is discussed. Following this, the…

  1. Computed tomography of human joints and radioactive waste drums

    International Nuclear Information System (INIS)

    Martz, Harry E.; Roberson, G. Patrick; Hollerbach, Karin; Logan, Clinton M.; Ashby, Elaine; Bernardi, Richard

    1999-01-01

    X- and gamma-ray imaging techniques in nondestructive evaluation (NDE) and assay (NDA) have seen increasing use in an array of industrial, environmental, military, and medical applications. Much of this growth in recent years is attributed to the rapid development of computed tomography (CT) and the use of NDE throughout the life-cycle of a product. Two diverse examples of CT are discussed, 1.) Our computational approach to normal joint kinematics and prosthetic joint analysis offers an opportunity to evaluate and improve prosthetic human joint replacements before they are manufactured or surgically implanted. Computed tomography data from scanned joints are segmented, resulting in the identification of bone and other tissues of interest, with emphasis on the articular surfaces. 2.) We are developing NDE and NDA techniques to analyze closed waste drums accurately and quantitatively. Active and passive computed tomography (A and PCT) is a comprehensive and accurate gamma-ray NDA method that can identify all detectable radioisotopes present in a container and measure their radioactivity

  2. Why computer games can be essential for human flourishing

    NARCIS (Netherlands)

    Fröding, B.; Peterson, M.B.

    2013-01-01

    Traditionally, playing computer games and engaging in other online activities has been seen as a threat to well-being, health and long-term happiness. It is feared that spending many hours per day in front of the screen leads the individual to forsake other, more worthwhile activities, such as human

  3. Developing Human-Computer Interface Models and Representation Techniques(Dialogue Management as an Integral Part of Software Engineering)

    OpenAIRE

    Hartson, H. Rex; Hix, Deborah; Kraly, Thomas M.

    1987-01-01

    The Dialogue Management Project at Virginia Tech is studying the poorly understood problem of human-computer dialogue development. This problem often leads to low usability in human-computer dialogues. The Dialogue Management Project approaches solutions to low usability in interfaces by addressing human-computer dialogue development as an integral and equal part of the total system development process. This project consists of two rather distinct, but dependent, parts. One is development of ...

  4. Human Environmental Disease Network: A computational model to assess toxicology of contaminants.

    Science.gov (United States)

    Taboureau, Olivier; Audouze, Karine

    2017-01-01

    During the past decades, many epidemiological, toxicological and biological studies have been performed to assess the role of environmental chemicals as potential toxicants associated with diverse human disorders. However, the relationships between diseases based on chemical exposure rarely have been studied by computational biology. We developed a human environmental disease network (EDN) to explore and suggest novel disease-disease and chemical-disease relationships. The presented scored EDN model is built upon the integration of systems biology and chemical toxicology using information on chemical contaminants and their disease relationships reported in the TDDB database. The resulting human EDN takes into consideration the level of evidence of the toxicant-disease relationships, allowing inclusion of some degrees of significance in the disease-disease associations. Such a network can be used to identify uncharacterized connections between diseases. Examples are discussed for type 2 diabetes (T2D). Additionally, this computational model allows confirmation of already known links between chemicals and diseases (e.g., between bisphenol A and behavioral disorders) and also reveals unexpected associations between chemicals and diseases (e.g., between chlordane and olfactory alteration), thus predicting which chemicals may be risk factors to human health. The proposed human EDN model allows exploration of common biological mechanisms of diseases associated with chemical exposure, helping us to gain insight into disease etiology and comorbidity. This computational approach is an alternative to animal testing supporting the 3R concept.

  5. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-03-24

    Roughly ~50% of the human genome, contains noncoding sequences serving as regulatory elements responsible for the diverse gene expression of the cells in the body. One very well studied category of regulatory elements is the category of enhancers. Enhancers increase the transcriptional output in cells through chromatin remodeling or recruitment of complexes of binding proteins. Identification of enhancer using computational techniques is an interesting area of research and up to now several approaches have been proposed. However, the current state-of-the-art methods face limitations since the function of enhancers is clarified, but their mechanism of function is not well understood. This PhD thesis presents a bioinformatics/computer science study that focuses on the problem of identifying enhancers in different human cells using computational techniques. The dissertation is decomposed into four main tasks that we present in different chapters. First, since many of the enhancer’s functions are not well understood, we study the basic biological models by which enhancers trigger transcriptional functions and we survey comprehensively over 30 bioinformatics approaches for identifying enhancers. Next, we elaborate more on the availability of enhancer data as produced by different enhancer identification methods and experimental procedures. In particular, we analyze advantages and disadvantages of existing solutions and we report obstacles that require further consideration. To mitigate these problems we developed the Database of Integrated Human Enhancers (DENdb), a centralized online repository that archives enhancer data from 16 ENCODE cell-lines. The integrated enhancer data are also combined with many other experimental data that can be used to interpret the enhancers content and generate a novel enhancer annotation that complements the existing integrative annotation proposed by the ENCODE consortium. Next, we propose the first deep-learning computational

  6. Human factors with nonhumans - Factors that affect computer-task performance

    Science.gov (United States)

    Washburn, David A.

    1992-01-01

    There are two general strategies that may be employed for 'doing human factors research with nonhuman animals'. First, one may use the methods of traditional human factors investigations to examine the nonhuman animal-to-machine interface. Alternatively, one might use performance by nonhuman animals as a surrogate for or model of performance by a human operator. Each of these approaches is illustrated with data in the present review. Chronic ambient noise was found to have a significant but inconsequential effect on computer-task performance by rhesus monkeys (Macaca mulatta). Additional data supported the generality of findings such as these to humans, showing that rhesus monkeys are appropriate models of human psychomotor performance. It is argued that ultimately the interface between comparative psychology and technology will depend on the coordinated use of both strategies of investigation.

  7. HCIDL: Human-computer interface description language for multi-target, multimodal, plastic user interfaces

    Directory of Open Access Journals (Sweden)

    Lamia Gaouar

    2018-06-01

    Full Text Available From the human-computer interface perspectives, the challenges to be faced are related to the consideration of new, multiple interactions, and the diversity of devices. The large panel of interactions (touching, shaking, voice dictation, positioning … and the diversification of interaction devices can be seen as a factor of flexibility albeit introducing incidental complexity. Our work is part of the field of user interface description languages. After an analysis of the scientific context of our work, this paper introduces HCIDL, a modelling language staged in a model-driven engineering approach. Among the properties related to human-computer interface, our proposition is intended for modelling multi-target, multimodal, plastic interaction interfaces using user interface description languages. By combining plasticity and multimodality, HCIDL improves usability of user interfaces through adaptive behaviour by providing end-users with an interaction-set adapted to input/output of terminals and, an optimum layout. Keywords: Model driven engineering, Human-computer interface, User interface description languages, Multimodal applications, Plastic user interfaces

  8. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    Science.gov (United States)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

  9. Plants and Human Affairs: Educational Enhancement Via a Computer.

    Science.gov (United States)

    Crovello, Theodore J.; Smith, W. Nelson

    To enhance both teaching and learning in an advanced undergraduate elective course on the interrelationships of plants and human affairs, the computer was used for information retrieval, multiple choice course review, and the running of three simulation models--plant related systems (e.g., the rise in world coffee prices after the 1975 freeze in…

  10. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  11. USING RESEARCH METHODS IN HUMAN COMPUTER INTERACTION TO DESIGN TECHNOLOGY FOR RESILIENCE

    OpenAIRE

    Lopes, Arminda Guerra

    2016-01-01

    ABSTRACT Research in human computer interaction (HCI) covers both technological and human behavioural concerns. As a consequence, the contributions made in HCI research tend to be aware to either engineering or the social sciences. In HCI the purpose of practical research contributions is to reveal unknown insights about human behaviour and its relationship to technology. Practical research methods normally used in HCI include formal experiments, field experiments, field studies, interviews, ...

  12. MoCog1: A computer simulation of recognition-primed human decision making, considering emotions

    Science.gov (United States)

    Gevarter, William B.

    1992-01-01

    The successful results of the first stage of a research effort to develop a versatile computer model of motivated human cognitive behavior are reported. Most human decision making appears to be an experience-based, relatively straightforward, largely automatic response to situations, utilizing cues and opportunities perceived from the current environment. The development, considering emotions, of the architecture and computer program associated with such 'recognition-primed' decision-making is described. The resultant computer program (MoCog1) was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  13. Using Noninvasive Brain Measurement to Explore the Psychological Effects of Computer Malfunctions on Users during Human-Computer Interactions

    Directory of Open Access Journals (Sweden)

    Leanne M. Hirshfield

    2014-01-01

    Full Text Available In today’s technologically driven world, there is a need to better understand the ways that common computer malfunctions affect computer users. These malfunctions may have measurable influences on computer user’s cognitive, emotional, and behavioral responses. An experiment was conducted where participants conducted a series of web search tasks while wearing functional near-infrared spectroscopy (fNIRS and galvanic skin response sensors. Two computer malfunctions were introduced during the sessions which had the potential to influence correlates of user trust and suspicion. Surveys were given after each session to measure user’s perceived emotional state, cognitive load, and perceived trust. Results suggest that fNIRS can be used to measure the different cognitive and emotional responses associated with computer malfunctions. These cognitive and emotional changes were correlated with users’ self-report levels of suspicion and trust, and they in turn suggest future work that further explores the capability of fNIRS for the measurement of user experience during human-computer interactions.

  14. Computational Fluid and Particle Dynamics in the Human Respiratory System

    CERN Document Server

    Tu, Jiyuan; Ahmadi, Goodarz

    2013-01-01

    Traditional research methodologies in the human respiratory system have always been challenging due to their invasive nature. Recent advances in medical imaging and computational fluid dynamics (CFD) have accelerated this research. This book compiles and details recent advances in the modelling of the respiratory system for researchers, engineers, scientists, and health practitioners. It breaks down the complexities of this field and provides both students and scientists with an introduction and starting point to the physiology of the respiratory system, fluid dynamics and advanced CFD modeling tools. In addition to a brief introduction to the physics of the respiratory system and an overview of computational methods, the book contains best-practice guidelines for establishing high-quality computational models and simulations. Inspiration for new simulations can be gained through innovative case studies as well as hands-on practice using pre-made computational code. Last but not least, students and researcher...

  15. Eye Tracking Based Control System for Natural Human-Computer Interaction

    Directory of Open Access Journals (Sweden)

    Xuebai Zhang

    2017-01-01

    Full Text Available Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user’s eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.

  16. Eye Tracking Based Control System for Natural Human-Computer Interaction.

    Science.gov (United States)

    Zhang, Xuebai; Liu, Xiaolong; Yuan, Shyan-Ming; Lin, Shu-Fan

    2017-01-01

    Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.

  17. Evidence Report: Risk of Inadequate Human-Computer Interaction

    Science.gov (United States)

    Holden, Kritina; Ezer, Neta; Vos, Gordon

    2013-01-01

    Human-computer interaction (HCI) encompasses all the methods by which humans and computer-based systems communicate, share information, and accomplish tasks. When HCI is poorly designed, crews have difficulty entering, navigating, accessing, and understanding information. HCI has rarely been studied in an operational spaceflight context, and detailed performance data that would support evaluation of HCI have not been collected; thus, we draw much of our evidence from post-spaceflight crew comments, and from other safety-critical domains like ground-based power plants, and aviation. Additionally, there is a concern that any potential or real issues to date may have been masked by the fact that crews have near constant access to ground controllers, who monitor for errors, correct mistakes, and provide additional information needed to complete tasks. We do not know what types of HCI issues might arise without this "safety net". Exploration missions will test this concern, as crews may be operating autonomously due to communication delays and blackouts. Crew survival will be heavily dependent on available electronic information for just-in-time training, procedure execution, and vehicle or system maintenance; hence, the criticality of the Risk of Inadequate HCI. Future work must focus on identifying the most important contributing risk factors, evaluating their contribution to the overall risk, and developing appropriate mitigations. The Risk of Inadequate HCI includes eight core contributing factors based on the Human Factors Analysis and Classification System (HFACS): (1) Requirements, policies, and design processes, (2) Information resources and support, (3) Allocation of attention, (4) Cognitive overload, (5) Environmentally induced perceptual changes, (6) Misperception and misinterpretation of displayed information, (7) Spatial disorientation, and (8) Displays and controls.

  18. Experimental evaluation of multimodal human computer interface for tactical audio applications

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.; Jovanov, E.; Oy, S.

    2002-01-01

    Mission critical and information overwhelming applications require careful design of the human computer interface. Typical applications include night vision or low visibility mission navigation, guidance through a hostile territory, and flight navigation and orientation. Additional channels of

  19. Electromagnetic Modeling of Human Body Using High Performance Computing

    Science.gov (United States)

    Ng, Cho-Kuen; Beall, Mark; Ge, Lixin; Kim, Sanghoek; Klaas, Ottmar; Poon, Ada

    Realistic simulation of electromagnetic wave propagation in the actual human body can expedite the investigation of the phenomenon of harvesting implanted devices using wireless powering coupled from external sources. The parallel electromagnetics code suite ACE3P developed at SLAC National Accelerator Laboratory is based on the finite element method for high fidelity accelerator simulation, which can be enhanced to model electromagnetic wave propagation in the human body. Starting with a CAD model of a human phantom that is characterized by a number of tissues, a finite element mesh representing the complex geometries of the individual tissues is built for simulation. Employing an optimal power source with a specific pattern of field distribution, the propagation and focusing of electromagnetic waves in the phantom has been demonstrated. Substantial speedup of the simulation is achieved by using multiple compute cores on supercomputers.

  20. Aspects of computer control from the human engineering standpoint

    International Nuclear Information System (INIS)

    Huang, T.V.

    1979-03-01

    A Computer Control System includes data acquisition, information display and output control signals. In order to design such a system effectively we must first determine the required operational mode: automatic control (closed loop), computer assisted (open loop), or hybrid control. The choice of operating mode will depend on the nature of the plant, the complexity of the operation, the funds available, and the technical expertise of the operating staff, among many other factors. Once the mode has been selected, consideration must be given to the method (man/machine interface) by which the operator interacts with this system. The human engineering factors are of prime importance to achieving high operating efficiency and very careful attention must be given to this aspect of the work, if full operator acceptance is to be achieved. This paper will discuss these topics and will draw on experience gained in setting up the computer control system in Main Control Center for Stanford University's Accelerator Center (a high energy physics research facility)

  1. The Study on Human-Computer Interaction Design Based on the Users’ Subconscious Behavior

    Science.gov (United States)

    Li, Lingyuan

    2017-09-01

    Human-computer interaction is human-centered. An excellent interaction design should focus on the study of user experience, which greatly comes from the consistence between design and human behavioral habit. However, users’ behavioral habits often result from subconsciousness. Therefore, it is smart to utilize users’ subconscious behavior to achieve design's intention and maximize the value of products’ functions, which gradually becomes a new trend in this field.

  2. Proceedings of the topical meeting on advances in human factors research on man/computer interactions

    International Nuclear Information System (INIS)

    Anon.

    1990-01-01

    This book discusses the following topics: expert systems and knowledge engineering-I; verification and validation of software; methods for modeling UMAN/computer performance; MAN/computer interaction problems in producing procedures -1-2; progress and problems with automation-1-2; experience with electronic presentation of procedures-2; intelligent displays and monitors; modeling user/computer interface; and computer-based human decision-making aids

  3. A PREDICTIVE STUDY: CARBON MONOXIDE EMISSION MODELING AT A SIGNALIZED INTERSECTION

    Directory of Open Access Journals (Sweden)

    FREDDY WEE LIANG KHO

    2014-02-01

    Full Text Available CAL3QHC dispersion model was used to predict the present and future carbonmonoxide (CO levels at a busy signalized intersection. This study attempted to identify CO “hot-spots” at nearby areas of the intersection during typical A.M. and P.M. peak hours. The CO concentration “hot-spots” had been identified at 101 Commercial Park and the simulated maximum 1-hour Time-Weighted Average (1-h TWA ground level CO concentrations of 18.3 ppm and 18.6 ppm had been observed during A.M. and P.M. peaks, respectively in year 2006. This study shows that there would be no significant increment in CO level for year 2014 although a substantial increase in the number of vehicles is assumed to affect CO levels. It was also found that CO levels would be well below the Malaysian Ambient Air Quality Guideline of 30 ppm (1-h TWA. Comparisons between the measured and simulated CO levels using quantitative data analysis technique and statistical methods indicated that CAL3QHC dispersion model correlated well with measured data.

  4. High-temperature stability of electron transport in semiconductors with strong spin-orbital interaction

    Science.gov (United States)

    Tomaka, G.; Grendysa, J.; ŚliŻ, P.; Becker, C. R.; Polit, J.; Wojnarowska, R.; Stadler, A.; Sheregii, E. M.

    2016-05-01

    Experimental results of the magnetotransport measurements (longitudinal magnetoresistance Rx x and the Hall resistance Rx y) are presented over a wide interval of temperatures for several samples of Hg1 -xCdxTe (x ≈0.13 -0.15 ) grown by MBE—thin layers (thickness about 100 nm) strained and not strained and thick ones with thickness about 1 μ m . An amazing temperature stability of the SdH-oscillation period and amplitude is observed in the entire temperature interval of measurements up to 50 K. Moreover, the quantum Hall effect (QHE) behavior of the Hall resistance is registered in the same temperature interval. These peculiarities of the Rx x and Rx y for strained thin layers are interpreted using quantum Hall conductivity (QHC) on topologically protected surface states (TPSS) [C. Brüne et al., Phys. Rev. Lett. 106, 126803 (2011), 10.1103/PhysRevLett.106.126803]. In the case of not strained layers it is assumed that the QHC on the TPSS (or on the resonant interface states) contributes also to the conductance of the bulk samples.

  5. Engageability: a new sub-principle of the learnability principle in human-computer interaction

    Directory of Open Access Journals (Sweden)

    B Chimbo

    2011-12-01

    Full Text Available The learnability principle relates to improving the usability of software, as well as users’ performance and productivity. A gap has been identified as the current definition of the principle does not distinguish between users of different ages. To determine the extent of the gap, this article compares the ways in which two user groups, adults and children, learn how to use an unfamiliar software application. In doing this, we bring together the research areas of human-computer interaction (HCI, adult and child learning, learning theories and strategies, usability evaluation and interaction design. A literature survey conducted on learnability and learning processes considered the meaning of learnability of software applications across generations. In an empirical investigation, users aged from 9 to 12 and from 35 to 50 were observed in a usability laboratory while learning to use educational software applications. Insights that emerged from data analysis showed different tactics and approaches that children and adults use when learning unfamiliar software. Eye tracking data was also recorded. Findings indicated that subtle re- interpretation of the learnability principle and its associated sub-principles was required. An additional sub-principle, namely engageability was proposed to incorporate aspects of learnability that are not covered by the existing sub-principles. Our re-interpretation of the learnability principle and the resulting design recommendations should help designers to fulfill the varying needs of different-aged users, and improve the learnability of their designs. Keywords: Child computer interaction, Design principles, Eye tracking, Generational differences, human-computer interaction, Learning theories, Learnability, Engageability, Software applications, Uasability Disciplines: Human-Computer Interaction (HCI Studies, Computer science, Observational Studies

  6. Intermittent control: a computational theory of human control.

    Science.gov (United States)

    Gawthrop, Peter; Loram, Ian; Lakie, Martin; Gollee, Henrik

    2011-02-01

    The paradigm of continuous control using internal models has advanced understanding of human motor control. However, this paradigm ignores some aspects of human control, including intermittent feedback, serial ballistic control, triggered responses and refractory periods. It is shown that event-driven intermittent control provides a framework to explain the behaviour of the human operator under a wider range of conditions than continuous control. Continuous control is included as a special case, but sampling, system matched hold, an intermittent predictor and an event trigger allow serial open-loop trajectories using intermittent feedback. The implementation here may be described as "continuous observation, intermittent action". Beyond explaining unimodal regulation distributions in common with continuous control, these features naturally explain refractoriness and bimodal stabilisation distributions observed in double stimulus tracking experiments and quiet standing, respectively. Moreover, given that human control systems contain significant time delays, a biological-cybernetic rationale favours intermittent over continuous control: intermittent predictive control is computationally less demanding than continuous predictive control. A standard continuous-time predictive control model of the human operator is used as the underlying design method for an event-driven intermittent controller. It is shown that when event thresholds are small and sampling is regular, the intermittent controller can masquerade as the underlying continuous-time controller and thus, under these conditions, the continuous-time and intermittent controller cannot be distinguished. This explains why the intermittent control hypothesis is consistent with the continuous control hypothesis for certain experimental conditions.

  7. Enrichment of Human-Computer Interaction in Brain-Computer Interfaces via Virtual Environments

    Directory of Open Access Journals (Sweden)

    Alonso-Valerdi Luz María

    2017-01-01

    Full Text Available Tridimensional representations stimulate cognitive processes that are the core and foundation of human-computer interaction (HCI. Those cognitive processes take place while a user navigates and explores a virtual environment (VE and are mainly related to spatial memory storage, attention, and perception. VEs have many distinctive features (e.g., involvement, immersion, and presence that can significantly improve HCI in highly demanding and interactive systems such as brain-computer interfaces (BCI. BCI is as a nonmuscular communication channel that attempts to reestablish the interaction between an individual and his/her environment. Although BCI research started in the sixties, this technology is not efficient or reliable yet for everyone at any time. Over the past few years, researchers have argued that main BCI flaws could be associated with HCI issues. The evidence presented thus far shows that VEs can (1 set out working environmental conditions, (2 maximize the efficiency of BCI control panels, (3 implement navigation systems based not only on user intentions but also on user emotions, and (4 regulate user mental state to increase the differentiation between control and noncontrol modalities.

  8. Simulation-based computation of dose to humans in radiological environments

    International Nuclear Information System (INIS)

    Breazeal, N.L.; Davis, K.R.; Watson, R.A.; Vickers, D.S.; Ford, M.S.

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface

  9. Simulation-based computation of dose to humans in radiological environments

    Energy Technology Data Exchange (ETDEWEB)

    Breazeal, N.L. [Sandia National Labs., Livermore, CA (United States); Davis, K.R.; Watson, R.A. [Sandia National Labs., Albuquerque, NM (United States); Vickers, D.S. [Brigham Young Univ., Provo, UT (United States). Dept. of Electrical and Computer Engineering; Ford, M.S. [Battelle Pantex, Amarillo, TX (United States). Dept. of Radiation Safety

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface.

  10. Impact of familiarity on information complexity in human-computer interfaces

    Directory of Open Access Journals (Sweden)

    Bakaev Maxim

    2016-01-01

    Full Text Available A quantitative measure of information complexity remains very much desirable in HCI field, since it may aid in optimization of user interfaces, especially in human-computer systems for controlling complex objects. Our paper is dedicated to exploration of subjective (subject-depended aspect of the complexity, conceptualized as information familiarity. Although research of familiarity in human cognition and behaviour is done in several fields, the accepted models in HCI, such as Human Processor or Hick-Hyman’s law do not generally consider this issue. In our experimental study the subjects performed search and selection of digits and letters, whose familiarity was conceptualized as frequency of occurrence in numbers and texts. The analysis showed significant effect of information familiarity on selection time and throughput in regression models, although the R2 values were somehow low. Still, we hope that our results might aid in quantification of information complexity and its further application for optimizing interaction in human-machine systems.

  11. Heuristic and optimal policy computations in the human brain during sequential decision-making.

    Science.gov (United States)

    Korn, Christoph W; Bach, Dominik R

    2018-01-23

    Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.

  12. Investigation and evaluation into the usability of human-computer interfaces using a typical CAD system

    Energy Technology Data Exchange (ETDEWEB)

    Rickett, J D

    1987-01-01

    This research program covers three topics relating to the human-computer interface namely, voice recognition, tools and techniques for evaluation, and user and interface modeling. An investigation into the implementation of voice-recognition technologies examines how voice recognizers may be evaluated in commercial software. A prototype system was developed with the collaboration of FEMVIEW Ltd. (marketing a CAD package). A theoretical approach to evaluation leads to the hypothesis that human-computer interaction is affected by personality, influencing types of dialogue, preferred methods for providing helps, etc. A user model based on personality traits, or habitual-behavior patterns (HBP) is presented. Finally, a practical framework is provided for the evaluation of human-computer interfaces. It suggests that evaluation is an integral part of design and that the iterative use of evaluation techniques throughout the conceptualization, design, implementation and post-implementation stages will ensure systems that satisfy the needs of the users and fulfill the goal of usability.

  13. Eyewear Computing – Augmenting the Human with Head-mounted Wearable Assistants (Dagstuhl Seminar 16042)

    OpenAIRE

    Bulling, Andreas; Cakmakci, Ozan; Kunze, Kai; Rehg, James M.

    2016-01-01

    The seminar was composed of workshops and tutorials on head-mounted eye tracking, egocentric vision, optics, and head-mounted displays. The seminar welcomed 30 academic and industry researchers from Europe, the US, and Asia with a diverse background, including wearable and ubiquitous computing, computer vision, developmental psychology, optics, and human-computer interaction. In contrast to several previous Dagstuhl seminars, we used an ignite talk format to reduce the time of talks to...

  14. Human computer interaction using hand gestures

    CERN Document Server

    Premaratne, Prashan

    2014-01-01

    Human computer interaction (HCI) plays a vital role in bridging the 'Digital Divide', bringing people closer to consumer electronics control in the 'lounge'. Keyboards and mouse or remotes do alienate old and new generations alike from control interfaces. Hand Gesture Recognition systems bring hope of connecting people with machines in a natural way. This will lead to consumers being able to use their hands naturally to communicate with any electronic equipment in their 'lounge.' This monograph will include the state of the art hand gesture recognition approaches and how they evolved from their inception. The author would also detail his research in this area for the past 8 years and how the future might turn out to be using HCI. This monograph will serve as a valuable guide for researchers (who would endeavour into) in the world of HCI.

  15. Man and computer

    International Nuclear Information System (INIS)

    Fischbach, K.F.

    1981-01-01

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.) [de

  16. A hybrid approach to the computational aeroacoustics of human voice production

    Czech Academy of Sciences Publication Activity Database

    Šidlof, Petr; Zörner, S.; Huppe, A.

    2015-01-01

    Roč. 14, č. 3 (2015), s. 473-488 ISSN 1617-7959 R&D Projects: GA ČR(CZ) GAP101/11/0207 Institutional support: RVO:61388998 Keywords : computational aeroacoustics * parallel CFD * human voice * vocal folds * ventricular folds Subject RIV: BI - Acoustics Impact factor: 3.032, year: 2015

  17. Ergonomic guidelines for using notebook personal computers. Technical Committee on Human-Computer Interaction, International Ergonomics Association.

    Science.gov (United States)

    Saito, S; Piccoli, B; Smith, M J; Sotoyama, M; Sweitzer, G; Villanueva, M B; Yoshitake, R

    2000-10-01

    In the 1980's, the visual display terminal (VDT) was introduced in workplaces of many countries. Soon thereafter, an upsurge in reported cases of related health problems, such as musculoskeletal disorders and eyestrain, was seen. Recently, the flat panel display or notebook personal computer (PC) became the most remarkable feature in modern workplaces with VDTs and even in homes. A proactive approach must be taken to avert foreseeable ergonomic and occupational health problems from the use of this new technology. Because of its distinct physical and optical characteristics, the ergonomic requirements for notebook PCs in terms of machine layout, workstation design, lighting conditions, among others, should be different from the CRT-based computers. The Japan Ergonomics Society (JES) technical committee came up with a set of guidelines for notebook PC use following exploratory discussions that dwelt on its ergonomic aspects. To keep in stride with this development, the Technical Committee on Human-Computer Interaction under the auspices of the International Ergonomics Association worked towards the international issuance of the guidelines. This paper unveils the result of this collaborative effort.

  18. The Socioemotional Effects of a Computer-Simulated Animal on Children's Empathy and Humane Attitudes

    Science.gov (United States)

    Tsai, Yueh-Feng Lily; Kaufman, David M.

    2009-01-01

    This study investigated the potential of using a computer-simulated animal in a handheld virtual pet videogame to improve children's empathy and humane attitudes. Also investigated was whether sex differences existed in children's development of empathy and humane attitudes resulting from play, as well as their feelings for a virtual pet. The…

  19. Human Factors Principles in Design of Computer-Mediated Visualization for Robot Missions

    Energy Technology Data Exchange (ETDEWEB)

    David I Gertman; David J Bruemmer

    2008-12-01

    With increased use of robots as a resource in missions supporting countermine, improvised explosive devices (IEDs), and chemical, biological, radiological nuclear and conventional explosives (CBRNE), fully understanding the best means by which to complement the human operator’s underlying perceptual and cognitive processes could not be more important. Consistent with control and display integration practices in many other high technology computer-supported applications, current robotic design practices rely highly upon static guidelines and design heuristics that reflect the expertise and experience of the individual designer. In order to use what we know about human factors (HF) to drive human robot interaction (HRI) design, this paper reviews underlying human perception and cognition principles and shows how they were applied to a threat detection domain.

  20. Histomorphometric quantification of human pathological bones from synchrotron radiation 3D computed microtomography

    International Nuclear Information System (INIS)

    Nogueira, Liebert P.; Braz, Delson

    2011-01-01

    Conventional bone histomorphometry is an important method for quantitative evaluation of bone microstructure. X-ray computed microtomography is a noninvasive technique, which can be used to evaluate histomorphometric indices in trabecular bones (BV/TV, BS/BV, Tb.N, Tb.Th, Tb.Sp). In this technique, the output 3D images are used to quantify the whole sample, differently from the conventional one, in which the quantification is performed in 2D slices and extrapolated for 3D case. In this work, histomorphometric quantification using synchrotron 3D X-ray computed microtomography was performed to quantify pathological samples of human bone. Samples of human bones were cut into small blocks (8 mm x 8 mm x 10 mm) with a precision saw and then imaged. The computed microtomographies were obtained at SYRMEP (Synchrotron Radiation for MEdical Physics) beamline, at ELETTRA synchrotron radiation facility (Italy). The obtained 3D images yielded excellent resolution and details of intra-trabecular bone structures, including marrow present inside trabeculae. Histomorphometric quantification was compared to literature as well. (author)

  1. US Army Weapon Systems Human-Computer Interface (WSHCI) style guide, Version 1

    Energy Technology Data Exchange (ETDEWEB)

    Avery, L.W.; O`Mara, P.A.; Shepard, A.P.

    1996-09-30

    A stated goal of the U.S. Army has been the standardization of the human computer interfaces (HCIS) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of style guides. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide. This document, the U.S. Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide, represents the first version of that style guide. The purpose of this document is to provide HCI design guidance for RT/NRT Army systems across the weapon systems domains of ground, aviation, missile, and soldier systems. Each domain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their domains.

  2. Computational Characterization of Exogenous MicroRNAs that Can Be Transferred into Human Circulation.

    Directory of Open Access Journals (Sweden)

    Jiang Shu

    Full Text Available MicroRNAs have been long considered synthesized endogenously until very recent discoveries showing that human can absorb dietary microRNAs from animal and plant origins while the mechanism remains unknown. Compelling evidences of microRNAs from rice, milk, and honeysuckle transported to human blood and tissues have created a high volume of interests in the fundamental questions that which and how exogenous microRNAs can be transferred into human circulation and possibly exert functions in humans. Here we present an integrated genomics and computational analysis to study the potential deciding features of transportable microRNAs. Specifically, we analyzed all publicly available microRNAs, a total of 34,612 from 194 species, with 1,102 features derived from the microRNA sequence and structure. Through in-depth bioinformatics analysis, 8 groups of discriminative features have been used to characterize human circulating microRNAs and infer the likelihood that a microRNA will get transferred into human circulation. For example, 345 dietary microRNAs have been predicted as highly transportable candidates where 117 of them have identical sequences with their homologs in human and 73 are known to be associated with exosomes. Through a milk feeding experiment, we have validated 9 cow-milk microRNAs in human plasma using microRNA-sequencing analysis, including the top ranked microRNAs such as bta-miR-487b, miR-181b, and miR-421. The implications in health-related processes have been illustrated in the functional analysis. This work demonstrates the data-driven computational analysis is highly promising to study novel molecular characteristics of transportable microRNAs while bypassing the complex mechanistic details.

  3. Computational Characterization of Exogenous MicroRNAs that Can Be Transferred into Human Circulation

    Science.gov (United States)

    Shu, Jiang; Chiang, Kevin; Zempleni, Janos; Cui, Juan

    2015-01-01

    MicroRNAs have been long considered synthesized endogenously until very recent discoveries showing that human can absorb dietary microRNAs from animal and plant origins while the mechanism remains unknown. Compelling evidences of microRNAs from rice, milk, and honeysuckle transported to human blood and tissues have created a high volume of interests in the fundamental questions that which and how exogenous microRNAs can be transferred into human circulation and possibly exert functions in humans. Here we present an integrated genomics and computational analysis to study the potential deciding features of transportable microRNAs. Specifically, we analyzed all publicly available microRNAs, a total of 34,612 from 194 species, with 1,102 features derived from the microRNA sequence and structure. Through in-depth bioinformatics analysis, 8 groups of discriminative features have been used to characterize human circulating microRNAs and infer the likelihood that a microRNA will get transferred into human circulation. For example, 345 dietary microRNAs have been predicted as highly transportable candidates where 117 of them have identical sequences with their homologs in human and 73 are known to be associated with exosomes. Through a milk feeding experiment, we have validated 9 cow-milk microRNAs in human plasma using microRNA-sequencing analysis, including the top ranked microRNAs such as bta-miR-487b, miR-181b, and miR-421. The implications in health-related processes have been illustrated in the functional analysis. This work demonstrates the data-driven computational analysis is highly promising to study novel molecular characteristics of transportable microRNAs while bypassing the complex mechanistic details. PMID:26528912

  4. Human-computer interfaces applied to numerical solution of the Plateau problem

    Science.gov (United States)

    Elias Fabris, Antonio; Soares Bandeira, Ivana; Ramos Batista, Valério

    2015-09-01

    In this work we present a code in Matlab to solve the Problem of Plateau numerically, and the code will include human-computer interface. The Problem of Plateau has applications in areas of knowledge like, for instance, Computer Graphics. The solution method will be the same one of the Surface Evolver, but the difference will be a complete graphical interface with the user. This will enable us to implement other kinds of interface like ocular mouse, voice, touch, etc. To date, Evolver does not include any graphical interface, which restricts its use by the scientific community. Specially, its use is practically impossible for most of the Physically Challenged People.

  5. HCI^2 Workbench: A Development Tool for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Wenzhe, Shi; Pantic, Maja

    In this paper, we present a novel software tool designed and implemented to simplify the development process of Multimodal Human-Computer Interaction (MHCI) systems. This tool, which is called the HCI^2 Workbench, exploits a Publish / Subscribe (P/S) architecture [13] [14] to facilitate efficient

  6. HCI^2 Framework: A software framework for multimodal human-computer interaction systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2013-01-01

    This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a

  7. Code system to compute radiation dose in human phantoms

    International Nuclear Information System (INIS)

    Ryman, J.C.; Cristy, M.; Eckerman, K.F.; Davis, J.L.; Tang, J.S.; Kerr, G.D.

    1986-01-01

    Monte Carlo photon transport code and a code using Monte Carlo integration of a point kernel have been revised to incorporate human phantom models for an adult female, juveniles of various ages, and a pregnant female at the end of the first trimester of pregnancy, in addition to the adult male used earlier. An analysis code has been developed for deriving recommended values of specific absorbed fractions of photon energy. The computer code system and calculational method are described, emphasizing recent improvements in methods

  8. Cross-cultural human-computer interaction and user experience design a semiotic perspective

    CERN Document Server

    Brejcha, Jan

    2015-01-01

    This book describes patterns of language and culture in human-computer interaction (HCI). Through numerous examples, it shows why these patterns matter and how to exploit them to design a better user experience (UX) with computer systems. It provides scientific information on the theoretical and practical areas of the interaction and communication design for research experts and industry practitioners and covers the latest research in semiotics and cultural studies, bringing a set of tools and methods to benefit the process of designing with the cultural background in mind.

  9. A Chinese Visible Human-based computational female pelvic phantom for radiation dosimetry simulation

    International Nuclear Information System (INIS)

    Nan, H.; Jinlu, S.; Shaoxiang, Z.; Qing, H.; Li-wen, T.; Chengjun, G.; Tang, X.; Jiang, S. B.; Xiano-lin, Z.

    2010-01-01

    Accurate voxel phantom is needed for dosimetric simulation in radiation therapy for malignant tumors in female pelvic region. However, most of the existing voxel phantoms are constructed on the basis of Caucasian or non-Chinese population. Materials and Methods: A computational framework for constructing female pelvic voxel phantom for radiation dosimetry was performed based on Chinese Visible Human datasets. First, several organs within pelvic region were segmented from Chinese Visible Human datasets. Then, polygonization and voxelization were performed based on the segmented organs and a 3D computational phantom is built in the form of a set of voxel arrays. Results: The generated phantom can be converted and loaded into treatment planning system for radiation dosimetry calculation. From the observed dosimetric results of those organs and structures, we can evaluate their absorbed dose and implement some simulation studies. Conclusion: A voxel female pelvic phantom was developed from Chinese Visible Human datasets. It can be utilized for dosimetry evaluation and planning simulation, which would be very helpful to improve the clinical performance and reduce the radiation toxicity on organ at risk.

  10. An Efficient and Secure m-IPS Scheme of Mobile Devices for Human-Centric Computing

    Directory of Open Access Journals (Sweden)

    Young-Sik Jeong

    2014-01-01

    Full Text Available Recent rapid developments in wireless and mobile IT technologies have led to their application in many real-life areas, such as disasters, home networks, mobile social networks, medical services, industry, schools, and the military. Business/work environments have become wire/wireless, integrated with wireless networks. Although the increase in the use of mobile devices that can use wireless networks increases work efficiency and provides greater convenience, wireless access to networks represents a security threat. Currently, wireless intrusion prevention systems (IPSs are used to prevent wireless security threats. However, these are not an ideal security measure for businesses that utilize mobile devices because they do not take account of temporal-spatial and role information factors. Therefore, in this paper, an efficient and secure mobile-IPS (m-IPS is proposed for businesses utilizing mobile devices in mobile environments for human-centric computing. The m-IPS system incorporates temporal-spatial awareness in human-centric computing with various mobile devices and checks users’ temporal spatial information, profiles, and role information to provide precise access control. And it also can extend application of m-IPS to the Internet of things (IoT, which is one of the important advanced technologies for supporting human-centric computing environment completely, for real ubiquitous field with mobile devices.

  11. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    International genome sequencing projects have produced billions of nucleotides (letters) of DNA sequence data, including the complete genome sequences of 74 organisms. These genome sequences have created many new scientific opportunities, including the ability to identify sequence variations among individuals within a species. These genetic differences, which are known as single nucleotide polymorphisms (SNPs), are particularly important in understanding the genetic basis for disease susceptibility. Since the report of the complete human genome sequence, over two million human SNPs have been identified, including a large-scale comparison of an entire chromosome from twenty individuals. Of the protein coding SNPs (cSNPs), approximately half leads to a single amino acid change in the encoded protein (non-synonymous coding SNPs). Most of these changes are functionally silent, while the remainder negatively impact the protein and sometimes cause human disease. To date, over 550 SNPs have been found to cause single locus (monogenic) diseases and many others have been associated with polygenic diseases. SNPs have been linked to specific human diseases, including late-onset Parkinson disease, autism, rheumatoid arthritis and cancer. The ability to predict accurately the effects of these SNPs on protein function would represent a major advance toward understanding these diseases. To date several attempts have been made toward predicting the effects of such mutations. The most successful of these is a computational approach called ''Sorting Intolerant From Tolerant'' (SIFT). This method uses sequence conservation among many similar proteins to predict which residues in a protein are functionally important. However, this method suffers from several limitations. First, a query sequence must have a sufficient number of relatives to infer sequence conservation. Second, this method does not make use of or provide any information on protein structure, which

  12. Development and evaluation of a computer-aided system for analyzing human error in railway operations

    International Nuclear Information System (INIS)

    Kim, Dong San; Baek, Dong Hyun; Yoon, Wan Chul

    2010-01-01

    As human error has been recognized as one of the major contributors to accidents in safety-critical systems, there has been a strong need for techniques that can analyze human error effectively. Although many techniques have been developed so far, much room for improvement remains. As human error analysis is a cognitively demanding and time-consuming task, it is particularly necessary to develop a computerized system supporting this task. This paper presents a computer-aided system for analyzing human error in railway operations, called Computer-Aided System for Human Error Analysis and Reduction (CAS-HEAR). It supports analysts to find multiple levels of error causes and their causal relations by using predefined links between contextual factors and causal factors as well as links between causal factors. In addition, it is based on a complete accident model; hence, it helps analysts to conduct a thorough analysis without missing any important part of human error analysis. A prototype of CAS-HEAR was evaluated by nine field investigators from six railway organizations in Korea. Its overall usefulness in human error analysis was confirmed, although development of its simplified version and some modification of the contextual factors and causal factors are required in order to ensure its practical use.

  13. Distribution of absorbed dose in human eye simulated by SRNA-2KG computer code

    International Nuclear Information System (INIS)

    Ilic, R.; Pesic, M.; Pavlovic, R.; Mostacci, D.

    2003-01-01

    Rapidly increasing performances of personal computers and development of codes for proton transport based on Monte Carlo methods will allow, very soon, the introduction of the computer planning proton therapy as a normal activity in regular hospital procedures. A description of SRNA code used for such applications and results of calculated distributions of proton-absorbed dose in human eye are given in this paper. (author)

  14. Assessing Human Judgment of Computationally Generated Swarming Behavior

    Directory of Open Access Journals (Sweden)

    John Harvey

    2018-02-01

    Full Text Available Computer-based swarm systems, aiming to replicate the flocking behavior of birds, were first introduced by Reynolds in 1987. In his initial work, Reynolds noted that while it was difficult to quantify the dynamics of the behavior from the model, observers of his model immediately recognized them as a representation of a natural flock. Considerable analysis has been conducted since then on quantifying the dynamics of flocking/swarming behavior. However, no systematic analysis has been conducted on human identification of swarming. In this paper, we assess subjects’ assessment of the behavior of a simplified version of Reynolds’ model. Factors that affect the identification of swarming are discussed and future applications of the resulting models are proposed. Differences in decision times for swarming-related questions asked during the study indicate that different brain mechanisms may be involved in different elements of the behavior assessment task. The relatively simple but finely tunable model used in this study provides a useful methodology for assessing individual human judgment of swarming behavior.

  15. Multi-step EMG Classification Algorithm for Human-Computer Interaction

    Science.gov (United States)

    Ren, Peng; Barreto, Armando; Adjouadi, Malek

    A three-electrode human-computer interaction system, based on digital processing of the Electromyogram (EMG) signal, is presented. This system can effectively help disabled individuals paralyzed from the neck down to interact with computers or communicate with people through computers using point-and-click graphic interfaces. The three electrodes are placed on the right frontalis, the left temporalis and the right temporalis muscles in the head, respectively. The signal processing algorithm used translates the EMG signals during five kinds of facial movements (left jaw clenching, right jaw clenching, eyebrows up, eyebrows down, simultaneous left & right jaw clenching) into five corresponding types of cursor movements (left, right, up, down and left-click), to provide basic mouse control. The classification strategy is based on three principles: the EMG energy of one channel is typically larger than the others during one specific muscle contraction; the spectral characteristics of the EMG signals produced by the frontalis and temporalis muscles during different movements are different; the EMG signals from adjacent channels typically have correlated energy profiles. The algorithm is evaluated on 20 pre-recorded EMG signal sets, using Matlab simulations. The results show that this method provides improvements and is more robust than other previous approaches.

  16. Appearance-based human gesture recognition using multimodal features for human computer interaction

    Science.gov (United States)

    Luo, Dan; Gao, Hua; Ekenel, Hazim Kemal; Ohya, Jun

    2011-03-01

    The use of gesture as a natural interface plays an utmost important role for achieving intelligent Human Computer Interaction (HCI). Human gestures include different components of visual actions such as motion of hands, facial expression, and torso, to convey meaning. So far, in the field of gesture recognition, most previous works have focused on the manual component of gestures. In this paper, we present an appearance-based multimodal gesture recognition framework, which combines the different groups of features such as facial expression features and hand motion features which are extracted from image frames captured by a single web camera. We refer 12 classes of human gestures with facial expression including neutral, negative and positive meanings from American Sign Languages (ASL). We combine the features in two levels by employing two fusion strategies. At the feature level, an early feature combination can be performed by concatenating and weighting different feature groups, and LDA is used to choose the most discriminative elements by projecting the feature on a discriminative expression space. The second strategy is applied on decision level. Weighted decisions from single modalities are fused in a later stage. A condensation-based algorithm is adopted for classification. We collected a data set with three to seven recording sessions and conducted experiments with the combination techniques. Experimental results showed that facial analysis improve hand gesture recognition, decision level fusion performs better than feature level fusion.

  17. Human brain as the model of a new computer system. II

    Energy Technology Data Exchange (ETDEWEB)

    Holtz, K; Langheld, E

    1981-12-09

    For Pt. I see IBID., Vol. 29, No. 22, P. 13 (1981). The authors describe the self-generating system of connections of a self-teaching no-program associative computer. The self-generating systems of connections are regarded as simulation models of the human brain and compared with the brain structure. The system hardware comprises microprocessor, PROM, memory, VDU, keyboard unit.

  18. Conversion of IVA Human Computer Model to EVA Use and Evaluation and Comparison of the Result to Existing EVA Models

    Science.gov (United States)

    Hamilton, George S.; Williams, Jermaine C.

    1998-01-01

    This paper describes the methods, rationale, and comparative results of the conversion of an intravehicular (IVA) 3D human computer model (HCM) to extravehicular (EVA) use and compares the converted model to an existing model on another computer platform. The task of accurately modeling a spacesuited human figure in software is daunting: the suit restricts the human's joint range of motion (ROM) and does not have joints collocated with human joints. The modeling of the variety of materials needed to construct a space suit (e. g. metal bearings, rigid fiberglass torso, flexible cloth limbs and rubber coated gloves) attached to a human figure is currently out of reach of desktop computer hardware and software. Therefore a simplified approach was taken. The HCM's body parts were enlarged and the joint ROM was restricted to match the existing spacesuit model. This basic approach could be used to model other restrictive environments in industry such as chemical or fire protective clothing. In summary, the approach provides a moderate fidelity, usable tool which will run on current notebook computers.

  19. The data base management system alternative for computing in the human services.

    Science.gov (United States)

    Sircar, S; Schkade, L L; Schoech, D

    1983-01-01

    The traditional incremental approach to computerization presents substantial problems as systems develop and grow. The Data Base Management System approach to computerization was developed to overcome the problems resulting from implementing computer applications one at a time. The authors describe the applications approach and the alternative Data Base Management System (DBMS) approach through their developmental history, discuss the technology of DBMS components, and consider the implications of choosing the DBMS alternative. Human service managers need an understanding of the DBMS alternative and its applicability to their agency data processing needs. The basis for a conscious selection of computing alternatives is outlined.

  20. Automatic procedure for realistic 3D finite element modelling of human brain for bioelectromagnetic computations

    International Nuclear Information System (INIS)

    Aristovich, K Y; Khan, S H

    2010-01-01

    Realistic computer modelling of biological objects requires building of very accurate and realistic computer models based on geometric and material data, type, and accuracy of numerical analyses. This paper presents some of the automatic tools and algorithms that were used to build accurate and realistic 3D finite element (FE) model of whole-brain. These models were used to solve the forward problem in magnetic field tomography (MFT) based on Magnetoencephalography (MEG). The forward problem involves modelling and computation of magnetic fields produced by human brain during cognitive processing. The geometric parameters of the model were obtained from accurate Magnetic Resonance Imaging (MRI) data and the material properties - from those obtained from Diffusion Tensor MRI (DTMRI). The 3D FE models of the brain built using this approach has been shown to be very accurate in terms of both geometric and material properties. The model is stored on the computer in Computer-Aided Parametrical Design (CAD) format. This allows the model to be used in a wide a range of methods of analysis, such as finite element method (FEM), Boundary Element Method (BEM), Monte-Carlo Simulations, etc. The generic model building approach presented here could be used for accurate and realistic modelling of human brain and many other biological objects.

  1. Human Computer Collaboration at the Edge: Enhancing Collective Situation Understanding with Controlled Natural Language

    Science.gov (United States)

    2016-09-06

    conversational agent with information exchange disabled until the end of the experiment run. The meaning of the indicator in the top- right of the agent... Human Computer Collaboration at the Edge: Enhancing Collective Situation Understanding with Controlled Natural Language Alun Preece∗, William...email: PreeceAD@cardiff.ac.uk †Emerging Technology Services, IBM United Kingdom Ltd, Hursley Park, Winchester, UK ‡US Army Research Laboratory, Human

  2. A structural approach to constructing perspective efficient and reliable human-computer interfaces

    International Nuclear Information System (INIS)

    Balint, L.

    1989-01-01

    The principles of human-computer interface (HCI) realizations are investigated with the aim of getting closer to a general framework and thus, to a more or less solid background of constructing perspective efficient, reliable and cost-effective human-computer interfaces. On the basis of characterizing and classifying the different HCI solutions, the fundamental problems of interface construction are pointed out especially with respect to human error occurrence possibilities. The evolution of HCI realizations is illustrated by summarizing the main properties of past, present and foreseeable future interface generations. HCI modeling is pointed out to be a crucial problem in theoretical and practical investigations. Suggestions concerning HCI structure (hierarchy and modularity), HCI functional dynamics (mapping from input to output information), minimization of human error caused system failures (error-tolerance, error-recovery and error-correcting) as well as cost-effective HCI design and realization methodology (universal and application-oriented vs. application-specific solutions) are presented. The concept of RISC-based and SCAMP-type HCI components is introduced with the aim of having a reduced interaction scheme in communication and a well defined architecture in HCI components' internal structure. HCI efficiency and reliability are dealt with, by taking into account complexity and flexibility. The application of fast computerized prototyping is also briefly investigated as an experimental device of achieving simple, parametrized, invariant HCI models. Finally, a concise outline of an approach of how to construct ideal HCI's is also suggested by emphasizing the open questions and the need of future work related to the proposals, as well. (author). 14 refs, 6 figs

  3. Seismic-load-induced human errors and countermeasures using computer graphics in plant-operator communication

    International Nuclear Information System (INIS)

    Hara, Fumio

    1988-01-01

    This paper remarks the importance of seismic load-induced human errors in plant operation by delineating the characteristics of the task performance of human beings under seismic loads. It focuses on man-machine communication via multidimensional data like that conventionally displayed on large panels in a plant control room. It demonstrates a countermeasure to human errors using a computer graphics technique that conveys the global state of the plant operation to operators through cartoon-like, colored graphs in the form of faces that, with different facial expressions, show the plant safety status. (orig.)

  4. U.S. Army weapon systems human-computer interface style guide. Version 2

    Energy Technology Data Exchange (ETDEWEB)

    Avery, L.W.; O`Mara, P.A.; Shepard, A.P.; Donohoo, D.T.

    1997-12-31

    A stated goal of the US Army has been the standardization of the human computer interfaces (HCIs) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of HCI design guidance documents. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA), now termed the Joint Technical Architecture-Army (JTA-A). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide, which resulted in the US Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide Version 1. Based on feedback from the user community, DISC4 further tasked PNNL to revise Version 1 and publish Version 2. The intent was to update some of the research and incorporate some enhancements. This document provides that revision. The purpose of this document is to provide HCI design guidance for the RT/NRT Army system domain across the weapon systems subdomains of ground, aviation, missile, and soldier systems. Each subdomain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their subdomains.

  5. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  6. Computational study of depth completion consistent with human bi-stable perception for ambiguous figures.

    Science.gov (United States)

    Mitsukura, Eiichi; Satoh, Shunji

    2018-03-01

    We propose a computational model that is consistent with human perception of depth in "ambiguous regions," in which no binocular disparity exists. Results obtained from our model reveal a new characteristic of depth perception. Random dot stereograms (RDS) are often used as examples because RDS provides sufficient disparity for depth calculation. A simple question confronts us: "How can we estimate the depth of a no-texture image region, such as one on white paper?" In such ambiguous regions, mathematical solutions related to binocular disparities are not unique or indefinite. We examine a mathematical description of depth completion that is consistent with human perception of depth for ambiguous regions. Using computer simulation, we demonstrate that resultant depth-maps qualitatively reproduce human depth perception of two kinds. The resultant depth maps produced using our model depend on the initial depth in the ambiguous region. Considering this dependence from psychological viewpoints, we conjecture that humans perceive completed surfaces that are affected by prior-stimuli corresponding to the initial condition of depth. We conducted psychological experiments to verify the model prediction. An ambiguous stimulus was presented after a prior stimulus removed ambiguity. The inter-stimulus interval (ISI) was inserted between the prior stimulus and post-stimulus. Results show that correlation of perception between the prior stimulus and post-stimulus depends on the ISI duration. Correlation is positive, negative, and nearly zero in the respective cases of short (0-200 ms), medium (200-400 ms), and long ISI (>400 ms). Furthermore, based on our model, we propose a computational model that can explain the dependence. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Computer-assisted image analysis assay of human neutrophil chemotaxis in vitro

    DEFF Research Database (Denmark)

    Jensen, P; Kharazmi, A

    1991-01-01

    We have developed a computer-based image analysis system to measure in-filter migration of human neutrophils in the Boyden chamber. This method is compared with the conventional manual counting techniques. Neutrophils from healthy individuals and from patients with reduced chemotactic activity were....... Another advantage of the assay is that it can be used to show the migration pattern of different populations of neutrophils from both healthy individuals and patients....

  8. Computational strategy for quantifying human pesticide exposure based upon a saliva measurement

    Energy Technology Data Exchange (ETDEWEB)

    Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.

    2015-05-27

    The National Research Council of the National Academies report, Toxicity Testing in the 21st Century: A Vision and Strategy, highlighted the importance of quantitative exposure data for evaluating human toxicity risk and noted that biomonitoring is a critical tool for quantitatively evaluating exposure from both environmental and occupational settings. Direct measurement of chemical exposures using personal monitoring provides the most accurate estimation of a subject’s true exposure, and non-invasive methods have also been advocated for quantifying the pharmacokinetics and bioavailability of drugs and xenobiotics. In this regard, there is a need to identify chemicals that are readily cleared in saliva at concentrations that can be quantified to support the implementation of this approach.. The current manuscript describes the use of computational modeling approaches that are closely coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics. The primary mechanism by which xenobiotics leave the blood and enter saliva is thought to involve paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of drugs and xenobiotics cleared from plasma into saliva by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computational modeled using a combination of compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of a modified Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis of key model parameters specifically identified that both protein-binding and pKa (for weak acids and bases) had the most significant impact on the determination of partitioning and that there were clear species dependent differences based upon physiological variance between

  9. Computational Thermodynamics Analysis of Vaporizing Fuel Droplets in the Human Upper Airways

    Science.gov (United States)

    Zhang, Zhe; Kleinstreuer, Clement

    The detailed knowledge of air flow structures as well as particle transport and deposition in the human lung for typical inhalation flow rates is an important precursor for dosimetry-and-health-effect studies of toxic particles as well as for targeted drug delivery of therapeutic aerosols. Focusing on highly toxic JP-8 fuel aerosols, 3-D airflow and fluid-particle thermodynamics in a human upper airway model starting from mouth to Generation G3 (G0 is the trachea) are simulated using a user-enhanced and experimentally validated finite-volume code. The temperature distributions and their effects on airflow structures, fuel vapor deposition and droplet motion/evaporation are discussed. The computational results show that the thermal effect on vapor deposition is minor, but it may greatly affect droplet deposition in human airways.

  10. Distributed and grid computing projects with research focus in human health.

    Science.gov (United States)

    Diomidous, Marianna; Zikos, Dimitrios

    2012-01-01

    Distributed systems and grid computing systems are used to connect several computers to obtain a higher level of performance, in order to solve a problem. During the last decade, projects use the World Wide Web to aggregate individuals' CPU power for research purposes. This paper presents the existing active large scale distributed and grid computing projects with research focus in human health. There have been found and presented 11 active projects with more than 2000 Processing Units (PUs) each. The research focus for most of them is molecular biology and, specifically on understanding or predicting protein structure through simulation, comparing proteins, genomic analysis for disease provoking genes and drug design. Though not in all cases explicitly stated, common target diseases include research to find cure against HIV, dengue, Duchene dystrophy, Parkinson's disease, various types of cancer and influenza. Other diseases include malaria, anthrax, Alzheimer's disease. The need for national initiatives and European Collaboration for larger scale projects is stressed, to raise the awareness of citizens to participate in order to create a culture of internet volunteering altruism.

  11. 3D virtual human atria: A computational platform for studying clinical atrial fibrillation.

    Science.gov (United States)

    Aslanidi, Oleg V; Colman, Michael A; Stott, Jonathan; Dobrzynski, Halina; Boyett, Mark R; Holden, Arun V; Zhang, Henggui

    2011-10-01

    Despite a vast amount of experimental and clinical data on the underlying ionic, cellular and tissue substrates, the mechanisms of common atrial arrhythmias (such as atrial fibrillation, AF) arising from the functional interactions at the whole atria level remain unclear. Computational modelling provides a quantitative framework for integrating such multi-scale data and understanding the arrhythmogenic behaviour that emerges from the collective spatio-temporal dynamics in all parts of the heart. In this study, we have developed a multi-scale hierarchy of biophysically detailed computational models for the human atria--the 3D virtual human atria. Primarily, diffusion tensor MRI reconstruction of the tissue geometry and fibre orientation in the human sinoatrial node (SAN) and surrounding atrial muscle was integrated into the 3D model of the whole atria dissected from the Visible Human dataset. The anatomical models were combined with the heterogeneous atrial action potential (AP) models, and used to simulate the AP conduction in the human atria under various conditions: SAN pacemaking and atrial activation in the normal rhythm, break-down of regular AP wave-fronts during rapid atrial pacing, and the genesis of multiple re-entrant wavelets characteristic of AF. Contributions of different properties of the tissue to mechanisms of the normal rhythm and arrhythmogenesis were investigated. Primarily, the simulations showed that tissue heterogeneity caused the break-down of the normal AP wave-fronts at rapid pacing rates, which initiated a pair of re-entrant spiral waves; and tissue anisotropy resulted in a further break-down of the spiral waves into multiple meandering wavelets characteristic of AF. The 3D virtual atria model itself was incorporated into the torso model to simulate the body surface ECG patterns in the normal and arrhythmic conditions. Therefore, a state-of-the-art computational platform has been developed, which can be used for studying multi

  12. Computational Strategy for Quantifying Human Pesticide Exposure based upon a Saliva Measurement

    Directory of Open Access Journals (Sweden)

    Charles eTimchalk

    2015-05-01

    Full Text Available Quantitative exposure data is important for evaluating toxicity risk and biomonitoring is a critical tool for evaluating human exposure. Direct personal monitoring provides the most accurate estimation of a subject’s true dose, and non-invasive methods are advocated for quantifying exposure to xenobiotics. In this regard, there is a need to identify chemicals that are cleared in saliva at concentrations that can be quantified to support the implementation of this approach. This manuscript reviews the computational modeling approaches that are coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics and provides additional insight on species-dependent differences in partitioning that are of key importance for extrapolation. The primary mechanism by which xenobiotics leave the blood and enter saliva involves paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of xenobiotics transferred by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computationally modeled using compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of the Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis identified that both protein-binding and pKa (for weak acids and bases have significant impact on determining partitioning and species dependent differences based upon physiological variance. Future strategies are focused on an in vitro salivary acinar cell based system to experimentally determine and computationally predict salivary gland uptake and clearance for xenobiotics. It is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of chemical exposures in human

  13. Proceedings of the 2011 2nd International Congress on Computer Applications and Computational Science

    CERN Document Server

    Nguyen, Quang

    2012-01-01

    The latest inventions in computer technology influence most of human daily activities. In the near future, there is tendency that all of aspect of human life will be dependent on computer applications. In manufacturing, robotics and automation have become vital for high quality products. In education, the model of teaching and learning is focusing more on electronic media than traditional ones. Issues related to energy savings and environment is becoming critical.   Computational Science should enhance the quality of human life,  not only solve their problems. Computational Science should help humans to make wise decisions by presenting choices and their possible consequences. Computational Science should help us make sense of observations, understand natural language, plan and reason with extensive background knowledge. Intelligence with wisdom is perhaps an ultimate goal for human-oriented science.   This book is a compilation of some recent research findings in computer application and computational sci...

  14. Human-Computer Interaction Handbook Fundamentals, Evolving Technologies, and Emerging Applications

    CERN Document Server

    Jacko, Julie A

    2012-01-01

    The third edition of a groundbreaking reference, The Human--Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications raises the bar for handbooks in this field. It is the largest, most complete compilation of HCI theories, principles, advances, case studies, and more that exist within a single volume. The book captures the current and emerging sub-disciplines within HCI related to research, development, and practice that continue to advance at an astonishing rate. It features cutting-edge advances to the scientific knowledge base as well as visionary perspe

  15. Shape perception in human and computer vision an interdisciplinary perspective

    CERN Document Server

    Dickinson, Sven J

    2013-01-01

    This comprehensive and authoritative text/reference presents a unique, multidisciplinary perspective on Shape Perception in Human and Computer Vision. Rather than focusing purely on the state of the art, the book provides viewpoints from world-class researchers reflecting broadly on the issues that have shaped the field. Drawing upon many years of experience, each contributor discusses the trends followed and the progress made, in addition to identifying the major challenges that still lie ahead. Topics and features: examines each topic from a range of viewpoints, rather than promoting a speci

  16. Modelling flow and heat transfer around a seated human body by computational fluid dynamics

    DEFF Research Database (Denmark)

    Sørensen, Dan Nørtoft; Voigt, Lars Peter Kølgaard

    2003-01-01

    A database (http://www.ie.dtu.dk/manikin) containing a detailed representation of the surface geometry of a seated female human body was created from a surface scan of a thermal manikin (minus clothing and hair). The radiative heat transfer coefficient and the natural convection flow around...... of the computational manikin has all surface features of a human being; (2) the geometry is an exact copy of an experimental thermal manikin, enabling detailed comparisons between calculations and experiments....

  17. Three-dimensional evaluation of human jaw bone microarchitecture: correlation between the microarchitectural parameters of cone beam computed tomography and micro-computer tomography.

    Science.gov (United States)

    Kim, Jo-Eun; Yi, Won-Jin; Heo, Min-Suk; Lee, Sam-Sun; Choi, Soon-Chul; Huh, Kyung-Hoe

    2015-12-01

    To evaluate the potential feasibility of cone beam computed tomography (CBCT) in the assessment of trabecular bone microarchitecture. Sixty-eight specimens from four pairs of human jaw were scanned using both micro-computed tomography (micro-CT) of 19.37-μm voxel size and CBCT of 100-μm voxel size. The correlation of 3-dimensional parameters between CBCT and micro-CT was evaluated. All parameters, except bone-specific surface and trabecular thickness, showed linear correlations between the 2 imaging modalities (P < .05). Among the parameters, bone volume, percent bone volume, trabecular separation, and degree of anisotropy (DA) of CBCT images showed strong correlations with those of micro-CT images. DA showed the strongest correlation (r = 0.693). Most microarchitectural parameters from CBCT were correlated with those from micro-CT. Some microarchitectural parameters, especially DA, could be used as strong predictors of bone quality in the human jaw. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Perceptually-Inspired Computing

    Directory of Open Access Journals (Sweden)

    Ming Lin

    2015-08-01

    Full Text Available Human sensory systems allow individuals to see, hear, touch, and interact with the surrounding physical environment. Understanding human perception and its limit enables us to better exploit the psychophysics of human perceptual systems to design more efficient, adaptive algorithms and develop perceptually-inspired computational models. In this talk, I will survey some of recent efforts on perceptually-inspired computing with applications to crowd simulation and multimodal interaction. In particular, I will present data-driven personality modeling based on the results of user studies, example-guided physics-based sound synthesis using auditory perception, as well as perceptually-inspired simplification for multimodal interaction. These perceptually guided principles can be used to accelerating multi-modal interaction and visual computing, thereby creating more natural human-computer interaction and providing more immersive experiences. I will also present their use in interactive applications for entertainment, such as video games, computer animation, and shared social experience. I will conclude by discussing possible future research directions.

  19. Research Summary 3-D Computational Fluid Dynamics (CFD) Model Of The Human Respiratory System

    Science.gov (United States)

    The U.S. EPA’s Office of Research and Development (ORD) has developed a 3-D computational fluid dynamics (CFD) model of the human respiratory system that allows for the simulation of particulate based contaminant deposition and clearance, while being adaptable for age, ethnicity,...

  20. Enhancing Human-Computer Interaction Design Education: Teaching Affordance Design for Emerging Mobile Devices

    Science.gov (United States)

    Faiola, Anthony; Matei, Sorin Adam

    2010-01-01

    The evolution of human-computer interaction design (HCID) over the last 20 years suggests that there is a growing need for educational scholars to consider new and more applicable theoretical models of interactive product design. The authors suggest that such paradigms would call for an approach that would equip HCID students with a better…

  1. Computer-assisted design and synthesis of a highly selective smart adsorbent for extraction of clonazepam from human serum.

    Science.gov (United States)

    Aqababa, Heydar; Tabandeh, Mehrdad; Tabatabaei, Meisam; Hasheminejad, Meisam; Emadi, Masoomeh

    2013-01-01

    A computational approach was applied to screen functional monomers and polymerization solvents for rational design of molecular imprinted polymers (MIPs) as smart adsorbents for solid-phase extraction of clonazepam (CLO) form human serum. The comparison of the computed binding energies of the complexes formed between the template and functional monomers was conducted. The primary computational results were corrected by taking into calculation both the basis set superposition error (BSSE) and the effect of the polymerization solvent using the counterpoise (CP) correction and the polarizable continuum model, respectively. Based on the theoretical calculations, trifluoromethyl acrylic acid (TFMAA) and acrylonitrile (ACN) were found as the best and the worst functional monomers, correspondingly. To test the accuracy of the computational results, three MIPs were synthesized by different functional monomers and their Langmuir-Freundlich (LF) isotherms were studied. The experimental results obtained confirmed the computational results and indicated that the MIP synthesized using TFMAA had the highest affinity for CLO in human serum despite the presence of a vast spectrum of ions. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Computer graphics of SEM images facilitate recognition of chromosome position in isolated human metaphase plates.

    Science.gov (United States)

    Hodge, L D; Barrett, J M; Welter, D A

    1995-04-01

    There is general agreement that at the time of mitosis chromosomes occupy precise positions and that these positions likely affect subsequent nuclear function in interphase. However, before such ideas can be investigated in human cells, it is necessary to determine first the precise position of each chromosome with regard to its neighbors. It has occurred to us that stereo images, produced by scanning electron microscopy, of isolated metaphase plates could form the basis whereby these positions could be ascertained. In this paper we describe a computer graphic technique that permits us to keep track of individual chromosomes in a metaphase plate and to compare chromosome positions in different metaphase plates. Moreover, the computer graphics provide permanent, easily manipulated, rapid recall of stored chromosome profiles. These advantages are demonstrated by a comparison of the relative position of group A-specific and groups D- and G-specific chromosomes to the full complement of chromosomes in metaphase plates isolated from a nearly triploid human-derived cell (HeLa S3) to a hypo-diploid human fetal lung cell.

  3. SnapAnatomy, a computer-based interactive tool for independent learning of human anatomy.

    Science.gov (United States)

    Yip, George W; Rajendran, Kanagasuntheram

    2008-06-01

    Computer-aided instruction materials are becoming increasing popular in medical education and particularly in the teaching of human anatomy. This paper describes SnapAnatomy, a new interactive program that the authors designed for independent learning of anatomy. SnapAnatomy is primarily tailored for the beginner student to encourage the learning of anatomy by developing a three-dimensional visualization of human structure that is essential to applications in clinical practice and the understanding of function. The program allows the student to take apart and to accurately put together body components in an interactive, self-paced and variable manner to achieve the learning outcome.

  4. Population of 224 realistic human subject-based computational breast phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, David W. [Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Wells, Jered R., E-mail: jered.wells@duke.edu [Clinical Imaging Physics Group and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Sturgeon, Gregory M. [Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 (United States); Samei, Ehsan [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Dobbins, James T. [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Physics and Biomedical Engineering and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Segars, W. Paul [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Lo, Joseph Y. [Department of Radiology and Carl E. Ravin Advanced Imaging Laboratories, Duke University Medical Center, Durham, North Carolina 27705 and Departments of Electrical and Computer Engineering and Biomedical Engineering and Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States)

    2016-01-15

    Purpose: To create a database of highly realistic and anatomically variable 3D virtual breast phantoms based on dedicated breast computed tomography (bCT) data. Methods: A tissue classification and segmentation algorithm was used to create realistic and detailed 3D computational breast phantoms based on 230 + dedicated bCT datasets from normal human subjects. The breast volume was identified using a coarse three-class fuzzy C-means segmentation algorithm which accounted for and removed motion blur at the breast periphery. Noise in the bCT data was reduced through application of a postreconstruction 3D bilateral filter. A 3D adipose nonuniformity (bias field) correction was then applied followed by glandular segmentation using a 3D bias-corrected fuzzy C-means algorithm. Multiple tissue classes were defined including skin, adipose, and several fractional glandular densities. Following segmentation, a skin mask was produced which preserved the interdigitated skin, adipose, and glandular boundaries of the skin interior. Finally, surface modeling was used to produce digital phantoms with methods complementary to the XCAT suite of digital human phantoms. Results: After rejecting some datasets due to artifacts, 224 virtual breast phantoms were created which emulate the complex breast parenchyma of actual human subjects. The volume breast density (with skin) ranged from 5.5% to 66.3% with a mean value of 25.3% ± 13.2%. Breast volumes ranged from 25.0 to 2099.6 ml with a mean value of 716.3 ± 386.5 ml. Three breast phantoms were selected for imaging with digital compression (using finite element modeling) and simple ray-tracing, and the results show promise in their potential to produce realistic simulated mammograms. Conclusions: This work provides a new population of 224 breast phantoms based on in vivo bCT data for imaging research. Compared to previous studies based on only a few prototype cases, this dataset provides a rich source of new cases spanning a wide range

  5. The role of beliefs in lexical alignment: evidence from dialogs with humans and computers.

    Science.gov (United States)

    Branigan, Holly P; Pickering, Martin J; Pearson, Jamie; McLean, Janet F; Brown, Ash

    2011-10-01

    Five experiments examined the extent to which speakers' alignment (i.e., convergence) on words in dialog is mediated by beliefs about their interlocutor. To do this, we told participants that they were interacting with another person or a computer in a task in which they alternated between selecting pictures that matched their 'partner's' descriptions and naming pictures themselves (though in reality all responses were scripted). In both text- and speech-based dialog, participants tended to repeat their partner's choice of referring expression. However, they showed a stronger tendency to align with 'computer' than with 'human' partners, and with computers that were presented as less capable than with computers that were presented as more capable. The tendency to align therefore appears to be mediated by beliefs, with the relevant beliefs relating to an interlocutor's perceived communicative capacity. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Human Adaptation to the Computer.

    Science.gov (United States)

    1986-09-01

    8217"’ TECHNOSTRESS " 5 5’..,:. VI I. CONCLUSIONS-------------------------59 -- LIST OF REFERENCES-------------------------61 BI BLI OGRAPHY...computer has not developed. Instead, what has developed is a "modern disease of adaptation" called " technostress ," a phrase coined by Brod. Craig...34 technostress ." Managers (according to Brod) have been implementing computers in ways that contribute directly to this stress: [Ref. 3:p. 38) 1. They

  7. Computational Model-Based Prediction of Human Episodic Memory Performance Based on Eye Movements

    Science.gov (United States)

    Sato, Naoyuki; Yamaguchi, Yoko

    Subjects' episodic memory performance is not simply reflected by eye movements. We use a ‘theta phase coding’ model of the hippocampus to predict subjects' memory performance from their eye movements. Results demonstrate the ability of the model to predict subjects' memory performance. These studies provide a novel approach to computational modeling in the human-machine interface.

  8. Human-Computer Interaction and Sociological Insight: A Theoretical Examination and Experiment in Building Affinity in Small Groups

    Science.gov (United States)

    Oren, Michael Anthony

    2011-01-01

    The juxtaposition of classic sociological theory and the, relatively, young discipline of human-computer interaction (HCI) serves as a powerful mechanism for both exploring the theoretical impacts of technology on human interactions as well as the application of technological systems to moderate interactions. It is the intent of this dissertation…

  9. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...

  10. Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals

    Directory of Open Access Journals (Sweden)

    Nasoz Fatma

    2004-01-01

    Full Text Available We discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI and user modeling. We introduce the overall paradigm for our multimodal system that aims at recognizing its users' emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement. We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and potential applications of emotion recognition for multimodal intelligent systems.

  11. Foundations for Reasoning in Cognition-Based Computational Representations of Human Decision Making; TOPICAL

    International Nuclear Information System (INIS)

    SENGLAUB, MICHAEL E.; HARRIS, DAVID L.; RAYBOURN, ELAINE M.

    2001-01-01

    In exploring the question of how humans reason in ambiguous situations or in the absence of complete information, we stumbled onto a body of knowledge that addresses issues beyond the original scope of our effort. We have begun to understand the importance that philosophy, in particular the work of C. S. Peirce, plays in developing models of human cognition and of information theory in general. We have a foundation that can serve as a basis for further studies in cognition and decision making. Peircean philosophy provides a foundation for understanding human reasoning and capturing behavioral characteristics of decision makers due to cultural, physiological, and psychological effects. The present paper describes this philosophical approach to understanding the underpinnings of human reasoning. We present the work of C. S. Peirce, and define sets of fundamental reasoning behavior that would be captured in the mathematical constructs of these newer technologies and would be able to interact in an agent type framework. Further, we propose the adoption of a hybrid reasoning model based on his work for future computational representations or emulations of human cognition

  12. Do Computers Write on Electric Screens?

    Directory of Open Access Journals (Sweden)

    Samuel Goyet

    2016-09-01

    Full Text Available How do we, humans, communicate with computers, or computational machines? What are the activities do humans and machines share, what are the meeting points between the two? Eventually, how can we build concepts of these meeting points that leaves space for the proper mode of existence of both humans and machines, without subduing one to the other? Computers are machines that operates on a scale different from humans: the calculus done by machines is too fast and untangible for humans. This is why computers activities has to be textualized, put into a form that can be understand for humans. For instance into a graphical interface, or a command line. More generally, this article tackles the problem of interface between humans and machines, the way the relation between humans and machines has been conceptualized. It is inspired both by philosophy of the modes of existence – since computers are machines with their own mode of existence – and semiotics, since computers activities have to be converted in some sort of signs that can be read by humans.

  13. Real-time non-invasive eyetracking and gaze-point determination for human-computer interaction and biomedicine

    Science.gov (United States)

    Talukder, Ashit; Morookian, John-Michael; Monacos, S.; Lam, R.; Lebaw, C.; Bond, A.

    2004-01-01

    Eyetracking is one of the latest technologies that has shown potential in several areas including human-computer interaction for people with and without disabilities, and for noninvasive monitoring, detection, and even diagnosis of physiological and neurological problems in individuals.

  14. Human-computer interface incorporating personal and application domains

    Science.gov (United States)

    Anderson, Thomas G [Albuquerque, NM

    2011-03-29

    The present invention provides a human-computer interface. The interface includes provision of an application domain, for example corresponding to a three-dimensional application. The user is allowed to navigate and interact with the application domain. The interface also includes a personal domain, offering the user controls and interaction distinct from the application domain. The separation into two domains allows the most suitable interface methods in each: for example, three-dimensional navigation in the application domain, and two- or three-dimensional controls in the personal domain. Transitions between the application domain and the personal domain are under control of the user, and the transition method is substantially independent of the navigation in the application domain. For example, the user can fly through a three-dimensional application domain, and always move to the personal domain by moving a cursor near one extreme of the display.

  15. Human-computer interface glove using flexible piezoelectric sensors

    Science.gov (United States)

    Cha, Youngsu; Seo, Jeonggyu; Kim, Jun-Sik; Park, Jung-Min

    2017-05-01

    In this note, we propose a human-computer interface glove based on flexible piezoelectric sensors. We select polyvinylidene fluoride as the piezoelectric material for the sensors because of advantages such as a steady piezoelectric characteristic and good flexibility. The sensors are installed in a fabric glove by means of pockets and Velcro bands. We detect changes in the angles of the finger joints from the outputs of the sensors, and use them for controlling a virtual hand that is utilized in virtual object manipulation. To assess the sensing ability of the piezoelectric sensors, we compare the processed angles from the sensor outputs with the real angles from a camera recoding. With good agreement between the processed and real angles, we successfully demonstrate the user interaction system with the virtual hand and interface glove based on the flexible piezoelectric sensors, for four hand motions: fist clenching, pinching, touching, and grasping.

  16. Rational behavior in decision making. A comparison between humans, computers and fast and frugal strategies

    NARCIS (Netherlands)

    Snijders, C.C.P.

    2007-01-01

    Rational behavior in decision making. A comparison between humans, computers, and fast and frugal strategies Chris Snijders and Frits Tazelaar (Eindhoven University of Technology, The Netherlands) Real life decisions often have to be made in "noisy" circumstances: not all crucial information is

  17. HumanComputer Systems Interaction Backgrounds and Applications 2 Part 2

    CERN Document Server

    Kulikowski, Juliusz; Mroczek, Teresa

    2012-01-01

    This volume of the book contains a collection of chapters selected from the papers which originally (in shortened form) have been presented at the 3rd International Conference on Human-Systems Interaction held in Rzeszow, Poland, in 2010. The chapters are divided into five sections concerning: IV. Environment monitoring and robotic systems, V. Diagnostic systems, VI. Educational Systems, and VII. General Problems. The novel concepts and realizations of humanoid robots, talking robots and orthopedic surgical robots, as well as those of direct brain-computer interface  are examples of particularly interesting topics presented in Sec. VI. In Sec. V the problems of  skin cancer recognition, colonoscopy diagnosis, and brain strokes diagnosis as well as more general problems of ontology design for  medical diagnostic knowledge are presented. Example of an industrial diagnostic system and a concept of new algorithm for edges detection in computer-analyzed images  are also presented in this Section. Among the edu...

  18. Direct Monte Carlo dose calculation using polygon-surface computational human model

    International Nuclear Information System (INIS)

    Jeong, Jong Hwi; Kim, Chan Hyeong; Yeom, Yeon Su; Cho, Sungkoo; Chung, Min Suk; Cho, Kun-Woo

    2011-01-01

    In the present study, a voxel-type computational human model was converted to a polygon-surface model, after which it was imported directly to the Geant4 code without using a voxelization process, that is, without converting back to a voxel model. The original voxel model was also imported to the Geant4 code, in order to compare the calculated dose values and the computational speed. The average polygon size of the polygon-surface model was ∼0.5 cm 2 , whereas the voxel resolution of the voxel model was 1.981 × 1.981 × 2.0854 mm 3 . The results showed a good agreement between the calculated dose values of the two models. The polygon-surface model was, however, slower than the voxel model by a factor of 6–9 for the photon energies and irradiation geometries considered in the present study, which nonetheless is considered acceptable, considering that direct use of the polygon-surface model does not require a separate voxelization process. (author)

  19. Observation of human tissue with phase-contrast x-ray computed tomography

    Science.gov (United States)

    Momose, Atsushi; Takeda, Tohoru; Itai, Yuji; Tu, Jinhong; Hirano, Keiichi

    1999-05-01

    Human tissues obtained from cancerous kidneys fixed in formalin were observed with phase-contrast X-ray computed tomography (CT) using 17.7-keV synchrotron X-rays. By measuring the distributions of the X-ray phase shift caused by samples using an X-ray interferometer, sectional images that map the distribution of the refractive index were reconstructed. Because of the high sensitivity of phase- contrast X-ray CT, a cancerous lesion was differentiated from normal tissue and a variety of other structures were revealed without the need for staining.

  20. CHI '13 Extended Abstracts on Human Factors in Computing Systems

    DEFF Research Database (Denmark)

    also deeply appreciate the huge amount of time donated to this process by the 211-member program committee, who paid their own way to attend the face-to-face program committee meeting, an event larger than the average ACM conference. We are proud of the work of the CHI 2013 program committee and hope...... a tremendous amount of work from all areas of the human-computer interaction community. As co-chairs of the process, we are amazed at the ability of the community to organize itself to accomplish this task. We would like to thank the 2680 individual reviewers for their careful consideration of these papers. We...

  1. Effects of muscle fatigue on the usability of a myoelectric human-computer interface.

    Science.gov (United States)

    Barszap, Alexander G; Skavhaug, Ida-Maria; Joshi, Sanjay S

    2016-10-01

    Electromyography-based human-computer interface development is an active field of research. However, knowledge on the effects of muscle fatigue for specific devices is limited. We have developed a novel myoelectric human-computer interface in which subjects continuously navigate a cursor to targets by manipulating a single surface electromyography (sEMG) signal. Two-dimensional control is achieved through simultaneous adjustments of power in two frequency bands through a series of dynamic low-level muscle contractions. Here, we investigate the potential effects of muscle fatigue during the use of our interface. In the first session, eight subjects completed 300 cursor-to-target trials without breaks; four using a wrist muscle and four using a head muscle. The wrist subjects returned for a second session in which a static fatiguing exercise took place at regular intervals in-between cursor-to-target trials. In the first session we observed no declines in performance as a function of use, even after the long period of use. In the second session, we observed clear changes in cursor trajectories, paired with a target-specific decrease in hit rates. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Using the Electrocorticographic Speech Network to Control a Brain-Computer Interface in Humans

    Science.gov (United States)

    Leuthardt, Eric C.; Gaona, Charles; Sharma, Mohit; Szrama, Nicholas; Roland, Jarod; Freudenberg, Zac; Solis, Jamie; Breshears, Jonathan; Schalk, Gerwin

    2013-01-01

    Electrocorticography (ECoG) has emerged as a new signal platform for brain-computer interface (BCI) systems. Classically, the cortical physiology that has been commonly investigated and utilized for device control in humans has been brain signals from sensorimotor cortex. Hence, it was unknown whether other neurophysiological substrates, such as the speech network, could be used to further improve on or complement existing motor-based control paradigms. We demonstrate here for the first time that ECoG signals associated with different overt and imagined phoneme articulation can enable invasively monitored human patients to control a one-dimensional computer cursor rapidly and accurately. This phonetic content was distinguishable within higher gamma frequency oscillations and enabled users to achieve final target accuracies between 68 and 91% within 15 minutes. Additionally, one of the patients achieved robust control using recordings from a microarray consisting of 1 mm spaced microwires. These findings suggest that the cortical network associated with speech could provide an additional cognitive and physiologic substrate for BCI operation and that these signals can be acquired from a cortical array that is small and minimally invasive. PMID:21471638

  3. Text understanding for computers

    NARCIS (Netherlands)

    Kenter, T.M.

    2017-01-01

    A long-standing challenge for computers communicating with humans is to pass the Turing test, i.e., to communicate in such a way that it is impossible for humans to determine whether they are talking to a computer or another human being. The field of natural language understanding — which studies

  4. PERANCANGAN COMPUTER AIDED SYSTEM DALAM MENGANALISA HUMAN ERROR DI PERKERETAAPIAN INDONESIA

    Directory of Open Access Journals (Sweden)

    Wiwik Budiawan

    2013-06-01

    the occurrence of a train crash in Indonesia. However, it is not clear how this analysis technique is done. Studies of human error made ​​National Transportation Safety Committee (NTSC is still relatively limited, is not equipped with a systematic method. There are several methods that have been developed at this time, but for railway transportation is not widely developed. Human Factors Analysis and Classification System (HFACS is a human error analysis method were developed and adapted to the Indonesian railway system. To improve the reliability of the analysis of human error, HFACS then developed in the form of web-based applications that can be accessed on a computer or smartphone. The results could be used by the NTSC as railway accident analysis methods particularly associated with human error. Keywords: human error, HFACS, CAS, railways

  5. Quantum Computing's Classical Problem, Classical Computing's Quantum Problem

    OpenAIRE

    Van Meter, Rodney

    2013-01-01

    Tasked with the challenge to build better and better computers, quantum computing and classical computing face the same conundrum: the success of classical computing systems. Small quantum computing systems have been demonstrated, and intermediate-scale systems are on the horizon, capable of calculating numeric results or simulating physical systems far beyond what humans can do by hand. However, to be commercially viable, they must surpass what our wildly successful, highly advanced classica...

  6. Computational fluid dynamics modeling of Bacillus anthracis spore deposition in rabbit and human respiratory airways

    Energy Technology Data Exchange (ETDEWEB)

    Kabilan, S.; Suffield, S. R.; Recknagle, K. P.; Jacob, R. E.; Einstein, D. R.; Kuprat, A. P.; Carson, J. P.; Colby, S. M.; Saunders, J. H.; Hines, S. A.; Teeguarden, J. G.; Straub, T. M.; Moe, M.; Taft, S. C.; Corley, R. A.

    2016-09-01

    Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived respectively from computed tomography (CT) and µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation–exhalation breathing conditions using average species-specific minute volumes. Two different exposure scenarios were modeled in the rabbit based upon experimental inhalation studies. For comparison, human simulations were conducted at the highest exposure concentration used during the rabbit experimental exposures. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Due to the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the nasal sinus compared to the human at the same air concentration of anthrax spores. In contrast, higher spore deposition was predicted in the lower conducting airways of the human compared to the rabbit lung due to differences in airway branching pattern. This information can be used to refine published and ongoing biokinetic models of inhalation anthrax spore exposures, which currently estimate deposited spore concentrations based solely upon exposure concentrations and inhaled doses that do not factor in species-specific anatomy and physiology for deposition.

  7. Computational Fluid Dynamics Modeling of Bacillus anthracis Spore Deposition in Rabbit and Human Respiratory Airways

    Energy Technology Data Exchange (ETDEWEB)

    Kabilan, Senthil; Suffield, Sarah R.; Recknagle, Kurtis P.; Jacob, Rick E.; Einstein, Daniel R.; Kuprat, Andrew P.; Carson, James P.; Colby, Sean M.; Saunders, James H.; Hines, Stephanie; Teeguarden, Justin G.; Straub, Tim M.; Moe, M.; Taft, Sarah; Corley, Richard A.

    2016-09-30

    Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived from computed tomography (CT) or µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation-exhalation breathing conditions using average species-specific minute volumes. The highest exposure concentration was modeled in the rabbit based upon prior acute inhalation studies. For comparison, human simulation was also conducted at the same concentration. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Due to the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the upper conducting airways compared to the human at the same air concentration of anthrax spores. As a result, higher particle deposition was predicted in the conducting airways and deep lung of the human compared to the rabbit lung due to differences in airway branching pattern. This information can be used to refine published and ongoing biokinetic models of inhalation anthrax spore exposures, which currently estimate deposited spore concentrations based solely upon exposure concentrations and inhaled doses that do not factor in species-specific anatomy and physiology.

  8. Computation as Medium

    DEFF Research Database (Denmark)

    Jochum, Elizabeth Ann; Putnam, Lance

    2017-01-01

    Artists increasingly utilize computational tools to generate art works. Computational approaches to art making open up new ways of thinking about agency in interactive art because they invite participation and allow for unpredictable outcomes. Computational art is closely linked...... to the participatory turn in visual art, wherein spectators physically participate in visual art works. Unlike purely physical methods of interaction, computer assisted interactivity affords artists and spectators more nuanced control of artistic outcomes. Interactive art brings together human bodies, computer code......, and nonliving objects to create emergent art works. Computation is more than just a tool for artists, it is a medium for investigating new aesthetic possibilities for choreography and composition. We illustrate this potential through two artistic projects: an improvisational dance performance between a human...

  9. Simple, accurate equations for human blood O2 dissociation computations.

    Science.gov (United States)

    Severinghaus, J W

    1979-03-01

    Hill's equation can be slightly modified to fit the standard human blood O2 dissociation curve to within plus or minus 0.0055 fractional saturation (S) from O less than S less than 1. Other modifications of Hill's equation may be used to compute Po2 (Torr) from S (Eq. 2), and the temperature coefficient of Po2 (Eq. 3). Variations of the Bohr coefficient with Po2 are given by Eq. 4. S = (((Po2(3) + 150 Po2)(-1) x 23,400) + 1)(-1) (1) In Po2 = 0.385 In (S-1 - 1)(-1) + 3.32 - (72 S)(-1) - 0.17(S6) (2) DELTA In Po2/delta T = 0.058 ((0.243 X Po2/100)(3.88) + 1)(-1) + 0.013 (3) delta In Po2/delta pH = (Po2/26.6)(0.184) - 2.2 (4) Procedures are described to determine Po2 and S of blood iteratively after extraction or addition of a defined amount of O2 and to compute P50 of blood from a single sample after measuring Po2, pH, and S.

  10. Supervised learning from human performance at the computationally hard problem of optimal traffic signal control on a network of junctions.

    Science.gov (United States)

    Box, Simon

    2014-12-01

    Optimal switching of traffic lights on a network of junctions is a computationally intractable problem. In this research, road traffic networks containing signallized junctions are simulated. A computer game interface is used to enable a human 'player' to control the traffic light settings on the junctions within the simulation. A supervised learning approach, based on simple neural network classifiers can be used to capture human player's strategies in the game and thus develop a human-trained machine control (HuTMaC) system that approaches human levels of performance. Experiments conducted within the simulation compare the performance of HuTMaC to two well-established traffic-responsive control systems that are widely deployed in the developed world and also to a temporal difference learning-based control method. In all experiments, HuTMaC outperforms the other control methods in terms of average delay and variance over delay. The conclusion is that these results add weight to the suggestion that HuTMaC may be a viable alternative, or supplemental method, to approximate optimization for some practical engineering control problems where the optimal strategy is computationally intractable.

  11. A computational model of human auditory signal processing and perception

    DEFF Research Database (Denmark)

    Jepsen, Morten Løve; Ewert, Stephan D.; Dau, Torsten

    2008-01-01

    A model of computational auditory signal-processing and perception that accounts for various aspects of simultaneous and nonsimultaneous masking in human listeners is presented. The model is based on the modulation filterbank model described by Dau et al. [J. Acoust. Soc. Am. 102, 2892 (1997...... discrimination with pure tones and broadband noise, tone-in-noise detection, spectral masking with narrow-band signals and maskers, forward masking with tone signals and tone or noise maskers, and amplitude-modulation detection with narrow- and wideband noise carriers. The model can account for most of the key...... properties of the data and is more powerful than the original model. The model might be useful as a front end in technical applications....

  12. High School Students' Written Argumentation Qualities with Problem-Based Computer-Aided Material (PBCAM) Designed about Human Endocrine System

    Science.gov (United States)

    Vekli, Gülsah Sezen; Çimer, Atilla

    2017-01-01

    This study investigated development of students' scientific argumentation levels in the applications made with Problem-Based Computer-Aided Material (PBCAM) designed about Human Endocrine System. The case study method was used: The study group was formed of 43 students in the 11th grade of the science high school in Rize. Human Endocrine System…

  13. Towards passive brain-computer interfaces: applying brain-computer interface technology to human-machine systems in general.

    Science.gov (United States)

    Zander, Thorsten O; Kothe, Christian

    2011-04-01

    Cognitive monitoring is an approach utilizing realtime brain signal decoding (RBSD) for gaining information on the ongoing cognitive user state. In recent decades this approach has brought valuable insight into the cognition of an interacting human. Automated RBSD can be used to set up a brain-computer interface (BCI) providing a novel input modality for technical systems solely based on brain activity. In BCIs the user usually sends voluntary and directed commands to control the connected computer system or to communicate through it. In this paper we propose an extension of this approach by fusing BCI technology with cognitive monitoring, providing valuable information about the users' intentions, situational interpretations and emotional states to the technical system. We call this approach passive BCI. In the following we give an overview of studies which utilize passive BCI, as well as other novel types of applications resulting from BCI technology. We especially focus on applications for healthy users, and the specific requirements and demands of this user group. Since the presented approach of combining cognitive monitoring with BCI technology is very similar to the concept of BCIs itself we propose a unifying categorization of BCI-based applications, including the novel approach of passive BCI.

  14. A computational method for identification of vaccine targets from protein regions of conserved human leukocyte antigen binding

    DEFF Research Database (Denmark)

    Olsen, Lars Rønn; Simon, Christian; Kudahl, Ulrich J.

    2015-01-01

    Background: Computational methods for T cell-based vaccine target discovery focus on selection of highly conserved peptides identified across pathogen variants, followed by prediction of their binding of human leukocyte antigen molecules. However, experimental studies have shown that T cells ofte...... or proteome using human leukocyte antigen binding predictions and made a web-accessible software implementation freely available at http://met-hilab.cbs.dtu.dk/blockcons/....

  15. Computational voxel phantom, associated to anthropometric and anthropomorphic real phantom for dosimetry in human male pelvis radiotherapy

    International Nuclear Information System (INIS)

    Silva, Cleuza Helena Teixeira; Campos, Tarcisio Passos Ribeiro de

    2005-01-01

    This paper addresses a computational model of voxels through MCNP5 Code and the experimental development of an anthropometric and anthropomorphic phantom for dosimetry in human male pelvis brachytherapy focusing prostatic tumors. For elaboration of the computational model of the human male pelvis, anatomical section images from the Visible Man Project were applied. Such selected and digital images were associated to a numeric representation, one for each section. Such computational representation of the anatomical sections was transformed into a bi-dimensional mesh of equivalent tissue. The group of bidimensional meshes was concatenated forming the three-dimensional model of voxels to be used by the MCNP5 code. In association to the anatomical information, data from the density and chemical composition of the basic elements, representatives of the organs and involved tissues, were setup in a material database for the MCNP-5. The model will be applied for dosimetric evaluations in situations of irradiation of the human masculine pelvis. Such 3D model of voxel is associated to the code of transport of particles MCNP5, allowing future simulations. It was also developed the construction of human masculine pelvis phantom, based on anthropometric and anthropomorphic dates and in the use of representative equivalent tissues of the skin, fatty, muscular and glandular tissue, as well as the bony structure.This part of work was developed in stages, being built the bony cast first, later the muscular structures and internal organs. They were then jointly mounted and inserted in the skin cast. The representative component of the fatty tissue was incorporate and accomplished the final retouchings in the skin. The final result represents the development of two important essential tools for elaboration of computational and experimental dosimetry. Thus, it is possible its use in calibrations of pre-existent protocols in radiotherapy, as well as for tests of new protocols, besides

  16. Advances in Human-Computer Interaction: Graphics and Animation Components for Interface Design

    Science.gov (United States)

    Cipolla Ficarra, Francisco V.; Nicol, Emma; Cipolla-Ficarra, Miguel; Richardson, Lucy

    We present an analysis of communicability methodology in graphics and animation components for interface design, called CAN (Communicability, Acceptability and Novelty). This methodology has been under development between 2005 and 2010, obtaining excellent results in cultural heritage, education and microcomputing contexts. In studies where there is a bi-directional interrelation between ergonomics, usability, user-centered design, software quality and the human-computer interaction. We also present the heuristic results about iconography and layout design in blogs and websites of the following countries: Spain, Italy, Portugal and France.

  17. Single-photon emission computed tomography in human immunodeficiency virus encephalopathy: A preliminary report

    International Nuclear Information System (INIS)

    Masdeu, J.C.; Yudd, A.; Van Heertum, R.L.; Grundman, M.; Hriso, E.; O'Connell, R.A.; Luck, D.; Camli, U.; King, L.N.

    1991-01-01

    Depression or psychosis in a previously asymptomatic individual infected with the human immunodeficiency virus (HIV) may be psychogenic, related to brain involvement by the HIV or both. Although prognosis and treatment differ depending on etiology, computed tomography (CT) and magnetic resonance imaging (MRI) are usually unrevealing in early HIV encephalopathy and therefore cannot differentiate it from psychogenic conditions. Thirty of 32 patients (94%) with HIV encephalopathy had single-photon emission computed tomography (SPECT) findings that differed from the findings in 15 patients with non-HIV psychoses and 6 controls. SPECT showed multifocal cortical and subcortical areas of hypoperfusion. In 4 cases, cognitive improvement after 6-8 weeks of zidovudine (AZT) therapy was reflected in amelioration of SPECT findings. CT remained unchanged. SPECT may be a useful technique for the evaluation of HIV encephalopathy

  18. User involvement in the design of human-computer interactions: some similarities and differences between design approaches

    NARCIS (Netherlands)

    Bekker, M.M.; Long, J.B.

    1998-01-01

    This paper presents a general review of user involvement in the design of human-computer interactions, as advocated by a selection of different approaches to design. The selection comprises User-Centred Design, Participatory Design, Socio-Technical Design, Soft Systems Methodology, and Joint

  19. Direct estimation of human trabecular bone stiffness using cone beam computed tomography.

    Science.gov (United States)

    Klintström, Eva; Klintström, Benjamin; Pahr, Dieter; Brismar, Torkel B; Smedby, Örjan; Moreno, Rodrigo

    2018-04-10

    The aim of this study was to evaluate the possibility of estimating the biomechanical properties of trabecular bone through finite element simulations by using dental cone beam computed tomography data. Fourteen human radius specimens were scanned in 3 cone beam computed tomography devices: 3-D Accuitomo 80 (J. Morita MFG., Kyoto, Japan), NewTom 5 G (QR Verona, Verona, Italy), and Verity (Planmed, Helsinki, Finland). The imaging data were segmented by using 2 different methods. Stiffness (Young modulus), shear moduli, and the size and shape of the stiffness tensor were studied. Corresponding evaluations by using micro-CT were regarded as the reference standard. The 3-D Accuitomo 80 (J. Morita MFG., Kyoto, Japan) showed good performance in estimating stiffness and shear moduli but was sensitive to the choice of segmentation method. NewTom 5 G (QR Verona, Verona, Italy) and Verity (Planmed, Helsinki, Finland) yielded good correlations, but they were not as strong as Accuitomo 80 (J. Morita MFG., Kyoto, Japan). The cone beam computed tomography devices overestimated both stiffness and shear compared with the micro-CT estimations. Finite element-based calculations of biomechanics from cone beam computed tomography data are feasible, with strong correlations for the Accuitomo 80 scanner (J. Morita MFG., Kyoto, Japan) combined with an appropriate segmentation method. Such measurements might be useful for predicting implant survival by in vivo estimations of bone properties. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Human-Computer Interfaces for Wearable Computers: A Systematic Approach to Development and Evaluation

    OpenAIRE

    Witt, Hendrik

    2007-01-01

    The research presented in this thesis examines user interfaces for wearable computers.Wearable computers are a special kind of mobile computers that can be worn on the body. Furthermore, they integrate themselves even more seamlessly into different activities than a mobile phone or a personal digital assistant can.The thesis investigates the development and evaluation of user interfaces for wearable computers. In particular, it presents fundamental research results as well as supporting softw...

  1. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Herberger, Sarah Elizabeth Marie [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  2. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    International Nuclear Information System (INIS)

    Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie; Mandelli, Diego; Smith, Curtis Lee

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS 'pathways,' or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

  3. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  4. Brain-Computer Symbiosis

    Science.gov (United States)

    Schalk, Gerwin

    2009-01-01

    The theoretical groundwork of the 1930’s and 1940’s and the technical advance of computers in the following decades provided the basis for dramatic increases in human efficiency. While computers continue to evolve, and we can still expect increasing benefits from their use, the interface between humans and computers has begun to present a serious impediment to full realization of the potential payoff. This article is about the theoretical and practical possibility that direct communication between the brain and the computer can be used to overcome this impediment by improving or augmenting conventional forms of human communication. It is about the opportunity that the limitations of our body’s input and output capacities can be overcome using direct interaction with the brain, and it discusses the assumptions, possible limitations, and implications of a technology that I anticipate will be a major source of pervasive changes in the coming decades. PMID:18310804

  5. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems

    DEFF Research Database (Denmark)

    also deeply appreciate the huge amount of time donated to this process by the 211-member program committee, who paid their own way to attend the face-to-face program committee meeting, an event larger than the average ACM conference. We are proud of the work of the CHI 2013 program committee and hope...... a tremendous amount of work from all areas of the human-computer interaction community. As co-chairs of the process, we are amazed at the ability of the community to organize itself to accomplish this task. We would like to thank the 2680 individual reviewers for their careful consideration of these papers. We...

  6. Functional physiology of the human terminal antrum defined by high-resolution electrical mapping and computational modeling.

    Science.gov (United States)

    Berry, Rachel; Miyagawa, Taimei; Paskaranandavadivel, Niranchan; Du, Peng; Angeli, Timothy R; Trew, Mark L; Windsor, John A; Imai, Yohsuke; O'Grady, Gregory; Cheng, Leo K

    2016-11-01

    High-resolution (HR) mapping has been used to study gastric slow-wave activation; however, the specific characteristics of antral electrophysiology remain poorly defined. This study applied HR mapping and computational modeling to define functional human antral physiology. HR mapping was performed in 10 subjects using flexible electrode arrays (128-192 electrodes; 16-24 cm 2 ) arranged from the pylorus to mid-corpus. Anatomical registration was by photographs and anatomical landmarks. Slow-wave parameters were computed, and resultant data were incorporated into a computational fluid dynamics (CFD) model of gastric flow to calculate impact on gastric mixing. In all subjects, extracellular mapping demonstrated normal aboral slow-wave propagation and a region of increased amplitude and velocity in the prepyloric antrum. On average, the high-velocity region commenced 28 mm proximal to the pylorus, and activation ceased 6 mm from the pylorus. Within this region, velocity increased 0.2 mm/s per mm of tissue, from the mean 3.3 ± 0.1 mm/s to 7.5 ± 0.6 mm/s (P human terminal antral contraction is controlled by a short region of rapid high-amplitude slow-wave activity. Distal antral wave acceleration plays a major role in antral flow and mixing, increasing particle strain and trituration. Copyright © 2016 the American Physiological Society.

  7. FDTD computation of human eye exposure to ultra-wideband electromagnetic pulses

    Energy Technology Data Exchange (ETDEWEB)

    Simicevic, Neven [Center for Applied Physics Studies, Louisiana Tech University, Ruston, LA 71272 (United States)], E-mail: neven@phys.latech.edu

    2008-03-21

    With an increase in the application of ultra-wideband (UWB) electromagnetic pulses in the communications industry, radar, biotechnology and medicine, comes an interest in UWB exposure safety standards. Despite an increase of the scientific research on bioeffects of exposure to non-ionizing UWB pulses, characterization of those effects is far from complete. A numerical computational approach, such as a finite-difference time domain (FDTD) method, is required to visualize and understand the complexity of broadband electromagnetic interactions. The FDTD method has almost no limits in the description of the geometrical and dispersive properties of the simulated material, it is numerically robust and appropriate for current computer technology. In this paper, a complete calculation of exposure of the human eye to UWB electromagnetic pulses in the frequency range of 3.1-10.6, 22-29 and 57-64 GHz is performed. Computation in this frequency range required a geometrical resolution of the eye of 0.1 mm and an arbitrary precision in the description of its dielectric properties in terms of the Debye model. New results show that the interaction of UWB pulses with the eye tissues exhibits the same properties as the interaction of the continuous electromagnetic waves (CWs) with the frequencies from the pulse's frequency spectrum. It is also shown that under the same exposure conditions the exposure to UWB pulses is from one to many orders of magnitude safer than the exposure to CW.

  8. FDTD computation of human eye exposure to ultra-wideband electromagnetic pulses.

    Science.gov (United States)

    Simicevic, Neven

    2008-03-21

    With an increase in the application of ultra-wideband (UWB) electromagnetic pulses in the communications industry, radar, biotechnology and medicine, comes an interest in UWB exposure safety standards. Despite an increase of the scientific research on bioeffects of exposure to non-ionizing UWB pulses, characterization of those effects is far from complete. A numerical computational approach, such as a finite-difference time domain (FDTD) method, is required to visualize and understand the complexity of broadband electromagnetic interactions. The FDTD method has almost no limits in the description of the geometrical and dispersive properties of the simulated material, it is numerically robust and appropriate for current computer technology. In this paper, a complete calculation of exposure of the human eye to UWB electromagnetic pulses in the frequency range of 3.1-10.6, 22-29 and 57-64 GHz is performed. Computation in this frequency range required a geometrical resolution of the eye of 0.1 mm and an arbitrary precision in the description of its dielectric properties in terms of the Debye model. New results show that the interaction of UWB pulses with the eye tissues exhibits the same properties as the interaction of the continuous electromagnetic waves (CWs) with the frequencies from the pulse's frequency spectrum. It is also shown that under the same exposure conditions the exposure to UWB pulses is from one to many orders of magnitude safer than the exposure to CW.

  9. FDTD computation of human eye exposure to ultra-wideband electromagnetic pulses

    International Nuclear Information System (INIS)

    Simicevic, Neven

    2008-01-01

    With an increase in the application of ultra-wideband (UWB) electromagnetic pulses in the communications industry, radar, biotechnology and medicine, comes an interest in UWB exposure safety standards. Despite an increase of the scientific research on bioeffects of exposure to non-ionizing UWB pulses, characterization of those effects is far from complete. A numerical computational approach, such as a finite-difference time domain (FDTD) method, is required to visualize and understand the complexity of broadband electromagnetic interactions. The FDTD method has almost no limits in the description of the geometrical and dispersive properties of the simulated material, it is numerically robust and appropriate for current computer technology. In this paper, a complete calculation of exposure of the human eye to UWB electromagnetic pulses in the frequency range of 3.1-10.6, 22-29 and 57-64 GHz is performed. Computation in this frequency range required a geometrical resolution of the eye of 0.1 mm and an arbitrary precision in the description of its dielectric properties in terms of the Debye model. New results show that the interaction of UWB pulses with the eye tissues exhibits the same properties as the interaction of the continuous electromagnetic waves (CWs) with the frequencies from the pulse's frequency spectrum. It is also shown that under the same exposure conditions the exposure to UWB pulses is from one to many orders of magnitude safer than the exposure to CW

  10. The mind-writing pupil : A human-computer interface based on decoding of covert attention through pupillometry

    NARCIS (Netherlands)

    Mathôt, Sebastiaan; Melmi, Jean Baptiste; Van Der Linden, Lotje; Van Der Stigchel, Stefan

    2016-01-01

    We present a new human-computer interface that is based on decoding of attention through pupillometry. Our method builds on the recent finding that covert visual attention affects the pupillary light response: Your pupil constricts when you covertly (without looking at it) attend to a bright,

  11. Rana computatrix to human language: towards a computational neuroethology of language evolution.

    Science.gov (United States)

    Arbib, Michael A

    2003-10-15

    Walter's Machina speculatrix inspired the name Rana computatrix for a family of models of visuomotor coordination in the frog, which contributed to the development of computational neuroethology. We offer here an 'evolutionary' perspective on models in the same tradition for rat, monkey and human. For rat, we show how the frog-like taxon affordance model provides a basis for the spatial navigation mechanisms that involve the hippocampus and other brain regions. For monkey, we recall two models of neural mechanisms for visuomotor coordination. The first, for saccades, shows how interactions between the parietal and frontal cortex augment superior colliculus seen as the homologue of frog tectum. The second, for grasping, continues the theme of parieto-frontal interactions, linking parietal affordances to motor schemas in premotor cortex. It further emphasizes the mirror system for grasping, in which neurons are active both when the monkey executes a specific grasp and when it observes a similar grasp executed by others. The model of human-brain mechanisms is based on the mirror-system hypothesis of the evolution of the language-ready brain, which sees the human Broca's area as an evolved extension of the mirror system for grasping.

  12. A soft-contact model for computing safety margins in human prehension.

    Science.gov (United States)

    Singh, Tarkeshwar; Ambike, Satyajit

    2017-10-01

    The soft human digit tip forms contact with grasped objects over a finite area and applies a moment about an axis normal to the area. These moments are important for ensuring stability during precision grasping. However, the contribution of these moments to grasp stability is rarely investigated in prehension studies. The more popular hard-contact model assumes that the digits exert a force vector but no free moment on the grasped object. Many sensorimotor studies use this model and show that humans estimate friction coefficients to scale the normal force to grasp objects stably, i.e. the smoother the surface, the tighter the grasp. The difference between the applied normal force and the minimal normal force needed to prevent slipping is called safety margin and this index is widely used as a measure of grasp planning. Here, we define and quantify safety margin using a more realistic contact model that allows digits to apply both forces and moments. Specifically, we adapt a soft-contact model from robotics and demonstrate that the safety margin thus computed is a more accurate and robust index of grasp planning than its hard-contact variant. Previously, we have used the soft-contact model to propose two indices of grasp planning that show how humans account for the shape and inertial properties of an object. A soft-contact based safety margin offers complementary insights by quantifying how humans may account for surface properties of the object and skin tissue during grasp planning and execution. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Comparison between a Computational Seated Human Model and Experimental Verification Data

    Directory of Open Access Journals (Sweden)

    Christian G. Olesen

    2014-01-01

    Full Text Available Sitting-acquired deep tissue injuries (SADTI are the most serious type of pressure ulcers. In order to investigate the aetiology of SADTI a new approach is under development: a musculo-skeletal model which can predict forces between the chair and the human body at different seated postures. This study focuses on comparing results from a model developed in the AnyBody Modeling System, with data collected from an experimental setup. A chair with force-measuring equipment was developed, an experiment was conducted with three subjects, and the experimental results were compared with the predictions of the computational model. The results show that the model predicted the reaction forces for different chair postures well. The correlation coefficients of how well the experiment and model correlate for the seat angle, backrest angle and footrest height was 0.93, 0.96, and 0.95. The study show a good agreement between experimental data and model prediction of forces between a human body and a chair. The model can in the future be used in designing wheelchairs or automotive seats.

  14. The use of computers to teach human anatomy and physiology to allied health and nursing students

    Science.gov (United States)

    Bergeron, Valerie J.

    Educational institutions are under tremendous pressure to adopt the newest technologies in order to prepare their students to meet the challenges of the twenty-first century. For the last twenty years huge amounts of money have been spent on computers, printers, software, multimedia projection equipment, and so forth. A reasonable question is, "Has it worked?" Has this infusion of resources, financial as well as human, resulted in improved learning? Are the students meeting the intended learning goals? Any attempt to develop answers to these questions should include examining the intended goals and exploring the effects of the changes on students and faculty. This project investigated the impact of a specific application of a computer program in a community college setting on students' attitudes and understanding of human anatomy and physiology. In this investigation two sites of the same community college with seemingly similar students populations, seven miles apart, used different laboratory activities to teach human anatomy and physiology. At one site nursing students were taught using traditional dissections and laboratory activities; at the other site two of the dissections, specifically cat and sheep pluck, were replaced with the A.D.A.M.RTM (Animated Dissection of Anatomy for Medicine) computer program. Analysis of the attitude data indicated that students at both sites were extremely positive about their laboratory experiences. Analysis of the content data indicated a statistically significant difference in performance between the two sites in two of the eight content areas that were studied. For both topics the students using the computer program scored higher. A detailed analysis of the surveys, interviews with faculty and students, examination of laboratory materials, and observations of laboratory facilities in both sites, and cost-benefit analysis led to the development of seven recommendations. The recommendations call for action at the level of the

  15. Human vs. Computer Diagnosis of Students' Natural Selection Knowledge: Testing the Efficacy of Text Analytic Software

    Science.gov (United States)

    Nehm, Ross H.; Haertig, Hendrik

    2012-01-01

    Our study examines the efficacy of Computer Assisted Scoring (CAS) of open-response text relative to expert human scoring within the complex domain of evolutionary biology. Specifically, we explored whether CAS can diagnose the explanatory elements (or Key Concepts) that comprise undergraduate students' explanatory models of natural selection with…

  16. Human-Centered Design of Human-Computer-Human Dialogs in Aerospace Systems

    Science.gov (United States)

    Mitchell, Christine M.

    1998-01-01

    A series of ongoing research programs at Georgia Tech established a need for a simulation support tool for aircraft computer-based aids. This led to the design and development of the Georgia Tech Electronic Flight Instrument Research Tool (GT-EFIRT). GT-EFIRT is a part-task flight simulator specifically designed to study aircraft display design and single pilot interaction. ne simulator, using commercially available graphics and Unix workstations, replicates to a high level of fidelity the Electronic Flight Instrument Systems (EFIS), Flight Management Computer (FMC) and Auto Flight Director System (AFDS) of the Boeing 757/767 aircraft. The simulator can be configured to present information using conventional looking B757n67 displays or next generation Primary Flight Displays (PFD) such as found on the Beech Starship and MD-11.

  17. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  18. The UF family of hybrid phantoms of the developing human fetus for computational radiation dosimetry

    International Nuclear Information System (INIS)

    Maynard, Matthew R; Geyer, John W; Bolch, Wesley; Aris, John P; Shifrin, Roger Y

    2011-01-01

    Historically, the development of computational phantoms for radiation dosimetry has primarily been directed at capturing and representing adult and pediatric anatomy, with less emphasis devoted to models of the human fetus. As concern grows over possible radiation-induced cancers from medical and non-medical exposures of the pregnant female, the need to better quantify fetal radiation doses, particularly at the organ-level, also increases. Studies such as the European Union's SOLO (Epidemiological Studies of Exposed Southern Urals Populations) hope to improve our understanding of cancer risks following chronic in utero radiation exposure. For projects such as SOLO, currently available fetal anatomic models do not provide sufficient anatomical detail for organ-level dose assessment. To address this need, two fetal hybrid computational phantoms were constructed using high-quality magnetic resonance imaging and computed tomography image sets obtained for two well-preserved fetal specimens aged 11.5 and 21 weeks post-conception. Individual soft tissue organs, bone sites and outer body contours were segmented from these images using 3D-DOCTOR(TM) and then imported to the 3D modeling software package Rhinoceros(TM) for further modeling and conversion of soft tissue organs, certain bone sites and outer body contours to deformable non-uniform rational B-spline surfaces. The two specimen-specific phantoms, along with a modified version of the 38 week UF hybrid newborn phantom, comprised a set of base phantoms from which a series of hybrid computational phantoms was derived for fetal ages 8, 10, 15, 20, 25, 30, 35 and 38 weeks post-conception. The methodology used to construct the series of phantoms accounted for the following age-dependent parameters: (1) variations in skeletal size and proportion, (2) bone-dependent variations in relative levels of bone growth, (3) variations in individual organ masses and total fetal masses and (4) statistical percentile variations in

  19. [Geomagnetic storm decreases coherence of electric oscillations of human brain while working at the computer].

    Science.gov (United States)

    Novik, O B; Smirnov, F A

    2013-01-01

    The effect of geomagnetic storms at the latitude of Moscow on the electric oscillations of the human brain cerebral cortex was studied. In course of electroencephalogram measurements it was shown that when the voluntary persons at the age of 18-23 years old were performing tasks using a computer during moderate magnetic storm or no later than 24 hrs after it, the value of the coherence function of electric oscillations of the human brain in the frontal and occipital areas in a range of 4.0-7.9 Hz (so-called the theta rhythm oscillations of the human brain) decreased by a factor of two or more, sometimes reaching zero, although arterial blood pressure, respiratory rate and the electrocardiogram registered during electroencephalogram measurements remained within the standard values.

  20. Sustaining Economic Exploitation of Complex Ecosystems in Computational Models of Coupled Human-Natural Networks

    OpenAIRE

    Martinez, Neo D.; Tonin, Perrine; Bauer, Barbara; Rael, Rosalyn C.; Singh, Rahul; Yoon, Sangyuk; Yoon, Ilmi; Dunne, Jennifer A.

    2012-01-01

    Understanding ecological complexity has stymied scientists for decades. Recent elucidation of the famously coined "devious strategies for stability in enduring natural systems" has opened up a new field of computational analyses of complex ecological networks where the nonlinear dynamics of many interacting species can be more realistically mod-eled and understood. Here, we describe the first extension of this field to include coupled human-natural systems. This extension elucidates new strat...

  1. Computer program for assessing the human dose due to stationary release of tritium

    International Nuclear Information System (INIS)

    Saito, Masahiro; Raskob, Wolfgang

    2003-01-01

    The computer program TriStat (Tritium dose assessment for stationary release) has been developed to assess the dose to humans assuming a stationary release of tritium as HTO and/or HT from nuclear facilities. A Gaussian dispersion model describes the behavior of HT gas and HTO vapor in the atmosphere. Tritium concentrations in soil, vegetables and forage were estimated on the basis of specific tritium concentrations in the free water component and the organic component. The uptake of contamination via food by humans was modeled by assuming a forage compartment, a vegetable component, and an animal compartment. A standardized vegetable and a standardized animal with the relative content of major nutrients, i.e. proteins, lipids and carbohydrates, representing a standard Japanese diet, were included. A standardized forage was defined in a similar manner by using the forage composition for typical farm animals. These standard feed- and foodstuffs are useful to simplify the tritium dosimetry and the food chain related to the tritium transfer to the human body. (author)

  2. A Single Camera Motion Capture System for Human-Computer Interaction

    Science.gov (United States)

    Okada, Ryuzo; Stenger, Björn

    This paper presents a method for markerless human motion capture using a single camera. It uses tree-based filtering to efficiently propagate a probability distribution over poses of a 3D body model. The pose vectors and associated shapes are arranged in a tree, which is constructed by hierarchical pairwise clustering, in order to efficiently evaluate the likelihood in each frame. Anew likelihood function based on silhouette matching is proposed that improves the pose estimation of thinner body parts, i. e. the limbs. The dynamic model takes self-occlusion into account by increasing the variance of occluded body-parts, thus allowing for recovery when the body part reappears. We present two applications of our method that work in real-time on a Cell Broadband Engine™: a computer game and a virtual clothing application.

  3. The effect of repeated freeze-thaw cycles on human muscle tissue visualized by postmortem computed tomography (PMCT)

    NARCIS (Netherlands)

    Klop, Anthony C.; Vester, Marloes E. M.; Colman, Kerri L.; Ruijter, Jan M.; van Rijn, Rick R.; Oostra, Roelof-Jan

    2017-01-01

    The aim of this study was to determine whether effects of repetitive freeze-thaw cycles, with various thawing temperatures, on human muscle tissue can be quantified using postmortem computed tomography (PMCT) technology. An additional objective was to determine the preferred thawing temperature for

  4. Human spaceflight and space adaptations: Computational simulation of gravitational unloading on the spine

    Science.gov (United States)

    Townsend, Molly T.; Sarigul-Klijn, Nesrin

    2018-04-01

    Living in reduced gravitational environments for a prolonged duration such, as a fly by mission to Mars or an extended stay at the international space station, affects the human body - in particular, the spine. As the spine adapts to spaceflight, morphological and physiological changes cause the mechanical integrity of the spinal column to be compromised, potentially endangering internal organs, nervous health, and human body mechanical function. Therefore, a high fidelity computational model and simulation of the whole human spine was created and validated for the purpose of investigating the mechanical integrity of the spine in crew members during exploratory space missions. A spaceflight exposed spine has been developed through the adaptation of a three-dimensional nonlinear finite element model with the updated Lagrangian formulation of a healthy ground-based human spine in vivo. Simulation of the porohyperelastic response of the intervertebral disc to mechanical unloading resulted in a model capable of accurately predicting spinal swelling/lengthening, spinal motion, and internal stress distribution. The curvature of this space adaptation exposed spine model was compared to a control terrestrial-based finite element model, indicating how the shape changed. Finally, the potential of injury sites to crew members are predicted for a typical 9 day mission.

  5. Computers and conversation

    CERN Document Server

    Luff, Paul; Gilbert, Nigel G

    1986-01-01

    In the past few years a branch of sociology, conversation analysis, has begun to have a significant impact on the design of human*b1computer interaction (HCI). The investigation of human*b1human dialogue has emerged as a fruitful foundation for interactive system design.****This book includes eleven original chapters by leading researchers who are applying conversation analysis to HCI. The fundamentals of conversation analysis are outlined, a number of systems are described, and a critical view of their value for HCI is offered.****Computers and Conversation will be of interest to all concerne

  6. Investigation on human serum albumin and Gum Tragacanth interactions using experimental and computational methods.

    Science.gov (United States)

    Moradi, Sajad; Taran, Mojtaba; Shahlaei, Mohsen

    2018-02-01

    The study on the interaction of human serum albumin and Gum Tragacanth, a biodegradable bio-polymer, has been undertaken. For this purpose, several experimental and computational methods were used. Investigation of thermodynamic parameters and mode of interactions were carried out using Fluorescence spectroscopy in 300 and 310K. Also, a Fourier transformed infrared spectra and synchronous fluorescence spectroscopy was performed. To give detailed insight of possible interactions, docking and molecular dynamic simulations were also applied. Results show that the interaction is based on hydrogen bonding and van der Waals forces. Structural analysis implies on no adverse change in protein conformation during binding of GT. Furthermore, computational methods confirm some evidence on secondary structure enhancement of protein as a presence of combining with Gum Tragacanth. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Conformational effects on the circular dichroism of Human Carbonic Anhydrase II: a multilevel computational study.

    Directory of Open Access Journals (Sweden)

    Tatyana G Karabencheva-Christova

    Full Text Available Circular Dichroism (CD spectroscopy is a powerful method for investigating conformational changes in proteins and therefore has numerous applications in structural and molecular biology. Here a computational investigation of the CD spectrum of the Human Carbonic Anhydrase II (HCAII, with main focus on the near-UV CD spectra of the wild-type enzyme and it seven tryptophan mutant forms, is presented and compared to experimental studies. Multilevel computational methods (Molecular Dynamics, Semiempirical Quantum Mechanics, Time-Dependent Density Functional Theory were applied in order to gain insight into the mechanisms of interaction between the aromatic chromophores within the protein environment and understand how the conformational flexibility of the protein influences these mechanisms. The analysis suggests that combining CD semi empirical calculations, crystal structures and molecular dynamics (MD could help in achieving a better agreement between the computed and experimental protein spectra and provide some unique insight into the dynamic nature of the mechanisms of chromophore interactions.

  8. A truly human interface: Interacting face-to-face with someone whose words are determined by a computer program

    Directory of Open Access Journals (Sweden)

    Kevin eCorti

    2015-05-01

    Full Text Available We use speech shadowing to create situations wherein people converse in person with a human whose words are determined by a conversational agent computer program. Speech shadowing involves a person (the shadower repeating vocal stimuli originating from a separate communication source in real-time. Humans shadowing for conversational agent sources (e.g., chat bots become hybrid agents (echoborgs capable of face-to-face interlocution. We report three studies that investigated people’s experiences interacting with echoborgs and the extent to which echoborgs pass as autonomous humans. First, participants in a Turing Test spoke with a chat bot via either a text interface or an echoborg. Human shadowing did not improve the chat bot’s chance of passing but did increase interrogators’ ratings of how human-like the chat bot seemed. In our second study, participants had to decide whether their interlocutor produced words generated by a chat bot or simply pretended to be one. Compared to those who engaged a text interface, participants who engaged an echoborg were more likely to perceive their interlocutor as pretending to be a chat bot. In our third study, participants were naïve to the fact that their interlocutor produced words generated by a chat bot. Unlike those who engaged a text interface, the vast majority of participants who engaged an echoborg neither sensed nor suspected a robotic interaction. These findings have implications for android science, the Turing Test paradigm, and human-computer interaction. The human body, as the delivery mechanism of communication, fundamentally alters the social psychological dynamics of interactions with machine intelligence.

  9. Challenges for Virtual Humans in Human Computing

    NARCIS (Netherlands)

    Reidsma, Dennis; Ruttkay, Z.M.; Huang, T; Nijholt, Antinus; Pantic, Maja; Pentland, A.

    The vision of Ambient Intelligence (AmI) presumes a plethora of embedded services and devices that all endeavor to support humans in their daily activities as unobtrusively as possible. Hardware gets distributed throughout the environment, occupying even the fabric of our clothing. The environment

  10. Digital Humanities

    DEFF Research Database (Denmark)

    Brügger, Niels

    2016-01-01

    , and preserving material to study, as an object of study in its own right, as an analytical tool, or for collaborating, and for disseminating results. The term "digital humanities" was coined around 2001, and gained currency within academia in the following years. However, computers had been used within......Digital humanities is an umbrella term for theories, methodologies, and practices related to humanities scholarship that use the digital computer as an integrated and essential part of its research and teaching activities. The computer can be used for establishing, finding, collecting...

  11. Attacks on computer systems

    Directory of Open Access Journals (Sweden)

    Dejan V. Vuletić

    2012-01-01

    Full Text Available Computer systems are a critical component of the human society in the 21st century. Economic sector, defense, security, energy, telecommunications, industrial production, finance and other vital infrastructure depend on computer systems that operate at local, national or global scales. A particular problem is that, due to the rapid development of ICT and the unstoppable growth of its application in all spheres of the human society, their vulnerability and exposure to very serious potential dangers increase. This paper analyzes some typical attacks on computer systems.

  12. Quality of human-computer interaction - results of a national usability survey of hospital-IT in Germany

    Directory of Open Access Journals (Sweden)

    Bundschuh Bettina B

    2011-11-01

    Full Text Available Abstract Background Due to the increasing functionality of medical information systems, it is hard to imagine day to day work in hospitals without IT support. Therefore, the design of dialogues between humans and information systems is one of the most important issues to be addressed in health care. This survey presents an analysis of the current quality level of human-computer interaction of healthcare-IT in German hospitals, focused on the users' point of view. Methods To evaluate the usability of clinical-IT according to the design principles of EN ISO 9241-10 the IsoMetrics Inventory, an assessment tool, was used. The focus of this paper has been put on suitability for task, training effort and conformity with user expectations, differentiated by information systems. Effectiveness has been evaluated with the focus on interoperability and functionality of different IT systems. Results 4521 persons from 371 hospitals visited the start page of the study, while 1003 persons from 158 hospitals completed the questionnaire. The results show relevant variations between different information systems. Conclusions Specialised information systems with defined functionality received better assessments than clinical information systems in general. This could be attributed to the improved customisation of these specialised systems for specific working environments. The results can be used as reference data for evaluation and benchmarking of human computer engineering in clinical health IT context for future studies.

  13. Machine takeover the growing threat to human freedom in a computer-controlled society

    CERN Document Server

    George, Frank Honywill

    1977-01-01

    Machine Takeover: The Growing Threat to Human Freedom in a Computer-Controlled Society discusses the implications of technological advancement. The title identifies the changes in society that no one is aware of, along with what this changes entails. The text first covers the information science, particularly the aspect of an automated system for information processing. Next, the selection deals with social implications of information science, such as information pollution. The text also tackles the concerns in the utilization of technology in order to manipulate the lives of people without th

  14. Application of computer-assisted imaging technology in human musculoskeletal joint research

    Directory of Open Access Journals (Sweden)

    Xudong Liu

    2014-01-01

    Full Text Available Computer-assisted imaging analysis technology has been widely used in the musculoskeletal joint biomechanics research in recent years. Imaging techniques can accurately reconstruct the anatomic features of the target joint and reproduce its in vivo motion characters. The data has greatly improved our understanding of normal joint function, joint injury mechanism, and surgical treatment, and can provide foundations for using reverse-engineering methods to develop biomimetic artificial joints. In this paper, we systematically reviewed the investigation of in vivo kinematics of the human knee, shoulder, lumber spine, and ankle using advanced imaging technologies, especially those using a dual fluoroscopic imaging system (DFIS. We also briefly discuss future development of imaging analysis technology in musculoskeletal joint research.

  15. Micro-Computed Tomography Evaluation of Human Fat Grafts in Nude Mice

    Science.gov (United States)

    Chung, Michael T.; Hyun, Jeong S.; Lo, David D.; Montoro, Daniel T.; Hasegawa, Masakazu; Levi, Benjamin; Januszyk, Michael; Longaker, Michael T.

    2013-01-01

    Background Although autologous fat grafting has revolutionized the field of soft tissue reconstruction and augmentation, long-term maintenance of fat grafts is unpredictable. Recent studies have reported survival rates of fat grafts to vary anywhere between 10% and 80% over time. The present study evaluated the long-term viability of human fat grafts in a murine model using a novel imaging technique allowing for in vivo volumetric analysis. Methods Human fat grafts were prepared from lipoaspirate samples using the Coleman technique. Fat was injected subcutaneously into the scalp of 10 adult Crl:NU-Foxn1nu CD-1 male mice. Micro-computed tomography (CT) was performed immediately following injection and then weekly thereafter. Fat volume was rendered by reconstructing a three-dimensional (3D) surface through cubic-spline interpolation. Specimens were also harvested at various time points and sections were prepared and stained with hematoxylin and eosin (H&E), for macrophages using CD68 and for the cannabinoid receptor 1 (CB1). Finally, samples were explanted at 8- and 12-week time points to validate calculated micro-CT volumes. Results Weekly CT scanning demonstrated progressive volume loss over the time course. However, volumetric analysis at the 8- and 12-week time points stabilized, showing an average of 62.2% and 60.9% survival, respectively. Gross analysis showed the fat graft to be healthy and vascularized. H&E analysis and staining for CD68 showed minimal inflammatory reaction with viable adipocytes. Immunohistochemical staining with anti-human CB1 antibodies confirmed human origin of the adipocytes. Conclusions Studies assessing the fate of autologous fat grafts in animals have focused on nonimaging modalities, including histological and biochemical analyses, which require euthanasia of the animals. In this study, we have demonstrated the ability to employ micro-CT for 3D reconstruction and volumetric analysis of human fat grafts in a mouse model. Importantly

  16. USING OLFACTORY DISPLAYS AS A NONTRADITIONAL INTERFACE IN HUMAN COMPUTER INTERACTION

    Directory of Open Access Journals (Sweden)

    Alper Efe

    2017-07-01

    Full Text Available Smell has its limitations and disadvantages as a display medium, but it also has its strengths and many have recognized its potential. At present, in communications and virtual technologies, smell is either forgotten or improperly stimulated, because non controlled odorants present in the physical space surrounding the user. Nonetheless a controlled presentation of olfactory information can give advantages in various application fields. Therefore, two enabling technologies, electronic noses and especially olfactory displays are reviewed. Scenarios of usage are discussed together with relevant psycho-physiological issues. End-to-end systems including olfactory interfaces are quantitatively characterised under many respects. Recent works done by the authors on field are reported. The article will touch briefly on the control of scent emissions; an important factor to consider when building scented computer systems. As a sample application SUBSMELL system investigated. A look at areas of human computer interaction where olfaction output may prove useful will be presented. The article will finish with some brief conclusions and discuss some shortcomings and gaps of the topic. In particular, the addition of olfactory cues to a virtual environment increased the user's sense of presence and memory of the environment. Also, this article discusses the educational aspect of the subsmell systems.

  17. POLYAR, a new computer program for prediction of poly(A sites in human sequences

    Directory of Open Access Journals (Sweden)

    Qamar Raheel

    2010-11-01

    Full Text Available Abstract Background mRNA polyadenylation is an essential step of pre-mRNA processing in eukaryotes. Accurate prediction of the pre-mRNA 3'-end cleavage/polyadenylation sites is important for defining the gene boundaries and understanding gene expression mechanisms. Results 28761 human mapped poly(A sites have been classified into three classes containing different known forms of polyadenylation signal (PAS or none of them (PAS-strong, PAS-weak and PAS-less, respectively and a new computer program POLYAR for the prediction of poly(A sites of each class was developed. In comparison with polya_svm (till date the most accurate computer program for prediction of poly(A sites while searching for PAS-strong poly(A sites in human sequences, POLYAR had a significantly higher prediction sensitivity (80.8% versus 65.7% and specificity (66.4% versus 51.7% However, when a similar sort of search was conducted for PAS-weak and PAS-less poly(A sites, both programs had a very low prediction accuracy, which indicates that our knowledge about factors involved in the determination of the poly(A sites is not sufficient to identify such polyadenylation regions. Conclusions We present a new classification of polyadenylation sites into three classes and a novel computer program POLYAR for prediction of poly(A sites/regions of each of the class. In tests, POLYAR shows high accuracy of prediction of the PAS-strong poly(A sites, though this program's efficiency in searching for PAS-weak and PAS-less poly(A sites is not very high but is comparable to other available programs. These findings suggest that additional characteristics of such poly(A sites remain to be elucidated. POLYAR program with a stand-alone version for downloading is available at http://cub.comsats.edu.pk/polyapredict.htm.

  18. The UF family of hybrid phantoms of the developing human fetus for computational radiation dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Maynard, Matthew R; Geyer, John W; Bolch, Wesley [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL (United States); Aris, John P [Department of Anatomy and Cell Biology, University of Florida, Gainesville, FL (United States); Shifrin, Roger Y, E-mail: wbolch@ufl.edu [Department of Radiology, University of Florida, Gainesville, FL (United States)

    2011-08-07

    Historically, the development of computational phantoms for radiation dosimetry has primarily been directed at capturing and representing adult and pediatric anatomy, with less emphasis devoted to models of the human fetus. As concern grows over possible radiation-induced cancers from medical and non-medical exposures of the pregnant female, the need to better quantify fetal radiation doses, particularly at the organ-level, also increases. Studies such as the European Union's SOLO (Epidemiological Studies of Exposed Southern Urals Populations) hope to improve our understanding of cancer risks following chronic in utero radiation exposure. For projects such as SOLO, currently available fetal anatomic models do not provide sufficient anatomical detail for organ-level dose assessment. To address this need, two fetal hybrid computational phantoms were constructed using high-quality magnetic resonance imaging and computed tomography image sets obtained for two well-preserved fetal specimens aged 11.5 and 21 weeks post-conception. Individual soft tissue organs, bone sites and outer body contours were segmented from these images using 3D-DOCTOR(TM) and then imported to the 3D modeling software package Rhinoceros(TM) for further modeling and conversion of soft tissue organs, certain bone sites and outer body contours to deformable non-uniform rational B-spline surfaces. The two specimen-specific phantoms, along with a modified version of the 38 week UF hybrid newborn phantom, comprised a set of base phantoms from which a series of hybrid computational phantoms was derived for fetal ages 8, 10, 15, 20, 25, 30, 35 and 38 weeks post-conception. The methodology used to construct the series of phantoms accounted for the following age-dependent parameters: (1) variations in skeletal size and proportion, (2) bone-dependent variations in relative levels of bone growth, (3) variations in individual organ masses and total fetal masses and (4) statistical percentile variations

  19. Computational and human observer image quality evaluation of low dose, knowledge-based CT iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Eck, Brendan L.; Fahmi, Rachid; Miao, Jun [Department of Biomedical Engineering, Case Western Reserve University, Cleveland, Ohio 44106 (United States); Brown, Kevin M.; Zabic, Stanislav; Raihani, Nilgoun [Philips Healthcare, Cleveland, Ohio 44143 (United States); Wilson, David L., E-mail: dlw@case.edu [Department of Biomedical Engineering, Case Western Reserve University, Cleveland, Ohio 44106 and Department of Radiology, Case Western Reserve University, Cleveland, Ohio 44106 (United States)

    2015-10-15

    Purpose: Aims in this study are to (1) develop a computational model observer which reliably tracks the detectability of human observers in low dose computed tomography (CT) images reconstructed with knowledge-based iterative reconstruction (IMR™, Philips Healthcare) and filtered back projection (FBP) across a range of independent variables, (2) use the model to evaluate detectability trends across reconstructions and make predictions of human observer detectability, and (3) perform human observer studies based on model predictions to demonstrate applications of the model in CT imaging. Methods: Detectability (d′) was evaluated in phantom studies across a range of conditions. Images were generated using a numerical CT simulator. Trained observers performed 4-alternative forced choice (4-AFC) experiments across dose (1.3, 2.7, 4.0 mGy), pin size (4, 6, 8 mm), contrast (0.3%, 0.5%, 1.0%), and reconstruction (FBP, IMR), at fixed display window. A five-channel Laguerre–Gauss channelized Hotelling observer (CHO) was developed with internal noise added to the decision variable and/or to channel outputs, creating six different internal noise models. Semianalytic internal noise computation was tested against Monte Carlo and used to accelerate internal noise parameter optimization. Model parameters were estimated from all experiments at once using maximum likelihood on the probability correct, P{sub C}. Akaike information criterion (AIC) was used to compare models of different orders. The best model was selected according to AIC and used to predict detectability in blended FBP-IMR images, analyze trends in IMR detectability improvements, and predict dose savings with IMR. Predicted dose savings were compared against 4-AFC study results using physical CT phantom images. Results: Detection in IMR was greater than FBP in all tested conditions. The CHO with internal noise proportional to channel output standard deviations, Model-k4, showed the best trade-off between fit

  20. "Teaching students how to wear their Computer"

    DEFF Research Database (Denmark)

    Guglielmi, Michel; Johannesen, Hanne Louise

    2005-01-01

    to address this question trough the angle of what we called ‘Physical Computing’ and asked ourselves and the students if new fields like ‘tangible media’ or ‘wearable computers’ can contribute to improvements of life? And whose life improvement are we aiming for? Computers are a ubiquitous part....... Through the workshop the students were encouraged to disrupt the myth of how a computer should be used and to focus on the human-human interaction (HHI) through the computer rather than human-computer interaction (HCI). The physical computing approach offered furthermore a unique opportunity to break down......This paper intends to present the goal, results and methodology of a workshop run in collaboration with Visual Culture (humanities), University of Copenhagen, the Danish academy of Design in Copenhagen and Media lab Aalborg, University of Aalborg. The workshop was related to a design competition...

  1. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model.

    Science.gov (United States)

    Neic, Aurel; Campos, Fernando O; Prassl, Anton J; Niederer, Steven A; Bishop, Martin J; Vigmond, Edward J; Plank, Gernot

    2017-10-01

    Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

  2. A computer model of the biosphere, to estimate stochastic and non-stochastic effects of radionuclides on humans

    International Nuclear Information System (INIS)

    Laurens, J.M.

    1985-01-01

    A computer code was written to model food chains in order to estimate the internal and external doses, for stochastic and non-stochastic effects, on humans (adults and infants). Results are given for 67 radionuclides, for unit concentration in water (1 Bq/L) and in atmosphere (1 Bq/m 3 )

  3. A computer-simulated liver phantom (virtual liver phantom) for multidetector computed tomography evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Funama, Yoshinori [Kumamoto University, Department of Radiological Sciences, School of Health Sciences, Kumamoto (Japan); Awai, Kazuo; Nakayama, Yoshiharu; Liu, Da; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Miyazaki, Osamu; Goto, Taiga [Hitachi Medical Corporation, Tokyo (Japan); Hori, Shinichi [Gate Tower Institute of Image Guided Therapy, Osaka (Japan)

    2006-04-15

    The purpose of study was to develop a computer-simulated liver phantom for hepatic CT studies. A computer-simulated liver phantom was mathematically constructed on a computer workstation. The computer-simulated phantom was calibrated using real CT images acquired by an actual four-detector CT. We added an inhomogeneous texture to the simulated liver by referring to CT images of chronically damaged human livers. The mean CT number of the simulated liver was 60 HU and we added numerous 5-to 10-mm structures with 60{+-}10 HU/mm. To mimic liver tumors we added nodules measuring 8, 10, and 12 mm in diameter with CT numbers of 60{+-}10, 60{+-}15, and 60{+-}20 HU. Five radiologists visually evaluated similarity of the texture of the computer-simulated liver phantom and a real human liver to confirm the appropriateness of the virtual liver images using a five-point scale. The total score was 44 in two radiologists, and 42, 41, and 39 in one radiologist each. They evaluated that the textures of virtual liver were comparable to those of human liver. Our computer-simulated liver phantom is a promising tool for the evaluation of the image quality and diagnostic performance of hepatic CT imaging. (orig.)

  4. Accuracy of computer-guided implantation in a human cadaver model.

    Science.gov (United States)

    Yatzkair, Gustavo; Cheng, Alice; Brodie, Stan; Raviv, Eli; Boyan, Barbara D; Schwartz, Zvi

    2015-10-01

    To examine the accuracy of computer-guided implantation using a human cadaver model with reduced experimental variability. Twenty-eight (28) dental implants representing 12 clinical cases were placed in four cadaver heads using a static guided implantation template. All planning and surgeries were performed by one clinician. All radiographs and measurements were performed by two examiners. The distance of the implants from buccal and lingual bone and mesial implant or tooth was analyzed at the apical and coronal levels, and measurements were compared to the planned values. No significant differences were seen between planned and implanted measurements. Average deviation of an implant from its planning radiograph was 0.8 mm, which is within the range of variability expected from CT analysis. Guided implantation can be used safely with a margin of error of 1 mm. © 2014 The Authors. Clinical Oral Implants Research Published by John Wiley & Sons Ltd.

  5. Soft Electronics Enabled Ergonomic Human-Computer Interaction for Swallowing Training

    Science.gov (United States)

    Lee, Yongkuk; Nicholls, Benjamin; Sup Lee, Dong; Chen, Yanfei; Chun, Youngjae; Siang Ang, Chee; Yeo, Woon-Hong

    2017-04-01

    We introduce a skin-friendly electronic system that enables human-computer interaction (HCI) for swallowing training in dysphagia rehabilitation. For an ergonomic HCI, we utilize a soft, highly compliant (“skin-like”) electrode, which addresses critical issues of an existing rigid and planar electrode combined with a problematic conductive electrolyte and adhesive pad. The skin-like electrode offers a highly conformal, user-comfortable interaction with the skin for long-term wearable, high-fidelity recording of swallowing electromyograms on the chin. Mechanics modeling and experimental quantification captures the ultra-elastic mechanical characteristics of an open mesh microstructured sensor, conjugated with an elastomeric membrane. Systematic in vivo studies investigate the functionality of the soft electronics for HCI-enabled swallowing training, which includes the application of a biofeedback system to detect swallowing behavior. The collection of results demonstrates clinical feasibility of the ergonomic electronics in HCI-driven rehabilitation for patients with swallowing disorders.

  6. Three-dimensional computer-aided human factors engineering analysis of a grafting robot.

    Science.gov (United States)

    Chiu, Y C; Chen, S; Wu, G J; Lin, Y H

    2012-07-01

    The objective of this research was to conduct a human factors engineering analysis of a grafting robot design using computer-aided 3D simulation technology. A prototype tubing-type grafting robot for fruits and vegetables was the subject of a series of case studies. To facilitate the incorporation of human models into the operating environment of the grafting robot, I-DEAS graphic software was applied to establish individual models of the grafting robot in line with Jack ergonomic analysis. Six human models (95th percentile, 50th percentile, and 5th percentile by height for both males and females) were employed to simulate the operating conditions and working postures in a real operating environment. The lower back and upper limb stresses of the operators were analyzed using the lower back analysis (LBA) and rapid upper limb assessment (RULA) functions in Jack. The experimental results showed that if a leg space is introduced under the robot, the operator can sit closer to the robot, which reduces the operator's level of lower back and upper limbs stress. The proper environmental layout for Taiwanese operators for minimum levels of lower back and upper limb stress are to set the grafting operation at 23.2 cm away from the operator at a height of 85 cm and with 45 cm between the rootstock and scion units.

  7. Institutionalizing human-computer interaction for global health.

    Science.gov (United States)

    Gulliksen, Jan

    2017-06-01

    Digitalization is the societal change process in which new ICT-based solutions bring forward completely new ways of doing things, new businesses and new movements in the society. Digitalization also provides completely new ways of addressing issues related to global health. This paper provides an overview of the field of human-computer interaction (HCI) and in what way the field has contributed to international development in different regions of the world. Additionally, it outlines the United Nations' new sustainability goals from December 2015 and what these could contribute to the development of global health and its relationship to digitalization. Finally, it argues why and how HCI could be adopted and adapted to fit the contextual needs, the need for localization and for the development of new digital innovations. The research methodology is mostly qualitative following an action research paradigm in which the actual change process that the digitalization is evoking is equally important as the scientific conclusions that can be drawn. In conclusion, the paper argues that digitalization is fundamentally changing the society through the development and use of digital technologies and may have a profound effect on the digital development of every country in the world. But it needs to be developed based on local practices, it needs international support and to not be limited by any technological constraints. Particularly digitalization to support global health requires a profound understanding of the users and their context, arguing for user-centred systems design methodologies as particularly suitable.

  8. Histogram analysis for age change of human lung with computed tomography

    International Nuclear Information System (INIS)

    Shirabe, Ichiju

    1990-01-01

    In order to evaluate physiological changes of normal lung with aging by computed tomography (CT), the peak position (PP) and full width half maximum (FWHM) of CT-histogram were studied in 77 normal human lung. Above 30 years old, PP tended to be seen in the lower attenuation value with advancing ages, with the result that the follow equation was obtained. CT attenuation value of PP=-0.87 x age -815. The peak position shifted to the range of higher CT attenuation in 30's. FWHM did not change with advancing ages. There were no differences of peak value and FWHM among the upper, middle and lower lung field. In this study, physiological changes of lung were evaluated quantitatively. Furthermore, this study was considered to be useful for diagnosis and treatment in lung diseases. (author)

  9. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  10. Computationally Modeling Interpersonal Trust

    Directory of Open Access Journals (Sweden)

    Jin Joo eLee

    2013-12-01

    Full Text Available We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind’s readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naivete' of this domain knowledge. We then present the construction of hidden Markov models to incorporate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust.

  11. Computer users at risk: Health disorders associated with prolonged computer use

    OpenAIRE

    Abida Ellahi; M. Shahid Khalil; Fouzia Akram

    2011-01-01

    By keeping in view the ISO standards which emphasize the assessment of use of a product, this research aims to assess the prolonged use of computers and their effects on human health. The objective of this study was to investigate the association between extent of computer use (per day) and carpal tunnel syndrome, computer stress syndrome, computer vision syndrome and musculoskeletal problems. The second objective was to investigate the extent of simultaneous occurrence of carpal tunnel syndr...

  12. From Computational Thinking to Computational Empowerment: A 21st Century PD Agenda

    DEFF Research Database (Denmark)

    Iversen, Ole Sejer; Smith, Rachel Charlotte; Dindler, Christian

    2018-01-01

    We propose computational empowerment as an approach, and a Participatory Design response, to challenges related to digitalization of society and the emerging need for digital literacy in K12 education. Our approach extends the current focus on computational thinking to include contextual, human-c...... technology in education. We argue that PD has the potential to drive a computational empowerment agenda in education, by connecting political PD with contemporary visions for addressing a future digitalized labor market and society.......We propose computational empowerment as an approach, and a Participatory Design response, to challenges related to digitalization of society and the emerging need for digital literacy in K12 education. Our approach extends the current focus on computational thinking to include contextual, human......-centred and societal challenges and impacts involved in students’ creative and critical engagement with digital technology. Our research is based on the FabLab@School project, in which a PD approach to computational empowerment provided opportunities as well as further challenges for the complex agenda of digital...

  13. Affective Computing used in an imaging interaction paradigm

    DEFF Research Database (Denmark)

    Schultz, Nette

    2003-01-01

    This paper combines affective computing with an imaging interaction paradigm. An imaging interaction paradigm means that human and computer communicates primarily by images. Images evoke emotions in humans, so the computer must be able to behave emotionally intelligent. An affective image selection...

  14. A Novel Wearable Forehead EOG Measurement System for Human Computer Interfaces.

    Science.gov (United States)

    Heo, Jeong; Yoon, Heenam; Park, Kwang Suk

    2017-06-23

    Amyotrophic lateral sclerosis (ALS) patients whose voluntary muscles are paralyzed commonly communicate with the outside world using eye movement. There have been many efforts to support this method of communication by tracking or detecting eye movement. An electrooculogram (EOG), an electro-physiological signal, is generated by eye movements and can be measured with electrodes placed around the eye. In this study, we proposed a new practical electrode position on the forehead to measure EOG signals, and we developed a wearable forehead EOG measurement system for use in Human Computer/Machine interfaces (HCIs/HMIs). Four electrodes, including the ground electrode, were placed on the forehead. The two channels were arranged vertically and horizontally, sharing a positive electrode. Additionally, a real-time eye movement classification algorithm was developed based on the characteristics of the forehead EOG. Three applications were employed to evaluate the proposed system: a virtual keyboard using a modified Bremen BCI speller and an automatic sequential row-column scanner, and a drivable power wheelchair. The mean typing speeds of the modified Bremen brain-computer interface (BCI) speller and automatic row-column scanner were 10.81 and 7.74 letters per minute, and the mean classification accuracies were 91.25% and 95.12%, respectively. In the power wheelchair demonstration, the user drove the wheelchair through an 8-shape course without collision with obstacles.

  15. HCI in Mobile and Ubiquitous Computing

    OpenAIRE

    椎尾, 一郎; 安村, 通晃; 福本, 雅明; 伊賀, 聡一郎; 増井, 俊之

    2003-01-01

    This paper provides some perspectives to human computer interaction in mobile and ubiquitous computing. The review covers overview of ubiquitous computing, mobile computing and wearable computing. It also summarizes HCI topics on these field, including real-world oriented interface, multi-modal interface, context awareness and in-visible computers. Finally we discuss killer applications for coming ubiquitous computing era.

  16. An experimental and computational framework to build a dynamic protein atlas of human cell division

    OpenAIRE

    Kavur, Marina; Kavur, Marina; Kavur, Marina; Ellenberg, Jan; Peters, Jan-Michael; Ladurner, Rene; Martinic, Marina; Kueblbeck, Moritz; Nijmeijer, Bianca; Wachsmuth, Malte; Koch, Birgit; Walther, Nike; Politi, Antonio; Heriche, Jean-Karim; Hossain, M.

    2017-01-01

    Essential biological functions of human cells, such as division, require the tight coordination of the activity of hundreds of proteins in space and time. While live cell imaging is a powerful tool to study the distribution and dynamics of individual proteins after fluorescence tagging, it has not yet been used to map protein networks due to the lack of systematic and quantitative experimental and computational approaches. Using the cell and nuclear boundaries as landmarks, we generated a 4D ...

  17. Computers and Creativity.

    Science.gov (United States)

    Ten Dyke, Richard P.

    1982-01-01

    A traditional question is whether or not computers shall ever think like humans. This question is redirected to a discussion of whether computers shall ever be truly creative. Creativity is defined and a program is described that is designed to complete creatively a series problem in mathematics. (MP)

  18. Is the corticomedullary index valid to distinguish human from nonhuman bones: a multislice computed tomography study.

    Science.gov (United States)

    Rérolle, Camille; Saint-Martin, Pauline; Dedouit, Fabrice; Rousseau, Hervé; Telmon, Norbert

    2013-09-10

    The first step in the identification process of bone remains is to determine whether they are of human or nonhuman origin. This issue may arise when only a fragment of bone is available, as the species of origin is usually easily determined on a complete bone. The present study aims to assess the validity of a morphometric method used by French forensic anthropologists to determine the species of origin: the corticomedullary index (CMI), defined by the ratio of the diameter of the medullary cavity to the total diameter of the bone. We studied the constancy of the CMI from measurements made on computed tomography images (CT scans) of different human bones, and compared our measurements with reference values selected in the literature. The measurements obtained on CT scans at three different sites of 30 human femurs, 24 tibias, and 24 fibulas were compared between themselves and with the CMI reference values for humans, pigs, dogs and sheep. Our results differed significantly from these reference values, with three exceptions: the proximal quarter of the femur and mid-fibular measurements for the human CMI, and the proximal quarter of the tibia for the sheep CMI. Mid-tibial, mid-femoral, and mid-fibular measurements also differed significantly between themselves. Only 22.6% of CT scans of human bones were correctly identified as human. We concluded that the CMI is not an effective method for determining the human origin of bone remains. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Human Face as human single identity

    OpenAIRE

    Warnars, Spits

    2014-01-01

    Human face as a physical human recognition can be used as a unique identity for computer to recognize human by transforming human face with face algorithm as simple text number which can be primary key for human. Human face as single identity for human will be done by making a huge and large world centre human face database, where the human face around the world will be recorded from time to time and from generation to generation. Architecture database will be divided become human face image ...

  20. Know Your Personal Computer

    Indian Academy of Sciences (India)

    computer with IBM PC .... read by a human and not translated by a compiler are called .... by different stages of education becomes a computer scientist. ... ancestors knew and carried out the semantic actions without question or comment.

  1. Computerized Cognitive Rehabilitation: Comparing Different Human-Computer Interactions.

    Science.gov (United States)

    Quaglini, Silvana; Alloni, Anna; Cattani, Barbara; Panzarasa, Silvia; Pistarini, Caterina

    2017-01-01

    In this work we describe an experiment involving aphasic patients, where the same speech rehabilitation exercise was administered in three different modalities, two of which are computer-based. In particular, one modality exploits the "Makey Makey", an electronic board which allows interacting with the computer using physical objects.

  2. Teaching natural language to computers

    OpenAIRE

    Corneli, Joseph; Corneli, Miriam

    2016-01-01

    "Natural Language," whether spoken and attended to by humans, or processed and generated by computers, requires networked structures that reflect creative processes in semantic, syntactic, phonetic, linguistic, social, emotional, and cultural modules. Being able to produce novel and useful behavior following repeated practice gets to the root of both artificial intelligence and human language. This paper investigates the modalities involved in language-like applications that computers -- and ...

  3. Computational dosimetry for grounded and ungrounded human models due to contact current

    International Nuclear Information System (INIS)

    Chan, Kwok Hung; Hattori, Junya; Laakso, Ilkka; Hirata, Akimasa; Taki, Masao

    2013-01-01

    This study presents the computational dosimetry of contact currents for grounded and ungrounded human models. The uncertainty of the quasi-static (QS) approximation of the in situ electric field induced in a grounded/ungrounded human body due to the contact current is first estimated. Different scenarios of cylindrical and anatomical human body models are considered, and the results are compared with the full-wave analysis. In the QS analysis, the induced field in the grounded cylindrical model is calculated by the QS finite-difference time-domain (QS-FDTD) method, and compared with the analytical solution. Because no analytical solution is available for the grounded/ungrounded anatomical human body model, the results of the QS-FDTD method are then compared with those of the conventional FDTD method. The upper frequency limit for the QS approximation in the contact current dosimetry is found to be 3 MHz, with a relative local error of less than 10%. The error increases above this frequency, which can be attributed to the neglect of the displacement current. The QS or conventional FDTD method is used for the dosimetry of induced electric field and/or specific absorption rate (SAR) for a contact current injected into the index finger of a human body model in the frequency range from 10 Hz to 100 MHz. The in situ electric fields or SAR are compared with the basic restrictions in the international guidelines/standards. The maximum electric field or the 99th percentile value of the electric fields appear not only in the fat and muscle tissues of the finger, but also around the wrist, forearm, and the upper arm. Some discrepancies are observed between the basic restrictions for the electric field and SAR and the reference levels for the contact current, especially in the extremities. These discrepancies are shown by an equation that relates the current density, tissue conductivity, and induced electric field in the finger with a cross-sectional area of 1 cm 2 . (paper)

  4. Integrated multimodal human-computer interface and augmented reality for interactive display applications

    Science.gov (United States)

    Vassiliou, Marius S.; Sundareswaran, Venkataraman; Chen, S.; Behringer, Reinhold; Tam, Clement K.; Chan, M.; Bangayan, Phil T.; McGee, Joshua H.

    2000-08-01

    We describe new systems for improved integrated multimodal human-computer interaction and augmented reality for a diverse array of applications, including future advanced cockpits, tactical operations centers, and others. We have developed an integrated display system featuring: speech recognition of multiple concurrent users equipped with both standard air- coupled microphones and novel throat-coupled sensors (developed at Army Research Labs for increased noise immunity); lip reading for improving speech recognition accuracy in noisy environments, three-dimensional spatialized audio for improved display of warnings, alerts, and other information; wireless, coordinated handheld-PC control of a large display; real-time display of data and inferences from wireless integrated networked sensors with on-board signal processing and discrimination; gesture control with disambiguated point-and-speak capability; head- and eye- tracking coupled with speech recognition for 'look-and-speak' interaction; and integrated tetherless augmented reality on a wearable computer. The various interaction modalities (speech recognition, 3D audio, eyetracking, etc.) are implemented a 'modality servers' in an Internet-based client-server architecture. Each modality server encapsulates and exposes commercial and research software packages, presenting a socket network interface that is abstracted to a high-level interface, minimizing both vendor dependencies and required changes on the client side as the server's technology improves.

  5. Developing human technology curriculum

    Directory of Open Access Journals (Sweden)

    Teija Vainio

    2012-10-01

    Full Text Available During the past ten years expertise in human-computer interaction has shifted from humans interacting with desktop computers to individual human beings or groups of human beings interacting with embedded or mobile technology. Thus, humans are not only interacting with computers but with technology. Obviously, this shift should be reflected in how we educate human-technology interaction (HTI experts today and in the future. We tackle this educational challenge first by analysing current Master’s-level education in collaboration with two universities and second, discussing postgraduate education in the international context. As a result, we identified core studies that should be included in the HTI curriculum. Furthermore, we discuss some practical challenges and new directions for international HTI education.

  6. A collaborative brain-computer interface for improving human performance.

    Directory of Open Access Journals (Sweden)

    Yijun Wang

    Full Text Available Electroencephalogram (EEG based brain-computer interfaces (BCI have been studied since the 1970s. Currently, the main focus of BCI research lies on the clinical use, which aims to provide a new communication channel to patients with motor disabilities to improve their quality of life. However, the BCI technology can also be used to improve human performance for normal healthy users. Although this application has been proposed for a long time, little progress has been made in real-world practices due to technical limits of EEG. To overcome the bottleneck of low single-user BCI performance, this study proposes a collaborative paradigm to improve overall BCI performance by integrating information from multiple users. To test the feasibility of a collaborative BCI, this study quantitatively compares the classification accuracies of collaborative and single-user BCI applied to the EEG data collected from 20 subjects in a movement-planning experiment. This study also explores three different methods for fusing and analyzing EEG data from multiple subjects: (1 Event-related potentials (ERP averaging, (2 Feature concatenating, and (3 Voting. In a demonstration system using the Voting method, the classification accuracy of predicting movement directions (reaching left vs. reaching right was enhanced substantially from 66% to 80%, 88%, 93%, and 95% as the numbers of subjects increased from 1 to 5, 10, 15, and 20, respectively. Furthermore, the decision of reaching direction could be made around 100-250 ms earlier than the subject's actual motor response by decoding the ERP activities arising mainly from the posterior parietal cortex (PPC, which are related to the processing of visuomotor transmission. Taken together, these results suggest that a collaborative BCI can effectively fuse brain activities of a group of people to improve the overall performance of natural human behavior.

  7. Computational intelligence and neuromorphic computing potential for cybersecurity applications

    Science.gov (United States)

    Pino, Robinson E.; Shevenell, Michael J.; Cam, Hasan; Mouallem, Pierre; Shumaker, Justin L.; Edwards, Arthur H.

    2013-05-01

    In today's highly mobile, networked, and interconnected internet world, the flow and volume of information is overwhelming and continuously increasing. Therefore, it is believed that the next frontier in technological evolution and development will rely in our ability to develop intelligent systems that can help us process, analyze, and make-sense of information autonomously just as a well-trained and educated human expert. In computational intelligence, neuromorphic computing promises to allow for the development of computing systems able to imitate natural neurobiological processes and form the foundation for intelligent system architectures.

  8. Computational analysis of human miRNAs phylogenetics

    African Journals Online (AJOL)

    User

    2011-05-02

    May 2, 2011 ... Human DNA. 71. 100.00. 1.94E-28. AL138714. Human DNA sequence from clone RP11-. 121J7 on chromosome 13q32.1-32.3. Contains the 3' end of a novel gene, the 5' end of the GPC5 gene for glypican 5, 5 ..... including human, chimpanzee, orangutan, and macaque, and find that miRNAs were ...

  9. Parallel Computing for Brain Simulation.

    Science.gov (United States)

    Pastur-Romay, L A; Porto-Pazos, A B; Cedron, F; Pazos, A

    2017-01-01

    The human brain is the most complex system in the known universe, it is therefore one of the greatest mysteries. It provides human beings with extraordinary abilities. However, until now it has not been understood yet how and why most of these abilities are produced. For decades, researchers have been trying to make computers reproduce these abilities, focusing on both understanding the nervous system and, on processing data in a more efficient way than before. Their aim is to make computers process information similarly to the brain. Important technological developments and vast multidisciplinary projects have allowed creating the first simulation with a number of neurons similar to that of a human brain. This paper presents an up-to-date review about the main research projects that are trying to simulate and/or emulate the human brain. They employ different types of computational models using parallel computing: digital models, analog models and hybrid models. This review includes the current applications of these works, as well as future trends. It is focused on various works that look for advanced progress in Neuroscience and still others which seek new discoveries in Computer Science (neuromorphic hardware, machine learning techniques). Their most outstanding characteristics are summarized and the latest advances and future plans are presented. In addition, this review points out the importance of considering not only neurons: Computational models of the brain should also include glial cells, given the proven importance of astrocytes in information processing. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  10. Computer-aided diagnosis in phase contrast imaging X-ray computed tomography for quantitative characterization of ex vivo human patellar cartilage.

    Science.gov (United States)

    Nagarajan, Mahesh B; Coan, Paola; Huber, Markus B; Diemoz, Paul C; Glaser, Christian; Wismuller, Axel

    2013-10-01

    Visualization of ex vivo human patellar cartilage matrix through the phase contrast imaging X-ray computed tomography (PCI-CT) has been previously demonstrated. Such studies revealed osteoarthritis-induced changes to chondrocyte organization in the radial zone. This study investigates the application of texture analysis to characterizing such chondrocyte patterns in the presence and absence of osteoarthritic damage. Texture features derived from Minkowski functionals (MF) and gray-level co-occurrence matrices (GLCM) were extracted from 842 regions of interest (ROI) annotated on PCI-CT images of ex vivo human patellar cartilage specimens. These texture features were subsequently used in a machine learning task with support vector regression to classify ROIs as healthy or osteoarthritic; classification performance was evaluated using the area under the receiver operating characteristic curve (AUC). The best classification performance was observed with the MF features perimeter (AUC: 0.94 ±0.08 ) and "Euler characteristic" (AUC: 0.94 ±0.07 ), and GLCM-derived feature "Correlation" (AUC: 0.93 ±0.07). These results suggest that such texture features can provide a detailed characterization of the chondrocyte organization in the cartilage matrix, enabling classification of cartilage as healthy or osteoarthritic with high accuracy.

  11. Distribution of recombination hotspots in the human genome--a comparison of computer simulations with real data.

    Directory of Open Access Journals (Sweden)

    Dorota Mackiewicz

    Full Text Available Recombination is the main cause of genetic diversity. Thus, errors in this process can lead to chromosomal abnormalities. Recombination events are confined to narrow chromosome regions called hotspots in which characteristic DNA motifs are found. Genomic analyses have shown that both recombination hotspots and DNA motifs are distributed unevenly along human chromosomes and are much more frequent in the subtelomeric regions of chromosomes than in their central parts. Clusters of motifs roughly follow the distribution of recombination hotspots whereas single motifs show a negative correlation with the hotspot distribution. To model the phenomena related to recombination, we carried out computer Monte Carlo simulations of genome evolution. Computer simulations generated uneven distribution of hotspots with their domination in the subtelomeric regions of chromosomes. They also revealed that purifying selection eliminating defective alleles is strong enough to cause such hotspot distribution. After sufficiently long time of simulations, the structure of chromosomes reached a dynamic equilibrium, in which number and global distribution of both hotspots and defective alleles remained statistically unchanged, while their precise positions were shifted. This resembles the dynamic structure of human and chimpanzee genomes, where hotspots change their exact locations but the global distributions of recombination events are very similar.

  12. Human factors in computing systems: focus on patient-centered health communication at the ACM SIGCHI conference.

    Science.gov (United States)

    Wilcox, Lauren; Patel, Rupa; Chen, Yunan; Shachak, Aviv

    2013-12-01

    Health Information Technologies, such as electronic health records (EHR) and secure messaging, have already transformed interactions among patients and clinicians. In addition, technologies supporting asynchronous communication outside of clinical encounters, such as email, SMS, and patient portals, are being increasingly used for follow-up, education, and data reporting. Meanwhile, patients are increasingly adopting personal tools to track various aspects of health status and therapeutic progress, wishing to review these data with clinicians during consultations. These issues have drawn increasing interest from the human-computer interaction (HCI) community, with special focus on critical challenges in patient-centered interactions and design opportunities that can address these challenges. We saw this community presenting and interacting at the ACM SIGCHI 2013, Conference on Human Factors in Computing Systems, (also known as CHI), held April 27-May 2nd, 2013 at the Palais de Congrès de Paris in France. CHI 2013 featured many formal avenues to pursue patient-centered health communication: a well-attended workshop, tracks of original research, and a lively panel discussion. In this report, we highlight these events and the main themes we identified. We hope that it will help bring the health care communication and the HCI communities closer together. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. A Computer Clone of Human Expert for Mobility Management Scheme (E-MMS): Step toward Green Transportation

    Science.gov (United States)

    Resdiansyah; O. K Rahmat, R. A.; Ismail, A.

    2018-03-01

    Green transportation refers to a sustainable transport that gives the least impact in terms of social and environmental but at the same time is able to supply energy sources globally that includes non-motorized transport strategies deployment to promote healthy lifestyles, also known as Mobility Management Scheme (MMS). As construction of road infrastructure cannot help solve the problem of congestion, past research has shown that MMS is an effective measure to mitigate congestion and to achieve green transportation. MMS consists of different strategies and policies that subdivided into categories according to how they are able to influence travel behaviour. Appropriate selection of mobility strategies will ensure its effectiveness in mitigating congestion problems. Nevertheless, determining appropriate strategies requires human expert and depends on a number of success factors. This research has successfully developed a computer clone system based on human expert, called E-MMS. The process of knowledge acquisition for MMS strategies and the next following process to selection of strategy has been encode in a knowledge-based system using a shell expert system. The newly developed computer cloning system was successfully verified, validated and evaluated (VV&E) by comparing the result output with the real transportation expert recommendation in which the findings suggested Introduction

  14. A Computer Simulation Approach to the Study of Effects of Deck Surface Compliance on Initial Impact Impulse Forces in Human Gait

    National Research Council Canada - National Science Library

    Bretz, David

    2000-01-01

    .... One proposal for reducing knee disorders is to install more compliant decking The goal of this thesis is to develop a computer model of the human gait that estimates the transarticulation forces...

  15. Brain-Computer Interfaces Applying Our Minds to Human-computer Interaction

    CERN Document Server

    Tan, Desney S

    2010-01-01

    For generations, humans have fantasized about the ability to create devices that can see into a person's mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science fiction stories. Recent advances in cognitive neuroscience and brain imaging technologies have started to turn these myths into a reality, and are providing us with the ability to interface directly with the human brain. This ability is made possible through the use of sensors that monitor physical p

  16. Computational Intelligence in a Human Brain Model

    Directory of Open Access Journals (Sweden)

    Viorel Gaftea

    2016-06-01

    Full Text Available This paper focuses on the current trends in brain research domain and the current stage of development of research for software and hardware solutions, communication capabilities between: human beings and machines, new technologies, nano-science and Internet of Things (IoT devices. The proposed model for Human Brain assumes main similitude between human intelligence and the chess game thinking process. Tactical & strategic reasoning and the need to follow the rules of the chess game, all are very similar with the activities of the human brain. The main objective for a living being and the chess game player are the same: securing a position, surviving and eliminating the adversaries. The brain resolves these goals, and more, the being movement, actions and speech are sustained by the vital five senses and equilibrium. The chess game strategy helps us understand the human brain better and easier replicate in the proposed ‘Software and Hardware’ SAH Model.

  17. The challenge of computer mathematics.

    Science.gov (United States)

    Barendregt, Henk; Wiedijk, Freek

    2005-10-15

    Progress in the foundations of mathematics has made it possible to formulate all thinkable mathematical concepts, algorithms and proofs in one language and in an impeccable way. This is not in spite of, but partially based on the famous results of Gödel and Turing. In this way statements are about mathematical objects and algorithms, proofs show the correctness of statements and computations, and computations are dealing with objects and proofs. Interactive computer systems for a full integration of defining, computing and proving are based on this. The human defines concepts, constructs algorithms and provides proofs, while the machine checks that the definitions are well formed and the proofs and computations are correct. Results formalized so far demonstrate the feasibility of this 'computer mathematics'. Also there are very good applications. The challenge is to make the systems more mathematician-friendly, by building libraries and tools. The eventual goal is to help humans to learn, develop, communicate, referee and apply mathematics.

  18. Computer-assisted machine-to-human protocols for authentication of a RAM-based embedded system

    Science.gov (United States)

    Idrissa, Abdourhamane; Aubert, Alain; Fournel, Thierry

    2012-06-01

    Mobile readers used for optical identification of manufactured products can be tampered in different ways: with hardware Trojan or by powering up with fake configuration data. How a human verifier can authenticate the reader to be handled for goods verification? In this paper, two cryptographic protocols are proposed to achieve the verification of a RAM-based system through a trusted auxiliary machine. Such a system is assumed to be composed of a RAM memory and a secure block (in practice a FPGA or a configurable microcontroller). The system is connected to an input/output interface and contains a Non Volatile Memory where the configuration data are stored. Here, except the secure block, all the blocks are exposed to attacks. At the registration stage of the first protocol, the MAC of both the secret and the configuration data, denoted M0 is computed by the mobile device without saving it then transmitted to the user in a secure environment. At the verification stage, the reader which is challenged with nonces sendsMACs / HMACs of both nonces and MAC M0 (to be recomputed), keyed with the secret. These responses are verified by the user through a trusted auxiliary MAC computer unit. Here the verifier does not need to tract a (long) list of challenge / response pairs. This makes the protocol tractable for a human verifier as its participation in the authentication process is increased. In counterpart the secret has to be shared with the auxiliary unit. This constraint is relaxed in a second protocol directly derived from Fiat-Shamir's scheme.

  19. Dynamics of Information as Natural Computation

    Directory of Open Access Journals (Sweden)

    Gordana Dodig Crnkovic

    2011-08-01

    Full Text Available Processes considered rendering information dynamics have been studied, among others in: questions and answers, observations, communication, learning, belief revision, logical inference, game-theoretic interactions and computation. This article will put the computational approaches into a broader context of natural computation, where information dynamics is not only found in human communication and computational machinery but also in the entire nature. Information is understood as representing the world (reality as an informational web for a cognizing agent, while information dynamics (information processing, computation realizes physical laws through which all the changes of informational structures unfold. Computation as it appears in the natural world is more general than the human process of calculation modeled by the Turing machine. Natural computing is epitomized through the interactions of concurrent, in general asynchronous computational processes which are adequately represented by what Abramsky names “the second generation models of computation” [1] which we argue to be the most general representation of information dynamics.

  20. Labels, Cognomes and Cyclic Computation: An Ethological Perspective

    Directory of Open Access Journals (Sweden)

    Elliot eMurphy

    2015-06-01

    Full Text Available For the past two decades, it has widely been assumed by linguists that there is a single computational operation, Merge, which is unique to language, distinguishing it from other cognitive domains. The intention of this paper is to progress the discussion of language evolution in two ways: (i survey what the ethological record reveals about the uniqueness of the human computational system, and (ii explore how syntactic theories account for what ethology may determine to be human-specific. It is shown that the operation Label, not Merge, constitutes the evolutionary novelty which distinguishes human language from non-human computational systems; a proposal lending weight to a Weak Continuity Hypothesis and leading to the formation of what is termed Computational Ethology. Some directions for future ethological research are suggested.

  1. Labels, cognomes, and cyclic computation: an ethological perspective.

    Science.gov (United States)

    Murphy, Elliot

    2015-01-01

    For the past two decades, it has widely been assumed by linguists that there is a single computational operation, Merge, which is unique to language, distinguishing it from other cognitive domains. The intention of this paper is to progress the discussion of language evolution in two ways: (i) survey what the ethological record reveals about the uniqueness of the human computational system, and (ii) explore how syntactic theories account for what ethology may determine to be human-specific. It is shown that the operation Label, not Merge, constitutes the evolutionary novelty which distinguishes human language from non-human computational systems; a proposal lending weight to a Weak Continuity Hypothesis and leading to the formation of what is termed Computational Ethology. Some directions for future ethological research are suggested.

  2. South African sign language human-computer interface in the context of the national accessibility portal

    CSIR Research Space (South Africa)

    Olivrin, GJ

    2006-02-01

    Full Text Available example, between a deaf person who can sign and an able person or a person with a different disability who cannot sign). METHODOLOGY A signing avatar is set up to work together with a chatterbot. The chatterbot is a natural language dialogue interface... are then offered in sign language as the replies are interpreted by a signing avatar, a living character that can reproduce human-like gestures and expressions. To make South African Sign Language (SASL) available digitally, computational models of the language...

  3. Integrative approaches to computational biomedicine

    Science.gov (United States)

    Coveney, Peter V.; Diaz-Zuccarini, Vanessa; Graf, Norbert; Hunter, Peter; Kohl, Peter; Tegner, Jesper; Viceconti, Marco

    2013-01-01

    The new discipline of computational biomedicine is concerned with the application of computer-based techniques and particularly modelling and simulation to human health. Since 2007, this discipline has been synonymous, in Europe, with the name given to the European Union's ambitious investment in integrating these techniques with the eventual aim of modelling the human body as a whole: the virtual physiological human. This programme and its successors are expected, over the next decades, to transform the study and practice of healthcare, moving it towards the priorities known as ‘4P's’: predictive, preventative, personalized and participatory medicine.

  4. McMaster University: College and University Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    The computing and information services (CIS) organization includes administrative computing, academic computing, and networking and has three divisions: computing services, development services, and information services. Other computing activities include Health Sciences, Humanities Computing Center, and Department of Computer Science and Systems.…

  5. Computer-Mediated Communication Systems

    Directory of Open Access Journals (Sweden)

    Bin Yu

    2011-10-01

    Full Text Available The essence of communication is to exchange and share information. Computers provide a new medium to human communication. CMC system, composed of human and computers, absorbs and then extends the advantages of all former formats of communication, embracing the instant interaction of oral communication, the abstract logics of printing dissemination, and the vivid images of movie and television. It also creates a series of new communication formats, such as Hyper Text, Multimedia etc. which are the information organizing methods, and cross-space message delivering patterns. Benefiting from the continuous development of technique and mechanism, the computer-mediated communication makes the dream of transmitting information cross space and time become true, which will definitely have a great impact on our social lives.

  6. Neural computations mediating one-shot learning in the human brain.

    Directory of Open Access Journals (Sweden)

    Sang Wan Lee

    2015-04-01

    Full Text Available Incremental learning, in which new knowledge is acquired gradually through trial and error, can be distinguished from one-shot learning, in which the brain learns rapidly from only a single pairing of a stimulus and a consequence. Very little is known about how the brain transitions between these two fundamentally different forms of learning. Here we test a computational hypothesis that uncertainty about the causal relationship between a stimulus and an outcome induces rapid changes in the rate of learning, which in turn mediates the transition between incremental and one-shot learning. By using a novel behavioral task in combination with functional magnetic resonance imaging (fMRI data from human volunteers, we found evidence implicating the ventrolateral prefrontal cortex and hippocampus in this process. The hippocampus was selectively "switched" on when one-shot learning was predicted to occur, while the ventrolateral prefrontal cortex was found to encode uncertainty about the causal association, exhibiting increased coupling with the hippocampus for high-learning rates, suggesting this region may act as a "switch," turning on and off one-shot learning as required.

  7. Emission computed tomography

    International Nuclear Information System (INIS)

    Budinger, T.F.; Gullberg, G.T.; Huesman, R.H.

    1979-01-01

    This chapter is devoted to the methods of computer assisted tomography for determination of the three-dimensional distribution of gamma-emitting radionuclides in the human body. The major applications of emission computed tomography are in biological research and medical diagnostic procedures. The objectives of these procedures are to make quantitative measurements of in vivo biochemical and hemodynamic functions

  8. Human activity recognition and prediction

    CERN Document Server

    2016-01-01

    This book provides a unique view of human activity recognition, especially fine-grained human activity structure learning, human-interaction recognition, RGB-D data based action recognition, temporal decomposition, and causality learning in unconstrained human activity videos. The techniques discussed give readers tools that provide a significant improvement over existing methodologies of video content understanding by taking advantage of activity recognition. It links multiple popular research fields in computer vision, machine learning, human-centered computing, human-computer interaction, image classification, and pattern recognition. In addition, the book includes several key chapters covering multiple emerging topics in the field. Contributed by top experts and practitioners, the chapters present key topics from different angles and blend both methodology and application, composing a solid overview of the human activity recognition techniques. .

  9. Resistance to change and resurgence in humans engaging in a computer task.

    Science.gov (United States)

    Kuroda, Toshikazu; Cançado, Carlos R X; Podlesnik, Christopher A

    2016-04-01

    The relation between persistence, as measured by resistance to change, and resurgence has been examined with nonhuman animals but not systematically with humans. The present study examined persistence and resurgence with undergraduate students engaging in a computer task for points exchangeable for money. In Phase 1, a target response was maintained on a multiple variable-interval (VI) 15-s (Rich) VI 60-s (Lean) schedule of reinforcement. In Phase 2, the target response was extinguished while an alternative response was reinforced at equal rates in both schedule components. In Phase 3, the target and the alternative responses were extinguished. In an additional test of persistence (Phase 4), target responding was reestablished as in Phase 1 and then disrupted by access to videos in both schedule components. In Phases 2 and 4, target responding was more persistent in the Rich than in the Lean component. Also, resurgence generally was greater in the Rich than in the Lean component in Phase 3. The present findings with humans extend the generality of those obtained with nonhuman animals showing that higher reinforcement rates produce both greater persistence and resurgence, and suggest that common processes underlie response persistence and relapse. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. ORION: a computer code for evaluating environmental concentrations and dose equivalent to human organs or tissue from airborne radionuclides

    International Nuclear Information System (INIS)

    Shinohara, K.; Nomura, T.; Iwai, M.

    1983-05-01

    The computer code ORION has been developed to evaluate the environmental concentrations and the dose equivalent to human organs or tissue from air-borne radionuclides released from multiple nuclear installations. The modified Gaussian plume model is applied to calculate the dispersion of the radionuclide. Gravitational settling, dry deposition, precipitation scavenging and radioactive decay are considered to be the causes of depletion and deposition on the ground or on vegetation. ORION is written in the FORTRAN IV language and can be run on IBM 360, 370, 303X, 43XX and FACOM M-series computers. 8 references, 6 tables

  11. Computational Modeling of Space Physiology

    Science.gov (United States)

    Lewandowski, Beth E.; Griffin, Devon W.

    2016-01-01

    The Digital Astronaut Project (DAP), within NASAs Human Research Program, develops and implements computational modeling for use in the mitigation of human health and performance risks associated with long duration spaceflight. Over the past decade, DAP developed models to provide insights into space flight related changes to the central nervous system, cardiovascular system and the musculoskeletal system. Examples of the models and their applications include biomechanical models applied to advanced exercise device development, bone fracture risk quantification for mission planning, accident investigation, bone health standards development, and occupant protection. The International Space Station (ISS), in its role as a testing ground for long duration spaceflight, has been an important platform for obtaining human spaceflight data. DAP has used preflight, in-flight and post-flight data from short and long duration astronauts for computational model development and validation. Examples include preflight and post-flight bone mineral density data, muscle cross-sectional area, and muscle strength measurements. Results from computational modeling supplement space physiology research by informing experimental design. Using these computational models, DAP personnel can easily identify both important factors associated with a phenomenon and areas where data are lacking. This presentation will provide examples of DAP computational models, the data used in model development and validation, and applications of the model.

  12. Machine Understanding of Human Behavior

    NARCIS (Netherlands)

    Pantic, Maja; Pentland, Alex; Nijholt, Antinus; Huang, Thomas

    2007-01-01

    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing, which we will call human computing, should

  13. The Dimensions of the Orbital Cavity Based on High-Resolution Computed Tomography of Human Cadavers

    DEFF Research Database (Denmark)

    Felding, Ulrik Ascanius; Bloch, Sune Land; Buchwald, Christian von

    2016-01-01

    for surface area. To authors' knowledge, this study is the first to have measured the entire surface area of the orbital cavity.The volume and surface area of the orbital cavity were estimated in computed tomography scans of 11 human cadavers using unbiased stereological sampling techniques. The mean (± SD......) total volume and total surface area of the orbital cavities was 24.27 ± 3.88 cm and 32.47 ± 2.96 cm, respectively. There was no significant difference in volume (P = 0.315) or surface area (P = 0.566) between the 2 orbital cavities.The stereological technique proved to be a robust and unbiased method...... that may be used as a gold standard for comparison with automated computer software. Future imaging studies in blow-out fracture patients may be based on individual and relative calculation involving both herniated volume and fractured surface area in relation to the total volume and surface area...

  14. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    International Nuclear Information System (INIS)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-01-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  15. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-02-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  16. Diagnostic Accuracy of Periapical Radiography and Cone-beam Computed Tomography in Identifying Root Canal Configuration of Human Premolars.

    Science.gov (United States)

    Sousa, Thiago Oliveira; Haiter-Neto, Francisco; Nascimento, Eduarda Helena Leandro; Peroni, Leonardo Vieira; Freitas, Deborah Queiroz; Hassan, Bassam

    2017-07-01

    The aim of this study was to assess the diagnostic accuracy of periapical radiography (PR) and cone-beam computed tomographic (CBCT) imaging in the detection of the root canal configuration (RCC) of human premolars. PR and CBCT imaging of 114 extracted human premolars were evaluated by 2 oral radiologists. RCC was recorded according to Vertucci's classification. Micro-computed tomographic imaging served as the gold standard to determine RCC. Accuracy, sensitivity, specificity, and predictive values were calculated. The Friedman test compared both PR and CBCT imaging with the gold standard. CBCT imaging showed higher values for all diagnostic tests compared with PR. Accuracy was 0.55 and 0.89 for PR and CBCT imaging, respectively. There was no difference between CBCT imaging and the gold standard, whereas PR differed from both CBCT and micro-computed tomographic imaging (P < .0001). CBCT imaging was more accurate than PR for evaluating different types of RCC individually. Canal configuration types III, VII, and "other" were poorly identified on CBCT imaging with a detection accuracy of 50%, 0%, and 43%, respectively. With PR, all canal configurations except type I were poorly visible. PR presented low performance in the detection of RCC in premolars, whereas CBCT imaging showed no difference compared with the gold standard. Canals with complex configurations were less identifiable using both imaging methods, especially PR. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  17. Computer-aided diagnosis for phase-contrast X-ray computed tomography: quantitative characterization of human patellar cartilage with high-dimensional geometric features.

    Science.gov (United States)

    Nagarajan, Mahesh B; Coan, Paola; Huber, Markus B; Diemoz, Paul C; Glaser, Christian; Wismüller, Axel

    2014-02-01

    Phase-contrast computed tomography (PCI-CT) has shown tremendous potential as an imaging modality for visualizing human cartilage with high spatial resolution. Previous studies have demonstrated the ability of PCI-CT to visualize (1) structural details of the human patellar cartilage matrix and (2) changes to chondrocyte organization induced by osteoarthritis. This study investigates the use of high-dimensional geometric features in characterizing such chondrocyte patterns in the presence or absence of osteoarthritic damage. Geometrical features derived from the scaling index method (SIM) and statistical features derived from gray-level co-occurrence matrices were extracted from 842 regions of interest (ROI) annotated on PCI-CT images of ex vivo human patellar cartilage specimens. These features were subsequently used in a machine learning task with support vector regression to classify ROIs as healthy or osteoarthritic; classification performance was evaluated using the area under the receiver-operating characteristic curve (AUC). SIM-derived geometrical features exhibited the best classification performance (AUC, 0.95 ± 0.06) and were most robust to changes in ROI size. These results suggest that such geometrical features can provide a detailed characterization of the chondrocyte organization in the cartilage matrix in an automated and non-subjective manner, while also enabling classification of cartilage as healthy or osteoarthritic with high accuracy. Such features could potentially serve as imaging markers for evaluating osteoarthritis progression and its response to different therapeutic intervention strategies.

  18. 76 FR 14669 - Privacy Act of 1974; CMS Computer Match No. 2011-02; HHS Computer Match No. 1007

    Science.gov (United States)

    2011-03-17

    ... 1974; CMS Computer Match No. 2011-02; HHS Computer Match No. 1007 AGENCY: Department of Health and Human Services (HHS), Centers for Medicare & Medicaid Services (CMS). ACTION: Notice of computer... notice establishes a computer matching agreement between CMS and the Department of Defense (DoD). We have...

  19. Advancements in Violin-Related Human-Computer Interaction

    DEFF Research Database (Denmark)

    Overholt, Daniel

    2014-01-01

    of human intelligence and emotion is at the core of the Musical Interface Technology Design Space, MITDS. This is a framework that endeavors to retain and enhance such traits of traditional instruments in the design of interactive live performance interfaces. Utilizing the MITDS, advanced Human...

  20. Human-Centric Interfaces for Ambient Intelligence

    CERN Document Server

    Aghajan, Hamid; Delgado, Ramon Lopez-Cozar

    2009-01-01

    To create truly effective human-centric ambient intelligence systems both engineering and computing methods are needed. This is the first book to bridge data processing and intelligent reasoning methods for the creation of human-centered ambient intelligence systems. Interdisciplinary in nature, the book covers topics such as multi-modal interfaces, human-computer interaction, smart environments and pervasive computing, addressing principles, paradigms, methods and applications. This book will be an ideal reference for university researchers, R&D engineers, computer engineers, and graduate s

  1. Advanced Computational Methods in Bio-Mechanics.

    Science.gov (United States)

    Al Qahtani, Waleed M S; El-Anwar, Mohamed I

    2018-04-15

    A novel partnership between surgeons and machines, made possible by advances in computing and engineering technology, could overcome many of the limitations of traditional surgery. By extending surgeons' ability to plan and carry out surgical interventions more accurately and with fewer traumas, computer-integrated surgery (CIS) systems could help to improve clinical outcomes and the efficiency of healthcare delivery. CIS systems could have a similar impact on surgery to that long since realised in computer-integrated manufacturing. Mathematical modelling and computer simulation have proved tremendously successful in engineering. Computational mechanics has enabled technological developments in virtually every area of our lives. One of the greatest challenges for mechanists is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. Biomechanics has significant potential for applications in orthopaedic industry, and the performance arts since skills needed for these activities are visibly related to the human musculoskeletal and nervous systems. Although biomechanics is widely used nowadays in the orthopaedic industry to design orthopaedic implants for human joints, dental parts, external fixations and other medical purposes, numerous researches funded by billions of dollars are still running to build a new future for sports and human healthcare in what is called biomechanics era.

  2. Computer Modeling of Human Delta Opioid Receptor

    Directory of Open Access Journals (Sweden)

    Tatyana Dzimbova

    2013-04-01

    Full Text Available The development of selective agonists of δ-opioid receptor as well as the model of interaction of ligands with this receptor is the subjects of increased interest. In the absence of crystal structures of opioid receptors, 3D homology models with different templates have been reported in the literature. The problem is that these models are not available for widespread use. The aims of our study are: (1 to choose within recently published crystallographic structures templates for homology modeling of the human δ-opioid receptor (DOR; (2 to evaluate the models with different computational tools; and (3 to precise the most reliable model basing on correlation between docking data and in vitro bioassay results. The enkephalin analogues, as ligands used in this study, were previously synthesized by our group and their biological activity was evaluated. Several models of DOR were generated using different templates. All these models were evaluated by PROCHECK and MolProbity and relationship between docking data and in vitro results was determined. The best correlations received for the tested models of DOR were found between efficacy (erel of the compounds, calculated from in vitro experiments and Fitness scoring function from docking studies. New model of DOR was generated and evaluated by different approaches. This model has good GA341 value (0.99 from MODELLER, good values from PROCHECK (92.6% of most favored regions and MolProbity (99.5% of favored regions. Scoring function correlates (Pearson r = -0.7368, p-value = 0.0097 with erel of a series of enkephalin analogues, calculated from in vitro experiments. So, this investigation allows suggesting a reliable model of DOR. Newly generated model of DOR receptor could be used further for in silico experiments and it will give possibility for faster and more correct design of selective and effective ligands for δ-opioid receptor.

  3. Human-Centred Computing for Assisting Nuclear Safeguards

    International Nuclear Information System (INIS)

    Szoke, I.

    2015-01-01

    With the rapid evolution of enabling hardware and software, technologies including 3D simulation, virtual reality (VR), augmented reality (AR), advanced user interfaces (UI), and geographical information systems (GIS) are increasingly employed in many aspects of modern life. In line with this, the nuclear industry is rapidly adopting emerging technologies to improve efficiency and safety by supporting planning and optimization of maintenance and decommissioning work, as well as for knowledge management, surveillance, training and briefing field operatives, education, etc. For many years, the authors have been involved in research and development (R&D) into the application of 3D simulation, VR, and AR, for mobile, desktop, and immersive 3D systems, to provide a greater sense of presence and situation awareness, for training, briefing, and in situ work by field operators. This work has resulted in a unique software base and experience (documented in numerous reports) from evaluating the effects of the design of training programmes and briefing sessions on human performance and training efficiency when applying various emerging technologies. In addition, the authors are involved in R&D into the use of 3D simulation, advanced UIs, mobile computing, and GIS systems to support realistic visualization of the combined radiological and geographical environment, as well as acquisition, analyzes, visualization and sharing of radiological and other data, within nuclear installations and their surroundings. The toolkit developed by the authors, and the associated knowledge base, has been successfully applied to various aspects of the nuclear industry, and has great potential within the safeguards domain. It can be used to train safeguards inspectors, brief inspectors before inspections, assist inspectors in situ (data registration, analyzes, and communication), support the design and verification of safeguards systems, conserve data and experience, educate future safeguards

  4. Human Perception, SBS Sympsoms and Performance of Office Work during Exposure to Air Polluted by Building Materials and Personal Computers

    DEFF Research Database (Denmark)

    Bako-Biro, Zsolt

    The present thesis deals with the impact of polluted air from building materials and personal computers on human perception, Sick Building Syndrome (SBS) symptoms and performance of office work. These effects have been studies in a series of experiments that are described in two different chapters...

  5. Distribution of Recombination Hotspots in the Human Genome – A Comparison of Computer Simulations with Real Data

    Science.gov (United States)

    Mackiewicz, Dorota; de Oliveira, Paulo Murilo Castro; Moss de Oliveira, Suzana; Cebrat, Stanisław

    2013-01-01

    Recombination is the main cause of genetic diversity. Thus, errors in this process can lead to chromosomal abnormalities. Recombination events are confined to narrow chromosome regions called hotspots in which characteristic DNA motifs are found. Genomic analyses have shown that both recombination hotspots and DNA motifs are distributed unevenly along human chromosomes and are much more frequent in the subtelomeric regions of chromosomes than in their central parts. Clusters of motifs roughly follow the distribution of recombination hotspots whereas single motifs show a negative correlation with the hotspot distribution. To model the phenomena related to recombination, we carried out computer Monte Carlo simulations of genome evolution. Computer simulations generated uneven distribution of hotspots with their domination in the subtelomeric regions of chromosomes. They also revealed that purifying selection eliminating defective alleles is strong enough to cause such hotspot distribution. After sufficiently long time of simulations, the structure of chromosomes reached a dynamic equilibrium, in which number and global distribution of both hotspots and defective alleles remained statistically unchanged, while their precise positions were shifted. This resembles the dynamic structure of human and chimpanzee genomes, where hotspots change their exact locations but the global distributions of recombination events are very similar. PMID:23776462

  6. Human-Avatar Symbiosis for the Treatment of Auditory Verbal Hallucinations in Schizophrenia through Virtual/Augmented Reality and Brain-Computer Interfaces.

    Science.gov (United States)

    Fernández-Caballero, Antonio; Navarro, Elena; Fernández-Sotos, Patricia; González, Pascual; Ricarte, Jorge J; Latorre, José M; Rodriguez-Jimenez, Roberto

    2017-01-01

    This perspective paper faces the future of alternative treatments that take advantage of a social and cognitive approach with regards to pharmacological therapy of auditory verbal hallucinations (AVH) in patients with schizophrenia. AVH are the perception of voices in the absence of auditory stimulation and represents a severe mental health symptom. Virtual/augmented reality (VR/AR) and brain computer interfaces (BCI) are technologies that are growing more and more in different medical and psychological applications. Our position is that their combined use in computer-based therapies offers still unforeseen possibilities for the treatment of physical and mental disabilities. This is why, the paper expects that researchers and clinicians undergo a pathway toward human-avatar symbiosis for AVH by taking full advantage of new technologies. This outlook supposes to address challenging issues in the understanding of non-pharmacological treatment of schizophrenia-related disorders and the exploitation of VR/AR and BCI to achieve a real human-avatar symbiosis.

  7. CAPTCHA Based on Human Cognitive Factor

    OpenAIRE

    Chowdhury, Mohammad Jabed Morshed; Chakraborty, Narayan Ranjan

    2013-01-01

    A CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart) is an automatic security mechanism used to determine whether the user is a human or a malicious computer program. It is a program that generates and grades tests that are human solvable, but intends to be beyond the capabilities of current computer programs. CAPTCHA should be designed to be very easy for humans but very hard for machines. Unfortunately, the existing CAPTCHA systems while trying to maximize ...

  8. Single photon emission computed tomography study of human pulmonary perfusion: preliminary findings

    Energy Technology Data Exchange (ETDEWEB)

    Carratu, L; Sofia, M [Naples Univ. (Italy). Facolta di Medicina e Chirurgia; Salvatore, M; Muto, P; Ariemma, G [Istituto Nazionale per la Prevenzione, Lo Studio e La Cura dei Tumori Fondazione Pascale, Naples (Italy); Lopez-Majano, V [Cook County Hospital, Chicago, IL (USA). Nuclear Medicine Div.

    1984-02-01

    Single photon emission computed tomography (SPECT) was performed with /sup 99/Tcsup(m)-albumin macroaggregates to study human pulmonary perfusion in healthy subjects and patients with respiratory diseases such as chronic obstructive pulmonary disease (COPD) and lung neoplasms. The reconstructed SPECT data was displayed in coronal, transverse, sagittal plane sections and compared to conventional perfusion scans. The SPECT data gave more complicated anatomical information about the extent of damage and morphology of the pulmonary vascular bed. In healthy subjects and COPD patients, qualitative and quantitative assessment of pulmonary perfusion could be obtained from serial SPECT scans with respect to distribution and relative concentration of the injected radiopharmaceutical. Furthermore, SPECT of pulmonary perfusion has been useful in detecting the extent of damage to the pulmonary circulation. This is useful for the preoperative evaluation and staging of lung cancer.

  9. A New Human-Machine Interfaces of Computer-based Procedure System to Reduce the Team Errors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Sim, Joo Hyun; Lee, Hyun Chul

    2016-01-01

    In this study, we identify the emerging types of team errors, especially, in digitalized control room of nuclear power plants such as the APR-1400 main control room of Korea. Most works in nuclear industry are to be performed by a team of more than two persons. Even though the individual errors can be detected and recovered by the qualified others and/or the well trained team, it is rather seldom that the errors by team could be easily detected and properly recovered by the team itself. Note that the team is defined as two or more people who are appropriately interacting with each other, and the team is a dependent aggregate, which accomplishes a valuable goal. Organizational errors sometimes increase the likelihood of operator errors through the active failure pathway and, at the same time, enhance the possibility of adverse outcomes through defensive weaknesses. We incorporate the crew resource management as a representative approach to deal with the team factors of the human errors. We suggest a set of crew resource management training procedures under the unsafe environments where human errors can have devastating effects. We are on the way to develop alternative interfaces against team error in a condition of using a computer-based procedure system in a digitalized main control room. The computer-based procedure system is a representative feature of digitalized control room. In this study, we will propose new interfaces of computer-based procedure system to reduce feasible team errors. We are on the way of effectiveness test to validate whether the new interface can reduce team errors during operating with a computer-based procedure system in a digitalized control room

  10. A New Human-Machine Interfaces of Computer-based Procedure System to Reduce the Team Errors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sa Kil; Sim, Joo Hyun; Lee, Hyun Chul [Korea Atomic Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    In this study, we identify the emerging types of team errors, especially, in digitalized control room of nuclear power plants such as the APR-1400 main control room of Korea. Most works in nuclear industry are to be performed by a team of more than two persons. Even though the individual errors can be detected and recovered by the qualified others and/or the well trained team, it is rather seldom that the errors by team could be easily detected and properly recovered by the team itself. Note that the team is defined as two or more people who are appropriately interacting with each other, and the team is a dependent aggregate, which accomplishes a valuable goal. Organizational errors sometimes increase the likelihood of operator errors through the active failure pathway and, at the same time, enhance the possibility of adverse outcomes through defensive weaknesses. We incorporate the crew resource management as a representative approach to deal with the team factors of the human errors. We suggest a set of crew resource management training procedures under the unsafe environments where human errors can have devastating effects. We are on the way to develop alternative interfaces against team error in a condition of using a computer-based procedure system in a digitalized main control room. The computer-based procedure system is a representative feature of digitalized control room. In this study, we will propose new interfaces of computer-based procedure system to reduce feasible team errors. We are on the way of effectiveness test to validate whether the new interface can reduce team errors during operating with a computer-based procedure system in a digitalized control room.

  11. Brain-machine and brain-computer interfaces.

    Science.gov (United States)

    Friehs, Gerhard M; Zerris, Vasilios A; Ojakangas, Catherine L; Fellows, Mathew R; Donoghue, John P

    2004-11-01

    The idea of connecting the human brain to a computer or machine directly is not novel and its potential has been explored in science fiction. With the rapid advances in the areas of information technology, miniaturization and neurosciences there has been a surge of interest in turning fiction into reality. In this paper the authors review the current state-of-the-art of brain-computer and brain-machine interfaces including neuroprostheses. The general principles and requirements to produce a successful connection between human and artificial intelligence are outlined and the authors' preliminary experience with a prototype brain-computer interface is reported.

  12. Chips challenging champions games, computers and artificial intelligence

    CERN Document Server

    Schaeffer, J

    2002-01-01

    One of the earliest dreams of the fledgling field of artificial intelligence (AI) was to build computer programs that could play games as well as or better than the best human players. Despite early optimism in the field, the challenge proved to be surprisingly difficult. However, the 1990s saw amazing progress. Computers are now better than humans in checkers, Othello and Scrabble; are at least as good as the best humans in backgammon and chess; and are rapidly improving at hex, go, poker, and shogi. This book documents the progress made in computers playing games and puzzles. The book is the

  13. Imaging cellular and subcellular structure of human brain tissue using micro computed tomography

    Science.gov (United States)

    Khimchenko, Anna; Bikis, Christos; Schweighauser, Gabriel; Hench, Jürgen; Joita-Pacureanu, Alexandra-Teodora; Thalmann, Peter; Deyhle, Hans; Osmani, Bekim; Chicherova, Natalia; Hieber, Simone E.; Cloetens, Peter; Müller-Gerbl, Magdalena; Schulz, Georg; Müller, Bert

    2017-09-01

    Brain tissues have been an attractive subject for investigations in neuropathology, neuroscience, and neurobiol- ogy. Nevertheless, existing imaging methodologies have intrinsic limitations in three-dimensional (3D) label-free visualisation of extended tissue samples down to (sub)cellular level. For a long time, these morphological features were visualised by electron or light microscopies. In addition to being time-consuming, microscopic investigation includes specimen fixation, embedding, sectioning, staining, and imaging with the associated artefacts. More- over, optical microscopy remains hampered by a fundamental limit in the spatial resolution that is imposed by the diffraction of visible light wavefront. In contrast, various tomography approaches do not require a complex specimen preparation and can now reach a true (sub)cellular resolution. Even laboratory-based micro computed tomography in the absorption-contrast mode of formalin-fixed paraffin-embedded (FFPE) human cerebellum yields an image contrast comparable to conventional histological sections. Data of a superior image quality was obtained by means of synchrotron radiation-based single-distance X-ray phase-contrast tomography enabling the visualisation of non-stained Purkinje cells down to the subcellular level and automated cell counting. The question arises, whether the data quality of the hard X-ray tomography can be superior to optical microscopy. Herein, we discuss the label-free investigation of the human brain ultramorphology be means of synchrotron radiation-based hard X-ray magnified phase-contrast in-line tomography at the nano-imaging beamline ID16A (ESRF, Grenoble, France). As an example, we present images of FFPE human cerebellum block. Hard X-ray tomography can provide detailed information on human tissues in health and disease with a spatial resolution below the optical limit, improving understanding of the neuro-degenerative diseases.

  14. [Activities of Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2001-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administrations missions. RIACS is located at the NASA Ames Research Center, Moffett Field, California. RIACS research focuses on the three cornerstones of IT research necessary to meet the future challenges of NASA missions: 1. Automated Reasoning for Autonomous Systems Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. 2. Human-Centered Computing Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities. 3. High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to analysis of large scientific datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply IT research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, visiting scientist programs and student summer programs, designed to encourage and facilitate collaboration between the university and NASA IT research communities.

  15. Computer Profiling Based Model for Investigation

    OpenAIRE

    Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh

    2011-01-01

    Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...

  16. Numerical simulation of human biped locomotion

    International Nuclear Information System (INIS)

    Ishiguro, Misako; Fujisaki, Masahide

    1988-04-01

    This report describes the numerical simulation of the motion of human-like robot which is one of the research theme of human acts simulation program (HASP) begun at the Computing Center of JAERI in 1987. The purpose of the theme is to model the human motion using robotics kinematic/kinetic equations and to get the joint angles as the solution. As the first trial, we treat the biped locomotion (walking) which is the most fundamental human motion. We implemented a computer program on FACOM M-780 computer, where the program is originated from the book of M. Vukobratovic in Yugoslavia, and made a graphic program to draw a walking shot sequence. Mainly described here are the mathematical model of the biped locomotion, implementation method of the computer program, input data for basic walking pattern, computed results and its validation, and graphic representation of human walking image. Literature survey on robotics equation and biped locomotion is also included. (author)

  17. Towards distributed multiscale computing for the VPH

    NARCIS (Netherlands)

    Hoekstra, A.G.; Coveney, P.

    2010-01-01

    Multiscale modeling is fundamental to the Virtual Physiological Human (VPH) initiative. Most detailed three-dimensional multiscale models lead to prohibitive computational demands. As a possible solution we present MAPPER, a computational science infrastructure for Distributed Multiscale Computing

  18. Human face recognition using eigenface in cloud computing environment

    Science.gov (United States)

    Siregar, S. T. M.; Syahputra, M. F.; Rahmat, R. F.

    2018-02-01

    Doing a face recognition for one single face does not take a long time to process, but if we implement attendance system or security system on companies that have many faces to be recognized, it will take a long time. Cloud computing is a computing service that is done not on a local device, but on an internet connected to a data center infrastructure. The system of cloud computing also provides a scalability solution where cloud computing can increase the resources needed when doing larger data processing. This research is done by applying eigenface while collecting data as training data is also done by using REST concept to provide resource, then server can process the data according to existing stages. After doing research and development of this application, it can be concluded by implementing Eigenface, recognizing face by applying REST concept as endpoint in giving or receiving related information to be used as a resource in doing model formation to do face recognition.

  19. Computational dissection of human episodic memory reveals mental process-specific genetic profiles.

    Science.gov (United States)

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J-F

    2015-09-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory.

  20. Computational dissection of human episodic memory reveals mental process-specific genetic profiles

    Science.gov (United States)

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G.; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J.-F.

    2015-01-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory. PMID:26261317

  1. Electrophysiological properties of computational human ventricular cell action potential models under acute ischemic conditions.

    Science.gov (United States)

    Dutta, Sara; Mincholé, Ana; Quinn, T Alexander; Rodriguez, Blanca

    2017-10-01

    Acute myocardial ischemia is one of the main causes of sudden cardiac death. The mechanisms have been investigated primarily in experimental and computational studies using different animal species, but human studies remain scarce. In this study, we assess the ability of four human ventricular action potential models (ten Tusscher and Panfilov, 2006; Grandi et al., 2010; Carro et al., 2011; O'Hara et al., 2011) to simulate key electrophysiological consequences of acute myocardial ischemia in single cell and tissue simulations. We specifically focus on evaluating the effect of extracellular potassium concentration and activation of the ATP-sensitive inward-rectifying potassium current on action potential duration, post-repolarization refractoriness, and conduction velocity, as the most critical factors in determining reentry vulnerability during ischemia. Our results show that the Grandi and O'Hara models required modifications to reproduce expected ischemic changes, specifically modifying the intracellular potassium concentration in the Grandi model and the sodium current in the O'Hara model. With these modifications, the four human ventricular cell AP models analyzed in this study reproduce the electrophysiological alterations in repolarization, refractoriness, and conduction velocity caused by acute myocardial ischemia. However, quantitative differences are observed between the models and overall, the ten Tusscher and modified O'Hara models show closest agreement to experimental data. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Neural Computations Mediating One-Shot Learning in the Human Brain

    Science.gov (United States)

    Lee, Sang Wan; O’Doherty, John P.; Shimojo, Shinsuke

    2015-01-01

    Incremental learning, in which new knowledge is acquired gradually through trial and error, can be distinguished from one-shot learning, in which the brain learns rapidly from only a single pairing of a stimulus and a consequence. Very little is known about how the brain transitions between these two fundamentally different forms of learning. Here we test a computational hypothesis that uncertainty about the causal relationship between a stimulus and an outcome induces rapid changes in the rate of learning, which in turn mediates the transition between incremental and one-shot learning. By using a novel behavioral task in combination with functional magnetic resonance imaging (fMRI) data from human volunteers, we found evidence implicating the ventrolateral prefrontal cortex and hippocampus in this process. The hippocampus was selectively “switched” on when one-shot learning was predicted to occur, while the ventrolateral prefrontal cortex was found to encode uncertainty about the causal association, exhibiting increased coupling with the hippocampus for high-learning rates, suggesting this region may act as a “switch,” turning on and off one-shot learning as required. PMID:25919291

  3. Evaluation of the reliability concerning the identification of human factors as contributing factors by a computer supported event analysis (CEA)

    International Nuclear Information System (INIS)

    Wilpert, B.; Maimer, H.; Loroff, C.

    2000-01-01

    The project's objectives are the evaluation of the reliability concerning the identification of Human Factors as contributing factors by a computer supported event analysis (CEA). CEA is a computer version of SOL (Safety through Organizational Learning). Parts of the first step were interviews with experts from the nuclear power industry and the evaluation of existing computer supported event analysis methods. This information was combined to a requirement profile for the CEA software. The next step contained the implementation of the software in an iterative process of evaluation. The completion of this project was the testing of the CEA software. As a result the testing demonstrated that it is possible to identify contributing factors with CEA validly. In addition, CEA received a very positive feedback from the experts. (orig.) [de

  4. Integrating Human and Computer Intelligence. Technical Report No. 32.

    Science.gov (United States)

    Pea, Roy D.

    This paper explores the thesis that advances in computer applications and artificial intelligence have important implications for the study of development and learning in psychology. Current approaches to the use of computers as devices for problem solving, reasoning, and thinking--i.e., expert systems and intelligent tutoring systems--are…

  5. 1995 CERN school of computing. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Vandoni, C E [ed.

    1995-10-25

    These proceedings contain a written account of the majority of the lectures given at the 1995 CERN School of Computing. The Scientific Programme was articulated on 8 main themes: Human Computer Interfaces; Collaborative Software Engineering; Information Super Highways; Trends in Computer Architecture/Industry; Parallel Architectures (MPP); Mathematical Computing; Data Acquisition Systems; World-Wide Web for Physics. A number of lectures dealt with general aspects of computing, in particular in the area of Human Computer Interfaces (computer graphics, user interface tools and virtual reality). Applications in HEP of computer graphics (event display) was the subject of two lectures. The main theme of Mathematical Computing covered Mathematica and the usage of statistics packages. The important subject of Data Acqusition Systems was covered by lectures on switching techniques and simulation and modelling tools. A series of lectures dealt with the Information Super Highways and World-Wide Web Technology and its applications to High Energy Physics. Different aspects of Object Oriented Information Engineering Methodology and Object Oriented Programming in HEP were dealt in detail also in connection with data acquisition systems. On the theme `Trends in Computer Architecutre and Industry` lectures were given on: ATM Switching, and FORTRAN90 and High Performance FORTRAN. Computer Parallel Architectures (MPP) lectures delt with very large scale open systems, history and future of computer system architecture, message passing paradigm, features of PVM and MPI. (orig.).

  6. 1995 CERN school of computing. Proceedings

    International Nuclear Information System (INIS)

    Vandoni, C.E.

    1995-01-01

    These proceedings contain a written account of the majority of the lectures given at the 1995 CERN School of Computing. The Scientific Programme was articulated on 8 main themes: Human Computer Interfaces; Collaborative Software Engineering; Information Super Highways; Trends in Computer Architecture/Industry; Parallel Architectures (MPP); Mathematical Computing; Data Acquisition Systems; World-Wide Web for Physics. A number of lectures dealt with general aspects of computing, in particular in the area of Human Computer Interfaces (computer graphics, user interface tools and virtual reality). Applications in HEP of computer graphics (event display) was the subject of two lectures. The main theme of Mathematical Computing covered Mathematica and the usage of statistics packages. The important subject of Data Acqusition Systems was covered by lectures on switching techniques and simulation and modelling tools. A series of lectures dealt with the Information Super Highways and World-Wide Web Technology and its applications to High Energy Physics. Different aspects of Object Oriented Information Engineering Methodology and Object Oriented Programming in HEP were dealt in detail also in connection with data acquisition systems. On the theme 'Trends in Computer Architecutre and Industry' lectures were given on: ATM Switching, and FORTRAN90 and High Performance FORTRAN. Computer Parallel Architectures (MPP) lectures delt with very large scale open systems, history and future of computer system architecture, message passing paradigm, features of PVM and MPI. (orig.)

  7. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

    International Nuclear Information System (INIS)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

    2016-01-01

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology ''Quantification of Radiation Therapy Infrastructure and Staffing'' guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO ''Health Economics in Radiation Oncology'' (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland. (orig.) [de

  8. Remotely Telling Humans and Computers Apart: An Unsolved Problem

    Science.gov (United States)

    Hernandez-Castro, Carlos Javier; Ribagorda, Arturo

    The ability to tell humans and computers apart is imperative to protect many services from misuse and abuse. For this purpose, tests called CAPTCHAs or HIPs have been designed and put into production. Recent history shows that most (if not all) can be broken given enough time and commercial interest: CAPTCHA design seems to be a much more difficult problem than previously thought. The assumption that difficult-AI problems can be easily converted into valid CAPTCHAs is misleading. There are also some extrinsic problems that do not help, especially the big number of in-house designs that are put into production without any prior public critique. In this paper we present a state-of-the-art survey of current HIPs, including proposals that are now into production. We classify them regarding their basic design ideas. We discuss current attacks as well as future attack paths, and we also present common errors in design, and how many implementation flaws can transform a not necessarily bad idea into a weak CAPTCHA. We present examples of these flaws, using specific well-known CAPTCHAs. In a more theoretical way, we discuss the threat model: confronted risks and countermeasures. Finally, we introduce and discuss some desirable properties that new HIPs should have, concluding with some proposals for future work, including methodologies for design, implementation and security assessment.

  9. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  10. Developing Educational Computer Animation Based on Human Personality Types

    Science.gov (United States)

    Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol

    2015-01-01

    Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…

  11. Computational Foundations of Natural Intelligence.

    Science.gov (United States)

    van Gerven, Marcel

    2017-01-01

    New developments in AI and neuroscience are revitalizing the quest to understanding natural intelligence, offering insight about how to equip machines with human-like capabilities. This paper reviews some of the computational principles relevant for understanding natural intelligence and, ultimately, achieving strong AI. After reviewing basic principles, a variety of computational modeling approaches is discussed. Subsequently, I concentrate on the use of artificial neural networks as a framework for modeling cognitive processes. This paper ends by outlining some of the challenges that remain to fulfill the promise of machines that show human-like intelligence.

  12. A computer vision system for rapid search inspired by surface-based attention mechanisms from human perception.

    Science.gov (United States)

    Mohr, Johannes; Park, Jong-Han; Obermayer, Klaus

    2014-12-01

    Humans are highly efficient at visual search tasks by focusing selective attention on a small but relevant region of a visual scene. Recent results from biological vision suggest that surfaces of distinct physical objects form the basic units of this attentional process. The aim of this paper is to demonstrate how such surface-based attention mechanisms can speed up a computer vision system for visual search. The system uses fast perceptual grouping of depth cues to represent the visual world at the level of surfaces. This representation is stored in short-term memory and updated over time. A top-down guided attention mechanism sequentially selects one of the surfaces for detailed inspection by a recognition module. We show that the proposed attention framework requires little computational overhead (about 11 ms), but enables the system to operate in real-time and leads to a substantial increase in search efficiency. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Computational drug design strategies applied to the modelling of human immunodeficiency virus-1 reverse transcriptase inhibitors

    Directory of Open Access Journals (Sweden)

    Lucianna Helene Santos

    2015-11-01

    Full Text Available Reverse transcriptase (RT is a multifunctional enzyme in the human immunodeficiency virus (HIV-1 life cycle and represents a primary target for drug discovery efforts against HIV-1 infection. Two classes of RT inhibitors, the nucleoside RT inhibitors (NRTIs and the nonnucleoside transcriptase inhibitors are prominently used in the highly active antiretroviral therapy in combination with other anti-HIV drugs. However, the rapid emergence of drug-resistant viral strains has limited the successful rate of the anti-HIV agents. Computational methods are a significant part of the drug design process and indispensable to study drug resistance. In this review, recent advances in computer-aided drug design for the rational design of new compounds against HIV-1 RT using methods such as molecular docking, molecular dynamics, free energy calculations, quantitative structure-activity relationships, pharmacophore modelling and absorption, distribution, metabolism, excretion and toxicity prediction are discussed. Successful applications of these methodologies are also highlighted.

  14. MRI Reconstructions of Human Phrenic Nerve Anatomy and Computational Modeling of Cryoballoon Ablative Therapy.

    Science.gov (United States)

    Goff, Ryan P; Spencer, Julianne H; Iaizzo, Paul A

    2016-04-01

    The primary goal of this computational modeling study was to better quantify the relative distance of the phrenic nerves to areas where cryoballoon ablations may be applied within the left atria. Phrenic nerve injury can be a significant complication of applied ablative therapies for treatment of drug refractory atrial fibrillation. To date, published reports suggest that such injuries may occur more frequently in cryoballoon ablations than in radiofrequency therapies. Ten human heart-lung blocs were prepared in an end-diastolic state, scanned with MRI, and analyzed using Mimics software as a means to make anatomical measurements. Next, generated computer models of ArticFront cryoballoons (23, 28 mm) were mated with reconstructed pulmonary vein ostias to determine relative distances between the phrenic nerves and projected balloon placements, simulating pulmonary vein isolation. The effects of deep seating balloons were also investigated. Interestingly, the relative anatomical differences in placement of 23 and 28 mm cryoballoons were quite small, e.g., the determined difference between mid spline distance to the phrenic nerves between the two cryoballoon sizes was only 1.7 ± 1.2 mm. Furthermore, the right phrenic nerves were commonly closer to the pulmonary veins than the left, and surprisingly tips of balloons were further from the nerves, yet balloon size choice did not significantly alter calculated distance to the nerves. Such computational modeling is considered as a useful tool for both clinicians and device designers to better understand these associated anatomies that, in turn, may lead to optimization of therapeutic treatments.

  15. Challenges in human behavior understanding

    NARCIS (Netherlands)

    Salah, A.A.; Gevers, T.; Sebe, N.; Vinciarelli, A.

    2010-01-01

    Recent advances in pattern recognition has allowed computer scientists and psychologists to jointly address automatic analysis of of human behavior via computers. The Workshop on Human Behavior Understanding at the International Conference on Pattern Recognition explores a number of different

  16. Comparison of computational to human observer detection for evaluation of CT low dose iterative reconstruction

    Science.gov (United States)

    Eck, Brendan; Fahmi, Rachid; Brown, Kevin M.; Raihani, Nilgoun; Wilson, David L.

    2014-03-01

    Model observers were created and compared to human observers for the detection of low contrast targets in computed tomography (CT) images reconstructed with an advanced, knowledge-based, iterative image reconstruction method for low x-ray dose imaging. A 5-channel Laguerre-Gauss Hotelling Observer (CHO) was used with internal noise added to the decision variable (DV) and/or channel outputs (CO). Models were defined by parameters: (k1) DV-noise with standard deviation (std) proportional to DV std; (k2) DV-noise with constant std; (k3) CO-noise with constant std across channels; and (k4) CO-noise in each channel with std proportional to CO variance. Four-alternative forced choice (4AFC) human observer studies were performed on sub-images extracted from phantom images with and without a "pin" target. Model parameters were estimated using maximum likelihood comparison to human probability correct (PC) data. PC in human and all model observers increased with dose, contrast, and size, and was much higher for advanced iterative reconstruction (IMR) as compared to filtered back projection (FBP). Detection in IMR was better than FPB at 1/3 dose, suggesting significant dose savings. Model(k1,k2,k3,k4) gave the best overall fit to humans across independent variables (dose, size, contrast, and reconstruction) at fixed display window. However Model(k1) performed better when considering model complexity using the Akaike information criterion. Model(k1) fit the extraordinary detectability difference between IMR and FBP, despite the different noise quality. It is anticipated that the model observer will predict results from iterative reconstruction methods having similar noise characteristics, enabling rapid comparison of methods.

  17. Child-Computer Interaction: ICMI 2012 special session

    NARCIS (Netherlands)

    Nijholt, Antinus; Morency, L.P.; Bohus, L.; Aghajan, H.; Nijholt, Antinus; Cassell, J.; Epps, J.

    2012-01-01

    This is a short introduction to the special session on child computer interaction at the International Conference on Multimodal Interaction 2012 (ICMI 2012). In human-computer interaction users have become participants in the design process. This is not different for child computer interaction

  18. 78 FR 39730 - Privacy Act of 1974; CMS Computer Match No. 2013-11; HHS Computer Match No. 1302

    Science.gov (United States)

    2013-07-02

    ... 1974; CMS Computer Match No. 2013-11; HHS Computer Match No. 1302 AGENCY: Centers for Medicare & Medicaid Services (CMS), Department of Health and Human Services (HHS). ACTION: Notice of Computer Matching... notice announces the establishment of a CMP that CMS intends to conduct with State-based Administering...

  19. Brain-Computer Interfaces. Applying our Minds to Human-Computer Interaction

    NARCIS (Netherlands)

    Tan, Desney S.; Nijholt, Antinus

    2010-01-01

    For generations, humans have fantasized about the ability to create devices that can see into a person’s mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science

  20. Brain-Computer Interface Controlled Cyborg: Establishing a Functional Information Transfer Pathway from Human Brain to Cockroach Brain.

    Science.gov (United States)

    Li, Guangye; Zhang, Dingguo

    2016-01-01

    An all-chain-wireless brain-to-brain system (BTBS), which enabled motion control of a cyborg cockroach via human brain, was developed in this work. Steady-state visual evoked potential (SSVEP) based brain-computer interface (BCI) was used in this system for recognizing human motion intention and an optimization algorithm was proposed in SSVEP to improve online performance of the BCI. The cyborg cockroach was developed by surgically integrating a portable microstimulator that could generate invasive electrical nerve stimulation. Through Bluetooth communication, specific electrical pulse trains could be triggered from the microstimulator by BCI commands and were sent through the antenna nerve to stimulate the brain of cockroach. Serial experiments were designed and conducted to test overall performance of the BTBS with six human subjects and three cockroaches. The experimental results showed that the online classification accuracy of three-mode BCI increased from 72.86% to 78.56% by 5.70% using the optimization algorithm and the mean response accuracy of the cyborgs using this system reached 89.5%. Moreover, the results also showed that the cyborg could be navigated by the human brain to complete walking along an S-shape track with the success rate of about 20%, suggesting the proposed BTBS established a feasible functional information transfer pathway from the human brain to the cockroach brain.

  1. Computer vision syndrome (CVS) - Thermographic Analysis

    Science.gov (United States)

    Llamosa-Rincón, L. E.; Jaime-Díaz, J. M.; Ruiz-Cardona, D. F.

    2017-01-01

    The use of computers has reported an exponential growth in the last decades, the possibility of carrying out several tasks for both professional and leisure purposes has contributed to the great acceptance by the users. The consequences and impact of uninterrupted tasks with computers screens or displays on the visual health, have grabbed researcher’s attention. When spending long periods of time in front of a computer screen, human eyes are subjected to great efforts, which in turn triggers a set of symptoms known as Computer Vision Syndrome (CVS). Most common of them are: blurred vision, visual fatigue and Dry Eye Syndrome (DES) due to unappropriate lubrication of ocular surface when blinking decreases. An experimental protocol was de-signed and implemented to perform thermographic studies on healthy human eyes during exposure to dis-plays of computers, with the main purpose of comparing the existing differences in temperature variations of healthy ocular surfaces.

  2. Can human experts predict solubility better than computers?

    Science.gov (United States)

    Boobier, Samuel; Osbourn, Anne; Mitchell, John B O

    2017-12-13

    In this study, we design and carry out a survey, asking human experts to predict the aqueous solubility of druglike organic compounds. We investigate whether these experts, drawn largely from the pharmaceutical industry and academia, can match or exceed the predictive power of algorithms. Alongside this, we implement 10 typical machine learning algorithms on the same dataset. The best algorithm, a variety of neural network known as a multi-layer perceptron, gave an RMSE of 0.985 log S units and an R 2 of 0.706. We would not have predicted the relative success of this particular algorithm in advance. We found that the best individual human predictor generated an almost identical prediction quality with an RMSE of 0.942 log S units and an R 2 of 0.723. The collection of algorithms contained a higher proportion of reasonably good predictors, nine out of ten compared with around half of the humans. We found that, for either humans or algorithms, combining individual predictions into a consensus predictor by taking their median generated excellent predictivity. While our consensus human predictor achieved very slightly better headline figures on various statistical measures, the difference between it and the consensus machine learning predictor was both small and statistically insignificant. We conclude that human experts can predict the aqueous solubility of druglike molecules essentially equally well as machine learning algorithms. We find that, for either humans or algorithms, combining individual predictions into a consensus predictor by taking their median is a powerful way of benefitting from the wisdom of crowds.

  3. Suggested Approaches to the Measurement of Computer Anxiety.

    Science.gov (United States)

    Toris, Carol

    Psychologists can gain insight into human behavior by examining what people feel about, know about, and do with, computers. Two extreme reactions to computers are computer phobia, or anxiety, and computer addiction, or "hacking". A four-part questionnaire was developed to measure computer anxiety. The first part is a projective technique which…

  4. A Conceptual Architecture for Adaptive Human-Computer Interface of a PT Operation Platform Based on Context-Awareness

    Directory of Open Access Journals (Sweden)

    Qing Xue

    2014-01-01

    Full Text Available We present a conceptual architecture for adaptive human-computer interface of a PT operation platform based on context-awareness. This architecture will form the basis of design for such an interface. This paper describes components, key technologies, and working principles of the architecture. The critical contents covered context information modeling, processing, relationship establishing between contexts and interface design knowledge by use of adaptive knowledge reasoning, and visualization implementing of adaptive interface with the aid of interface tools technology.

  5. 78 FR 50419 - Privacy Act of 1974; CMS Computer Match No. 2013-10; HHS Computer Match No. 1310

    Science.gov (United States)

    2013-08-19

    ... 1974; CMS Computer Match No. 2013-10; HHS Computer Match No. 1310 AGENCY: Centers for Medicare & Medicaid Services (CMS), Department of Health and Human Services (HHS). ACTION: Notice of Computer Matching... notice announces the establishment of a CMP that CMS plans to conduct with the Department of Homeland...

  6. Supporting Human Activities - Exploring Activity-Centered Computing

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob

    2002-01-01

    In this paper we explore an activity-centered computing paradigm that is aimed at supporting work processes that are radically different from the ones known from office work. Our main inspiration is healthcare work that is characterized by an extreme degree of mobility, many interruptions, ad-hoc...

  7. Human Modeling for Ground Processing Human Factors Engineering Analysis

    Science.gov (United States)

    Stambolian, Damon B.; Lawrence, Brad A.; Stelges, Katrine S.; Steady, Marie-Jeanne O.; Ridgwell, Lora C.; Mills, Robert E.; Henderson, Gena; Tran, Donald; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over the last few years using human modeling for human factors engineering analysis for design of spacecraft. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the human modeling currently used at Kennedy Space Center (KSC), and to explain the future plans for human modeling for future spacecraft designs

  8. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  9. X-ray micro computed tomography for the visualization of an atherosclerotic human coronary artery

    Science.gov (United States)

    Matviykiv, Sofiya; Buscema, Marzia; Deyhle, Hans; Pfohl, Thomas; Zumbuehl, Andreas; Saxer, Till; Müller, Bert

    2017-06-01

    Atherosclerosis refers to narrowing or blocking of blood vessels that can lead to a heart attack, chest pain or stroke. Constricted segments of diseased arteries exhibit considerably increased wall shear stress, compared to the healthy ones. One of the possibilities to improve patient’s treatment is the application of nano-therapeutic approaches, based on shear stress sensitive nano-containers. In order to tailor the chemical composition and subsequent physical properties of such liposomes, one has to know precisely the morphology of critically stenosed arteries at micrometre resolution. It is often obtained by means of histology, which has the drawback of offering only two-dimensional information. Additionally, it requires the artery to be decalcified before sectioning, which might lead to deformations within the tissue. Micro computed tomography (μCT) enables the three-dimensional (3D) visualization of soft and hard tissues at micrometre level. μCT allows lumen segmentation that is crucial for subsequent flow simulation analysis. In this communication, tomographic images of a human coronary artery before and after decalcification are qualitatively and quantitatively compared. We analyse the cross section of the diseased human coronary artery before and after decalcification, and calculate the lumen area of both samples.

  10. A Situative Space Model for Mobile Mixed-Reality Computing

    DEFF Research Database (Denmark)

    Pederson, Thomas; Janlert, Lars-Erik; Surie, Dipak

    2011-01-01

    This article proposes a situative space model that links the physical and virtual realms and sets the stage for complex human-computer interaction defined by what a human agent can see, hear, and touch, at any given point in time.......This article proposes a situative space model that links the physical and virtual realms and sets the stage for complex human-computer interaction defined by what a human agent can see, hear, and touch, at any given point in time....

  11. 21 CFR 870.1425 - Programmable diagnostic computer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Programmable diagnostic computer. 870.1425 Section 870.1425 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES... diagnostic computer. (a) Identification. A programmable diagnostic computer is a device that can be...

  12. Computationally efficient analysis of particle transport and deposition in a human whole-lung-airway model. Part I: Theory and model validation.

    Science.gov (United States)

    Kolanjiyil, Arun V; Kleinstreuer, Clement

    2016-12-01

    Computational predictions of aerosol transport and deposition in the human respiratory tract can assist in evaluating detrimental or therapeutic health effects when inhaling toxic particles or administering drugs. However, the sheer complexity of the human lung, featuring a total of 16 million tubular airways, prohibits detailed computer simulations of the fluid-particle dynamics for the entire respiratory system. Thus, in order to obtain useful and efficient particle deposition results, an alternative modeling approach is necessary where the whole-lung geometry is approximated and physiological boundary conditions are implemented to simulate breathing. In Part I, the present new whole-lung-airway model (WLAM) represents the actual lung geometry via a basic 3-D mouth-to-trachea configuration while all subsequent airways are lumped together, i.e., reduced to an exponentially expanding 1-D conduit. The diameter for each generation of the 1-D extension can be obtained on a subject-specific basis from the calculated total volume which represents each generation of the individual. The alveolar volume was added based on the approximate number of alveoli per generation. A wall-displacement boundary condition was applied at the bottom surface of the first-generation WLAM, so that any breathing pattern due to the negative alveolar pressure can be reproduced. Specifically, different inhalation/exhalation scenarios (rest, exercise, etc.) were implemented by controlling the wall/mesh displacements to simulate realistic breathing cycles in the WLAM. Total and regional particle deposition results agree with experimental lung deposition results. The outcomes provide critical insight to and quantitative results of aerosol deposition in human whole-lung airways with modest computational resources. Hence, the WLAM can be used in analyzing human exposure to toxic particulate matter or it can assist in estimating pharmacological effects of administered drug-aerosols. As a practical

  13. Computer Simulation of Reading.

    Science.gov (United States)

    Leton, Donald A.

    In recent years, coding and decoding have been claimed to be the processes for converting one language form to another. But there has been little effort to locate these processes in the human learner or to identify the nature of the internal codes. Computer simulation of reading is useful because the similarities in the human reception and…

  14. Flow velocity-driven differentiation of human mesenchymal stromal cells in silk fibroin scaffolds: A combined experimental and computational approach.

    Directory of Open Access Journals (Sweden)

    Jolanda Rita Vetsch

    Full Text Available Mechanical loading plays a major role in bone remodeling and fracture healing. Mimicking the concept of mechanical loading of bone has been widely studied in bone tissue engineering by perfusion cultures. Nevertheless, there is still debate regarding the in-vitro mechanical stimulation regime. This study aims at investigating the effect of two different flow rates (vlow = 0.001m/s and vhigh = 0.061m/s on the growth of mineralized tissue produced by human mesenchymal stromal cells cultured on 3-D silk fibroin scaffolds. The flow rates applied were chosen to mimic the mechanical environment during early fracture healing or during bone remodeling, respectively. Scaffolds cultured under static conditions served as a control. Time-lapsed micro-computed tomography showed that mineralized extracellular matrix formation was completely inhibited at vlow compared to vhigh and the static group. Biochemical assays and histology confirmed these results and showed enhanced osteogenic differentiation at vhigh whereas the amount of DNA was increased at vlow. The biological response at vlow might correspond to the early stage of fracture healing, where cell proliferation and matrix production is prominent. Visual mapping of shear stresses, simulated by computational fluid dynamics, to 3-D micro-computed tomography data revealed that shear stresses up to 0.39mPa induced a higher DNA amount and shear stresses between 0.55mPa and 24mPa induced osteogenic differentiation. This study demonstrates the feasibility to drive cell behavior of human mesenchymal stromal cells by the flow velocity applied in agreement with mechanical loading mimicking early fracture healing (vlow or bone remodeling (vhigh. These results can be used in the future to tightly control the behavior of human mesenchymal stromal cells towards proliferation or differentiation. Additionally, the combination of experiment and simulation presented is a strong tool to link biological responses to

  15. Research Institute for Advanced Computer Science

    Science.gov (United States)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a

  16. An Overview of Computer-Based Natural Language Processing.

    Science.gov (United States)

    Gevarter, William B.

    Computer-based Natural Language Processing (NLP) is the key to enabling humans and their computer-based creations to interact with machines using natural languages (English, Japanese, German, etc.) rather than formal computer languages. NLP is a major research area in the fields of artificial intelligence and computational linguistics. Commercial…

  17. Computational Foundations of Natural Intelligence

    Directory of Open Access Journals (Sweden)

    Marcel van Gerven

    2017-12-01

    Full Text Available New developments in AI and neuroscience are revitalizing the quest to understanding natural intelligence, offering insight about how to equip machines with human-like capabilities. This paper reviews some of the computational principles relevant for understanding natural intelligence and, ultimately, achieving strong AI. After reviewing basic principles, a variety of computational modeling approaches is discussed. Subsequently, I concentrate on the use of artificial neural networks as a framework for modeling cognitive processes. This paper ends by outlining some of the challenges that remain to fulfill the promise of machines that show human-like intelligence.

  18. MAPPS (Maintenance Personnel Performance Simulation): a computer simulation model for human reliability analysis

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.

    1985-01-01

    A computer model has been developed, sensitivity tested, and evaluated capable of generating reliable estimates of human performance measures in the nuclear power plant (NPP) maintenance context. The model, entitled MAPPS (Maintenance Personnel Performance Simulation), is of the simulation type and is task-oriented. It addresses a number of person-machine, person-environment, and person-person variables and is capable of providing the user with a rich spectrum of important performance measures including mean time for successful task performance by a maintenance team and maintenance team probability of task success. These two measures are particularly important for input to probabilistic risk assessment (PRA) studies which were the primary impetus for the development of MAPPS. The simulation nature of the model along with its generous input parameters and output variables allows its usefulness to extend beyond its input to PRA

  19. Human thyroid specimen imaging by fluorescent x-ray computed tomography with synchrotron radiation

    Science.gov (United States)

    Takeda, Tohoru; Yu, Quanwen; Yashiro, Toru; Yuasa, Tetsuya; Hasegawa, Yasuo; Itai, Yuji; Akatsuka, Takao

    1999-09-01

    Fluorescent x-ray computed tomography (FXCT) is being developed to detect non-radioactive contrast materials in living specimens. The FXCT system consists of a silicon (111) channel cut monochromator, an x-ray slit and a collimator for fluorescent x ray detection, a scanning table for the target organ and an x-ray detector for fluorescent x-ray and transmission x-ray. To reduce Compton scattering overlapped on the fluorescent K(alpha) line, incident monochromatic x-ray was set at 37 keV. The FXCT clearly imaged a human thyroid gland and iodine content was estimated quantitatively. In a case of hyperthyroidism, the two-dimensional distribution of iodine content was not uniform, and thyroid cancer had a small amount of iodine. FXCT can be used to detect iodine within thyroid gland quantitatively and to delineate its distribution.

  20. Flat panel computed tomography of human ex vivo heart and bone specimens: initial experience

    Energy Technology Data Exchange (ETDEWEB)

    Nikolaou, Konstantin; Becker, Christoph R.; Reiser, Maximilian F. [Ludwig-Maximilians-University, Department of Clinical Radiology, Munich (Germany); Flohr, Thomas; Stierstorfer, Karl [CT Division, Siemens Medical Solutions, Forchheim (Germany)

    2005-02-01

    The aim of this technical investigation was the detailed description of a prototype flat panel detector computed tomography system (FPCT) and its initial evaluation in an ex vivo setting. The prototype FPCT scanner consists of a conventional radiographic flat panel detector, mounted on a multi-slice CT scanner gantry. Explanted human ex vivo heart and foot specimens were examined. Images were reformatted with various reconstruction algorithms and were evaluated for high-resolution anatomic information. For comparison purposes, the ex vivo specimens were also scanned with a conventional 16-detector-row CT scanner (Sensation 16, Siemens Medical Solutions, Forchheim, Germany). With the FPCT prototype used, a 1,024 x 768 resolution matrix can be obtained, resulting in an isotropic voxel size of 0.25 x 0.25 x 0.25 mm at the iso-center. Due to the high spatial resolution, very small structures such as trabecular bone or third-degree, distal branches of coronary arteries could be visualized. This first evaluation showed that flat panel detector systems can be used in a cone-beam computed tomography scanner and that very high spatial resolutions can be achieved. However, there are limitations for in vivo use due to constraints in low contrast resolution and slow scan speed. (orig.)

  1. A computational method for probabilistic safety assessment of I and C systems and human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2006-01-01

    To make probabilistic safety assessment (PSA) more realistic, the improvements of human reliability analysis (HRA) are essential. But, current HRA methods have many limitations including the lack of considerations on the interdependency between instrumentation and control (I and C) systems and human operators, and lack of theoretical basis for situation assessment of human operators. To overcome these limitations, we propose a new method for the quantitative safety assessment of I and C systems and human operators. The proposed method is developed based on the computational models for the knowledge-driven monitoring and the situation assessment of human operators, with the consideration of the interdependency between I and C systems and human operators. The application of the proposed method to an example situation demonstrates that the quantitative description by the proposed method for a probable scenario well matches with the qualitative description of the scenario. It is also demonstrated that the proposed method can probabilistically consider all possible scenarios and the proposed method can be used to quantitatively evaluate the effects of various context factor on the safety of nuclear power plants. In our opinion, the proposed method can be used as the basis for the development of advanced HRA methods

  2. Human Technology and Human Affects

    DEFF Research Database (Denmark)

    Fausing, Bent

    2009-01-01

    Human Technology and Human Affects  This year Samsung introduced a mobile phone with "Soul". It was made with a human touch and included itself a magical touch. Which function does technology and affects get in everyday aesthetics like this, its images and interactions included this presentation...... will ask and try to answer. The mobile phone and its devices are depicted as being able to make a unique human presence, interaction, and affect. The medium, the technology is a necessary helper to get towards this very special and lost humanity. Without the technology, no special humanity - soul....... The paper will investigate how technology, humanity, affects, and synaesthesia are presented and combined with examples from everyday aesthetics, e.g. early computer tv-commercial, net-commercial for mobile phones. Technology and affects point, is the conclusion, towards a forgotten pre-human and not he...

  3. Computing the influences of different Intraocular Pressures on the human eye components using computational fluid-structure interaction model.

    Science.gov (United States)

    Karimi, Alireza; Razaghi, Reza; Navidbakhsh, Mahdi; Sera, Toshihiro; Kudo, Susumu

    2017-01-01

    Intraocular Pressure (IOP) is defined as the pressure of aqueous in the eye. It has been reported that the normal range of IOP should be within the 10-20 mmHg with an average of 15.50 mmHg among the ophthalmologists. Keratoconus is an anti-inflammatory eye disorder that debilitated cornea unable to reserve the normal structure contrary to the IOP in the eye. Consequently, the cornea would bulge outward and invoke a conical shape following by distorted vision. In addition, it is known that any alterations in the structure and composition of the lens and cornea would exceed a change of the eye ball as well as the mechanical and optical properties of the eye. Understanding the precise alteration of the eye components' stresses and deformations due to different IOPs could help elucidate etiology and pathogenesis to develop treatments not only for keratoconus but also for other diseases of the eye. In this study, at three different IOPs, including 10, 20, and 30 mmHg the stresses and deformations of the human eye components were quantified using a Three-Dimensional (3D) computational Fluid-Structure Interaction (FSI) model of the human eye. The results revealed the highest amount of von Mises stress in the bulged region of the cornea with 245 kPa at the IOP of 30 mmHg. The lens was also showed the von Mises stress of 19.38 kPa at the IOPs of 30 mmHg. In addition, by increasing the IOP from 10 to 30 mmHg, the radius of curvature in the cornea and lens was increased accordingly. In contrast, the sclera indicated its highest stress at the IOP of 10 mmHg due to over pressure phenomenon. The variation of IOP illustrated a little influence in the amount of stress as well as the resultant displacement of the optic nerve. These results can be used for understanding the amount of stresses and deformations in the human eye components due to different IOPs as well as for clarifying significant role of IOP on the radius of curvature of the cornea and the lens.

  4. Computation of emotions in man and machines.

    Science.gov (United States)

    Robinson, Peter; el Kaliouby, Rana

    2009-12-12

    The importance of emotional expression as part of human communication has been understood since Aristotle, and the subject has been explored scientifically since Charles Darwin and others in the nineteenth century. Advances in computer technology now allow machines to recognize and express emotions, paving the way for improved human-computer and human-human communications. Recent advances in psychology have greatly improved our understanding of the role of affect in communication, perception, decision-making, attention and memory. At the same time, advances in technology mean that it is becoming possible for machines to sense, analyse and express emotions. We can now consider how these advances relate to each other and how they can be brought together to influence future research in perception, attention, learning, memory, communication, decision-making and other applications. The computation of emotions includes both recognition and synthesis, using channels such as facial expressions, non-verbal aspects of speech, posture, gestures, physiology, brain imaging and general behaviour. The combination of new results in psychology with new techniques of computation is leading to new technologies with applications in commerce, education, entertainment, security, therapy and everyday life. However, there are important issues of privacy and personal expression that must also be considered.

  5. Variation in the human ribs geometrical properties and mechanical response based on X-ray computed tomography images resolution.

    Science.gov (United States)

    Perz, Rafał; Toczyski, Jacek; Subit, Damien

    2015-01-01

    Computational models of the human body are commonly used for injury prediction in automobile safety research. To create these models, the geometry of the human body is typically obtained from segmentation of medical images such as computed tomography (CT) images that have a resolution between 0.2 and 1mm/pixel. While the accuracy of the geometrical and structural information obtained from these images depend greatly on their resolution, the effect of image resolution on the estimation of the ribs geometrical properties has yet to be established. To do so, each of the thirty-four sections of ribs obtained from a Post Mortem Human Surrogate (PMHS) was imaged using three different CT modalities: standard clinical CT (clinCT), high resolution clinical CT (HRclinCT), and microCT. The images were processed to estimate the rib cross-section geometry and mechanical properties, and the results were compared to those obtained from the microCT images by computing the 'deviation factor', a metric that quantifies the relative difference between results obtained from clinCT and HRclinCT to those obtained from microCT. Overall, clinCT images gave a deviation greater than 100%, and were therefore deemed inadequate for the purpose of this study. HRclinCT overestimated the rib cross-sectional area by 7.6%, the moments of inertia by about 50%, and the cortical shell area by 40.2%, while underestimating the trabecular area by 14.7%. Next, a parametric analysis was performed to quantify how the variations in the estimate of the geometrical properties affected the rib predicted mechanical response under antero-posterior loading. A variation of up to 45% for the predicted peak force and up to 50% for the predicted stiffness was observed. These results provide a quantitative estimate of the sensitivity of the response of the FE model to the resolution of the images used to generate it. They also suggest that a correction factor could be derived from the comparison between microCT and

  6. An audio-visual dataset of human-human interactions in stressful situations

    NARCIS (Netherlands)

    Lefter, I.; Burghouts, G.J.; Rothkrantz, L.J.M.

    2014-01-01

    Stressful situations are likely to occur at human operated service desks, as well as at human-computer interfaces used in public domain. Automatic surveillance can help notifying when extra assistance is needed. Human communication is inherently multimodal e.g. speech, gestures, facial expressions.

  7. 21 CFR 870.1110 - Blood pressure computer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Blood pressure computer. 870.1110 Section 870.1110 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED... computer. (a) Identification. A blood pressure computer is a device that accepts the electrical signal from...

  8. Where computers disappear, virtual humans appear

    NARCIS (Netherlands)

    Nijholt, Antinus; Sourin, A.

    2004-01-01

    In this paper, we survey the role of virtual humans (or embodied conversational agents) in smart and ambient intelligence environments. Research in this area can profit from research done earlier in virtual reality environments and research on verbal and nonverbal interaction. We discuss virtual

  9. Optimal design method for a digital human–computer interface based on human reliability in a nuclear power plant. Part 3: Optimization method for interface task layout

    International Nuclear Information System (INIS)

    Jiang, Jianjun; Wang, Yiqun; Zhang, Li; Xie, Tian; Li, Min; Peng, Yuyuan; Wu, Daqing; Li, Peiyao; Ma, Congmin; Shen, Mengxu; Wu, Xing; Weng, Mengyun; Wang, Shiwei; Xie, Cen

    2016-01-01

    Highlights: • The authors present an optimization algorithm for interface task layout. • The performing process of the proposed algorithm was depicted. • The performance evaluation method adopted neural network method. • The optimization layouts of an event interface tasks were obtained by experiments. - Abstract: This is the last in a series of papers describing the optimal design for a digital human–computer interface of a nuclear power plant (NPP) from three different points based on human reliability. The purpose of this series is to propose different optimization methods from varying perspectives to decrease human factor events that arise from the defects of a human–computer interface. The present paper mainly solves the optimization method as to how to effectively layout interface tasks into different screens. The purpose of this paper is to decrease human errors by reducing the distance that an operator moves among different screens in each operation. In order to resolve the problem, the authors propose an optimization process of interface task layout for digital human–computer interface of a NPP. As to how to automatically layout each interface task into one of screens in each operation, the paper presents a shortest moving path optimization algorithm with dynamic flag based on human reliability. To test the algorithm performance, the evaluation method uses neural network based on human reliability. The less the human error probabilities are, the better the interface task layouts among different screens are. Thus, by analyzing the performance of each interface task layout, the optimization result is obtained. Finally, the optimization layouts of spurious safety injection event interface tasks of the NPP are obtained by an experiment, the proposed methods has a good accuracy and stabilization.

  10. Radiotherapy infrastructure and human resources in Switzerland : Present status and projected computations for 2020.

    Science.gov (United States)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

    2016-09-01

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology "Quantification of Radiation Therapy Infrastructure and Staffing" guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO "Health Economics in Radiation Oncology" (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland.

  11. Computationally derived points of fragility of a human cascade are consistent with current therapeutic strategies.

    Directory of Open Access Journals (Sweden)

    Deyan Luan

    2007-07-01

    Full Text Available The role that mechanistic mathematical modeling and systems biology will play in molecular medicine and clinical development remains uncertain. In this study, mathematical modeling and sensitivity analysis were used to explore the working hypothesis that mechanistic models of human cascades, despite model uncertainty, can be computationally screened for points of fragility, and that these sensitive mechanisms could serve as therapeutic targets. We tested our working hypothesis by screening a model of the well-studied coagulation cascade, developed and validated from literature. The predicted sensitive mechanisms were then compared with the treatment literature. The model, composed of 92 proteins and 148 protein-protein interactions, was validated using 21 published datasets generated from two different quiescent in vitro coagulation models. Simulated platelet activation and thrombin generation profiles in the presence and absence of natural anticoagulants were consistent with measured values, with a mean correlation of 0.87 across all trials. Overall state sensitivity coefficients, which measure the robustness or fragility of a given mechanism, were calculated using a Monte Carlo strategy. In the absence of anticoagulants, fluid and surface phase factor X/activated factor X (fX/FXa activity and thrombin-mediated platelet activation were found to be fragile, while fIX/FIXa and fVIII/FVIIIa activation and activity were robust. Both anti-fX/FXa and direct thrombin inhibitors are important classes of anticoagulants; for example, anti-fX/FXa inhibitors have FDA approval for the prevention of venous thromboembolism following surgical intervention and as an initial treatment for deep venous thrombosis and pulmonary embolism. Both in vitro and in vivo experimental evidence is reviewed supporting the prediction that fIX/FIXa activity is robust. When taken together, these results support our working hypothesis that computationally derived points of

  12. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations

    Directory of Open Access Journals (Sweden)

    Andrea Stocco

    2018-04-01

    Full Text Available This article describes the data analyzed in the paper “Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model” (Stocco et al., 2017 [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004 [2], Simon task (Craft and Simon, 1970 [3], and Automated Operation Span (Unsworth et al., 2005 [4], as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  13. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations.

    Science.gov (United States)

    Stocco, Andrea; Yamasaki, Brianna L; Prat, Chantel S

    2018-04-01

    This article describes the data analyzed in the paper "Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model" (Stocco et al., 2017) [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004) [2], Simon task (Craft and Simon, 1970) [3], and Automated Operation Span (Unsworth et al., 2005) [4]), as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format) as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  14. Systems analysis and the computer

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, A S

    1983-08-01

    The words systems analysis are used in at least two senses. Whilst the general nature of the topic is well understood in the or community, the nature of the term as used by computer scientists is less familiar. In this paper, the nature of systems analysis as it relates to computer-based systems is examined from the point of view that the computer system is an automaton embedded in a human system, and some facets of this are explored. It is concluded that or analysts and computer analysts have things to learn from each other and that this ought to be reflected in their education. The important role played by change in the design of systems is also highlighted, and it is concluded that, whilst the application of techniques developed in the artificial intelligence field have considerable relevance to constructing automata able to adapt to change in the environment, study of the human factors affecting the overall systems within which the automata are embedded has an even more important role. 19 references.

  15. The latest science and human

    International Nuclear Information System (INIS)

    Kim, Sang Il; Lee, Hae Du; Lee, Geun Hui

    1985-04-01

    The book is collective reports on the science and human. The contents of this book are life ethics and technology ethics, conception of human and human science, biotechnology. The tower of Babel in computer age, human brain and robot, new media and communication innovation, status of computer engineering, current condition of development of new media, mass media and violence, crime and scientification of terror, condition of the life and peace, period of machine and literature, religious prophecy and scientific prophecy and hi-tech age and education of science.

  16. The latest science and human

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sang Il; Lee, Hae Du; Lee, Geun Hui

    1985-04-15

    The book is collective reports on the science and human. The contents of this book are life ethics and technology ethics, conception of human and human science, biotechnology. The tower of Babel in computer age, human brain and robot, new media and communication innovation, status of computer engineering, current condition of development of new media, mass media and violence, crime and scientification of terror, condition of the life and peace, period of machine and literature, religious prophecy and scientific prophecy and hi-tech age and education of science.

  17. Augmenting digital displays with computation

    Science.gov (United States)

    Liu, Jing

    As we inevitably step deeper and deeper into a world connected via the Internet, more and more information will be exchanged digitally. Displays are the interface between digital information and each individual. Naturally, one fundamental goal of displays is to reproduce information as realistically as possible since humans still care a lot about what happens in the real world. Human eyes are the receiving end of such information exchange; therefore it is impossible to study displays without studying the human visual system. In fact, the design of displays is rather closely coupled with what human eyes are capable of perceiving. For example, we are less interested in building displays that emit light in the invisible spectrum. This dissertation explores how we can augment displays with computation, which takes both display hardware and the human visual system into consideration. Four novel projects on display technologies are included in this dissertation: First, we propose a software-based approach to driving multiview autostereoscopic displays. Our display algorithm can dynamically assign views to hardware display zones based on multiple observers' current head positions, substantially reducing crosstalk and stereo inversion. Second, we present a dense projector array that creates a seamless 3D viewing experience for multiple viewers. We smoothly interpolate the set of viewer heights and distances on a per-vertex basis across the arrays field of view, reducing image distortion, crosstalk, and artifacts from tracking errors. Third, we propose a method for high dynamic range display calibration that takes into account the variation of the chrominance error over luminance. We propose a data structure for enabling efficient representation and querying of the calibration function, which also allows user-guided balancing between memory consumption and the amount of computation. Fourth, we present user studies that demonstrate that the ˜ 60 Hz critical flicker fusion

  18. Rethinking Human-Centered Computing: Finding the Customer and Negotiated Interactions at the Airport

    Science.gov (United States)

    Wales, Roxana; O'Neill, John; Mirmalek, Zara

    2003-01-01

    The breakdown in the air transportation system over the past several years raises an interesting question for researchers: How can we help improve the reliability of airline operations? In offering some answers to this question, we make a statement about Huuman-Centered Computing (HCC). First we offer the definition that HCC is a multi-disciplinary research and design methodology focused on supporting humans as they use technology by including cognitive and social systems, computational tools and the physical environment in the analysis of organizational systems. We suggest that a key element in understanding organizational systems is that there are external cognitive and social systems (customers) as well as internal cognitive and social systems (employees) and that they interact dynamically to impact the organization and its work. The design of human-centered intelligent systems must take this outside-inside dynamic into account. In the past, the design of intelligent systems has focused on supporting the work and improvisation requirements of employees but has often assumed that customer requirements are implicitly satisfied by employee requirements. Taking a customer-centric perspective provides a different lens for understanding this outside-inside dynamic, the work of the organization and the requirements of both customers and employees In this article we will: 1) Demonstrate how the use of ethnographic methods revealed the important outside-inside dynamic in an airline, specifically the consequential relationship between external customer requirements and perspectives and internal organizational processes and perspectives as they came together in a changing environment; 2) Describe how taking a customer centric perspective identifies places where the impact of the outside-inside dynamic is most critical and requires technology that can be adaptive; 3) Define and discuss the place of negotiated interactions in airline operations, identifying how these

  19. Selection of suitable hand gestures for reliable myoelectric human computer interface.

    Science.gov (United States)

    Castro, Maria Claudia F; Arjunan, Sridhar P; Kumar, Dinesh K

    2015-04-09

    Myoelectric controlled prosthetic hand requires machine based identification of hand gestures using surface electromyogram (sEMG) recorded from the forearm muscles. This study has observed that a sub-set of the hand gestures have to be selected for an accurate automated hand gesture recognition, and reports a method to select these gestures to maximize the sensitivity and specificity. Experiments were conducted where sEMG was recorded from the muscles of the forearm while subjects performed hand gestures and then was classified off-line. The performances of ten gestures were ranked using the proposed Positive-Negative Performance Measurement Index (PNM), generated by a series of confusion matrices. When using all the ten gestures, the sensitivity and specificity was 80.0% and 97.8%. After ranking the gestures using the PNM, six gestures were selected and these gave sensitivity and specificity greater than 95% (96.5% and 99.3%); Hand open, Hand close, Little finger flexion, Ring finger flexion, Middle finger flexion and Thumb flexion. This work has shown that reliable myoelectric based human computer interface systems require careful selection of the gestures that have to be recognized and without such selection, the reliability is poor.

  20. Monochromatic computed tomography of the human brain using synchrotron x rays: Technical feasibility

    International Nuclear Information System (INIS)

    Nachaliel, E.; Dilmanian, F.A.; Garrett, R.F.; Thomlinson, W.C.; Chapman, L.D.; Gmuer, N.F.; Lazarz, N.M.; Moulin, H.R.; Rivers, M.L.; Rarback, H.; Stefan, P.M.; Spanne, P.; Luke, P.N.; Pehl, R.; Thompson, A.C.; Miller, M.

    1991-01-01

    A monochromatic computed tomography (CT) scanner is being developed at the X17 superconducting wiggler beamline at the National Synchrotron Light Source (NSLS), Brookhaven National Laboratory, to image the human head and neck. The system configuration is one of a horizontal fan beam and an upright seated rotating subject. The purpose of the project are to demonstrate improvement in the image contrast and in the image quantitative accuracy that can be obtained in monochromatic CT and to apply the system to specific clinical research programs in neuroradiology. This paper describes the first phantom studies carried out with a prototype system, using the dual photon absorptiometry (DPA) method at energies of 20 and 39 Kev. The results show that improvements in image contrast and quantitative accuracy are possible with monochromatic DPA CT. Estimates of the clinical performance of the planned CT system are made on the basis of these initial results

  1. Computer modelling of the chemical speciation of Americium (III) in human body fluids

    International Nuclear Information System (INIS)

    Jiang, Shu-bin; Lei, Jia-rong; Wang, He-yi; Zhong, Zhi-jing; Yang, Yong; Du, Yang

    2008-01-01

    A multi-phase equilibrium model consisted of multi-metal ion and low molecular mass ligands in human body fluid has been constructed to discuss the speciation of Am 3+ in gastric juice, sweat, interstitial fluid, intracellular fluid and urine of human body, respectively. Computer simulations indicated that the major Am(III)P Species were Am 3+ , [Am Cl] 2+ and [AmH 2 PO 4 ] 2+ at pH 4 became dominant with higher pH value when [Am] = 1 x 10 -7 mol/L in gastric juice model and percentage of AmPO 4 increased with [Am]. in sweat system, Am(III) existed with soluble species at pH 4.2∼pH 7.5 when [Am] = 1 x 10 -7 mol/L and Am(III) existed with Am 3+ and [Am OH] 2+ at pH 6.5 when [Am] -10 mol/L or [Am] > 5 x 10 -8 mol/L . With addition of EDTA, the Am(III) existed with soluble [Am EDTA] - whereas the Am(III) existed with insoluble AmPO 4 when [Am] > 1 x 10 -12 mol/L at interstitial fluid. The major Am(III) species was AmPO 4 at pH 7.0 and [Am]=4 x 10 -12 mol/L in intracellular fluid, which implied Am(III) represented strong cell toxicity. The percentage of Am(III) soluble species increased at lower pH hinted that the Am(III), in the form of aerosol, ingested by macrophage, could released into interstitial fluid and bring strong toxicity to skeleton system. The soluble Am(III) species was dominant when pH 4 when pH > 4.5 when [Am] = 1 x 10 -10 Pmol/L in human urine, so it was favorable to excrete Am(III) from kidney by taking acid materials. (author)

  2. Computational modeling of turn-taking dynamics in spoken conversations

    OpenAIRE

    Chowdhury, Shammur Absar

    2017-01-01

    The study of human interaction dynamics has been at the center for multiple research disciplines in- cluding computer and social sciences, conversational analysis and psychology, for over decades. Recent interest has been shown with the aim of designing computational models to improve human-machine interaction system as well as support humans in their decision-making process. Turn-taking is one of the key aspects of conversational dynamics in dyadic conversations and is an integral part of hu...

  3. Twenty Years of Creativity Research in Human-Computer Interaction: Current State and Future Directions

    DEFF Research Database (Denmark)

    Frich Pedersen, Jonas; Biskjaer, Michael Mose; Dalsgaard, Peter

    2018-01-01

    Creativity has been a growing topic within the ACM community since the 1990s. However, no clear overview of this trend has been offered. We present a thorough survey of 998 creativity-related publications in the ACM Digital Library collected using keyword search to determine prevailing approaches......, topics, and characteristics of creativity-oriented Human-Computer Interaction (HCI) research. . A selected sample based on yearly citations yielded 221 publications, which were analyzed using constant comparison analysis. We found that HCI is almost exclusively responsible for creativity......-oriented publications; they focus on collaborative creativity rather than individual creativity; there is a general lack of definition of the term ‘creativity’; empirically based contributions are prevalent; and many publications focus on new tools, often developed by researchers. On this basis, we present three...

  4. Computed aided system for separation and classification of the abnormal erythrocytes in human blood

    Science.gov (United States)

    Wąsowicz, Michał; Grochowski, Michał; Kulka, Marek; Mikołajczyk, Agnieszka; Ficek, Mateusz; Karpieńko, Katarzyna; Cićkiewicz, Maciej

    2017-12-01

    The human peripheral blood consists of cells (red cells, white cells, and platelets) suspended in plasma. In the following research the team assessed an influence of nanodiamond particles on blood elements over various periods of time. The material used in the study consisted of samples taken from ten healthy humans of various age, different blood types and both sexes. The markings were leaded by adding to the blood unmodified diamonds and oxidation modified. The blood was put under an impact of two diamond concentrations: 20μl and 100μl. The amount of abnormal cells increased with time. The percentage of echinocytes as a result of interaction with nanodiamonds in various time intervals for individual specimens was scarce. The impact of the two diamond types had no clinical importance on red blood cells. It is supposed that as a result of longlasting exposure a dehydratation of red cells takes place, because of the function of the cells. The analysis of an influence of nanodiamond particles on blood elements was supported by computer system designed for automatic counting and classification of the Red Blood Cells (RBC). The system utilizes advanced image processing methods for RBCs separation and counting and Eigenfaces method coupled with the neural networks for RBCs classification into normal and abnormal cells purposes.

  5. Computer Skills Training and Readiness to Work with Computers

    Directory of Open Access Journals (Sweden)

    Arnon Hershkovitz

    2016-05-01

    Full Text Available In today’s job market, computer skills are part of the prerequisites for many jobs. In this paper, we report on a study of readiness to work with computers (the dependent variable among unemployed women (N=54 after participating in a unique, web-supported training focused on computer skills and empowerment. Overall, the level of participants’ readiness to work with computers was much higher at the end of the course than it was at its begin-ning. During the analysis, we explored associations between this variable and variables from four categories: log-based (describing the online activity; computer literacy and experience; job-seeking motivation and practice; and training satisfaction. Only two variables were associated with the dependent variable: knowledge post-test duration and satisfaction with content. After building a prediction model for the dependent variable, another log-based variable was highlighted: total number of actions in the course website along the course. Overall, our analyses shed light on the predominance of log-based variables over variables from other categories. These findings might hint at the need of developing new assessment tools for learners and trainees that take into consideration human-computer interaction when measuring self-efficacy variables.

  6. About possibility of temperature trace observing on a human skin through clothes by using computer processing of IR image

    Science.gov (United States)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Shestakov, Ivan L.; Blednov, Roman G.

    2017-05-01

    One of urgent security problems is a detection of objects placed inside the human body. Obviously, for safety reasons one cannot use X-rays for such object detection widely and often. For this purpose, we propose to use THz camera and IR camera. Below we continue a possibility of IR camera using for a detection of temperature trace on a human body. In contrast to passive THz camera using, the IR camera does not allow to see very pronounced the object under clothing. Of course, this is a big disadvantage for a security problem solution based on the IR camera using. To find possible ways for this disadvantage overcoming we make some experiments with IR camera, produced by FLIR Company and develop novel approach for computer processing of images captured by IR camera. It allows us to increase a temperature resolution of IR camera as well as human year effective susceptibility enhancing. As a consequence of this, a possibility for seeing of a human body temperature changing through clothing appears. We analyze IR images of a person, which drinks water and eats chocolate. We follow a temperature trace on human body skin, caused by changing of temperature inside the human body. Some experiments are made with observing of temperature trace from objects placed behind think overall. Demonstrated results are very important for the detection of forbidden objects, concealed inside the human body, by using non-destructive control without using X-rays.

  7. The Human/Machine Humanities: A Proposal

    Directory of Open Access Journals (Sweden)

    Ollivier Dyens

    2016-03-01

    Full Text Available What does it mean to be human in the 21st century? The pull of engineering on every aspect of our lives, the impact of machines on how we represent ourselves, the influence of computers on our understanding of free-will, individuality and species, and the effect of microorganisms on our behaviour are so great that one cannot discourse on humanity and humanities without considering their entanglement with technology and with the multiple new dimensions of reality that it opens up. The future of humanities should take into account AI, bacteria, software, viruses (both organic and inorganic, hardware, machine language, parasites, big data, monitors, pixels, swarms systems and the Internet. One cannot think of humanity and humanities as distinct from technology anymore.

  8. Inferring Human Activity in Mobile Devices by Computing Multiple Contexts.

    Science.gov (United States)

    Chen, Ruizhi; Chu, Tianxing; Liu, Keqiang; Liu, Jingbin; Chen, Yuwei

    2015-08-28

    This paper introduces a framework for inferring human activities in mobile devices by computing spatial contexts, temporal contexts, spatiotemporal contexts, and user contexts. A spatial context is a significant location that is defined as a geofence, which can be a node associated with a circle, or a polygon; a temporal context contains time-related information that can be e.g., a local time tag, a time difference between geographical locations, or a timespan; a spatiotemporal context is defined as a dwelling length at a particular spatial context; and a user context includes user-related information that can be the user's mobility contexts, environmental contexts, psychological contexts or social contexts. Using the measurements of the built-in sensors and radio signals in mobile devices, we can snapshot a contextual tuple for every second including aforementioned contexts. Giving a contextual tuple, the framework evaluates the posteriori probability of each candidate activity in real-time using a Naïve Bayes classifier. A large dataset containing 710,436 contextual tuples has been recorded for one week from an experiment carried out at Texas A&M University Corpus Christi with three participants. The test results demonstrate that the multi-context solution significantly outperforms the spatial-context-only solution. A classification accuracy of 61.7% is achieved for the spatial-context-only solution, while 88.8% is achieved for the multi-context solution.

  9. How should Fitts' Law be applied to human-computer interaction?

    Science.gov (United States)

    Gillan, D. J.; Holden, K.; Adam, S.; Rudisill, M.; Magee, L.

    1992-01-01

    The paper challenges the notion that any Fitts' Law model can be applied generally to human-computer interaction, and proposes instead that applying Fitts' Law requires knowledge of the users' sequence of movements, direction of movement, and typical movement amplitudes as well as target sizes. Two experiments examined a text selection task with sequences of controlled movements (point-click and point-drag). For the point-click sequence, a Fitts' Law model that used the diagonal across the text object in the direction of pointing (rather than the horizontal extent of the text object) as the target size provided the best fit for the pointing time data, whereas for the point-drag sequence, a Fitts' Law model that used the vertical size of the text object as the target size gave the best fit. Dragging times were fitted well by Fitts' Law models that used either the vertical or horizontal size of the terminal character in the text object. Additional results of note were that pointing in the point-click sequence was consistently faster than in the point-drag sequence, and that pointing in either sequence was consistently faster than dragging. The discussion centres around the need to define task characteristics before applying Fitts' Law to an interface design or analysis, analyses of pointing and of dragging, and implications for interface design.

  10. Promises and perils of computational thinking

    DEFF Research Database (Denmark)

    Gad, Christopher; Douglas-Jones, Rachel

    slippage within the computational thinking concept, as it moves between the descriptive and promotional modes described above. We consider the implications of this slippage through various conceptual apparatuses available within STS – since these approaches are already critical of distinctions between......Proponents of computational thinking use the concept to account for what they perceive as important generalizable aspects of human thought (Wing 2011, National Research Council USA 2010, 2011). Simultaneously, the concept is employed to designate an ambitious pedagogical programme, in which...... computational thinking can be taught as a skill for the digitally literate 21st century (ibid.). As such, CT is seen both as an innate human capacity and a programme for developing future oriented skills - both for individuals and for populations at large. This paper explores what we perceive as conceptual...

  11. Introduction to This Special Issue on Context-Aware Computing.

    Science.gov (United States)

    Moran, Thomas P.; Dourish, Paul

    2001-01-01

    Discusses pervasive, or ubiquitous, computing; explains the notion of context; and defines context-aware computing as the key to disperse and enmesh computation into our lives. Considers context awareness in human-computer interaction and describes the broad topic areas of the essays included in this special issue. (LRW)

  12. Hybrid soft computing systems for electromyographic signals analysis: a review.

    Science.gov (United States)

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  13. Computational Fluid Dynamics Ventilation Study for the Human Powered Centrifuge at the International Space Station

    Science.gov (United States)

    Son, Chang H.

    2012-01-01

    The Human Powered Centrifuge (HPC) is a facility that is planned to be installed on board the International Space Station (ISS) to enable crew exercises under the artificial gravity conditions. The HPC equipment includes a "bicycle" for long-term exercises of a crewmember that provides power for rotation of HPC at a speed of 30 rpm. The crewmember exercising vigorously on the centrifuge generates the amount of carbon dioxide of about two times higher than a crewmember in ordinary conditions. The goal of the study is to analyze the airflow and carbon dioxide distribution within Pressurized Multipurpose Module (PMM) cabin when HPC is operating. A full unsteady formulation is used for airflow and CO2 transport CFD-based modeling with the so-called sliding mesh concept when the HPC equipment with the adjacent Bay 4 cabin volume is considered in the rotating reference frame while the rest of the cabin volume is considered in the stationary reference frame. The rotating part of the computational domain includes also a human body model. Localized effects of carbon dioxide dispersion are examined. Strong influence of the rotating HPC equipment on the CO2 distribution detected is discussed.

  14. Neuroscience, brains, and computers

    Directory of Open Access Journals (Sweden)

    Giorno Maria Innocenti

    2013-07-01

    Full Text Available This paper addresses the role of the neurosciences in establishing what the brain is and how states of the brain relate to states of the mind. The brain is viewed as a computational deviceperforming operations on symbols. However, the brain is a special purpose computational devicedesigned by evolution and development for survival and reproduction, in close interaction with theenvironment. The hardware of the brain (its structure is very different from that of man-made computers.The computational style of the brain is also very different from traditional computers: the computationalalgorithms, instead of being sets of external instructions, are embedded in brain structure. Concerningthe relationships between brain and mind a number of questions lie ahead. One of them is why andhow, only the human brain grasped the notion of God, probably only at the evolutionary stage attainedby Homo sapiens.

  15. Big data challenges in decoding cortical activity in a human with quadriplegia to inform a brain computer interface.

    Science.gov (United States)

    Friedenberg, David A; Bouton, Chad E; Annetta, Nicholas V; Skomrock, Nicholas; Mingming Zhang; Schwemmer, Michael; Bockbrader, Marcia A; Mysiw, W Jerry; Rezai, Ali R; Bresler, Herbert S; Sharma, Gaurav

    2016-08-01

    Recent advances in Brain Computer Interfaces (BCIs) have created hope that one day paralyzed patients will be able to regain control of their paralyzed limbs. As part of an ongoing clinical study, we have implanted a 96-electrode Utah array in the motor cortex of a paralyzed human. The array generates almost 3 million data points from the brain every second. This presents several big data challenges towards developing algorithms that should not only process the data in real-time (for the BCI to be responsive) but are also robust to temporal variations and non-stationarities in the sensor data. We demonstrate an algorithmic approach to analyze such data and present a novel method to evaluate such algorithms. We present our methodology with examples of decoding human brain data in real-time to inform a BCI.

  16. Creating Communications, Computing, and Networking Technology Development Road Maps for Future NASA Human and Robotic Missions

    Science.gov (United States)

    Bhasin, Kul; Hayden, Jeffrey L.

    2005-01-01

    For human and robotic exploration missions in the Vision for Exploration, roadmaps are needed for capability development and investments based on advanced technology developments. A roadmap development process was undertaken for the needed communications, and networking capabilities and technologies for the future human and robotics missions. The underlying processes are derived from work carried out during development of the future space communications architecture, an d NASA's Space Architect Office (SAO) defined formats and structures for accumulating data. Interrelationships were established among emerging requirements, the capability analysis and technology status, and performance data. After developing an architectural communications and networking framework structured around the assumed needs for human and robotic exploration, in the vicinity of Earth, Moon, along the path to Mars, and in the vicinity of Mars, information was gathered from expert participants. This information was used to identify the capabilities expected from the new infrastructure and the technological gaps in the way of obtaining them. We define realistic, long-term space communication architectures based on emerging needs and translate the needs into interfaces, functions, and computer processing that will be required. In developing our roadmapping process, we defined requirements for achieving end-to-end activities that will be carried out by future NASA human and robotic missions. This paper describes: 10 the architectural framework developed for analysis; 2) our approach to gathering and analyzing data from NASA, industry, and academia; 3) an outline of the technology research to be done, including milestones for technology research and demonstrations with timelines; and 4) the technology roadmaps themselves.

  17. Human computer interaction and communication aids for hearing-impaired, deaf and deaf-blind people: Introduction to the special thematic session

    DEFF Research Database (Denmark)

    Bothe, Hans-Heinrich

    2008-01-01

    This paper gives ail overview and extends the Special Thematic Session (STS) oil research and development of technologies for hearing-impaired, deaf, and deaf-blind people. The topics of the session focus oil special equipment or services to improve communication and human computer interaction....... The papers are related to visual communication using captions, sign language, speech-reading, to vibro-tactile stimulation, or to general services for hearing-impaired persons....

  18. Operational characteristics optimization of human-computer system

    Directory of Open Access Journals (Sweden)

    Zulquernain Mallick

    2010-09-01

    Full Text Available Computer operational parameters are having vital influence on the operators efficiency from readability viewpoint. Four parameters namely font, text/background color, viewing angle and viewing distance are analyzed. The text reading task, in the form of English text, was presented on the computer screen to the participating subjects and their performance, measured in terms of number of words read per minute (NWRPM, was recorded. For the purpose of optimization, the Taguchi method is used to find the optimal parameters to maximize operators’ efficiency for performing readability task. Two levels of each parameter have been considered in this study. An orthogonal array, the signal-to-noise (S/N ratio and the analysis of variance (ANOVA were employed to investigate the operators’ performance/efficiency. Results showed that Times Roman font, black text on white background, 40 degree viewing angle and 60 cm viewing distance, the subjects were quite comfortable, efficient and read maximum number of words per minute. Text/background color was dominant parameter with a percentage contribution of 76.18% towards the laid down objective followed by font type at 18.17%, viewing distance 7.04% and viewing angle 0.58%. Experimental results are provided to confirm the effectiveness of this approach.

  19. Prospects of a mathematical theory of human behavior in complex man-machine systems tasks. [time sharing computer analogy of automobile driving

    Science.gov (United States)

    Johannsen, G.; Rouse, W. B.

    1978-01-01

    A hierarchy of human activities is derived by analyzing automobile driving in general terms. A structural description leads to a block diagram and a time-sharing computer analogy. The range of applicability of existing mathematical models is considered with respect to the hierarchy of human activities in actual complex tasks. Other mathematical tools so far not often applied to man machine systems are also discussed. The mathematical descriptions at least briefly considered here include utility, estimation, control, queueing, and fuzzy set theory as well as artificial intelligence techniques. Some thoughts are given as to how these methods might be integrated and how further work might be pursued.

  20. SU-F-J-174: A Series of Computational Human Phantoms in DICOM-RT Format for Normal Tissue Dose Reconstruction in Epidemiological Studies

    International Nuclear Information System (INIS)

    Pyakuryal, A; Moroz, B; Lee, C; Pelletier, C; Jung, J; Lee, C

    2016-01-01

    Purpose: Epidemiological studies of second cancer risk in radiotherapy patients often require individualized dose estimates of normal tissues. Prior to 3D conformal radiation therapy planning, patient anatomy information was mostly limited to 2D radiological images or not even available. Generic patient CT images are often used in commercial radiotherapy treatment planning system (TPS) to reconstruct normal tissue doses. The objective of the current work was to develop a series of reference size computational human phantoms in DICOM-RT format for direct use in dose reconstruction in TPS. Methods: Contours of 93 organs and tissues were extracted from a series of pediatric and adult hybrid computational human phantoms (newborn, 1-, 5-, 10-, 15-year-old, and adult males and females) using Rhinoceros software. A MATLAB script was created to convert the contours into the DICOM-RT structure format. The simulated CT images with the resolution of 1×1×3 mm3 were also generated from the binary phantom format and coupled with the DICOM-structure files. Accurate volumes of the organs were drawn in the format using precise delineation of the contours in converted format. Due to complex geometry of organs, higher resolution (1×1×1 mm3) was found to be more efficient in the conversion of newborn and 1-year-old phantoms. Results: Contour sets were efficiently converted into DICOM-RT structures in relatively short time (about 30 minutes for each phantom). A good agreement was observed in the volumes between the original phantoms and the converted contours for large organs (NRMSD<1.0%) and small organs (NRMSD<7.7%). Conclusion: A comprehensive series of computational human phantoms in DICOM-RT format was created to support epidemiological studies of second cancer risks in radiotherapy patients. We confirmed the DICOM-RT phantoms were successfully imported into the TPS programs of major vendors.

  1. SU-F-J-174: A Series of Computational Human Phantoms in DICOM-RT Format for Normal Tissue Dose Reconstruction in Epidemiological Studies

    Energy Technology Data Exchange (ETDEWEB)

    Pyakuryal, A; Moroz, B [National Cancer Institute, National Institutes of Health, Rockville, MD (United States); Lee, C [University of Michigan, Ann Arbor, MI (United States); Pelletier, C; Jung, J [East Carolina University Greenville, NC (United States); Lee, C [National Cancer Institute, Rockville, MD (United States)

    2016-06-15

    Purpose: Epidemiological studies of second cancer risk in radiotherapy patients often require individualized dose estimates of normal tissues. Prior to 3D conformal radiation therapy planning, patient anatomy information was mostly limited to 2D radiological images or not even available. Generic patient CT images are often used in commercial radiotherapy treatment planning system (TPS) to reconstruct normal tissue doses. The objective of the current work was to develop a series of reference size computational human phantoms in DICOM-RT format for direct use in dose reconstruction in TPS. Methods: Contours of 93 organs and tissues were extracted from a series of pediatric and adult hybrid computational human phantoms (newborn, 1-, 5-, 10-, 15-year-old, and adult males and females) using Rhinoceros software. A MATLAB script was created to convert the contours into the DICOM-RT structure format. The simulated CT images with the resolution of 1×1×3 mm3 were also generated from the binary phantom format and coupled with the DICOM-structure files. Accurate volumes of the organs were drawn in the format using precise delineation of the contours in converted format. Due to complex geometry of organs, higher resolution (1×1×1 mm3) was found to be more efficient in the conversion of newborn and 1-year-old phantoms. Results: Contour sets were efficiently converted into DICOM-RT structures in relatively short time (about 30 minutes for each phantom). A good agreement was observed in the volumes between the original phantoms and the converted contours for large organs (NRMSD<1.0%) and small organs (NRMSD<7.7%). Conclusion: A comprehensive series of computational human phantoms in DICOM-RT format was created to support epidemiological studies of second cancer risks in radiotherapy patients. We confirmed the DICOM-RT phantoms were successfully imported into the TPS programs of major vendors.

  2. Dual-Energy Computed Tomography Gemstone Spectral Imaging: A Novel Technique to Determine Human Cardiac Calculus Composition.

    Science.gov (United States)

    Cheng, Ching-Li; Chang, Hsiao-Huang; Ko, Shih-Chi; Huang, Pei-Jung; Lin, Shan-Yang

    2016-01-01

    Understanding the chemical composition of any calculus in different human organs is essential for choosing the best treatment strategy for patients. The purpose of this study was to assess the capability of determining the chemical composition of a human cardiac calculus using gemstone spectral imaging (GSI) mode on a single-source dual-energy computed tomography (DECT) in vitro. The cardiac calculus was directly scanned on the Discovery CT750 HD FREEdom Edition using GSI mode, in vitro. A portable fiber-optic Raman spectroscopy was also applied to verify the quantitative accuracy of the DECT measurements. The results of spectral DECT measurements indicate that effective Z values in 3 designated positions located in this calculus were 15.02 to 15.47, which are close to values of 15.74 to 15.86, corresponding to the effective Z values of calcium apatite and hydroxyapatite. The Raman spectral data were also reflected by the predominant Raman peak at 960 cm for hydroxyapatite and the minor peak at 875 cm for calcium apatite. A potential single-source DECT with GSI mode was first used to examine the morphological characteristics and chemical compositions of a giant human cardiac calculus, in vitro. The CT results were consistent with the Raman spectral data, suggesting that spectral CT imaging techniques could be accurately used to diagnose and characterize the compositional materials in the cardiac calculus.

  3. [The current state of the brain-computer interface problem].

    Science.gov (United States)

    Shurkhay, V A; Aleksandrova, E V; Potapov, A A; Goryainov, S A

    2015-01-01

    It was only 40 years ago that the first PC appeared. Over this period, rather short in historical terms, we have witnessed the revolutionary changes in lives of individuals and the entire society. Computer technologies are tightly connected with any field, either directly or indirectly. We can currently claim that computers are manifold superior to a human mind in terms of a number of parameters; however, machines lack the key feature: they are incapable of independent thinking (like a human). However, the key to successful development of humankind is collaboration between the brain and the computer rather than competition. Such collaboration when a computer broadens, supplements, or replaces some brain functions is known as the brain-computer interface. Our review focuses on real-life implementation of this collaboration.

  4. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn

    2009-01-01

    An increased interest in the notion of place has evolved in interaction design based on the proliferation of wireless infrastructures, developments in digital media, and a ‘spatial turn’ in computing. In this article, place-specific computing is suggested as a genre of interaction design that add......An increased interest in the notion of place has evolved in interaction design based on the proliferation of wireless infrastructures, developments in digital media, and a ‘spatial turn’ in computing. In this article, place-specific computing is suggested as a genre of interaction design...... that addresses the shaping of interactions among people, place-specific resources and global socio-technical networks, mediated by digital technology, and influenced by the structuring conditions of place. The theoretical grounding for place-specific computing is located in the meeting between conceptions...... of place in human geography and recent research in interaction design focusing on embodied interaction. Central themes in this grounding revolve around place and its relation to embodiment and practice, as well as the social, cultural and material aspects conditioning the enactment of place. Selected...

  5. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  6. Vascular tissue engineering by computer-aided laser micromachining.

    Science.gov (United States)

    Doraiswamy, Anand; Narayan, Roger J

    2010-04-28

    Many conventional technologies for fabricating tissue engineering scaffolds are not suitable for fabricating scaffolds with patient-specific attributes. For example, many conventional technologies for fabricating tissue engineering scaffolds do not provide control over overall scaffold geometry or over cell position within the scaffold. In this study, the use of computer-aided laser micromachining to create scaffolds for vascular tissue networks was investigated. Computer-aided laser micromachining was used to construct patterned surfaces in agarose or in silicon, which were used for differential adherence and growth of cells into vascular tissue networks. Concentric three-ring structures were fabricated on agarose hydrogel substrates, in which the inner ring contained human aortic endothelial cells, the middle ring contained HA587 human elastin and the outer ring contained human aortic vascular smooth muscle cells. Basement membrane matrix containing vascular endothelial growth factor and heparin was to promote proliferation of human aortic endothelial cells within the vascular tissue networks. Computer-aided laser micromachining provides a unique approach to fabricate small-diameter blood vessels for bypass surgery as well as other artificial tissues with complex geometries.

  7. Development and validation of a new dynamic computer-controlled model of the human stomach and small intestine.

    Science.gov (United States)

    Guerra, Aurélie; Denis, Sylvain; le Goff, Olivier; Sicardi, Vincent; François, Olivier; Yao, Anne-Françoise; Garrait, Ghislain; Manzi, Aimé Pacifique; Beyssac, Eric; Alric, Monique; Blanquet-Diot, Stéphanie

    2016-06-01

    For ethical, regulatory, and economic reasons, in vitro human digestion models are increasingly used as an alternative to in vivo assays. This study aims to present the new Engineered Stomach and small INtestine (ESIN) model and its validation for pharmaceutical applications. This dynamic computer-controlled system reproduces, according to in vivo data, the complex physiology of the human stomach and small intestine, including pH, transit times, chyme mixing, digestive secretions, and passive absorption of digestion products. Its innovative design allows a progressive meal intake and the differential gastric emptying of solids and liquids. The pharmaceutical behavior of two model drugs (paracetamol immediate release form and theophylline sustained release tablet) was studied in ESIN during liquid digestion. The results were compared to those found with a classical compendial method (paddle apparatus) and in human volunteers. Paracetamol and theophylline tablets showed similar absorption profiles in ESIN and in healthy subjects. For theophylline, a level A in vitro-in vivo correlation could be established between the results obtained in ESIN and in humans. Interestingly, using a pharmaceutical basket, the swelling and erosion of the theophylline sustained release form was followed during transit throughout ESIN. ESIN emerges as a relevant tool for pharmaceutical studies but once further validated may find many other applications in nutritional, toxicological, and microbiological fields. Biotechnol. Bioeng. 2016;113: 1325-1335. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  8. Flaws in current human training protocols for spontaneous Brain-Computer Interfaces: lessons learned from instructional design

    Directory of Open Access Journals (Sweden)

    Fabien eLotte

    2013-09-01

    Full Text Available While recent research on Brain-Computer Interfaces (BCI has highlighted their potential for many applications, they remain barely used outside laboratories. The main reason is their lack of robustness. Indeed, with current BCI, mental state recognition is usually slow and often incorrect. Spontaneous BCI (i.e., mental imagery-based BCI often rely on mutual learning efforts by the user and the machine, with BCI users learning to produce stable EEG patterns (spontaneous BCI control being widely acknowledged as a skill while the computer learns to automatically recognize these EEG patterns, using signal processing. Most research so far was focused on signal processing, mostly neglecting the human in the loop. However, how well the user masters the BCI skill is also a key element explaining BCI robustness. Indeed, if the user is not able to produce stable and distinct EEG patterns, then no signal processing algorithm would be able to recognize them. Unfortunately, despite the importance of BCI training protocols, they have been scarcely studied so far, and used mostly unchanged for years.In this paper, we advocate that current human training approaches for spontaneous BCI are most likely inappropriate. We notably study instructional design literature in order to identify the key requirements and guidelines for a successful training procedure that promotes a good and efficient skill learning. This literature study highlights that current spontaneous BCI user training procedures satisfy very few of these requirements and hence are likely to be suboptimal. We therefore identify the flaws in BCI training protocols according to instructional design principles, at several levels: in the instructions provided to the user, in the tasks he/she has to perform, and in the feedback provided. For each level, we propose new research directions that are theoretically expected to address some of these flaws and to help users learn the BCI skill more efficiently.

  9. Workplace Spirituality, Computer Self-Efficacy And Emotional ...

    African Journals Online (AJOL)

    There should therefore, be an ongoing facilitation of self-development for lecturers through opportunities for computer skills acquisition, role identification and role performance to manage emotional labour and the cold fact that spirituality is a key player in human functioning. Keywords: Workplace Spirituality, Computer ...

  10. Concerned with computer games

    DEFF Research Database (Denmark)

    Chimiri, Niklas Alexander; Andersen, Mads Lund; Jensen, Tine

    2018-01-01

    In this chapter, we focus on a particular matter of concern within computer gaming practices: the concern of being or not being a gamer. This matter of concern emerged from within our collective investigations of gaming practices across various age groups. The empirical material under scrutiny...... was generated across a multiplicity of research projects, predominantly conducted in Denmark. The question of being versus not being a gamer, we argue, exemplifies interesting enactments of how computer game players become both concerned with and concerned about their gaming practices. As a collective...... of researchers writing from the field of psychology and inspired by neo-materialist theories, we are particularly concerned with (human) subjectivity and processes of social and subjective becoming. Our empirical examples show that conerns/worries about computer games and being engaged with computer game...

  11. Human Systems Design Criteria

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1982-01-01

    This paper deals with the problem of designing more humanised computer systems. This problem can be formally described as the need for defining human design criteria, which — if used in the design process - will secure that the systems designed get the relevant qualities. That is not only...... the necessary functional qualities but also the needed human qualities. The author's main argument is, that the design process should be a dialectical synthesis of the two points of view: Man as a System Component, and System as Man's Environment. Based on a man's presentation of the state of the art a set...... of design criteria is suggested and their relevance discussed. The point is to focus on the operator rather than on the computer. The crucial question is not to program the computer to work on its own conditions, but to “program” the operator to function on human conditions....

  12. Exercises in molecular computing.

    Science.gov (United States)

    Stojanovic, Milan N; Stefanovic, Darko; Rudchenko, Sergei

    2014-06-17

    CONSPECTUS: The successes of electronic digital logic have transformed every aspect of human life over the last half-century. The word "computer" now signifies a ubiquitous electronic device, rather than a human occupation. Yet evidently humans, large assemblies of molecules, can compute, and it has been a thrilling challenge to develop smaller, simpler, synthetic assemblies of molecules that can do useful computation. When we say that molecules compute, what we usually mean is that such molecules respond to certain inputs, for example, the presence or absence of other molecules, in a precisely defined but potentially complex fashion. The simplest way for a chemist to think about computing molecules is as sensors that can integrate the presence or absence of multiple analytes into a change in a single reporting property. Here we review several forms of molecular computing developed in our laboratories. When we began our work, combinatorial approaches to using DNA for computing were used to search for solutions to constraint satisfaction problems. We chose to work instead on logic circuits, building bottom-up from units based on catalytic nucleic acids, focusing on DNA secondary structures in the design of individual circuit elements, and reserving the combinatorial opportunities of DNA for the representation of multiple signals propagating in a large circuit. Such circuit design directly corresponds to the intuition about sensors transforming the detection of analytes into reporting properties. While this approach was unusual at the time, it has been adopted since by other groups working on biomolecular computing with different nucleic acid chemistries. We created logic gates by modularly combining deoxyribozymes (DNA-based enzymes cleaving or combining other oligonucleotides), in the role of reporting elements, with stem-loops as input detection elements. For instance, a deoxyribozyme that normally exhibits an oligonucleotide substrate recognition region is

  13. [Computational prediction of human immunodeficiency resistance to reverse transcriptase inhibitors].

    Science.gov (United States)

    Tarasova, O A; Filimonov, D A; Poroikov, V V

    2017-10-01

    Human immunodeficiency virus (HIV) causes acquired immunodeficiency syndrome (AIDS) and leads to over one million of deaths annually. Highly active antiretroviral treatment (HAART) is a gold standard in the HIV/AIDS therapy. Nucleoside and non-nucleoside inhibitors of HIV reverse transcriptase (RT) are important component of HAART, but their effect depends on the HIV susceptibility/resistance. HIV resistance mainly occurs due to mutations leading to conformational changes in the three-dimensional structure of HIV RT. The aim of our work was to develop and test a computational method for prediction of HIV resistance associated with the mutations in HIV RT. Earlier we have developed a method for prediction of HIV type 1 (HIV-1) resistance; it is based on the usage of position-specific descriptors. These descriptors are generated using the particular amino acid residue and its position; the position of certain residue is determined in a multiple alignment. The training set consisted of more than 1900 sequences of HIV RT from the Stanford HIV Drug Resistance database; for these HIV RT variants experimental data on their resistance to ten inhibitors are presented. Balanced accuracy of prediction varies from 80% to 99% depending on the method of classification (support vector machine, Naive Bayes, random forest, convolutional neural networks) and the drug, resistance to which is obtained. Maximal balanced accuracy was obtained for prediction of resistance to zidovudine, stavudine, didanosine and efavirenz by the random forest classifier. Average accuracy of prediction is 89%.

  14. Computational Modeling of Human Multiple-Task Performance

    National Research Council Canada - National Science Library

    Kieras, David E; Meyer, David

    2005-01-01

    This is the final report for a project that was a continuation of an earlier, long-term project on the development and validation of the EPIC cognitive architecture for modeling human cognition and performance...

  15. Tracking the PhD Students' Daily Computer Use

    Science.gov (United States)

    Sim, Kwong Nui; van der Meer, Jacques

    2015-01-01

    This study investigated PhD students' computer activities in their daily research practice. Software that tracks computer usage (Manic Time) was installed on the computers of nine PhD students, who were at their early, mid and final stage in doing their doctoral research in four different discipline areas (Commerce, Humanities, Health Sciences and…

  16. Future Computing Technology (3/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Computing of the future will be affected by a number of fundamental technologies in development today, many of which are already on the way to becoming commercialized. In this series of lectures, we will discuss hardware and software development that will become mainstream in the timeframe of a few years and how they will shape or change the computing landscape - commercial and personal alike. Topics range from processor and memory aspects, programming models and the limits of artificial intelligence, up to end-user interaction with wearables or e-textiles. We discuss the impact of these technologies on the art of programming, the data centres of the future and daily life. On the third day of the Future Computing Technology series, we will touch on societal aspects of the future of computing. Our perception of computers may at time seem passive, but in reality we are a vital chain of the feedback loop. Human-computer interaction, innovative forms of computers, privacy, process automation, threats and medica...

  17. Time-of-Flight Cameras in Computer Graphics

    DEFF Research Database (Denmark)

    Kolb, Andreas; Barth, Erhardt; Koch, Reinhard

    2010-01-01

    Computer Graphics, Computer Vision and Human Machine Interaction (HMI). These technologies are starting to have an impact on research and commercial applications. The upcoming generation of ToF sensors, however, will be even more powerful and will have the potential to become “ubiquitous real-time geometry...

  18. Face Recognition in Humans and Machines

    Science.gov (United States)

    O'Toole, Alice; Tistarelli, Massimo

    The study of human face recognition by psychologists and neuroscientists has run parallel to the development of automatic face recognition technologies by computer scientists and engineers. In both cases, there are analogous steps of data acquisition, image processing, and the formation of representations that can support the complex and diverse tasks we accomplish with faces. These processes can be understood and compared in the context of their neural and computational implementations. In this chapter, we present the essential elements of face recognition by humans and machines, taking a perspective that spans psychological, neural, and computational approaches. From the human side, we overview the methods and techniques used in the neurobiology of face recognition, the underlying neural architecture of the system, the role of visual attention, and the nature of the representations that emerges. From the computational side, we discuss face recognition technologies and the strategies they use to overcome challenges to robust operation over viewing parameters. Finally, we conclude the chapter with a look at some recent studies that compare human and machine performances at face recognition.

  19. NURBS-based 3-d anthropomorphic computational phantoms for radiation dosimetry applications

    International Nuclear Information System (INIS)

    Lee, Choonsik; Lodwick, Daniel; Lee, Choonik; Bolch, Wesley E.

    2007-01-01

    Computational anthropomorphic phantoms are computer models used in the evaluation of absorbed dose distributions within the human body. Currently, two classes of the computational phantoms have been developed and widely utilised for dosimetry calculation: (1) stylized (equation-based) and (2) voxel (image-based) phantoms describing human anatomy through the use of mathematical surface equations and 3-D voxel matrices, respectively. However, stylized phantoms have limitations in defining realistic organ contours and positioning as compared to voxel phantoms, which are themselves based on medical images of human subjects. In turn, voxel phantoms that have been developed through medical image segmentation have limitations in describing organs that are presented in low contrast within either magnetic resonance or computed tomography image. The present paper reviews the advantages and disadvantages of these existing classes of computational phantoms and introduces a hybrid approach to a computational phantom construction based on non-uniform rational B-Spline (NURBS) surface animation technology that takes advantage of the most desirable features of the former two phantom types. (authors)

  20. Mathematics and Computer Science: The Interplay

    OpenAIRE

    Madhavan, Veni CE

    2005-01-01

    Mathematics has been an important intellectual preoccupation of man for a long time. Computer science as a formal discipline is about seven decades young. However, one thing in common between all users and producers of mathematical thought is the almost involuntary use of computing. In this article, we bring to fore the many close connections and parallels between the two sciences of mathematics and computing. We show that, unlike in the other branches of human inquiry where mathematics is me...

  1. Affordances and Cognitive Walkthrough for Analyzing Human-Virtual Human Interaction

    NARCIS (Netherlands)

    Ruttkay, Z.M.; op den Akker, Hendrikus J.A.; Esposito, A.; Bourbakis, N.; Avouris, N.; Hatzilygeroudis, I.

    2008-01-01

    This study investigates how the psychological notion of affordance, known from human computer interface design, can be adopted for the analysis and design of communication of a user with a Virtual Human (VH), as a novel interface. We take as starting point the original notion of affordance, used to

  2. Neural and cortisol responses during play with human and computer partners in children with autism

    Science.gov (United States)

    Edmiston, Elliot Kale; Merkle, Kristen

    2015-01-01

    Children with autism spectrum disorder (ASD) exhibit impairment in reciprocal social interactions, including play, which can manifest as failure to show social preference or discrimination between social and nonsocial stimuli. To explore mechanisms underlying these deficits, we collected salivary cortisol from 42 children 8–12 years with ASD or typical development during a playground interaction with a confederate child. Participants underwent functional MRI during a prisoner’s dilemma game requiring cooperation or defection with a human (confederate) or computer partner. Search region of interest analyses were based on previous research (e.g. insula, amygdala, temporal parietal junction—TPJ). There were significant group differences in neural activation based on partner and response pattern. When playing with a human partner, children with ASD showed limited engagement of a social salience brain circuit during defection. Reduced insula activation during defection in the ASD children relative to TD children, regardless of partner type, was also a prominent finding. Insula and TPJ BOLD during defection was also associated with stress responsivity and behavior in the ASD group under playground conditions. Children with ASD engage social salience networks less than TD children during conditions of social salience, supporting a fundamental disturbance of social engagement. PMID:25552572

  3. COMPUTATIONAL MODELING OF AIRFLOW IN NONREGULAR SHAPED CHANNELS

    Directory of Open Access Journals (Sweden)

    A. A. Voronin

    2013-05-01

    Full Text Available The basic approaches to computational modeling of airflow in the human nasal cavity are analyzed. Different models of turbulent flow which may be used in order to calculate air velocity and pressure are discussed. Experimental measurement results of airflow temperature are illustrated. Geometrical model of human nasal cavity reconstructed from computer-aided tomography scans and numerical simulation results of airflow inside this model are also given. Spatial distributions of velocity and temperature for inhaled and exhaled air are shown.

  4. Animal-Computer Interaction (ACI) : An analysis, a perspective, and guidelines

    NARCIS (Netherlands)

    van den Broek, E.L.

    2016-01-01

    Animal-Computer Interaction (ACI)’s founding elements are discussed in relation to its overarching discipline Human-Computer Interaction (HCI). Its basic dimensions are identified: agent, computing machinery, and interaction, and their levels of processing: perceptual, cognitive, and affective.

  5. Practical applications of soft computing in engineering

    CERN Document Server

    2001-01-01

    Soft computing has been presented not only with the theoretical developments but also with a large variety of realistic applications to consumer products and industrial systems. Application of soft computing has provided the opportunity to integrate human-like vagueness and real-life uncertainty into an otherwise hard computer program. This book highlights some of the recent developments in practical applications of soft computing in engineering problems. All the chapters have been sophisticatedly designed and revised by international experts to achieve wide but in-depth coverage. Contents: Au

  6. Design of Computer Fault Diagnosis and Troubleshooting System ...

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-12-01

    Dec 1, 2013 ... 2Department of Computer Science. Cross River University ... owners in dealing with their computer problems especially when the time is limited and human expert is not ..... questions with the system responding to each of the ...

  7. Usability Studies in Virtual and Traditional Computer Aided Design Environments for Fault Identification

    Science.gov (United States)

    2017-08-08

    communicate their subjective opinions. Keywords: Usability Analysis; CAVETM (Cave Automatic Virtual Environments); Human Computer Interface (HCI...the differences in interaction when compared with traditional human computer interfaces. This paper provides analysis via usability study methods

  8. Human-Computer Interaction Software: Lessons Learned, Challenges Ahead

    Science.gov (United States)

    1989-01-01

    domain communi- Iatelligent s t s s Me cation. Users familiar with problem Inteligent support systes. High-func- anddomains but inxperienced with comput...8217i. April 1987, pp. 7.3-78. His research interests include artificial intel- Creating better HCI softw-are will have a 8. S.K Catrd. I.P. Moran. arid

  9. Computational anthropomorphic phantoms for radiation protection dosimetry: evolution and prospects

    International Nuclear Information System (INIS)

    Lee, Choonsik; Lee, Jaiki

    2006-01-01

    Computational anthropomorphic phantoms are computer models of human anatomy used in the calculation of radiation dose distribution in the human body upon exposure to a radiation source. Depending on the manner to represent human anatomy, they are categorized into two classes: stylized and tomographic phantoms. Stylized phantoms, which have mainly been developed at the Oak Ridge National Laboratory (ORNL), describe human anatomy by using simple mathematical equations of analytical geometry. Several improved stylized phantoms such as male and female adults, pediatric series, and enhanced organ models have been developed following the first hermaphrodite adult stylized phantom, Medical Internal Radiation Dose (MIRD)-5 phantom. Although stylized phantoms have significantly contributed to dosimetry calculation, they provide only approximations of the true anatomical features of the human body and the resulting organ dose distribution. An alternative class of computational phantom, the tomographic phantom, is based upon three-dimensional imaging techniques such as Magnetic Resonance (MR) imaging and Computed Tomography (CT). The tomographic phantoms represent the human anatomy with a large number of voxels that are assigned tissue type and organ identity. To date, a total of around 30 tomographic phantoms including male and female adults, pediatric phantoms, and even a pregnant female, have been developed and utilized for realistic radiation dosimetry calculation. They are based on MRI/CT images or sectional color photos from patients, volunteers or cadavers. Several investigators have compared tomographic phantoms with stylized phantoms, and demonstrated the superiority of tomographic phantoms in terms of realistic anatomy and dosimetry calculation. This paper summarizes the history and current status of both stylized and tomographic phantoms, including Korean computational phantoms. Advantages, limitations, and future prospects are also discussed

  10. 78 FR 73195 - Privacy Act of 1974: CMS Computer Matching Program Match No. 2013-01; HHS Computer Matching...

    Science.gov (United States)

    2013-12-05

    ... 1974: CMS Computer Matching Program Match No. 2013-01; HHS Computer Matching Program Match No. 1312 AGENCY: Centers for Medicare & Medicaid Services (CMS), Department of Health and Human Services (HHS... Privacy Act of 1974 (5 U.S.C. 552a), as amended, this notice announces the renewal of a CMP that CMS plans...

  11. Historical Overview, Current Status, and Future Trends in Human-Computer Interfaces for Process Control

    International Nuclear Information System (INIS)

    Owre, Fridtjov

    2003-01-01

    Approximately 25 yr ago, the first computer-based process control systems, including computer-generated displays, appeared. It is remarkable how slowly the human-computer interfaces (HCI's) of such systems have developed over the years. The display design approach in those early days had its roots in the topology of the process. Usually, the information came from the piping and instrumentation diagrams. Later, some important additional functions were added to the basic system, such as alarm and trend displays. Today, these functions are still the basic ones, and the end-user displays have not changed much except for improved display quality in terms of colors, font types and sizes, resolution, and object shapes, resulting from improved display hardware.Today, there are two schools of display design competing for supremacy in the process control segment of the HCI community. One can be characterized by extension and integration of current practice, while the other is more revolutionary.The extension of the current practice approach can be described in terms of added system functionality and integration. This means that important functions for the plant operator - such as signal validation, plant overview information, safety parameter displays, procedures, prediction of future states, and plant performance optimization - are added to the basic functions and integrated in a total unified HCI for the plant operator.The revolutionary approach, however, takes as its starting point the design process itself. The functioning of the plant is described in terms of the plant goals and subgoals, as well as the means available to reach these goals. Then, displays are designed representing this functional structure - in clear contrast to the earlier plant topology representation. Depending on the design approach used, the corresponding displays have various designations, e.g., function-oriented, task-oriented, or ecological displays.This paper gives a historical overview of past

  12. Image based Monte Carlo modeling for computational phantom

    International Nuclear Information System (INIS)

    Cheng, M.; Wang, W.; Zhao, K.; Fan, Y.; Long, P.; Wu, Y.

    2013-01-01

    Full text of the publication follows. The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verification of the models for Monte Carlo (MC) simulation are very tedious, error-prone and time-consuming. In addition, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling. The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients (Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection. (authors)

  13. Performance of human observers and an automatic 3-dimensional computer-vision-based locomotion scoring method to detect lameness and hoof lesions in dairy cows

    NARCIS (Netherlands)

    Schlageter-Tello, Andrés; Hertem, Van Tom; Bokkers, Eddie A.M.; Viazzi, Stefano; Bahr, Claudia; Lokhorst, Kees

    2018-01-01

    The objective of this study was to determine if a 3-dimensional computer vision automatic locomotion scoring (3D-ALS) method was able to outperform human observers for classifying cows as lame or nonlame and for detecting cows affected and nonaffected by specific type(s) of hoof lesion. Data

  14. Application of a computational situation assessment model to human system interface design and experimental validation of its effectiveness

    International Nuclear Information System (INIS)

    Lee, Hyun-Chul; Koh, Kwang-Yong; Seong, Poong-Hyun

    2013-01-01

    Highlights: ► We validate the effectiveness of a proposed procedure thru an experiment. ► The proposed procedure addresses the salient coding of the key information. ► It was found that salience coding affects operators’ attention significantly. ► The first observation to the key information quickly guided to the correct situation awareness. ► It was validated the proposed procedure is effective for better situation awareness. - Abstract: To evaluate the effects of human cognitive characteristics on situation awareness, a computational situation assessment model of nuclear power plant operators has been developed, as well as a procedure to apply the developed model to the design of human system interfaces (HSIs). The concept of the proposed procedure is to identify the key information source, which is expected to guarantee fast and accurate diagnosis when operators attend to it. The developed computational model is used to search the diagnostic paths and the key information source. In this study, an experiment with twelve trained participants was executed to validate the effectiveness of the proposed procedure. Eighteen scenarios covering various accidents were administered twice for each subject, and experimental data were collected and analyzed. As a result of the data analysis, it was validated that the salience level of information sources significantly influences the attention of operators, and the first observation of the key information sources leads operators to a quick and correct situation assessment. Therefore, we conclude that the proposed procedure for applying the developed model to HSI design is effective

  15. Eco-toxicity and human estrogenic exposure risks from "·OH-initiated photochemical transformation of four phthalates in water: A computational study

    International Nuclear Information System (INIS)

    Gao, Yanpeng; An, Taicheng; Ji, Yuemeng; Li, Guiying; Zhao, Cunyuan

    2015-01-01

    Transformation products (TPs) of emerging organic contaminates (EOCs) in water are still rarely considered in environmental risk assessment, although some have been found to be concern. "·OH is believed as an important reactive species both in indirect phototransformation and advanced oxidation technology. Thus, eco-toxicity and human estrogenic exposure risks of four phthalates and TPs during the "·OH-initiated photochemical process were investigated using computational approach. Four phthalates can be degraded through "·OH-addition and H-transfer pathways. The "·OH-addition TPs were predominant for dimethyl phthalates, while H-transfer TPs were predominant for other three phthalates. Compared with phthalates, "·OH-addition TPs (o-OH-phthalates) were one level more toxic to aquatic organisms, and m-OH-phthalates exhibit higher estrogenic activity. Although H-transfer TPs were less harmful than "·OH-addition TPs, some of them still have aquatic toxicity and estrogenic activity. Therefore, more attentions should be paid to photochemical TPs and original EOCs, particularly those exhibiting high estrogenic activity to humans. - Highlights: • Phthalates can be degraded with "·OH-addition and H-transfer pathways. • "·OH-addition products are mainly formed during DMP transformation. • H-transfer products were predominant for the transformation of DEP, DPP and DBP. • o-"·OH-addition products have greater eco-toxicity than corresponding phthalates. • m-"·OH-addition products have higher estrogenic activity than corresponding phthalates. - Computational approach could provide valuable information on the mechanisms, kinetics, eco-toxicity as well as human estrogenic exposure risks of EOCs and their transformation products.

  16. Achievement report for fiscal 1999 on specified international cooperative research project. Research on anthropometry using computer mannequin; 1999 nendo computer mannequin ni kansuru jintai model no keitai suitei gijutsu no kenkyu seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Studies are conducted about anthropometry using a computer mannequin for assessing the compatibility of virtual commodities and environments, fabricated by CAD (computer-aided design) or the like, with human beings, and the fiscal 1999 achievements are compiled. In the study of multidimensional equations of correlation for shape estimation, calculation of basic human model dimensions using 43 measurements has become feasible. For the construction of a human model using a computer mannequin, joint positions and distances between joints have to be determined using data which are measurements that cover human model surfaces. Such necessitates a study of shape estimating algorithm, and a basic physique calculation module and a shape estimation module have been developed. Verification is conducted using a computer mannequin, and it is found that much information is quantitatively determined prior to the manufacture of real goods, the said information involving the mechanism of a chair and the behavior of a human being in it, and the relations of the human behavior with peripheral equipment. (NEDO)

  17. International Conference on Computational Intelligence 2015

    CERN Document Server

    Saha, Sujan

    2017-01-01

    This volume comprises the proceedings of the International Conference on Computational Intelligence 2015 (ICCI15). This book aims to bring together work from leading academicians, scientists, researchers and research scholars from across the globe on all aspects of computational intelligence. The work is composed mainly of original and unpublished results of conceptual, constructive, empirical, experimental, or theoretical work in all areas of computational intelligence. Specifically, the major topics covered include classical computational intelligence models and artificial intelligence, neural networks and deep learning, evolutionary swarm and particle algorithms, hybrid systems optimization, constraint programming, human-machine interaction, computational intelligence for the web analytics, robotics, computational neurosciences, neurodynamics, bioinspired and biomorphic algorithms, cross disciplinary topics and applications. The contents of this volume will be of use to researchers and professionals alike....

  18. Design of a compact low-power human-computer interaction equipment for hand motion

    Science.gov (United States)

    Wu, Xianwei; Jin, Wenguang

    2017-01-01

    Human-Computer Interaction (HCI) raises demand of convenience, endurance, responsiveness and naturalness. This paper describes a design of a compact wearable low-power HCI equipment applied to gesture recognition. System combines multi-mode sense signals: the vision sense signal and the motion sense signal, and the equipment is equipped with the depth camera and the motion sensor. The dimension (40 mm × 30 mm) and structure is compact and portable after tight integration. System is built on a module layered framework, which contributes to real-time collection (60 fps), process and transmission via synchronous confusion with asynchronous concurrent collection and wireless Blue 4.0 transmission. To minimize equipment's energy consumption, system makes use of low-power components, managing peripheral state dynamically, switching into idle mode intelligently, pulse-width modulation (PWM) of the NIR LEDs of the depth camera and algorithm optimization by the motion sensor. To test this equipment's function and performance, a gesture recognition algorithm is applied to system. As the result presents, general energy consumption could be as low as 0.5 W.

  19. Evolved Representation and Computational Creativity

    Directory of Open Access Journals (Sweden)

    Ashraf Fouad Hafez Ismail

    2001-01-01

    Full Text Available Advances in science and technology have influenced designing activity in architecture throughout its history. Observing the fundamental changes to architectural designing due to the substantial influences of the advent of the computing era, we now witness our design environment gradually changing from conventional pencil and paper to digital multi-media. Although designing is considered to be a unique human activity, there has always been a great dependency on design aid tools. One of the greatest aids to architectural design, amongst the many conventional and widely accepted computational tools, is the computer-aided object modeling and rendering tool, commonly known as a CAD package. But even though conventional modeling tools have provided designers with fast and precise object handling capabilities that were not available in the pencil-and-paper age, they normally show weaknesses and limitations in covering the whole design process.In any kind of design activity, the design worked on has to be represented in some way. For a human designer, designs are for example represented using models, drawings, or verbal descriptions. If a computer is used for design work, designs are usually represented by groups of pixels (paintbrush programs, lines and shapes (general-purpose CAD programs or higher-level objects like ‘walls’ and ‘rooms’ (purpose-specific CAD programs.A human designer usually has a large number of representations available, and can use the representation most suitable for what he or she is working on. Humans can also introduce new representations and thereby represent objects that are not part of the world they experience with their sensory organs, for example vector representations of four and five dimensional objects. In design computing on the other hand, the representation or representations used have to be explicitly defined. Many different representations have been suggested, often optimized for specific design domains

  20. University Students and Ethics of Computer Technology Usage: Human Resource Development

    Science.gov (United States)

    Iyadat, Waleed; Iyadat, Yousef; Ashour, Rateb; Khasawneh, Samer

    2012-01-01

    The primary purpose of this study was to determine the level of students' awareness about computer technology ethics at the Hashemite University in Jordan. A total of 180 university students participated in the study by completing the questionnaire designed by the researchers, named the Computer Technology Ethics Questionnaire (CTEQ). Results…

  1. A human-assisted computer generated LA-grammar for simple ...

    African Journals Online (AJOL)

    Southern African Linguistics and Applied Language Studies ... of computer programs to generate Left Associative Grammars (LAGs) for natural languages is described. The generation proceeds from examples of correct sentences and needs ...

  2. Resource Management in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Andrei IONESCU

    2015-01-01

    Full Text Available Mobile cloud computing is a major research topic in Information Technology & Communications. It integrates cloud computing, mobile computing and wireless networks. While mainly built on cloud computing, it has to operate using more heterogeneous resources with implications on how these resources are managed and used. Managing the resources of a mobile cloud is not a trivial task, involving vastly different architectures. The process is outside the scope of human users. Using the resources by the applications at both platform and software tiers come with its own challenges. This paper presents different approaches in use for managing cloud resources at infrastructure and platform levels.

  3. Generation of a suite of 3D computer-generated breast phantoms from a limited set of human subject data

    International Nuclear Information System (INIS)

    Hsu, Christina M. L.; Palmeri, Mark L.; Segars, W. Paul; Veress, Alexander I.; Dobbins, James T. III

    2013-01-01

    Purpose: The authors previously reported on a three-dimensional computer-generated breast phantom, based on empirical human image data, including a realistic finite-element based compression model that was capable of simulating multimodality imaging data. The computerized breast phantoms are a hybrid of two phantom generation techniques, combining empirical breast CT (bCT) data with flexible computer graphics techniques. However, to date, these phantoms have been based on single human subjects. In this paper, the authors report on a new method to generate multiple phantoms, simulating additional subjects from the limited set of original dedicated breast CT data. The authors developed an image morphing technique to construct new phantoms by gradually transitioning between two human subject datasets, with the potential to generate hundreds of additional pseudoindependent phantoms from the limited bCT cases. The authors conducted a preliminary subjective assessment with a limited number of observers (n= 4) to illustrate how realistic the simulated images generated with the pseudoindependent phantoms appeared. Methods: Several mesh-based geometric transformations were developed to generate distorted breast datasets from the original human subject data. Segmented bCT data from two different human subjects were used as the “base” and “target” for morphing. Several combinations of transformations were applied to morph between the “base’ and “target” datasets such as changing the breast shape, rotating the glandular data, and changing the distribution of the glandular tissue. Following the morphing, regions of skin and fat were assigned to the morphed dataset in order to appropriately assign mechanical properties during the compression simulation. The resulting morphed breast was compressed using a finite element algorithm and simulated mammograms were generated using techniques described previously. Sixty-two simulated mammograms, generated from morphing

  4. Micro-computed tomography assessment of human alveolar bone: bone density and three-dimensional micro-architecture.

    Science.gov (United States)

    Kim, Yoon Jeong; Henkin, Jeffrey

    2015-04-01

    Micro-computed tomography (micro-CT) is a valuable means to evaluate and secure information related to bone density and quality in human necropsy samples and small live animals. The aim of this study was to assess the bone density of the alveolar jaw bones in human cadaver, using micro-CT. The correlation between bone density and three-dimensional micro architecture of trabecular bone was evaluated. Thirty-four human cadaver jaw bone specimens were harvested. Each specimen was scanned with micro-CT at resolution of 10.5 μm. The bone volume fraction (BV/TV) and the bone mineral density (BMD) value within a volume of interest were measured. The three-dimensional micro architecture of trabecular bone was assessed. All the parameters in the maxilla and the mandible were subject to comparison. The variables for the bone density and the three-dimensional micro architecture were analyzed for nonparametric correlation using Spearman's rho at the significance level of p architecture parameters were consistently higher in the mandible, up to 3.3 times greater than those in the maxilla. The most linear correlation was observed between BV/TV and BMD, with Spearman's rho = 0.99 (p = .01). Both BV/TV and BMD were highly correlated with all micro architecture parameters with Spearman's rho above 0.74 (p = .01). Two aspects of bone density using micro-CT, the BV/TV and BMD, are highly correlated with three-dimensional micro architecture parameters, which represent the quality of trabecular bone. This noninvasive method may adequately enhance evaluation of the alveolar bone. © 2013 Wiley Periodicals, Inc.

  5. Applications of X-ray Computed Tomography and Emission Computed Tomography

    International Nuclear Information System (INIS)

    Seletchi, Emilia Dana; Sutac, Victor

    2005-01-01

    Computed Tomography is a non-destructive imaging method that allows visualization of internal features within non-transparent objects such as sedimentary rocks. Filtering techniques have been applied to circumvent the artifacts and achieve high-quality images for quantitative analysis. High-resolution X-ray computed tomography (HRXCT) can be used to identify the position of the growth axis in speleothems by detecting subtle changes in calcite density between growth bands. HRXCT imagery reveals the three-dimensional variability of coral banding providing information on coral growth and climate over the past several centuries. The Nuclear Medicine imaging technique uses a radioactive tracer, several radiation detectors, and sophisticated computer technologies to understand the biochemical basis of normal and abnormal functions within the brain. The goal of Emission Computed Tomography (ECT) is to accurately determine the three-dimensional radioactivity distribution resulting from the radiopharmaceutical uptake inside the patient instead of the attenuation coefficient distribution from different tissues as obtained from X-ray Computer Tomography. ECT is a very useful tool for investigating the cognitive functions. Because of the low radiation doses associated with Positron Emission Tomography (PET), this technique has been applied in clinical research, allowing the direct study of human neurological diseases. (authors)

  6. Modeling aspects of human memory for scientific study.

    Energy Technology Data Exchange (ETDEWEB)

    Caudell, Thomas P. (University of New Mexico); Watson, Patrick (University of Illinois - Champaign-Urbana Beckman Institute); McDaniel, Mark A. (Washington University); Eichenbaum, Howard B. (Boston University); Cohen, Neal J. (University of Illinois - Champaign-Urbana Beckman Institute); Vineyard, Craig Michael; Taylor, Shawn Ellis; Bernard, Michael Lewis; Morrow, James Dan; Verzi, Stephen J.

    2009-10-01

    Working with leading experts in the field of cognitive neuroscience and computational intelligence, SNL has developed a computational architecture that represents neurocognitive mechanisms associated with how humans remember experiences in their past. The architecture represents how knowledge is organized and updated through information from individual experiences (episodes) via the cortical-hippocampal declarative memory system. We compared the simulated behavioral characteristics with those of humans measured under well established experimental standards, controlling for unmodeled aspects of human processing, such as perception. We used this knowledge to create robust simulations of & human memory behaviors that should help move the scientific community closer to understanding how humans remember information. These behaviors were experimentally validated against actual human subjects, which was published. An important outcome of the validation process will be the joining of specific experimental testing procedures from the field of neuroscience with computational representations from the field of cognitive modeling and simulation.

  7. Human-computer interaction for alert warning and attention allocation systems of the multimodal watchstation

    Science.gov (United States)

    Obermayer, Richard W.; Nugent, William A.

    2000-11-01

    The SPAWAR Systems Center San Diego is currently developing an advanced Multi-Modal Watchstation (MMWS); design concepts and software from this effort are intended for transition to future United States Navy surface combatants. The MMWS features multiple flat panel displays and several modes of user interaction, including voice input and output, natural language recognition, 3D audio, stylus and gestural inputs. In 1999, an extensive literature review was conducted on basic and applied research concerned with alerting and warning systems. After summarizing that literature, a human computer interaction (HCI) designer's guide was prepared to support the design of an attention allocation subsystem (AAS) for the MMWS. The resultant HCI guidelines are being applied in the design of a fully interactive AAS prototype. An overview of key findings from the literature review, a proposed design methodology with illustrative examples, and an assessment of progress made in implementing the HCI designers guide are presented.

  8. Elucidating Mechanisms of Molecular Recognition Between Human Argonaute and miRNA Using Computational Approaches

    KAUST Repository

    Jiang, Hanlun

    2016-12-06

    MicroRNA (miRNA) and Argonaute (AGO) protein together form the RNA-induced silencing complex (RISC) that plays an essential role in the regulation of gene expression. Elucidating the underlying mechanism of AGO-miRNA recognition is thus of great importance not only for the in-depth understanding of miRNA function but also for inspiring new drugs targeting miRNAs. In this chapter we introduce a combined computational approach of molecular dynamics (MD) simulations, Markov state models (MSMs), and protein-RNA docking to investigate AGO-miRNA recognition. Constructed from MD simulations, MSMs can elucidate the conformational dynamics of AGO at biologically relevant timescales. Protein-RNA docking can then efficiently identify the AGO conformations that are geometrically accessible to miRNA. Using our recent work on human AGO2 as an example, we explain the rationale and the workflow of our method in details. This combined approach holds great promise to complement experiments in unraveling the mechanisms of molecular recognition between large, flexible, and complex biomolecules.

  9. Elucidating Mechanisms of Molecular Recognition Between Human Argonaute and miRNA Using Computational Approaches.

    Science.gov (United States)

    Jiang, Hanlun; Zhu, Lizhe; Héliou, Amélie; Gao, Xin; Bernauer, Julie; Huang, Xuhui

    2017-01-01

    MicroRNA (miRNA) and Argonaute (AGO) protein together form the RNA-induced silencing complex (RISC) that plays an essential role in the regulation of gene expression. Elucidating the underlying mechanism of AGO-miRNA recognition is thus of great importance not only for the in-depth understanding of miRNA function but also for inspiring new drugs targeting miRNAs. In this chapter we introduce a combined computational approach of molecular dynamics (MD) simulations, Markov state models (MSMs), and protein-RNA docking to investigate AGO-miRNA recognition. Constructed from MD simulations, MSMs can elucidate the conformational dynamics of AGO at biologically relevant timescales. Protein-RNA docking can then efficiently identify the AGO conformations that are geometrically accessible to miRNA. Using our recent work on human AGO2 as an example, we explain the rationale and the workflow of our method in details. This combined approach holds great promise to complement experiments in unraveling the mechanisms of molecular recognition between large, flexible, and complex biomolecules.

  10. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-01-01

    Finally, we take a step further by developing a novel feature selection method suitable for defining a computational framework capable of analyzing the genomic content of enhancers and reporting cell-line specific predictive signatures.

  11. Computational intelligence, medicine and biology selected links

    CERN Document Server

    Zaitseva, Elena

    2015-01-01

    This book contains an interesting and state-of the art collection of chapters presenting several examples of attempts to developing modern tools utilizing computational intelligence in different real life problems encountered by humans. Reasoning, prediction, modeling, optimization, decision making, etc. need modern, soft and intelligent algorithms, methods and methodologies to solve, in the efficient ways, problems appearing in human activity. The contents of the book is divided into two parts. Part I, consisting of four chapters, is devoted to selected links of computational intelligence, medicine, health care and biomechanics. Several problems are considered: estimation of healthcare system reliability, classification of ultrasound thyroid images, application of fuzzy logic to measure weight status and central fatness, and deriving kinematics directly from video records. Part II, also consisting of four chapters, is devoted to selected links of computational intelligence and biology. The common denominato...

  12. Mirror neurons and imitation: a computationally guided review.

    Science.gov (United States)

    Oztop, Erhan; Kawato, Mitsuo; Arbib, Michael

    2006-04-01

    Neurophysiology reveals the properties of individual mirror neurons in the macaque while brain imaging reveals the presence of 'mirror systems' (not individual neurons) in the human. Current conceptual models attribute high level functions such as action understanding, imitation, and language to mirror neurons. However, only the first of these three functions is well-developed in monkeys. We thus distinguish current opinions (conceptual models) on mirror neuron function from more detailed computational models. We assess the strengths and weaknesses of current computational models in addressing the data and speculations on mirror neurons (macaque) and mirror systems (human). In particular, our mirror neuron system (MNS), mental state inference (MSI) and modular selection and identification for control (MOSAIC) models are analyzed in more detail. Conceptual models often overlook the computational requirements for posited functions, while too many computational models adopt the erroneous hypothesis that mirror neurons are interchangeable with imitation ability. Our meta-analysis underlines the gap between conceptual and computational models and points out the research effort required from both sides to reduce this gap.

  13. Technology for Large-Scale Translation of Clinical Practice Guidelines: A Pilot Study of the Performance of a Hybrid Human and Computer-Assisted Approach.

    Science.gov (United States)

    Van de Velde, Stijn; Macken, Lieve; Vanneste, Koen; Goossens, Martine; Vanschoenbeek, Jan; Aertgeerts, Bert; Vanopstal, Klaar; Vander Stichele, Robert; Buysschaert, Joost

    2015-10-09

    The construction of EBMPracticeNet, a national electronic point-of-care information platform in Belgium, began in 2011 to optimize quality of care by promoting evidence-based decision making. The project involved, among other tasks, the translation of 940 EBM Guidelines of Duodecim Medical Publications from English into Dutch and French. Considering the scale of the translation process, it was decided to make use of computer-aided translation performed by certificated translators with limited expertise in medical translation. Our consortium used a hybrid approach, involving a human translator supported by a translation memory (using SDL Trados Studio), terminology recognition (using SDL MultiTerm terminology databases) from medical terminology databases, and support from online machine translation. This resulted in a validated translation memory, which is now in use for the translation of new and updated guidelines. The objective of this experiment was to evaluate the performance of the hybrid human and computer-assisted approach in comparison with translation unsupported by translation memory and terminology recognition. A comparison was also made with the translation efficiency of an expert medical translator. We conducted a pilot study in which two sets of 30 new and 30 updated guidelines were randomized to one of three groups. Comparable guidelines were translated (1) by certificated junior translators without medical specialization using the hybrid method, (2) by an experienced medical translator without this support, and (3) by the same junior translators without the support of the validated translation memory. A medical proofreader who was blinded for the translation procedure, evaluated the translated guidelines for acceptability and adequacy. Translation speed was measured by recording translation and post-editing time. The human translation edit rate was calculated as a metric to evaluate the quality of the translation. A further evaluation was made of

  14. 6th International Conference on Computer Science and its Applications

    CERN Document Server

    Stojmenovic, Ivan; Jeong, Hwa; Yi, Gangman

    2015-01-01

    The 6th FTRA International Conference on Computer Science and its Applications (CSA-14) will be held in Guam, USA, Dec. 17 - 19, 2014. CSA-14 presents a comprehensive conference focused on the various aspects of advances in engineering systems in computer science, and applications, including ubiquitous computing, U-Health care system, Big Data, UI/UX for human-centric computing, Computing Service, Bioinformatics and Bio-Inspired Computing and will show recent advances on various aspects of computing technology, Ubiquitous Computing Services and its application.

  15. Applying virtual and augmented reality in cultural computing

    NARCIS (Netherlands)

    Bartneck, C.; Hu, J.; Salem, B.I.; Cristescu, R.; Rauterberg, G.W.M.

    2008-01-01

    We are exploring a new application of virtual and augmented reality for a novel direction in human-computer inteaction named 'cultural computing', which aims to provide a new medium for cultural translation and unconscious metamorphosis. In this application both virtual and robotic agents are

  16. Concept and computation of radiation dose at high energies

    International Nuclear Information System (INIS)

    Sarkar, P.K.

    2010-01-01

    Computational dosimetry, a subdiscipline of computational physics devoted to radiation metrology, is determination of absorbed dose and other dose related quantities by numbers. Computations are done separately both for external and internal dosimetry. The methodology used in external beam dosimetry is necessarily a combination of experimental radiation dosimetry and theoretical dose computation since it is not feasible to plan any physical dose measurements from inside a living human body

  17. Human machine interaction: The special role for human unconscious emotional information processing

    NARCIS (Netherlands)

    Noort, M.W.M.L. van den; Hugdahl, K.; Bosch, M.P.C.

    2005-01-01

    The nature of (un)conscious human emotional information processing remains a great mystery. On the one hand, classical models view human conscious emotional information processing as computation among the brain’s neurons but fail to address its enigmatic features. On the other hand, quantum

  18. Computational Human Performance Modeling For Alarm System Design

    Energy Technology Data Exchange (ETDEWEB)

    Jacques Hugo

    2012-07-01

    The introduction of new technologies like adaptive automation systems and advanced alarms processing and presentation techniques in nuclear power plants is already having an impact on the safety and effectiveness of plant operations and also the role of the control room operator. This impact is expected to escalate dramatically as more and more nuclear power utilities embark on upgrade projects in order to extend the lifetime of their plants. One of the most visible impacts in control rooms will be the need to replace aging alarm systems. Because most of these alarm systems use obsolete technologies, the methods, techniques and tools that were used to design the previous generation of alarm system designs are no longer effective and need to be updated. The same applies to the need to analyze and redefine operators’ alarm handling tasks. In the past, methods for analyzing human tasks and workload have relied on crude, paper-based methods that often lacked traceability. New approaches are needed to allow analysts to model and represent the new concepts of alarm operation and human-system interaction. State-of-the-art task simulation tools are now available that offer a cost-effective and efficient method for examining the effect of operator performance in different conditions and operational scenarios. A discrete event simulation system was used by human factors researchers at the Idaho National Laboratory to develop a generic alarm handling model to examine the effect of operator performance with simulated modern alarm system. It allowed analysts to evaluate alarm generation patterns as well as critical task times and human workload predicted by the system.

  19. The Research of Computer Aided Farm Machinery Designing Method Based on Ergonomics

    Science.gov (United States)

    Gao, Xiyin; Li, Xinling; Song, Qiang; Zheng, Ying

    Along with agricultural economy development, the farm machinery product type Increases gradually, the ergonomics question is also getting more and more prominent. The widespread application of computer aided machinery design makes it possible that farm machinery design is intuitive, flexible and convenient. At present, because the developed computer aided ergonomics software has not suitable human body database, which is needed in view of farm machinery design in China, the farm machinery design have deviation in ergonomics analysis. This article puts forward that using the open database interface procedure in CATIA to establish human body database which aims at the farm machinery design, and reading the human body data to ergonomics module of CATIA can product practical application virtual body, using human posture analysis and human activity analysis module to analysis the ergonomics in farm machinery, thus computer aided farm machinery designing method based on engineering can be realized.

  20. 3rd International Conference on Computer & Communication Technologies

    CERN Document Server

    Bhateja, Vikrant; Raju, K; Janakiramaiah, B

    2017-01-01

    The book is a compilation of high-quality scientific papers presented at the 3rd International Conference on Computer & Communication Technologies (IC3T 2016). The individual papers address cutting-edge technologies and applications of soft computing, artificial intelligence and communication. In addition, a variety of further topics are discussed, which include data mining, machine intelligence, fuzzy computing, sensor networks, signal and image processing, human-computer interaction, web intelligence, etc. As such, it offers readers a valuable and unique resource.