#### Sample records for human computer qhc

1. Human Computation

CERN Multimedia

CERN. Geneva

2008-01-01

What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

2. The mathematics of a quantum Hamiltonian computing half adder Boolean logic gate

International Nuclear Information System (INIS)

Dridi, G; Julien, R; Hliwa, M; Joachim, C

2015-01-01

The mathematics behind the quantum Hamiltonian computing (QHC) approach of designing Boolean logic gates with a quantum system are given. Using the quantum eigenvalue repulsion effect, the QHC AND, NAND, OR, NOR, XOR, and NXOR Hamiltonian Boolean matrices are constructed. This is applied to the construction of a QHC half adder Hamiltonian matrix requiring only six quantum states to fullfil a half Boolean logical truth table. The QHC design rules open a nano-architectronic way of constructing Boolean logic gates inside a single molecule or atom by atom at the surface of a passivated semi-conductor. (paper)

3. The mathematics of a quantum Hamiltonian computing half adder Boolean logic gate.

Science.gov (United States)

Dridi, G; Julien, R; Hliwa, M; Joachim, C

2015-08-28

The mathematics behind the quantum Hamiltonian computing (QHC) approach of designing Boolean logic gates with a quantum system are given. Using the quantum eigenvalue repulsion effect, the QHC AND, NAND, OR, NOR, XOR, and NXOR Hamiltonian Boolean matrices are constructed. This is applied to the construction of a QHC half adder Hamiltonian matrix requiring only six quantum states to fullfil a half Boolean logical truth table. The QHC design rules open a nano-architectronic way of constructing Boolean logic gates inside a single molecule or atom by atom at the surface of a passivated semi-conductor.

4. Qubits and quantum Hamiltonian computing performances for operating a digital Boolean 1/2-adder

Science.gov (United States)

Dridi, Ghassen; Faizy Namarvar, Omid; Joachim, Christian

2018-04-01

Quantum Boolean (1 + 1) digits 1/2-adders are designed with 3 qubits for the quantum computing (Qubits) and 4 quantum states for the quantum Hamiltonian computing (QHC) approaches. Detailed analytical solutions are provided to analyse the time operation of those different 1/2-adder gates. QHC is more robust to noise than Qubits and requires about the same amount of energy for running its 1/2-adder logical operations. QHC is faster in time than Qubits but its logical output measurement takes longer.

5. Human Computer Music Performance

OpenAIRE

Dannenberg, Roger B.

2012-01-01

Human Computer Music Performance (HCMP) is the study of music performance by live human performers and real-time computer-based performers. One goal of HCMP is to create a highly autonomous artificial performer that can fill the role of a human, especially in a popular music setting. This will require advances in automated music listening and understanding, new representations for music, techniques for music synchronization, real-time human-computer communication, music generation, sound synt...

6. Ubiquitous human computing.

Science.gov (United States)

Zittrain, Jonathan

2008-10-28

Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a drawing pin and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This paper explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

7. When computers were human

CERN Document Server

Grier, David Alan

2013-01-01

Before Palm Pilots and iPods, PCs and laptops, the term ""computer"" referred to the people who did scientific calculations by hand. These workers were neither calculating geniuses nor idiot savants but knowledgeable people who, in other circumstances, might have become scientists in their own right. When Computers Were Human represents the first in-depth account of this little-known, 200-year epoch in the history of science and technology. Beginning with the story of his own grandmother, who was trained as a human computer, David Alan Grier provides a poignant introduction to the wider wo

8. Handbook of human computation

CERN Document Server

Michelucci, Pietro

2013-01-01

This volume addresses the emerging area of human computation, The chapters, written by leading international researchers, explore existing and future opportunities to combine the respective strengths of both humans and machines in order to create powerful problem-solving capabilities. The book bridges scientific communities, capturing and integrating the unique perspective and achievements of each. It coalesces contributions from industry and across related disciplines in order to motivate, define, and anticipate the future of this exciting new frontier in science and cultural evolution. Reade

9. Making IBM's Computer, Watson, Human

Science.gov (United States)

Rachlin, Howard

2012-01-01

This essay uses the recent victory of an IBM computer (Watson) in the TV game, "Jeopardy," to speculate on the abilities Watson would need, in addition to those it has, to be human. The essay's basic premise is that to be human is to behave as humans behave and to function in society as humans function. Alternatives to this premise are considered…

10. Minimal mobile human computer interaction

NARCIS (Netherlands)

el Ali, A.

2013-01-01

In the last 20 years, the widespread adoption of personal, mobile computing devices in everyday life, has allowed entry into a new technological era in Human Computer Interaction (HCI). The constant change of the physical and social context in a user's situation made possible by the portability of

11. Humans, computers and wizards human (simulated) computer interaction

CERN Document Server

Fraser, Norman; McGlashan, Scott; Wooffitt, Robin

2013-01-01

Using data taken from a major European Union funded project on speech understanding, the SunDial project, this book considers current perspectives on human computer interaction and argues for the value of an approach taken from sociology which is based on conversation analysis.

12. Artifical Intelligence for Human Computing

NARCIS (Netherlands)

Huang, Th.S.; Nijholt, Antinus; Pantic, Maja; Pentland, A.; Unknown, [Unknown

2007-01-01

This book constitutes the thoroughly refereed post-proceedings of two events discussing AI for Human Computing: one Special Session during the Eighth International ACM Conference on Multimodal Interfaces (ICMI 2006), held in Banff, Canada, in November 2006, and a Workshop organized in conjunction

13. Guest Editorial Special Issue on Human Computing

NARCIS (Netherlands)

Pantic, Maja; Santos, E.; Pentland, A.; Nijholt, Antinus

2009-01-01

The seven articles in this special issue focus on human computing. Most focus on two challenging issues in human computing, namely, machine analysis of human behavior in group interactions and context-sensitive modeling.

14. Human ear recognition by computer

CERN Document Server

Bhanu, Bir; Chen, Hui

2010-01-01

Biometrics deals with recognition of individuals based on their physiological or behavioral characteristics. The human ear is a new feature in biometrics that has several merits over the more common face, fingerprint and iris biometrics. Unlike the fingerprint and iris, it can be easily captured from a distance without a fully cooperative subject, although sometimes it may be hidden with hair, scarf and jewellery. Also, unlike a face, the ear is a relatively stable structure that does not change much with the age and facial expressions. ""Human Ear Recognition by Computer"" is the first book o

15. Human-centered Computing: Toward a Human Revolution

OpenAIRE

Jaimes, Alejandro; Gatica-Perez, Daniel; Sebe, Nicu; Huang, Thomas S.

2007-01-01

Human-centered computing studies the design, development, and deployment of mixed-initiative human-computer systems. HCC is emerging from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts.

16. Cooperation in human-computer communication

OpenAIRE

Kronenberg, Susanne

2000-01-01

The goal of this thesis is to simulate cooperation in human-computer communication to model the communicative interaction process of agents in natural dialogs in order to provide advanced human-computer interaction in that coherence is maintained between contributions of both agents, i.e. the human user and the computer. This thesis contributes to certain aspects of understanding and generation and their interaction in the German language. In spontaneous dialogs agents cooperate by the pro...

17. Human Computing and Machine Understanding of Human Behavior: A Survey

NARCIS (Netherlands)

Pantic, Maja; Pentland, Alex; Nijholt, Antinus; Huang, Thomas; Quek, F.; Yang, Yie

2006-01-01

A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing, which we will call human computing, should

18. Language evolution and human-computer interaction

Science.gov (United States)

Grudin, Jonathan; Norman, Donald A.

1991-01-01

Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

19. Occupational stress in human computer interaction.

Science.gov (United States)

Smith, M J; Conway, F T; Karsh, B T

1999-04-01

There have been a variety of research approaches that have examined the stress issues related to human computer interaction including laboratory studies, cross-sectional surveys, longitudinal case studies and intervention studies. A critical review of these studies indicates that there are important physiological, biochemical, somatic and psychological indicators of stress that are related to work activities where human computer interaction occurs. Many of the stressors of human computer interaction at work are similar to those stressors that have historically been observed in other automated jobs. These include high workload, high work pressure, diminished job control, inadequate employee training to use new technology, monotonous tasks, por supervisory relations, and fear for job security. New stressors have emerged that can be tied primarily to human computer interaction. These include technology breakdowns, technology slowdowns, and electronic performance monitoring. The effects of the stress of human computer interaction in the workplace are increased physiological arousal; somatic complaints, especially of the musculoskeletal system; mood disturbances, particularly anxiety, fear and anger; and diminished quality of working life, such as reduced job satisfaction. Interventions to reduce the stress of computer technology have included improved technology implementation approaches and increased employee participation in implementation. Recommendations for ways to reduce the stress of human computer interaction at work are presented. These include proper ergonomic conditions, increased organizational support, improved job content, proper workload to decrease work pressure, and enhanced opportunities for social support. A model approach to the design of human computer interaction at work that focuses on the system "balance" is proposed.

20. Human Adaptation to the Computer.

Science.gov (United States)

1986-09-01

8217"’ TECHNOSTRESS " 5 5’..,:. VI I. CONCLUSIONS-------------------------59 -- LIST OF REFERENCES-------------------------61 BI BLI OGRAPHY...computer has not developed. Instead, what has developed is a "modern disease of adaptation" called " technostress ," a phrase coined by Brod. Craig...34 technostress ." Managers (according to Brod) have been implementing computers in ways that contribute directly to this stress: [Ref. 3:p. 38) 1. They

1. Challenges for Virtual Humans in Human Computing

NARCIS (Netherlands)

Reidsma, Dennis; Ruttkay, Z.M.; Huang, T; Nijholt, Antinus; Pantic, Maja; Pentland, A.

The vision of Ambient Intelligence (AmI) presumes a plethora of embedded services and devices that all endeavor to support humans in their daily activities as unobtrusively as possible. Hardware gets distributed throughout the environment, occupying even the fabric of our clothing. The environment

2. Object categorization: computer and human vision perspectives

National Research Council Canada - National Science Library

Dickinson, Sven J

2009-01-01

.... The result of a series of four highly successful workshops on the topic, the book gathers many of the most distinguished researchers from both computer and human vision to reflect on their experience...

3. Human law and computer law comparative perspectives

CERN Document Server

Hildebrandt, Mireille

2014-01-01

This book probes the epistemological and hermeneutic implications of data science and artificial intelligence for democracy and the Rule of Law, and the challenges posed by computing technologies traditional legal thinking and the regulation of human affairs.

4. Fundamentals of human-computer interaction

CERN Document Server

Monk, Andrew F

1985-01-01

Fundamentals of Human-Computer Interaction aims to sensitize the systems designer to the problems faced by the user of an interactive system. The book grew out of a course entitled """"The User Interface: Human Factors for Computer-based Systems"""" which has been run annually at the University of York since 1981. This course has been attended primarily by systems managers from the computer industry. The book is organized into three parts. Part One focuses on the user as processor of information with studies on visual perception; extracting information from printed and electronically presented

5. Approaching Engagement towards Human-Engaged Computing

DEFF Research Database (Denmark)

Niksirat, Kavous Salehzadeh; Sarcar, Sayan; Sun, Huatong

2018-01-01

Debates regarding the nature and role of HCI research and practice have intensified in recent years, given the ever increasingly intertwined relations between humans and technologies. The framework of Human-Engaged Computing (HEC) was proposed and developed over a series of scholarly workshops to...

6. Modeling multimodal human-computer interaction

NARCIS (Netherlands)

Obrenovic, Z.; Starcevic, D.

2004-01-01

Incorporating the well-known Unified Modeling Language into a generic modeling framework makes research on multimodal human-computer interaction accessible to a wide range off software engineers. Multimodal interaction is part of everyday human discourse: We speak, move, gesture, and shift our gaze

7. Human computing and machine understanding of human behavior: A survey

NARCIS (Netherlands)

Pentland, Alex; Huang, Thomas S.; Huang, Th.S.; Nijholt, Antinus; Pantic, Maja; Pentland, A.

2007-01-01

A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing should be about anticipatory user interfaces

8. Pilots of the future - Human or computer?

Science.gov (United States)

Chambers, A. B.; Nagel, D. C.

1985-01-01

In connection with the occurrence of aircraft accidents and the evolution of the air-travel system, questions arise regarding the computer's potential for making fundamental contributions to improving the safety and reliability of air travel. An important result of an analysis of the causes of aircraft accidents is the conclusion that humans - 'pilots and other personnel' - are implicated in well over half of the accidents which occur. Over 70 percent of the incident reports contain evidence of human error. In addition, almost 75 percent show evidence of an 'information-transfer' problem. Thus, the question arises whether improvements in air safety could be achieved by removing humans from control situations. In an attempt to answer this question, it is important to take into account also certain advantages which humans have in comparison to computers. Attention is given to human error and the effects of technology, the motivation to automate, aircraft automation at the crossroads, the evolution of cockpit automation, and pilot factors.

9. Parallel structures in human and computer memory

Science.gov (United States)

Kanerva, Pentti

1986-08-01

If we think of our experiences as being recorded continuously on film, then human memory can be compared to a film library that is indexed by the contents of the film strips stored in it. Moreover, approximate retrieval cues suffice to retrieve information stored in this library: We recognize a familiar person in a fuzzy photograph or a familiar tune played on a strange instrument. This paper is about how to construct a computer memory that would allow a computer to recognize patterns and to recall sequences the way humans do. Such a memory is remarkably similar in structure to a conventional computer memory and also to the neural circuits in the cortex of the cerebellum of the human brain. The paper concludes that the frame problem of artificial intelligence could be solved by the use of such a memory if we were able to encode information about the world properly.

10. Applying Human Computation Methods to Information Science

Science.gov (United States)

Harris, Christopher Glenn

2013-01-01

Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

11. Feedback Loops in Communication and Human Computing

NARCIS (Netherlands)

op den Akker, Hendrikus J.A.; Heylen, Dirk K.J.; Pantic, Maja; Pentland, Alex; Nijholt, Antinus; Huang, Thomas S.

Building systems that are able to analyse communicative behaviours or take part in conversations requires a sound methodology in which the complex organisation of conversations is understood and tested on real-life samples. The data-driven approaches to human computing not only have a value for the

12. Human Memory Organization for Computer Programs.

Science.gov (United States)

Norcio, A. F.; Kerst, Stephen M.

1983-01-01

Results of study investigating human memory organization in processing of computer programming languages indicate that algorithmic logic segments form a cognitive organizational structure in memory for programs. Statement indentation and internal program documentation did not enhance organizational process of recall of statements in five Fortran…

13. Computational Complexity and Human Decision-Making.

Science.gov (United States)

Bossaerts, Peter; Murawski, Carsten

2017-12-01

The rationality principle postulates that decision-makers always choose the best action available to them. It underlies most modern theories of decision-making. The principle does not take into account the difficulty of finding the best option. Here, we propose that computational complexity theory (CCT) provides a framework for defining and quantifying the difficulty of decisions. We review evidence showing that human decision-making is affected by computational complexity. Building on this evidence, we argue that most models of decision-making, and metacognition, are intractable from a computational perspective. To be plausible, future theories of decision-making will need to take into account both the resources required for implementing the computations implied by the theory, and the resource constraints imposed on the decision-maker by biology. Copyright © 2017 Elsevier Ltd. All rights reserved.

14. Introduction to human-computer interaction

CERN Document Server

Booth, Paul

2014-01-01

Originally published in 1989 this title provided a comprehensive and authoritative introduction to the burgeoning discipline of human-computer interaction for students, academics, and those from industry who wished to know more about the subject. Assuming very little knowledge, the book provides an overview of the diverse research areas that were at the time only gradually building into a coherent and well-structured field. It aims to explain the underlying causes of the cognitive, social and organizational problems typically encountered when computer systems are introduced. It is clear and co

15. Proxemics in Human-Computer Interaction

OpenAIRE

Greenberg, Saul; Honbaek, Kasper; Quigley, Aaron; Reiterer, Harald; Rädle, Roman

2014-01-01

In 1966, anthropologist Edward Hall coined the term "proxemics." Proxemics is an area of study that identifies the culturally dependent ways in which people use interpersonal distance to understand and mediate their interactions with others. Recent research has demonstrated the use of proxemics in human-computer interaction (HCI) for supporting users' explicit and implicit interactions in a range of uses, including remote office collaboration, home entertainment, and games. One promise of pro...

16. Human-Computer Interaction in Smart Environments

Science.gov (United States)

Paravati, Gianluca; Gatteschi, Valentina

2015-01-01

Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

17. Human-computer interaction : Guidelines for web animation

OpenAIRE

2006-01-01

Human-computer interaction in the large is an interdisciplinary area which attracts researchers, educators, and practioners from many differenf fields. Human-computer interaction studies a human and a machine in communication, it draws from supporting knowledge on both the machine and the human side. This paper is related to the human side of human-computer interaction and focuses on animations. The growing use of animation in Web pages testifies to the increasing ease with which such multim...

18. Brain-Computer Interfaces Revolutionizing Human-Computer Interaction

CERN Document Server

Graimann, Bernhard; Allison, Brendan

2010-01-01

A brain-computer interface (BCI) establishes a direct output channel between the human brain and external devices. BCIs infer user intent via direct measures of brain activity and thus enable communication and control without movement. This book, authored by experts in the field, provides an accessible introduction to the neurophysiological and signal-processing background required for BCI, presents state-of-the-art non-invasive and invasive approaches, gives an overview of current hardware and software solutions, and reviews the most interesting as well as new, emerging BCI applications. The book is intended not only for students and young researchers, but also for newcomers and other readers from diverse backgrounds keen to learn about this vital scientific endeavour.

19. Human-Computer Interaction in Smart Environments

Directory of Open Access Journals (Sweden)

Gianluca Paravati

2015-08-01

Full Text Available Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

20. Human-Computer Interaction The Agency Perspective

CERN Document Server

Oliveira, José

2012-01-01

Agent-centric theories, approaches and technologies are contributing to enrich interactions between users and computers. This book aims at highlighting the influence of the agency perspective in Human-Computer Interaction through a careful selection of research contributions. Split into five sections; Users as Agents, Agents and Accessibility, Agents and Interactions, Agent-centric Paradigms and Approaches, and Collective Agents, the book covers a wealth of novel, original and fully updated material, offering:   ü  To provide a coherent, in depth, and timely material on the agency perspective in HCI ü  To offer an authoritative treatment of the subject matter presented by carefully selected authors ü  To offer a balanced and broad coverage of the subject area, including, human, organizational, social, as well as technological concerns. ü  To offer a hands-on-experience by covering representative case studies and offering essential design guidelines   The book will appeal to a broad audience of resea...

1. Measuring Multimodal Synchrony for Human-Computer Interaction

NARCIS (Netherlands)

Reidsma, Dennis; Nijholt, Antinus; Tschacher, Wolfgang; Ramseyer, Fabian; Sourin, A.

2010-01-01

Nonverbal synchrony is an important and natural element in human-human interaction. It can also play various roles in human-computer interaction. In particular this is the case in the interaction between humans and the virtual humans that inhabit our cyberworlds. Virtual humans need to adapt their

2. Human computer interaction using hand gestures

CERN Document Server

Premaratne, Prashan

2014-01-01

Human computer interaction (HCI) plays a vital role in bridging the 'Digital Divide', bringing people closer to consumer electronics control in the 'lounge'. Keyboards and mouse or remotes do alienate old and new generations alike from control interfaces. Hand Gesture Recognition systems bring hope of connecting people with machines in a natural way. This will lead to consumers being able to use their hands naturally to communicate with any electronic equipment in their 'lounge.' This monograph will include the state of the art hand gesture recognition approaches and how they evolved from their inception. The author would also detail his research in this area for the past 8 years and how the future might turn out to be using HCI. This monograph will serve as a valuable guide for researchers (who would endeavour into) in the world of HCI.

3. Human Computation An Integrated Approach to Learning from the Crowd

CERN Document Server

Law, Edith

2011-01-01

Human computation is a new and evolving research area that centers around harnessing human intelligence to solve computational problems that are beyond the scope of existing Artificial Intelligence (AI) algorithms. With the growth of the Web, human computation systems can now leverage the abilities of an unprecedented number of people via the Web to perform complex computation. There are various genres of human computation applications that exist today. Games with a purpose (e.g., the ESP Game) specifically target online gamers who generate useful data (e.g., image tags) while playing an enjoy

4. Advanced Technologies, Embedded and Multimedia for Human-Centric Computing

CERN Document Server

Chao, Han-Chieh; Deng, Der-Jiunn; Park, James; HumanCom and EMC 2013

2014-01-01

The theme of HumanCom and EMC are focused on the various aspects of human-centric computing for advances in computer science and its applications, embedded and multimedia computing and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. And the theme of EMC (Advanced in Embedded and Multimedia Computing) is focused on the various aspects of embedded system, smart grid, cloud and multimedia computing, and it provides an opportunity for academic, industry professionals to discuss the latest issues and progress in the area of embedded and multimedia computing. Therefore this book will be include the various theories and practical applications in human-centric computing and embedded and multimedia computing.

5. An Interdisciplinary Bibliography for Computers and the Humanities Courses.

Science.gov (United States)

Ehrlich, Heyward

1991-01-01

Presents an annotated bibliography of works related to the subject of computers and the humanities. Groups items into textbooks and overviews; introductions; human and computer languages; literary and linguistic analysis; artificial intelligence and robotics; social issue debates; computers' image in fiction; anthologies; writing and the…

6. The epistemology and ontology of human-computer interaction

NARCIS (Netherlands)

Brey, Philip A.E.

2005-01-01

This paper analyzes epistemological and ontological dimensions of Human-Computer Interaction (HCI) through an analysis of the functions of computer systems in relation to their users. It is argued that the primary relation between humans and computer systems has historically been epistemic:

7. 2012 International Conference on Human-centric Computing

CERN Document Server

Jin, Qun; Yeo, Martin; Hu, Bin; Human Centric Technology and Service in Smart Space, HumanCom 2012

2012-01-01

The theme of HumanCom is focused on the various aspects of human-centric computing for advances in computer science and its applications and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in human-centric computing. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject.

8. Computational Analysis of Human Blood Flow

Science.gov (United States)

Panta, Yogendra; Marie, Hazel; Harvey, Mark

2009-11-01

Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

9. Human-Computer Interaction and Information Management Research Needs

Data.gov (United States)

Networking and Information Technology Research and Development, Executive Office of the President — In a visionary future, Human-Computer Interaction HCI and Information Management IM have the potential to enable humans to better manage their lives through the use...

10. Computer Modeling of Human Delta Opioid Receptor

Directory of Open Access Journals (Sweden)

Tatyana Dzimbova

2013-04-01

Full Text Available The development of selective agonists of δ-opioid receptor as well as the model of interaction of ligands with this receptor is the subjects of increased interest. In the absence of crystal structures of opioid receptors, 3D homology models with different templates have been reported in the literature. The problem is that these models are not available for widespread use. The aims of our study are: (1 to choose within recently published crystallographic structures templates for homology modeling of the human δ-opioid receptor (DOR; (2 to evaluate the models with different computational tools; and (3 to precise the most reliable model basing on correlation between docking data and in vitro bioassay results. The enkephalin analogues, as ligands used in this study, were previously synthesized by our group and their biological activity was evaluated. Several models of DOR were generated using different templates. All these models were evaluated by PROCHECK and MolProbity and relationship between docking data and in vitro results was determined. The best correlations received for the tested models of DOR were found between efficacy (erel of the compounds, calculated from in vitro experiments and Fitness scoring function from docking studies. New model of DOR was generated and evaluated by different approaches. This model has good GA341 value (0.99 from MODELLER, good values from PROCHECK (92.6% of most favored regions and MolProbity (99.5% of favored regions. Scoring function correlates (Pearson r = -0.7368, p-value = 0.0097 with erel of a series of enkephalin analogues, calculated from in vitro experiments. So, this investigation allows suggesting a reliable model of DOR. Newly generated model of DOR receptor could be used further for in silico experiments and it will give possibility for faster and more correct design of selective and effective ligands for δ-opioid receptor.

11. Multimodal Information Presentation for High-Load Human Computer Interaction

NARCIS (Netherlands)

Cao, Y.

2011-01-01

This dissertation addresses multimodal information presentation in human computer interaction. Information presentation refers to the manner in which computer systems/interfaces present information to human users. More specifically, the focus of our work is not on which information to present, but

12. Stereo Vision for Unrestricted Human-Computer Interaction

OpenAIRE

Eldridge, Ross; Rudolph, Heiko

2008-01-01

Human computer interfaces have come long way in recent years, but the goal of a computer interpreting unrestricted human movement remains elusive. The use of stereo vision in this field has enabled the development of systems that begin to approach this goal. As computer technology advances we come ever closer to a system that can react to the ambiguities of human movement in real-time. In the foreseeable future stereo computer vision is not likely to replace the keyboard or mouse. There is at...

13. Benefits of Subliminal Feedback Loops in Human-Computer Interaction

OpenAIRE

Walter Ritter

2011-01-01

A lot of efforts have been directed to enriching human-computer interaction to make the user experience more pleasing or efficient. In this paper, we briefly present work in the fields of subliminal perception and affective computing, before we outline a new approach to add analog communication channels to the human-computer interaction experience. In this approach, in addition to symbolic predefined mappings of input to output, a subliminal feedback loop is used that provides feedback in evo...

14. Human computer confluence applied in healthcare and rehabilitation.

Science.gov (United States)

Viaud-Delmon, Isabelle; Gaggioli, Andrea; Ferscha, Alois; Dunne, Stephen

2012-01-01

Human computer confluence (HCC) is an ambitious research program studying how the emerging symbiotic relation between humans and computing devices can enable radically new forms of sensing, perception, interaction, and understanding. It is an interdisciplinary field, bringing together researches from horizons as various as pervasive computing, bio-signals processing, neuroscience, electronics, robotics, virtual & augmented reality, and provides an amazing potential for applications in medicine and rehabilitation.

15. From Human-Computer Interaction to Human-Robot Social Interaction

OpenAIRE

2014-01-01

Human-Robot Social Interaction became one of active research fields in which researchers from different areas propose solutions and directives leading robots to improve their interactions with humans. In this paper we propose to introduce works in both human robot interaction and human computer interaction and to make a bridge between them, i.e. to integrate emotions and capabilities concepts of the robot in human computer model to become adequate for human robot interaction and discuss chall...

16. Safety Metrics for Human-Computer Controlled Systems

Science.gov (United States)

Leveson, Nancy G; Hatanaka, Iwao

2000-01-01

The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

17. Human performance models for computer-aided engineering

Science.gov (United States)

Elkind, Jerome I. (Editor); Card, Stuart K. (Editor); Hochberg, Julian (Editor); Huey, Beverly Messick (Editor)

1989-01-01

This report discusses a topic important to the field of computational human factors: models of human performance and their use in computer-based engineering facilities for the design of complex systems. It focuses on a particular human factors design problem -- the design of cockpit systems for advanced helicopters -- and on a particular aspect of human performance -- vision and related cognitive functions. By focusing in this way, the authors were able to address the selected topics in some depth and develop findings and recommendations that they believe have application to many other aspects of human performance and to other design domains.

18. Image Visual Realism: From Human Perception to Machine Computation.

Science.gov (United States)

Fan, Shaojing; Ng, Tian-Tsong; Koenig, Bryan L; Herberg, Jonathan S; Jiang, Ming; Shen, Zhiqi; Zhao, Qi

2017-08-30

Visual realism is defined as the extent to which an image appears to people as a photo rather than computer generated. Assessing visual realism is important in applications like computer graphics rendering and photo retouching. However, current realism evaluation approaches use either labor-intensive human judgments or automated algorithms largely dependent on comparing renderings to reference images. We develop a reference-free computational framework for visual realism prediction to overcome these constraints. First, we construct a benchmark dataset of 2520 images with comprehensive human annotated attributes. From statistical modeling on this data, we identify image attributes most relevant for visual realism. We propose both empirically-based (guided by our statistical modeling of human data) and CNN-learned features to predict visual realism of images. Our framework has the following advantages: (1) it creates an interpretable and concise empirical model that characterizes human perception of visual realism; (2) it links computational features to latent factors of human image perception.

19. Computational Intelligence in a Human Brain Model

Directory of Open Access Journals (Sweden)

Viorel Gaftea

2016-06-01

Full Text Available This paper focuses on the current trends in brain research domain and the current stage of development of research for software and hardware solutions, communication capabilities between: human beings and machines, new technologies, nano-science and Internet of Things (IoT devices. The proposed model for Human Brain assumes main similitude between human intelligence and the chess game thinking process. Tactical & strategic reasoning and the need to follow the rules of the chess game, all are very similar with the activities of the human brain. The main objective for a living being and the chess game player are the same: securing a position, surviving and eliminating the adversaries. The brain resolves these goals, and more, the being movement, actions and speech are sustained by the vital five senses and equilibrium. The chess game strategy helps us understand the human brain better and easier replicate in the proposed ‘Software and Hardware’ SAH Model.

20. The Next Wave: Humans, Computers, and Redefining Reality

Science.gov (United States)

Little, William

2018-01-01

The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.

1. Accident sequence analysis of human-computer interface design

International Nuclear Information System (INIS)

Fan, C.-F.; Chen, W.-H.

2000-01-01

It is important to predict potential accident sequences of human-computer interaction in a safety-critical computing system so that vulnerable points can be disclosed and removed. We address this issue by proposing a Multi-Context human-computer interaction Model along with its analysis techniques, an Augmented Fault Tree Analysis, and a Concurrent Event Tree Analysis. The proposed augmented fault tree can identify the potential weak points in software design that may induce unintended software functions or erroneous human procedures. The concurrent event tree can enumerate possible accident sequences due to these weak points

2. Human-computer interaction and management information systems

CERN Document Server

Galletta, Dennis F

2014-01-01

""Human-Computer Interaction and Management Information Systems: Applications"" offers state-of-the-art research by a distinguished set of authors who span the MIS and HCI fields. The original chapters provide authoritative commentaries and in-depth descriptions of research programs that will guide 21st century scholars, graduate students, and industry professionals. Human-Computer Interaction (or Human Factors) in MIS is concerned with the ways humans interact with information, technologies, and tasks, especially in business, managerial, organizational, and cultural contexts. It is distinctiv

3. Mobile human-computer interaction perspective on mobile learning

CSIR Research Space (South Africa)

2010-10-01

Full Text Available Applying a Mobile Human Computer Interaction (MHCI) view to the domain of education using Mobile Learning (Mlearning), the research outlines its understanding of the influences and effects of different interactions on the use of mobile technology...

4. Cognition beyond the brain computation, interactivity and human artifice

CERN Document Server

Cowley, Stephen J

2013-01-01

Arguing that a collective dimension has given cognitive flexibility to human intelligence, this book shows that traditional cognitive psychology underplays the role of bodies, dialogue, diagrams, tools, talk, customs, habits, computers and cultural practices.

5. Computers, the Human Mind, and My In-Laws' House.

Science.gov (United States)

Esque, Timm J.

1996-01-01

Discussion of human memory, computer memory, and the storage of information focuses on a metaphor that can account for memory without storage and can set the stage for systemic research around a more comprehensive, understandable theory. (Author/LRW)

6. The Emotiv EPOC interface paradigm in Human-Computer Interaction

OpenAIRE

Ancău Dorina; Roman Nicolae-Marius; Ancău Mircea

2017-01-01

Numerous studies have suggested the use of decoded error potentials in the brain to improve human-computer communication. Together with state-of-the-art scientific equipment, experiments have also tested instruments with more limited performance for the time being, such as Emotiv EPOC. This study presents a review of these trials and a summary of the results obtained. However, the level of these results indicates a promising prospect for using this headset as a human-computer interface for er...

7. Where computers disappear, virtual humans appear

NARCIS (Netherlands)

Nijholt, Antinus; Sourin, A.

2004-01-01

In this paper, we survey the role of virtual humans (or embodied conversational agents) in smart and ambient intelligence environments. Research in this area can profit from research done earlier in virtual reality environments and research on verbal and nonverbal interaction. We discuss virtual

8. Audio Technology and Mobile Human Computer Interaction

DEFF Research Database (Denmark)

2017-01-01

Audio-based mobile technology is opening up a range of new interactive possibilities. This paper brings some of those possibilities to light by offering a range of perspectives based in this area. It is not only the technical systems that are developing, but novel approaches to the design...... and understanding of audio-based mobile systems are evolving to offer new perspectives on interaction and design and support such systems to be applied in areas, such as the humanities....

9. Object recognition in images by human vision and computer vision

NARCIS (Netherlands)

Chen, Q.; Dijkstra, J.; Vries, de B.

2010-01-01

Object recognition plays a major role in human behaviour research in the built environment. Computer based object recognition techniques using images as input are challenging, but not an adequate representation of human vision. This paper reports on the differences in object shape recognition

10. A Perspective on Computational Human Performance Models as Design Tools

Science.gov (United States)

Jones, Patricia M.

2010-01-01

The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

11. HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL*

Science.gov (United States)

Malamud, Ofer; Pop-Eleches, Cristian

2012-01-01

This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven’s Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors. PMID:22719135

12. The UK Human Genome Mapping Project online computing service.

Science.gov (United States)

Rysavy, F R; Bishop, M J; Gibbs, G P; Williams, G W

1992-04-01

This paper presents an overview of computing and networking facilities developed by the Medical Research Council to provide online computing support to the Human Genome Mapping Project (HGMP) in the UK. The facility is connected to a number of other computing facilities in various centres of genetics and molecular biology research excellence, either directly via high-speed links or through national and international wide-area networks. The paper describes the design and implementation of the current system, a 'client/server' network of Sun, IBM, DEC and Apple servers, gateways and workstations. A short outline of online computing services currently delivered by this system to the UK human genetics research community is also provided. More information about the services and their availability could be obtained by a direct approach to the UK HGMP-RC.

13. A Research Roadmap for Computation-Based Human Reliability Analysis

Energy Technology Data Exchange (ETDEWEB)

Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

2015-08-01

The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

14. A Research Roadmap for Computation-Based Human Reliability Analysis

International Nuclear Information System (INIS)

Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

2015-01-01

The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

15. The Past, Present and Future of Human Computer Interaction

KAUST Repository

Churchill, Elizabeth

2018-01-16

Human Computer Interaction (HCI) focuses on how people interact with, and are transformed by computation. Our current technology landscape is changing rapidly. Interactive applications, devices and services are increasingly becoming embedded into our environments. From our homes to the urban and rural spaces, we traverse everyday. We are increasingly able toﾖoften required toﾖmanage and configure multiple, interconnected devices and program their interactions. Artificial intelligence (AI) techniques are being used to create dynamic services that learn about us and others, that make conclusions about our intents and affiliations, and that mould our digital interactions based in predictions about our actions and needs, nudging us toward certain behaviors. Computation is also increasingly embedded into our bodies. Understanding human interactions in the everyday digital and physical context. During this lecture, Elizabeth Churchill -Director of User Experience at Google- will talk about how an emerging landscape invites us to revisit old methods and tactics for understanding how people interact with computers and computation, and how it challenges us to think about new methods and frameworks for understanding the future of human-centered computation.

16. The Emotiv EPOC interface paradigm in Human-Computer Interaction

Directory of Open Access Journals (Sweden)

Ancău Dorina

2017-01-01

Full Text Available Numerous studies have suggested the use of decoded error potentials in the brain to improve human-computer communication. Together with state-of-the-art scientific equipment, experiments have also tested instruments with more limited performance for the time being, such as Emotiv EPOC. This study presents a review of these trials and a summary of the results obtained. However, the level of these results indicates a promising prospect for using this headset as a human-computer interface for error decoding.

17. From humans to computers cognition through visual perception

CERN Document Server

Alexandrov, Viktor Vasilievitch

1991-01-01

This book considers computer vision to be an integral part of the artificial intelligence system. The core of the book is an analysis of possible approaches to the creation of artificial vision systems, which simulate human visual perception. Much attention is paid to the latest achievements in visual psychology and physiology, the description of the functional and structural organization of the human perception mechanism, the peculiarities of artistic perception and the expression of reality. Computer vision models based on these data are investigated. They include the processes of external d

18. An intelligent multi-media human-computer dialogue system

Science.gov (United States)

Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.

1988-01-01

Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.

19. A Human-Centred Tangible approach to learning Computational Thinking

Directory of Open Access Journals (Sweden)

Tommaso Turchi

2016-08-01

Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

20. Choice of Human-Computer Interaction Mode in Stroke Rehabilitation.

Science.gov (United States)

Mousavi Hondori, Hossein; Khademi, Maryam; Dodakian, Lucy; McKenzie, Alison; Lopes, Cristina V; Cramer, Steven C

2016-03-01

Advances in technology are providing new forms of human-computer interaction. The current study examined one form of human-computer interaction, augmented reality (AR), whereby subjects train in the real-world workspace with virtual objects projected by the computer. Motor performances were compared with those obtained while subjects used a traditional human-computer interaction, that is, a personal computer (PC) with a mouse. Patients used goal-directed arm movements to play AR and PC versions of the Fruit Ninja video game. The 2 versions required the same arm movements to control the game but had different cognitive demands. With AR, the game was projected onto the desktop, where subjects viewed the game plus their arm movements simultaneously, in the same visual coordinate space. In the PC version, subjects used the same arm movements but viewed the game by looking up at a computer monitor. Among 18 patients with chronic hemiparesis after stroke, the AR game was associated with 21% higher game scores (P = .0001), 19% faster reaching times (P = .0001), and 15% less movement variability (P = .0068), as compared to the PC game. Correlations between game score and arm motor status were stronger with the AR version. Motor performances during the AR game were superior to those during the PC game. This result is due in part to the greater cognitive demands imposed by the PC game, a feature problematic for some patients but clinically useful for others. Mode of human-computer interface influences rehabilitation therapy demands and can be individualized for patients. © The Author(s) 2015.

1. Plants and Human Affairs: Educational Enhancement Via a Computer.

Science.gov (United States)

Crovello, Theodore J.; Smith, W. Nelson

To enhance both teaching and learning in an advanced undergraduate elective course on the interrelationships of plants and human affairs, the computer was used for information retrieval, multiple choice course review, and the running of three simulation models--plant related systems (e.g., the rise in world coffee prices after the 1975 freeze in…

2. Humor in Human-Computer Interaction : A Short Survey

NARCIS (Netherlands)

Nijholt, Anton; Niculescu, Andreea; Valitutti, Alessandro; Banchs, Rafael E.; Joshi, Anirudha; Balkrishan, Devanuj K.; Dalvi, Girish; Winckler, Marco

2017-01-01

This paper is a short survey on humor in human-computer interaction. It describes how humor is designed and interacted with in social media, virtual agents, social robots and smart environments. Benefits and future use of humor in interactions with artificial entities are discussed based on

3. A Software Framework for Multimodal Human-Computer Interaction Systems

NARCIS (Netherlands)

Shen, Jie; Pantic, Maja

2009-01-01

This paper describes a software framework we designed and implemented for the development and research in the area of multimodal human-computer interface. The proposed framework is based on publish / subscribe architecture, which allows developers and researchers to conveniently configure, test and

4. Computational 3-D Model of the Human Respiratory System

Science.gov (United States)

We are developing a comprehensive, morphologically-realistic computational model of the human respiratory system that can be used to study the inhalation, deposition, and clearance of contaminants, while being adaptable for age, race, gender, and health/disease status. The model ...

5. Why computer games can be essential for human flourishing

NARCIS (Netherlands)

Fröding, B.; Peterson, M.B.

2013-01-01

Traditionally, playing computer games and engaging in other online activities has been seen as a threat to well-being, health and long-term happiness. It is feared that spending many hours per day in front of the screen leads the individual to forsake other, more worthwhile activities, such as human

6. Homo ludens in the loop playful human computation systems

CERN Document Server

Krause, Markus

2014-01-01

The human mind is incredible. It solves problems with ease that will elude machines even for the next decades. This book explores what happens when humans and machines work together to solve problems machines cannot yet solve alone. It explains how machines and computers can work together and how humans can have fun helping to face some of the most challenging problems of artificial intelligence. In this book, you will find designs for games that are entertaining and yet able to collect data to train machines for complex tasks such as natural language processing or image understanding. You wil

7. Computational Fluid and Particle Dynamics in the Human Respiratory System

CERN Document Server

2013-01-01

Traditional research methodologies in the human respiratory system have always been challenging due to their invasive nature. Recent advances in medical imaging and computational fluid dynamics (CFD) have accelerated this research. This book compiles and details recent advances in the modelling of the respiratory system for researchers, engineers, scientists, and health practitioners. It breaks down the complexities of this field and provides both students and scientists with an introduction and starting point to the physiology of the respiratory system, fluid dynamics and advanced CFD modeling tools. In addition to a brief introduction to the physics of the respiratory system and an overview of computational methods, the book contains best-practice guidelines for establishing high-quality computational models and simulations. Inspiration for new simulations can be gained through innovative case studies as well as hands-on practice using pre-made computational code. Last but not least, students and researcher...

8. A novel polar-based human face recognition computational model

Directory of Open Access Journals (Sweden)

Y. Zana

2009-07-01

Full Text Available Motivated by a recently proposed biologically inspired face recognition approach, we investigated the relation between human behavior and a computational model based on Fourier-Bessel (FB spatial patterns. We measured human recognition performance of FB filtered face images using an 8-alternative forced-choice method. Test stimuli were generated by converting the images from the spatial to the FB domain, filtering the resulting coefficients with a band-pass filter, and finally taking the inverse FB transformation of the filtered coefficients. The performance of the computational models was tested using a simulation of the psychophysical experiment. In the FB model, face images were first filtered by simulated V1- type neurons and later analyzed globally for their content of FB components. In general, there was a higher human contrast sensitivity to radially than to angularly filtered images, but both functions peaked at the 11.3-16 frequency interval. The FB-based model presented similar behavior with regard to peak position and relative sensitivity, but had a wider frequency band width and a narrower response range. The response pattern of two alternative models, based on local FB analysis and on raw luminance, strongly diverged from the human behavior patterns. These results suggest that human performance can be constrained by the type of information conveyed by polar patterns, and consequently that humans might use FB-like spatial patterns in face processing.

9. Human-Computer Interaction, Tourism and Cultural Heritage

Science.gov (United States)

Cipolla Ficarra, Francisco V.

We present a state of the art of the human-computer interaction aimed at tourism and cultural heritage in some cities of the European Mediterranean. In the work an analysis is made of the main problems deriving from training understood as business and which can derail the continuous growth of the HCI, the new technologies and tourism industry. Through a semiotic and epistemological study the current mistakes in the context of the interrelations of the formal and factual sciences will be detected and also the human factors that have an influence on the professionals devoted to the development of interactive systems in order to safeguard and boost cultural heritage.

10. Computer aided systems human engineering: A hypermedia tool

Science.gov (United States)

Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

1992-01-01

The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

11. Overview Electrotactile Feedback for Enhancing Human Computer Interface

Science.gov (United States)

Pamungkas, Daniel S.; Caesarendra, Wahyu

2018-04-01

To achieve effective interaction between a human and a computing device or machine, adequate feedback from the computing device or machine is required. Recently, haptic feedback is increasingly being utilised to improve the interactivity of the Human Computer Interface (HCI). Most existing haptic feedback enhancements aim at producing forces or vibrations to enrich the user’s interactive experience. However, these force and/or vibration actuated haptic feedback systems can be bulky and uncomfortable to wear and only capable of delivering a limited amount of information to the user which can limit both their effectiveness and the applications they can be applied to. To address this deficiency, electrotactile feedback is used. This involves delivering haptic sensations to the user by electrically stimulating nerves in the skin via electrodes placed on the surface of the skin. This paper presents a review and explores the capability of electrotactile feedback for HCI applications. In addition, a description of the sensory receptors within the skin for sensing tactile stimulus and electric currents alsoseveral factors which influenced electric signal to transmit to the brain via human skinare explained.

12. Human-computer systems interaction backgrounds and applications 3

CERN Document Server

Kulikowski, Juliusz; Mroczek, Teresa; Wtorek, Jerzy

2014-01-01

This book contains an interesting and state-of the art collection of papers on the recent progress in Human-Computer System Interaction (H-CSI). It contributes the profound description of the actual status of the H-CSI field and also provides a solid base for further development and research in the discussed area. The contents of the book are divided into the following parts: I. General human-system interaction problems; II. Health monitoring and disabled people helping systems; and III. Various information processing systems. This book is intended for a wide audience of readers who are not necessarily experts in computer science, machine learning or knowledge engineering, but are interested in Human-Computer Systems Interaction. The level of particular papers and specific spreading-out into particular parts is a reason why this volume makes fascinating reading. This gives the reader a much deeper insight than he/she might glean from research papers or talks at conferences. It touches on all deep issues that ...

13. Computer simulation of human motion in sports biomechanics.

Science.gov (United States)

Vaughan, C L

1984-01-01

This chapter has covered some important aspects of the computer simulation of human motion in sports biomechanics. First the definition and the advantages and limitations of computer simulation were discussed; second, research on various sporting activities were reviewed. These activities included basic movements, aquatic sports, track and field athletics, winter sports, gymnastics, and striking sports. This list was not exhaustive and certain material has, of necessity, been omitted. However, it was felt that a sufficiently broad and interesting range of activities was chosen to illustrate both the advantages and the pitfalls of simulation. It is almost a decade since Miller [53] wrote a review chapter similar to this one. One might be tempted to say that things have changed radically since then--that computer simulation is now a widely accepted and readily applied research tool in sports biomechanics. This is simply not true, however. Biomechanics researchers still tend to emphasize the descriptive type of study, often unfortunately, when a little theoretical explanation would have been more helpful [29]. What will the next decade bring? Of one thing we can be certain: The power of computers, particularly the readily accessible and portable microcomputer, will expand beyond all recognition. The memory and storage capacities will increase dramatically on the hardware side, and on the software side the trend will be toward "user-friendliness." It is likely that a number of software simulation packages designed specifically for studying human motion [31, 96] will be extensively tested and could gain wide acceptance in the biomechanics research community. Nevertheless, a familiarity with Newtonian and Lagrangian mechanics, optimization theory, and computers in general, as well as practical biomechanical insight, will still be a prerequisite for successful simulation models of human motion. Above all, the biomechanics researcher will still have to bear in mind that

14. Electromagnetic Modeling of Human Body Using High Performance Computing

Science.gov (United States)

Ng, Cho-Kuen; Beall, Mark; Ge, Lixin; Kim, Sanghoek; Klaas, Ottmar; Poon, Ada

Realistic simulation of electromagnetic wave propagation in the actual human body can expedite the investigation of the phenomenon of harvesting implanted devices using wireless powering coupled from external sources. The parallel electromagnetics code suite ACE3P developed at SLAC National Accelerator Laboratory is based on the finite element method for high fidelity accelerator simulation, which can be enhanced to model electromagnetic wave propagation in the human body. Starting with a CAD model of a human phantom that is characterized by a number of tissues, a finite element mesh representing the complex geometries of the individual tissues is built for simulation. Employing an optimal power source with a specific pattern of field distribution, the propagation and focusing of electromagnetic waves in the phantom has been demonstrated. Substantial speedup of the simulation is achieved by using multiple compute cores on supercomputers.

15. Intermittent control: a computational theory of human control.

Science.gov (United States)

Gawthrop, Peter; Loram, Ian; Lakie, Martin; Gollee, Henrik

2011-02-01

The paradigm of continuous control using internal models has advanced understanding of human motor control. However, this paradigm ignores some aspects of human control, including intermittent feedback, serial ballistic control, triggered responses and refractory periods. It is shown that event-driven intermittent control provides a framework to explain the behaviour of the human operator under a wider range of conditions than continuous control. Continuous control is included as a special case, but sampling, system matched hold, an intermittent predictor and an event trigger allow serial open-loop trajectories using intermittent feedback. The implementation here may be described as "continuous observation, intermittent action". Beyond explaining unimodal regulation distributions in common with continuous control, these features naturally explain refractoriness and bimodal stabilisation distributions observed in double stimulus tracking experiments and quiet standing, respectively. Moreover, given that human control systems contain significant time delays, a biological-cybernetic rationale favours intermittent over continuous control: intermittent predictive control is computationally less demanding than continuous predictive control. A standard continuous-time predictive control model of the human operator is used as the underlying design method for an event-driven intermittent controller. It is shown that when event thresholds are small and sampling is regular, the intermittent controller can masquerade as the underlying continuous-time controller and thus, under these conditions, the continuous-time and intermittent controller cannot be distinguished. This explains why the intermittent control hypothesis is consistent with the continuous control hypothesis for certain experimental conditions.

16. Computed tomography of human joints and radioactive waste drums

International Nuclear Information System (INIS)

Martz, Harry E.; Roberson, G. Patrick; Hollerbach, Karin; Logan, Clinton M.; Ashby, Elaine; Bernardi, Richard

1999-01-01

X- and gamma-ray imaging techniques in nondestructive evaluation (NDE) and assay (NDA) have seen increasing use in an array of industrial, environmental, military, and medical applications. Much of this growth in recent years is attributed to the rapid development of computed tomography (CT) and the use of NDE throughout the life-cycle of a product. Two diverse examples of CT are discussed, 1.) Our computational approach to normal joint kinematics and prosthetic joint analysis offers an opportunity to evaluate and improve prosthetic human joint replacements before they are manufactured or surgically implanted. Computed tomography data from scanned joints are segmented, resulting in the identification of bone and other tissues of interest, with emphasis on the articular surfaces. 2.) We are developing NDE and NDA techniques to analyze closed waste drums accurately and quantitatively. Active and passive computed tomography (A and PCT) is a comprehensive and accurate gamma-ray NDA method that can identify all detectable radioisotopes present in a container and measure their radioactivity

17. Identification of Enhancers In Human: Advances In Computational Studies

KAUST Repository

Kleftogiannis, Dimitrios A.

2016-03-24

Roughly ~50% of the human genome, contains noncoding sequences serving as regulatory elements responsible for the diverse gene expression of the cells in the body. One very well studied category of regulatory elements is the category of enhancers. Enhancers increase the transcriptional output in cells through chromatin remodeling or recruitment of complexes of binding proteins. Identification of enhancer using computational techniques is an interesting area of research and up to now several approaches have been proposed. However, the current state-of-the-art methods face limitations since the function of enhancers is clarified, but their mechanism of function is not well understood. This PhD thesis presents a bioinformatics/computer science study that focuses on the problem of identifying enhancers in different human cells using computational techniques. The dissertation is decomposed into four main tasks that we present in different chapters. First, since many of the enhancer’s functions are not well understood, we study the basic biological models by which enhancers trigger transcriptional functions and we survey comprehensively over 30 bioinformatics approaches for identifying enhancers. Next, we elaborate more on the availability of enhancer data as produced by different enhancer identification methods and experimental procedures. In particular, we analyze advantages and disadvantages of existing solutions and we report obstacles that require further consideration. To mitigate these problems we developed the Database of Integrated Human Enhancers (DENdb), a centralized online repository that archives enhancer data from 16 ENCODE cell-lines. The integrated enhancer data are also combined with many other experimental data that can be used to interpret the enhancers content and generate a novel enhancer annotation that complements the existing integrative annotation proposed by the ENCODE consortium. Next, we propose the first deep-learning computational

18. CHI '13 Extended Abstracts on Human Factors in Computing Systems

DEFF Research Database (Denmark)

also deeply appreciate the huge amount of time donated to this process by the 211-member program committee, who paid their own way to attend the face-to-face program committee meeting, an event larger than the average ACM conference. We are proud of the work of the CHI 2013 program committee and hope...... a tremendous amount of work from all areas of the human-computer interaction community. As co-chairs of the process, we are amazed at the ability of the community to organize itself to accomplish this task. We would like to thank the 2680 individual reviewers for their careful consideration of these papers. We...

19. Code system to compute radiation dose in human phantoms

International Nuclear Information System (INIS)

Ryman, J.C.; Cristy, M.; Eckerman, K.F.; Davis, J.L.; Tang, J.S.; Kerr, G.D.

1986-01-01

Monte Carlo photon transport code and a code using Monte Carlo integration of a point kernel have been revised to incorporate human phantom models for an adult female, juveniles of various ages, and a pregnant female at the end of the first trimester of pregnancy, in addition to the adult male used earlier. An analysis code has been developed for deriving recommended values of specific absorbed fractions of photon energy. The computer code system and calculational method are described, emphasizing recent improvements in methods

20. Shape perception in human and computer vision an interdisciplinary perspective

CERN Document Server

Dickinson, Sven J

2013-01-01

This comprehensive and authoritative text/reference presents a unique, multidisciplinary perspective on Shape Perception in Human and Computer Vision. Rather than focusing purely on the state of the art, the book provides viewpoints from world-class researchers reflecting broadly on the issues that have shaped the field. Drawing upon many years of experience, each contributor discusses the trends followed and the progress made, in addition to identifying the major challenges that still lie ahead. Topics and features: examines each topic from a range of viewpoints, rather than promoting a speci

1. Virtual reality/ augmented reality technology : the next chapter of human-computer interaction

OpenAIRE

Huang, Xing

2015-01-01

No matter how many different size and shape the computer has, the basic components of computers are still the same. If we use the user perspective to look for the development of computer history, we can surprisingly find that it is the input output device that leads the development of the industry development, in one word, human-computer interaction changes the development of computer history. Human computer interaction has been gone through three stages, the first stage relies on the inpu...

2. My4Sight: A Human Computation Platform for Improving Flu Predictions

OpenAIRE

Akupatni, Vivek Bharath

2015-01-01

While many human computation (human-in-the-loop) systems exist in the field of Artificial Intelligence (AI) to solve problems that can't be solved by computers alone, comparatively fewer platforms exist for collecting human knowledge, and evaluation of various techniques for harnessing human insights in improving forecasting models for infectious diseases, such as Influenza and Ebola. In this thesis, we present the design and implementation of My4Sight, a human computation system develope...

3. Aspects of computer control from the human engineering standpoint

International Nuclear Information System (INIS)

Huang, T.V.

1979-03-01

A Computer Control System includes data acquisition, information display and output control signals. In order to design such a system effectively we must first determine the required operational mode: automatic control (closed loop), computer assisted (open loop), or hybrid control. The choice of operating mode will depend on the nature of the plant, the complexity of the operation, the funds available, and the technical expertise of the operating staff, among many other factors. Once the mode has been selected, consideration must be given to the method (man/machine interface) by which the operator interacts with this system. The human engineering factors are of prime importance to achieving high operating efficiency and very careful attention must be given to this aspect of the work, if full operator acceptance is to be achieved. This paper will discuss these topics and will draw on experience gained in setting up the computer control system in Main Control Center for Stanford University's Accelerator Center (a high energy physics research facility)

4. Evidence Report: Risk of Inadequate Human-Computer Interaction

Science.gov (United States)

Holden, Kritina; Ezer, Neta; Vos, Gordon

2013-01-01

5. Gaze-and-brain-controlled interfaces for human-computer and human-robot interaction

Directory of Open Access Journals (Sweden)

Shishkin S. L.

2017-09-01

Full Text Available Background. Human-machine interaction technology has greatly evolved during the last decades, but manual and speech modalities remain single output channels with their typical constraints imposed by the motor system’s information transfer limits. Will brain-computer interfaces (BCIs and gaze-based control be able to convey human commands or even intentions to machines in the near future? We provide an overview of basic approaches in this new area of applied cognitive research. Objective. We test the hypothesis that the use of communication paradigms and a combination of eye tracking with unobtrusive forms of registering brain activity can improve human-machine interaction. Methods and Results. Three groups of ongoing experiments at the Kurchatov Institute are reported. First, we discuss the communicative nature of human-robot interaction, and approaches to building a more e cient technology. Specifically, “communicative” patterns of interaction can be based on joint attention paradigms from developmental psychology, including a mutual “eye-to-eye” exchange of looks between human and robot. Further, we provide an example of “eye mouse” superiority over the computer mouse, here in emulating the task of selecting a moving robot from a swarm. Finally, we demonstrate a passive, noninvasive BCI that uses EEG correlates of expectation. This may become an important lter to separate intentional gaze dwells from non-intentional ones. Conclusion. The current noninvasive BCIs are not well suited for human-robot interaction, and their performance, when they are employed by healthy users, is critically dependent on the impact of the gaze on selection of spatial locations. The new approaches discussed show a high potential for creating alternative output pathways for the human brain. When support from passive BCIs becomes mature, the hybrid technology of the eye-brain-computer (EBCI interface will have a chance to enable natural, fluent, and the

6. Human-computer interface incorporating personal and application domains

Science.gov (United States)

Anderson, Thomas G [Albuquerque, NM

2011-03-29

The present invention provides a human-computer interface. The interface includes provision of an application domain, for example corresponding to a three-dimensional application. The user is allowed to navigate and interact with the application domain. The interface also includes a personal domain, offering the user controls and interaction distinct from the application domain. The separation into two domains allows the most suitable interface methods in each: for example, three-dimensional navigation in the application domain, and two- or three-dimensional controls in the personal domain. Transitions between the application domain and the personal domain are under control of the user, and the transition method is substantially independent of the navigation in the application domain. For example, the user can fly through a three-dimensional application domain, and always move to the personal domain by moving a cursor near one extreme of the display.

7. A computational model of human auditory signal processing and perception

DEFF Research Database (Denmark)

Jepsen, Morten Løve; Ewert, Stephan D.; Dau, Torsten

2008-01-01

A model of computational auditory signal-processing and perception that accounts for various aspects of simultaneous and nonsimultaneous masking in human listeners is presented. The model is based on the modulation filterbank model described by Dau et al. [J. Acoust. Soc. Am. 102, 2892 (1997...... discrimination with pure tones and broadband noise, tone-in-noise detection, spectral masking with narrow-band signals and maskers, forward masking with tone signals and tone or noise maskers, and amplitude-modulation detection with narrow- and wideband noise carriers. The model can account for most of the key...... properties of the data and is more powerful than the original model. The model might be useful as a front end in technical applications....

8. Human-computer interface glove using flexible piezoelectric sensors

Science.gov (United States)

Cha, Youngsu; Seo, Jeonggyu; Kim, Jun-Sik; Park, Jung-Min

2017-05-01

In this note, we propose a human-computer interface glove based on flexible piezoelectric sensors. We select polyvinylidene fluoride as the piezoelectric material for the sensors because of advantages such as a steady piezoelectric characteristic and good flexibility. The sensors are installed in a fabric glove by means of pockets and Velcro bands. We detect changes in the angles of the finger joints from the outputs of the sensors, and use them for controlling a virtual hand that is utilized in virtual object manipulation. To assess the sensing ability of the piezoelectric sensors, we compare the processed angles from the sensor outputs with the real angles from a camera recoding. With good agreement between the processed and real angles, we successfully demonstrate the user interaction system with the virtual hand and interface glove based on the flexible piezoelectric sensors, for four hand motions: fist clenching, pinching, touching, and grasping.

9. Simple, accurate equations for human blood O2 dissociation computations.

Science.gov (United States)

Severinghaus, J W

1979-03-01

Hill's equation can be slightly modified to fit the standard human blood O2 dissociation curve to within plus or minus 0.0055 fractional saturation (S) from O less than S less than 1. Other modifications of Hill's equation may be used to compute Po2 (Torr) from S (Eq. 2), and the temperature coefficient of Po2 (Eq. 3). Variations of the Bohr coefficient with Po2 are given by Eq. 4. S = (((Po2(3) + 150 Po2)(-1) x 23,400) + 1)(-1) (1) In Po2 = 0.385 In (S-1 - 1)(-1) + 3.32 - (72 S)(-1) - 0.17(S6) (2) DELTA In Po2/delta T = 0.058 ((0.243 X Po2/100)(3.88) + 1)(-1) + 0.013 (3) delta In Po2/delta pH = (Po2/26.6)(0.184) - 2.2 (4) Procedures are described to determine Po2 and S of blood iteratively after extraction or addition of a defined amount of O2 and to compute P50 of blood from a single sample after measuring Po2, pH, and S.

10. Assessing Human Judgment of Computationally Generated Swarming Behavior

Directory of Open Access Journals (Sweden)

John Harvey

2018-02-01

Full Text Available Computer-based swarm systems, aiming to replicate the flocking behavior of birds, were first introduced by Reynolds in 1987. In his initial work, Reynolds noted that while it was difficult to quantify the dynamics of the behavior from the model, observers of his model immediately recognized them as a representation of a natural flock. Considerable analysis has been conducted since then on quantifying the dynamics of flocking/swarming behavior. However, no systematic analysis has been conducted on human identification of swarming. In this paper, we assess subjects’ assessment of the behavior of a simplified version of Reynolds’ model. Factors that affect the identification of swarming are discussed and future applications of the resulting models are proposed. Differences in decision times for swarming-related questions asked during the study indicate that different brain mechanisms may be involved in different elements of the behavior assessment task. The relatively simple but finely tunable model used in this study provides a useful methodology for assessing individual human judgment of swarming behavior.

11. Hybrid Human-Computing Distributed Sense-Making: Extending the SOA Paradigm for Dynamic Adjudication and Optimization of Human and Computer Roles

Science.gov (United States)

Rimland, Jeffrey C.

2013-01-01

In many evolving systems, inputs can be derived from both human observations and physical sensors. Additionally, many computation and analysis tasks can be performed by either human beings or artificial intelligence (AI) applications. For example, weather prediction, emergency event response, assistive technology for various human sensory and…

12. Mutations that Cause Human Disease: A Computational/Experimental Approach

Energy Technology Data Exchange (ETDEWEB)

Beernink, P; Barsky, D; Pesavento, B

2006-01-11

International genome sequencing projects have produced billions of nucleotides (letters) of DNA sequence data, including the complete genome sequences of 74 organisms. These genome sequences have created many new scientific opportunities, including the ability to identify sequence variations among individuals within a species. These genetic differences, which are known as single nucleotide polymorphisms (SNPs), are particularly important in understanding the genetic basis for disease susceptibility. Since the report of the complete human genome sequence, over two million human SNPs have been identified, including a large-scale comparison of an entire chromosome from twenty individuals. Of the protein coding SNPs (cSNPs), approximately half leads to a single amino acid change in the encoded protein (non-synonymous coding SNPs). Most of these changes are functionally silent, while the remainder negatively impact the protein and sometimes cause human disease. To date, over 550 SNPs have been found to cause single locus (monogenic) diseases and many others have been associated with polygenic diseases. SNPs have been linked to specific human diseases, including late-onset Parkinson disease, autism, rheumatoid arthritis and cancer. The ability to predict accurately the effects of these SNPs on protein function would represent a major advance toward understanding these diseases. To date several attempts have been made toward predicting the effects of such mutations. The most successful of these is a computational approach called ''Sorting Intolerant From Tolerant'' (SIFT). This method uses sequence conservation among many similar proteins to predict which residues in a protein are functionally important. However, this method suffers from several limitations. First, a query sequence must have a sufficient number of relatives to infer sequence conservation. Second, this method does not make use of or provide any information on protein structure, which

13. Human Pacman: A Mobile Augmented Reality Entertainment System Based on Physical, Social, and Ubiquitous Computing

Science.gov (United States)

This chapter details the Human Pacman system to illuminate entertainment computing which ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on infrastructure provided by mobile computing, wireless LAN, and ubiquitous computing. With Human Pacman, we have a physical role-playing computer fantasy together with real human-social and mobile-gaming that emphasizes on collaboration and competition between players in a wide outdoor physical area that allows natural wide-area human-physical movements. Pacmen and Ghosts are now real human players in the real world experiencing mixed computer graphics fantasy-reality provided by using the wearable computers on them. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide novel experiences of seamless transitions between the real and virtual worlds. This is an example of a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.

14. Open-Box Muscle-Computer Interface: Introduction to Human-Computer Interactions in Bioengineering, Physiology, and Neuroscience Courses

Science.gov (United States)

Landa-Jiménez, M. A.; González-Gaspar, P.; Pérez-Estudillo, C.; López-Meraz, M. L.; Morgado-Valle, C.; Beltran-Parrazal, L.

2016-01-01

A Muscle-Computer Interface (muCI) is a human-machine system that uses electromyographic (EMG) signals to communicate with a computer. Surface EMG (sEMG) signals are currently used to command robotic devices, such as robotic arms and hands, and mobile robots, such as wheelchairs. These signals reflect the motor intention of a user before the…

15. Enrichment of Human-Computer Interaction in Brain-Computer Interfaces via Virtual Environments

Directory of Open Access Journals (Sweden)

Alonso-Valerdi Luz María

2017-01-01

Full Text Available Tridimensional representations stimulate cognitive processes that are the core and foundation of human-computer interaction (HCI. Those cognitive processes take place while a user navigates and explores a virtual environment (VE and are mainly related to spatial memory storage, attention, and perception. VEs have many distinctive features (e.g., involvement, immersion, and presence that can significantly improve HCI in highly demanding and interactive systems such as brain-computer interfaces (BCI. BCI is as a nonmuscular communication channel that attempts to reestablish the interaction between an individual and his/her environment. Although BCI research started in the sixties, this technology is not efficient or reliable yet for everyone at any time. Over the past few years, researchers have argued that main BCI flaws could be associated with HCI issues. The evidence presented thus far shows that VEs can (1 set out working environmental conditions, (2 maximize the efficiency of BCI control panels, (3 implement navigation systems based not only on user intentions but also on user emotions, and (4 regulate user mental state to increase the differentiation between control and noncontrol modalities.

16. Human-Centered Design of Human-Computer-Human Dialogs in Aerospace Systems

Science.gov (United States)

Mitchell, Christine M.

1998-01-01

A series of ongoing research programs at Georgia Tech established a need for a simulation support tool for aircraft computer-based aids. This led to the design and development of the Georgia Tech Electronic Flight Instrument Research Tool (GT-EFIRT). GT-EFIRT is a part-task flight simulator specifically designed to study aircraft display design and single pilot interaction. ne simulator, using commercially available graphics and Unix workstations, replicates to a high level of fidelity the Electronic Flight Instrument Systems (EFIS), Flight Management Computer (FMC) and Auto Flight Director System (AFDS) of the Boeing 757/767 aircraft. The simulator can be configured to present information using conventional looking B757n67 displays or next generation Primary Flight Displays (PFD) such as found on the Beech Starship and MD-11.

17. A Human/Computer Learning Network to Improve Biodiversity Conservation and Research

OpenAIRE

Kelling, Steve; Gerbracht, Jeff; Fink, Daniel; Lagoze, Carl; Wong, Weng-Keen; Yu, Jun; Damoulas, Theodoros; Gomes, Carla

2012-01-01

In this paper we describe eBird, a citizen-science project that takes advantage of the human observational capacity to identify birds to species, which is then used to accurately represent patterns of bird occurrences across broad spatial and temporal extents. eBird employs artificial intelligence techniques such as machine learning to improve data quality by taking advantage of the synergies between human computation and mechanical computation. We call this a Human-Computer Learning Network,...

18. A collaborative brain-computer interface for improving human performance.

Directory of Open Access Journals (Sweden)

Yijun Wang

Full Text Available Electroencephalogram (EEG based brain-computer interfaces (BCI have been studied since the 1970s. Currently, the main focus of BCI research lies on the clinical use, which aims to provide a new communication channel to patients with motor disabilities to improve their quality of life. However, the BCI technology can also be used to improve human performance for normal healthy users. Although this application has been proposed for a long time, little progress has been made in real-world practices due to technical limits of EEG. To overcome the bottleneck of low single-user BCI performance, this study proposes a collaborative paradigm to improve overall BCI performance by integrating information from multiple users. To test the feasibility of a collaborative BCI, this study quantitatively compares the classification accuracies of collaborative and single-user BCI applied to the EEG data collected from 20 subjects in a movement-planning experiment. This study also explores three different methods for fusing and analyzing EEG data from multiple subjects: (1 Event-related potentials (ERP averaging, (2 Feature concatenating, and (3 Voting. In a demonstration system using the Voting method, the classification accuracy of predicting movement directions (reaching left vs. reaching right was enhanced substantially from 66% to 80%, 88%, 93%, and 95% as the numbers of subjects increased from 1 to 5, 10, 15, and 20, respectively. Furthermore, the decision of reaching direction could be made around 100-250 ms earlier than the subject's actual motor response by decoding the ERP activities arising mainly from the posterior parietal cortex (PPC, which are related to the processing of visuomotor transmission. Taken together, these results suggest that a collaborative BCI can effectively fuse brain activities of a group of people to improve the overall performance of natural human behavior.

19. [Computational prediction of human immunodeficiency resistance to reverse transcriptase inhibitors].

Science.gov (United States)

Tarasova, O A; Filimonov, D A; Poroikov, V V

2017-10-01

Human immunodeficiency virus (HIV) causes acquired immunodeficiency syndrome (AIDS) and leads to over one million of deaths annually. Highly active antiretroviral treatment (HAART) is a gold standard in the HIV/AIDS therapy. Nucleoside and non-nucleoside inhibitors of HIV reverse transcriptase (RT) are important component of HAART, but their effect depends on the HIV susceptibility/resistance. HIV resistance mainly occurs due to mutations leading to conformational changes in the three-dimensional structure of HIV RT. The aim of our work was to develop and test a computational method for prediction of HIV resistance associated with the mutations in HIV RT. Earlier we have developed a method for prediction of HIV type 1 (HIV-1) resistance; it is based on the usage of position-specific descriptors. These descriptors are generated using the particular amino acid residue and its position; the position of certain residue is determined in a multiple alignment. The training set consisted of more than 1900 sequences of HIV RT from the Stanford HIV Drug Resistance database; for these HIV RT variants experimental data on their resistance to ten inhibitors are presented. Balanced accuracy of prediction varies from 80% to 99% depending on the method of classification (support vector machine, Naive Bayes, random forest, convolutional neural networks) and the drug, resistance to which is obtained. Maximal balanced accuracy was obtained for prediction of resistance to zidovudine, stavudine, didanosine and efavirenz by the random forest classifier. Average accuracy of prediction is 89%.

20. Institutionalizing human-computer interaction for global health.

Science.gov (United States)

Gulliksen, Jan

2017-06-01

Digitalization is the societal change process in which new ICT-based solutions bring forward completely new ways of doing things, new businesses and new movements in the society. Digitalization also provides completely new ways of addressing issues related to global health. This paper provides an overview of the field of human-computer interaction (HCI) and in what way the field has contributed to international development in different regions of the world. Additionally, it outlines the United Nations' new sustainability goals from December 2015 and what these could contribute to the development of global health and its relationship to digitalization. Finally, it argues why and how HCI could be adopted and adapted to fit the contextual needs, the need for localization and for the development of new digital innovations. The research methodology is mostly qualitative following an action research paradigm in which the actual change process that the digitalization is evoking is equally important as the scientific conclusions that can be drawn. In conclusion, the paper argues that digitalization is fundamentally changing the society through the development and use of digital technologies and may have a profound effect on the digital development of every country in the world. But it needs to be developed based on local practices, it needs international support and to not be limited by any technological constraints. Particularly digitalization to support global health requires a profound understanding of the users and their context, arguing for user-centred systems design methodologies as particularly suitable.

1. Remotely Telling Humans and Computers Apart: An Unsolved Problem

Science.gov (United States)

Hernandez-Castro, Carlos Javier; Ribagorda, Arturo

The ability to tell humans and computers apart is imperative to protect many services from misuse and abuse. For this purpose, tests called CAPTCHAs or HIPs have been designed and put into production. Recent history shows that most (if not all) can be broken given enough time and commercial interest: CAPTCHA design seems to be a much more difficult problem than previously thought. The assumption that difficult-AI problems can be easily converted into valid CAPTCHAs is misleading. There are also some extrinsic problems that do not help, especially the big number of in-house designs that are put into production without any prior public critique. In this paper we present a state-of-the-art survey of current HIPs, including proposals that are now into production. We classify them regarding their basic design ideas. We discuss current attacks as well as future attack paths, and we also present common errors in design, and how many implementation flaws can transform a not necessarily bad idea into a weak CAPTCHA. We present examples of these flaws, using specific well-known CAPTCHAs. In a more theoretical way, we discuss the threat model: confronted risks and countermeasures. Finally, we introduce and discuss some desirable properties that new HIPs should have, concluding with some proposals for future work, including methodologies for design, implementation and security assessment.

2. Inferring Human Activity in Mobile Devices by Computing Multiple Contexts.

Science.gov (United States)

Chen, Ruizhi; Chu, Tianxing; Liu, Keqiang; Liu, Jingbin; Chen, Yuwei

2015-08-28

This paper introduces a framework for inferring human activities in mobile devices by computing spatial contexts, temporal contexts, spatiotemporal contexts, and user contexts. A spatial context is a significant location that is defined as a geofence, which can be a node associated with a circle, or a polygon; a temporal context contains time-related information that can be e.g., a local time tag, a time difference between geographical locations, or a timespan; a spatiotemporal context is defined as a dwelling length at a particular spatial context; and a user context includes user-related information that can be the user's mobility contexts, environmental contexts, psychological contexts or social contexts. Using the measurements of the built-in sensors and radio signals in mobile devices, we can snapshot a contextual tuple for every second including aforementioned contexts. Giving a contextual tuple, the framework evaluates the posteriori probability of each candidate activity in real-time using a Naïve Bayes classifier. A large dataset containing 710,436 contextual tuples has been recorded for one week from an experiment carried out at Texas A&M University Corpus Christi with three participants. The test results demonstrate that the multi-context solution significantly outperforms the spatial-context-only solution. A classification accuracy of 61.7% is achieved for the spatial-context-only solution, while 88.8% is achieved for the multi-context solution.

3. Psychosocial and Cultural Modeling in Human Computation Systems: A Gamification Approach

Energy Technology Data Exchange (ETDEWEB)

Sanfilippo, Antonio P.; Riensche, Roderick M.; Haack, Jereme N.; Butner, R. Scott

2013-11-20

“Gamification”, the application of gameplay to real-world problems, enables the development of human computation systems that support decision-making through the integration of social and machine intelligence. One of gamification’s major benefits includes the creation of a problem solving environment where the influence of cognitive and cultural biases on human judgment can be curtailed through collaborative and competitive reasoning. By reducing biases on human judgment, gamification allows human computation systems to exploit human creativity relatively unhindered by human error. Operationally, gamification uses simulation to harvest human behavioral data that provide valuable insights for the solution of real-world problems.

4. Human-Centred Computing for Assisting Nuclear Safeguards

International Nuclear Information System (INIS)

Szoke, I.

2015-01-01

With the rapid evolution of enabling hardware and software, technologies including 3D simulation, virtual reality (VR), augmented reality (AR), advanced user interfaces (UI), and geographical information systems (GIS) are increasingly employed in many aspects of modern life. In line with this, the nuclear industry is rapidly adopting emerging technologies to improve efficiency and safety by supporting planning and optimization of maintenance and decommissioning work, as well as for knowledge management, surveillance, training and briefing field operatives, education, etc. For many years, the authors have been involved in research and development (R&D) into the application of 3D simulation, VR, and AR, for mobile, desktop, and immersive 3D systems, to provide a greater sense of presence and situation awareness, for training, briefing, and in situ work by field operators. This work has resulted in a unique software base and experience (documented in numerous reports) from evaluating the effects of the design of training programmes and briefing sessions on human performance and training efficiency when applying various emerging technologies. In addition, the authors are involved in R&D into the use of 3D simulation, advanced UIs, mobile computing, and GIS systems to support realistic visualization of the combined radiological and geographical environment, as well as acquisition, analyzes, visualization and sharing of radiological and other data, within nuclear installations and their surroundings. The toolkit developed by the authors, and the associated knowledge base, has been successfully applied to various aspects of the nuclear industry, and has great potential within the safeguards domain. It can be used to train safeguards inspectors, brief inspectors before inspections, assist inspectors in situ (data registration, analyzes, and communication), support the design and verification of safeguards systems, conserve data and experience, educate future safeguards

5. L'ordinateur a visage humain (The Computer in Human Guise).

Science.gov (United States)

Otman, Gabriel

1986-01-01

Discusses the tendency of humans to describe parts and functions of a computer with terminology that refers to human characteristics; for example, parts of the body (electronic brain), intellectual activities (optical memory), and physical activities (command). Computers are also described through metaphors, connotations, allusions, and analogies…

6. Computer science security research and human subjects: emerging considerations for research ethics boards.

Science.gov (United States)

Buchanan, Elizabeth; Aycock, John; Dexter, Scott; Dittrich, David; Hvizdak, Erin

2011-06-01

This paper explores the growing concerns with computer science research, and in particular, computer security research and its relationship with the committees that review human subjects research. It offers cases that review boards are likely to confront, and provides a context for appropriate consideration of such research, as issues of bots, clouds, and worms enter the discourse of human subjects review.

7. Implementations of the CC'01 Human-Computer Interaction Guidelines Using Bloom's Taxonomy

Science.gov (United States)

Manaris, Bill; Wainer, Michael; Kirkpatrick, Arthur E.; Stalvey, RoxAnn H.; Shannon, Christine; Leventhal, Laura; Barnes, Julie; Wright, John; Schafer, J. Ben; Sanders, Dean

2007-01-01

In today's technology-laden society human-computer interaction (HCI) is an important knowledge area for computer scientists and software engineers. This paper surveys existing approaches to incorporate HCI into computer science (CS) and such related issues as the perceived gap between the interests of the HCI community and the needs of CS…

8. Domain Decomposition for Computing Extremely Low Frequency Induced Current in the Human Body

OpenAIRE

Perrussel , Ronan; Voyer , Damien; Nicolas , Laurent; Scorretti , Riccardo; Burais , Noël

2011-01-01

International audience; Computation of electromagnetic fields in high resolution computational phantoms requires solving large linear systems. We present an application of Schwarz preconditioners with Krylov subspace methods for computing extremely low frequency induced fields in a phantom issued from the Visible Human.

9. Human-Computer Interfaces for Wearable Computers: A Systematic Approach to Development and Evaluation

OpenAIRE

Witt, Hendrik

2007-01-01

The research presented in this thesis examines user interfaces for wearable computers.Wearable computers are a special kind of mobile computers that can be worn on the body. Furthermore, they integrate themselves even more seamlessly into different activities than a mobile phone or a personal digital assistant can.The thesis investigates the development and evaluation of user interfaces for wearable computers. In particular, it presents fundamental research results as well as supporting softw...

10. Appearance-based human gesture recognition using multimodal features for human computer interaction

Science.gov (United States)

Luo, Dan; Gao, Hua; Ekenel, Hazim Kemal; Ohya, Jun

2011-03-01

The use of gesture as a natural interface plays an utmost important role for achieving intelligent Human Computer Interaction (HCI). Human gestures include different components of visual actions such as motion of hands, facial expression, and torso, to convey meaning. So far, in the field of gesture recognition, most previous works have focused on the manual component of gestures. In this paper, we present an appearance-based multimodal gesture recognition framework, which combines the different groups of features such as facial expression features and hand motion features which are extracted from image frames captured by a single web camera. We refer 12 classes of human gestures with facial expression including neutral, negative and positive meanings from American Sign Languages (ASL). We combine the features in two levels by employing two fusion strategies. At the feature level, an early feature combination can be performed by concatenating and weighting different feature groups, and LDA is used to choose the most discriminative elements by projecting the feature on a discriminative expression space. The second strategy is applied on decision level. Weighted decisions from single modalities are fused in a later stage. A condensation-based algorithm is adopted for classification. We collected a data set with three to seven recording sessions and conducted experiments with the combination techniques. Experimental results showed that facial analysis improve hand gesture recognition, decision level fusion performs better than feature level fusion.

11. Integrating Human and Computer Intelligence. Technical Report No. 32.

Science.gov (United States)

Pea, Roy D.

This paper explores the thesis that advances in computer applications and artificial intelligence have important implications for the study of development and learning in psychology. Current approaches to the use of computers as devices for problem solving, reasoning, and thinking--i.e., expert systems and intelligent tutoring systems--are…

12. Developing Educational Computer Animation Based on Human Personality Types

Science.gov (United States)

Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol

2015-01-01

Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…

13. Computerized Cognitive Rehabilitation: Comparing Different Human-Computer Interactions.

Science.gov (United States)

Quaglini, Silvana; Alloni, Anna; Cattani, Barbara; Panzarasa, Silvia; Pistarini, Caterina

2017-01-01

In this work we describe an experiment involving aphasic patients, where the same speech rehabilitation exercise was administered in three different modalities, two of which are computer-based. In particular, one modality exploits the "Makey Makey", an electronic board which allows interacting with the computer using physical objects.

14. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

Science.gov (United States)

Hatanaka, Iwao

2000-01-01

The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

15. Constructing a Computer Model of the Human Eye Based on Tissue Slice Images

OpenAIRE

Dai, Peishan; Wang, Boliang; Bao, Chunbo; Ju, Ying

2010-01-01

Computer simulation of the biomechanical and biological heat transfer in ophthalmology greatly relies on having a reliable computer model of the human eye. This paper proposes a novel method on the construction of a geometric model of the human eye based on tissue slice images. Slice images were obtained from an in vitro Chinese human eye through an embryo specimen processing methods. A level set algorithm was used to extract contour points of eye tissues while a principle component analysi...

16. Proceedings of the topical meeting on advances in human factors research on man/computer interactions

International Nuclear Information System (INIS)

Anon.

1990-01-01

This book discusses the following topics: expert systems and knowledge engineering-I; verification and validation of software; methods for modeling UMAN/computer performance; MAN/computer interaction problems in producing procedures -1-2; progress and problems with automation-1-2; experience with electronic presentation of procedures-2; intelligent displays and monitors; modeling user/computer interface; and computer-based human decision-making aids

17. Identification of Enhancers In Human: Advances In Computational Studies

KAUST Repository

Kleftogiannis, Dimitrios A.

2016-01-01

Finally, we take a step further by developing a novel feature selection method suitable for defining a computational framework capable of analyzing the genomic content of enhancers and reporting cell-line specific predictive signatures.

18. Human face recognition using eigenface in cloud computing environment

Science.gov (United States)

Siregar, S. T. M.; Syahputra, M. F.; Rahmat, R. F.

2018-02-01

Doing a face recognition for one single face does not take a long time to process, but if we implement attendance system or security system on companies that have many faces to be recognized, it will take a long time. Cloud computing is a computing service that is done not on a local device, but on an internet connected to a data center infrastructure. The system of cloud computing also provides a scalability solution where cloud computing can increase the resources needed when doing larger data processing. This research is done by applying eigenface while collecting data as training data is also done by using REST concept to provide resource, then server can process the data according to existing stages. After doing research and development of this application, it can be concluded by implementing Eigenface, recognizing face by applying REST concept as endpoint in giving or receiving related information to be used as a resource in doing model formation to do face recognition.

19. Computational analysis of human miRNAs phylogenetics

African Journals Online (AJOL)

User

2011-05-02

May 2, 2011 ... Human DNA. 71. 100.00. 1.94E-28. AL138714. Human DNA sequence from clone RP11-. 121J7 on chromosome 13q32.1-32.3. Contains the 3' end of a novel gene, the 5' end of the GPC5 gene for glypican 5, 5 ..... including human, chimpanzee, orangutan, and macaque, and find that miRNAs were ...

20. Activity-based computing: computational management of activities reflecting human intention

DEFF Research Database (Denmark)

Bardram, Jakob E; Jeuris, Steven; Houben, Steven

2015-01-01

paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...

1. Advancements in Violin-Related Human-Computer Interaction

DEFF Research Database (Denmark)

Overholt, Daniel

2014-01-01

of human intelligence and emotion is at the core of the Musical Interface Technology Design Space, MITDS. This is a framework that endeavors to retain and enhance such traits of traditional instruments in the design of interactive live performance interfaces. Utilizing the MITDS, advanced Human...

2. Applying systemic-structural activity theory to design of human-computer interaction systems

CERN Document Server

Bedny, Gregory Z; Bedny, Inna

2015-01-01

Human-Computer Interaction (HCI) is an interdisciplinary field that has gained recognition as an important field in ergonomics. HCI draws on ideas and theoretical concepts from computer science, psychology, industrial design, and other fields. Human-Computer Interaction is no longer limited to trained software users. Today people interact with various devices such as mobile phones, tablets, and laptops. How can you make such interaction user friendly, even when user proficiency levels vary? This book explores methods for assessing the psychological complexity of computer-based tasks. It also p

3. HuRECA: Human Reliability Evaluator for Computer-based Control Room Actions

International Nuclear Information System (INIS)

Kim, Jae Whan; Lee, Seung Jun; Jang, Seung Cheol

2011-01-01

As computer-based design features such as computer-based procedures (CBP), soft controls (SCs), and integrated information systems are being adopted in main control rooms (MCR) of nuclear power plants, a human reliability analysis (HRA) method capable of dealing with the effects of these design features on human reliability is needed. From the observations of human factors engineering verification and validation experiments, we have drawn some major important characteristics on operator behaviors and design-related influencing factors (DIFs) from the perspective of human reliability. Firstly, there are new DIFs that should be considered in developing an HRA method for computer-based control rooms including especially CBP and SCs. In the case of the computer-based procedure rather than the paper-based procedure, the structural and managerial elements should be considered as important PSFs in addition to the procedural contents. In the case of the soft controllers, the so-called interface management tasks (or secondary tasks) should be reflected in the assessment of human error probability. Secondly, computer-based control rooms can provide more effective error recovery features than conventional control rooms. Major error recovery features for computer-based control rooms include the automatic logic checking function of the computer-based procedure and the information sharing feature of the general computer-based designs

4. Computational Modeling of Human Multiple-Task Performance

National Research Council Canada - National Science Library

Kieras, David E; Meyer, David

2005-01-01

This is the final report for a project that was a continuation of an earlier, long-term project on the development and validation of the EPIC cognitive architecture for modeling human cognition and performance...

5. Human-Computer Interaction Software: Lessons Learned, Challenges Ahead

Science.gov (United States)

1989-01-01

domain communi- Iatelligent s t s s Me cation. Users familiar with problem Inteligent support systes. High-func- anddomains but inxperienced with comput...8217i. April 1987, pp. 7.3-78. His research interests include artificial intel- Creating better HCI softw-are will have a 8. S.K Catrd. I.P. Moran. arid

6. Supporting Human Activities - Exploring Activity-Centered Computing

DEFF Research Database (Denmark)

Christensen, Henrik Bærbak; Bardram, Jakob

2002-01-01

In this paper we explore an activity-centered computing paradigm that is aimed at supporting work processes that are radically different from the ones known from office work. Our main inspiration is healthcare work that is characterized by an extreme degree of mobility, many interruptions, ad-hoc...

7. Can human experts predict solubility better than computers?

Science.gov (United States)

Boobier, Samuel; Osbourn, Anne; Mitchell, John B O

2017-12-13

In this study, we design and carry out a survey, asking human experts to predict the aqueous solubility of druglike organic compounds. We investigate whether these experts, drawn largely from the pharmaceutical industry and academia, can match or exceed the predictive power of algorithms. Alongside this, we implement 10 typical machine learning algorithms on the same dataset. The best algorithm, a variety of neural network known as a multi-layer perceptron, gave an RMSE of 0.985 log S units and an R 2 of 0.706. We would not have predicted the relative success of this particular algorithm in advance. We found that the best individual human predictor generated an almost identical prediction quality with an RMSE of 0.942 log S units and an R 2 of 0.723. The collection of algorithms contained a higher proportion of reasonably good predictors, nine out of ten compared with around half of the humans. We found that, for either humans or algorithms, combining individual predictions into a consensus predictor by taking their median generated excellent predictivity. While our consensus human predictor achieved very slightly better headline figures on various statistical measures, the difference between it and the consensus machine learning predictor was both small and statistically insignificant. We conclude that human experts can predict the aqueous solubility of druglike molecules essentially equally well as machine learning algorithms. We find that, for either humans or algorithms, combining individual predictions into a consensus predictor by taking their median is a powerful way of benefitting from the wisdom of crowds.

8. Experimental evaluation of multimodal human computer interface for tactical audio applications

NARCIS (Netherlands)

Obrenovic, Z.; Starcevic, D.; Jovanov, E.; Oy, S.

2002-01-01

Mission critical and information overwhelming applications require careful design of the human computer interface. Typical applications include night vision or low visibility mission navigation, guidance through a hostile territory, and flight navigation and orientation. Additional channels of

9. Design Science in Human-Computer Interaction: A Model and Three Examples

Science.gov (United States)

Prestopnik, Nathan R.

2013-01-01

Humanity has entered an era where computing technology is virtually ubiquitous. From websites and mobile devices to computers embedded in appliances on our kitchen counters and automobiles parked in our driveways, information and communication technologies (ICTs) and IT artifacts are fundamentally changing the ways we interact with our world.…

10. Eyewear Computing  Augmenting the Human with Head-mounted Wearable Assistants (Dagstuhl Seminar 16042)

OpenAIRE

Bulling, Andreas; Cakmakci, Ozan; Kunze, Kai; Rehg, James M.

2016-01-01

The seminar was composed of workshops and tutorials on head-mounted eye tracking, egocentric vision, optics, and head-mounted displays. The seminar welcomed 30 academic and industry researchers from Europe, the US, and Asia with a diverse background, including wearable and ubiquitous computing, computer vision, developmental psychology, optics, and human-computer interaction. In contrast to several previous Dagstuhl seminars, we used an ignite talk format to reduce the time of talks to...

11. The Human Genome Project: Biology, Computers, and Privacy.

Science.gov (United States)

Cutter, Mary Ann G.; Drexler, Edward; Gottesman, Kay S.; Goulding, Philip G.; McCullough, Laurence B.; McInerney, Joseph D.; Micikas, Lynda B.; Mural, Richard J.; Murray, Jeffrey C.; Zola, John

This module, for high school teachers, is the second of two modules about the Human Genome Project (HGP) produced by the Biological Sciences Curriculum Study (BSCS). The first section of this module provides background information for teachers about the structure and objectives of the HGP, aspects of the science and technology that underlie the…

12. Using Noninvasive Brain Measurement to Explore the Psychological Effects of Computer Malfunctions on Users during Human-Computer Interactions

Directory of Open Access Journals (Sweden)

Leanne M. Hirshfield

2014-01-01

Full Text Available In today’s technologically driven world, there is a need to better understand the ways that common computer malfunctions affect computer users. These malfunctions may have measurable influences on computer user’s cognitive, emotional, and behavioral responses. An experiment was conducted where participants conducted a series of web search tasks while wearing functional near-infrared spectroscopy (fNIRS and galvanic skin response sensors. Two computer malfunctions were introduced during the sessions which had the potential to influence correlates of user trust and suspicion. Surveys were given after each session to measure user’s perceived emotional state, cognitive load, and perceived trust. Results suggest that fNIRS can be used to measure the different cognitive and emotional responses associated with computer malfunctions. These cognitive and emotional changes were correlated with users’ self-report levels of suspicion and trust, and they in turn suggest future work that further explores the capability of fNIRS for the measurement of user experience during human-computer interactions.

13. Recent Advances in Computational Mechanics of the Human Knee Joint

Science.gov (United States)

Kazemi, M.; Dabiri, Y.; Li, L. P.

2013-01-01

Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling. PMID:23509602

14. Computational simulation of chromosome breaks in human liver

International Nuclear Information System (INIS)

Yang Jianshe; Li Wenjian; Jin Xiaodong

2006-01-01

An easy method was established for computing chromosome breaks in cells exposed to heavily charged particles. The cell chromosome break value by 12 C +6 ions was theoretically calculated, and was tested with experimental data of chromosome breaks by using a premature chromosome condensation technique. The theoretical chromosome break value agreed well with the experimental data. The higher relative biological effectiveness of the heavy ions was closely correlated to its physical characteristics. In addition, the chromosome break value can be predicted off line. (authors)

15. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

DEFF Research Database (Denmark)

Sonnenwald, Diane H.

1988-01-01

A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

16. Distinguishing humans from computers in the game of go: A complex network approach

Science.gov (United States)

Coquidé, C.; Georgeot, B.; Giraud, O.

2017-08-01

We compare complex networks built from the game of go and obtained from databases of human-played games with those obtained from computer-played games. Our investigations show that statistical features of the human-based networks and the computer-based networks differ, and that these differences can be statistically significant on a relatively small number of games using specific estimators. We show that the deterministic or stochastic nature of the computer algorithm playing the game can also be distinguished from these quantities. This can be seen as a tool to implement a Turing-like test for go simulators.

17. MoCog1: A computer simulation of recognition-primed human decision making

Science.gov (United States)

Gevarter, William B.

1991-01-01

The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

18. Computer modelling of HT gas metabolism in humans

International Nuclear Information System (INIS)

Peterman, B.F.

1982-01-01

A mathematical model was developed to simulate the metabolism of HT gas in humans. The rate constants of the model were estimated by fitting the calculated curves to the experimental data by Pinson and Langham in 1957. The calculations suggest that the oxidation of HT gas (which probably occurs as a result of the enzymatic action of hydrogenase present in bacteria of human gut) occurs at a relatively low rate with a half-time of 10-12 hours. The inclusion of the dose due to the production of the HT oxidation product (HTO) in the soft tissues lowers the value of derived air concentration by about 50%. Furthermore the relationship between the concentration of HTO in urine and the dose to the lung from HT in the air in lungs is linear after short HT exposures, and hence HTO concentrations in urine can be used to estimate the upper limits on the lung dose from HT exposures. (author)

19. Measuring Human Performance within Computer Security Incident Response Teams

Energy Technology Data Exchange (ETDEWEB)

McClain, Jonathan T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva, Austin Ray [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Avina, Glory Emmanuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Forsythe, James C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

2015-09-01

Human performance has become a pertinen t issue within cyber security. However, this research has been stymied by the limited availability of expert cyber security professionals. This is partly attributable to the ongoing workload faced by cyber security professionals, which is compound ed by the limited number of qualified personnel and turnover of p ersonnel across organizations. Additionally, it is difficult to conduct research, and particularly, openly published research, due to the sensitivity inherent to cyber ope rations at most orga nizations. As an alternative, the current research has focused on data collection during cyb er security training exercises. These events draw individuals with a range of knowledge and experience extending from seasoned professionals to recent college gradu ates to college students. The current paper describes research involving data collection at two separate cyber security exercises. This data collection involved multiple measures which included behavioral performance based on human - machine transactions and questionnaire - based assessments of cyber security experience.

20. Computer simulation of mucosal waves on vibrating human vocal folds

Czech Academy of Sciences Publication Activity Database

Vampola, T.; Horáček, Jaromír; Klepáček, I.

2016-01-01

Roč. 36, č. 3 (2016), s. 451-465 ISSN 0208-5216 R&D Projects: GA ČR GA16-01246S; GA ČR(CZ) GAP101/12/1306 Institutional support: RVO:61388998 Keywords : biomechanics of human voice * 3D FE model of human larynx * finite element method * proper orthogonal decomposition analysis Subject RIV: BI - Acoustics Impact factor: 1.031, year: 2016 http://ac.els-cdn.com/S0208521616300298/1-s2.0-S0208521616300298-main.pdf?_tid=e0b15360-28a9-11e6-9119-00000aab0f27&acdnat=1464862256_9ef3bcd835b40b3ce495106c65295508

1. Transnational HCI: Humans, Computers and Interactions in Global Contexts

DEFF Research Database (Denmark)

Vertesi, Janet; Lindtner, Silvia; Shklovski, Irina

2011-01-01

, but as evolving in relation to global processes, boundary crossings, frictions and hybrid practices. In doing so, we expand upon existing research in HCI to consider the effects, implications for individuals and communities, and design opportunities in times of increased transnational interactions. We hope...... to broaden the conversation around the impact of technology in global processes by bringing together scholars from HCI and from related humanities, media arts and social sciences disciplines....

2. Computational Human Performance Modeling For Alarm System Design

Energy Technology Data Exchange (ETDEWEB)

Jacques Hugo

2012-07-01

3. Advanced approaches to characterize the human intestinal microbiota by computational meta-analysis

NARCIS (Netherlands)

Nikkilä, J.; Vos, de W.M.

2010-01-01

GOALS: We describe advanced approaches for the computational meta-analysis of a collection of independent studies, including over 1000 phylogenetic array datasets, as a means to characterize the variability of human intestinal microbiota. BACKGROUND: The human intestinal microbiota is a complex

4. The Socioemotional Effects of a Computer-Simulated Animal on Children's Empathy and Humane Attitudes

Science.gov (United States)

Tsai, Yueh-Feng Lily; Kaufman, David M.

2009-01-01

This study investigated the potential of using a computer-simulated animal in a handheld virtual pet videogame to improve children's empathy and humane attitudes. Also investigated was whether sex differences existed in children's development of empathy and humane attitudes resulting from play, as well as their feelings for a virtual pet. The…

5. Operational characteristics optimization of human-computer system

Directory of Open Access Journals (Sweden)

Zulquernain Mallick

2010-09-01

Full Text Available Computer operational parameters are having vital influence on the operators efficiency from readability viewpoint. Four parameters namely font, text/background color, viewing angle and viewing distance are analyzed. The text reading task, in the form of English text, was presented on the computer screen to the participating subjects and their performance, measured in terms of number of words read per minute (NWRPM, was recorded. For the purpose of optimization, the Taguchi method is used to find the optimal parameters to maximize operators’ efficiency for performing readability task. Two levels of each parameter have been considered in this study. An orthogonal array, the signal-to-noise (S/N ratio and the analysis of variance (ANOVA were employed to investigate the operators’ performance/efficiency. Results showed that Times Roman font, black text on white background, 40 degree viewing angle and 60 cm viewing distance, the subjects were quite comfortable, efficient and read maximum number of words per minute. Text/background color was dominant parameter with a percentage contribution of 76.18% towards the laid down objective followed by font type at 18.17%, viewing distance 7.04% and viewing angle 0.58%. Experimental results are provided to confirm the effectiveness of this approach.

6. Computational modeling of human oral bioavailability: what will be next?

Science.gov (United States)

Cabrera-Pérez, Miguel Ángel; Pham-The, Hai

2018-06-01

The oral route is the most convenient way of administrating drugs. Therefore, accurate determination of oral bioavailability is paramount during drug discovery and development. Quantitative structure-property relationship (QSPR), rule-of-thumb (RoT) and physiologically based-pharmacokinetic (PBPK) approaches are promising alternatives to the early oral bioavailability prediction. Areas covered: The authors give insight into the factors affecting bioavailability, the fundamental theoretical framework and the practical aspects of computational methods for predicting this property. They also give their perspectives on future computational models for estimating oral bioavailability. Expert opinion: Oral bioavailability is a multi-factorial pharmacokinetic property with its accurate prediction challenging. For RoT and QSPR modeling, the reliability of datasets, the significance of molecular descriptor families and the diversity of chemometric tools used are important factors that define model predictability and interpretability. Likewise, for PBPK modeling the integrity of the pharmacokinetic data, the number of input parameters, the complexity of statistical analysis and the software packages used are relevant factors in bioavailability prediction. Although these approaches have been utilized independently, the tendency to use hybrid QSPR-PBPK approaches together with the exploration of ensemble and deep-learning systems for QSPR modeling of oral bioavailability has opened new avenues for development promising tools for oral bioavailability prediction.

7. Automated planning target volume generation: an evaluation pitting a computer-based tool against human experts

International Nuclear Information System (INIS)

Ketting, Case H.; Austin-Seymour, Mary; Kalet, Ira; Jacky, Jon; Kromhout-Schiro, Sharon; Hummel, Sharon; Unger, Jonathan; Fagan, Lawrence M.; Griffin, Tom

1997-01-01

Purpose: Software tools are seeing increased use in three-dimensional treatment planning. However, the development of these tools frequently omits careful evaluation before placing them in clinical use. This study demonstrates the application of a rigorous evaluation methodology using blinded peer review to an automated software tool that produces ICRU-50 planning target volumes (PTVs). Methods and Materials: Seven physicians from three different institutions involved in three-dimensional treatment planning participated in the evaluation. Four physicians drew partial PTVs on nine test cases, consisting of four nasopharynx and five lung primaries. Using the same information provided to the human experts, the computer tool generated PTVs for comparison. The remaining three physicians, designated evaluators, individually reviewed the PTVs for acceptability. To exclude bias, the evaluators were blinded to the source (human or computer) of the PTVs they reviewed. Their scorings of the PTVs were statistically examined to determine if the computer tool performed as well as the human experts. Results: The computer tool was as successful as the human experts in generating PTVs. Failures were primarily attributable to insufficient margins around the clinical target volume and to encroachment upon critical structures. In a qualitative analysis, the human and computer experts displayed similar types and distributions of errors. Conclusions: Rigorous evaluation of computer-based radiotherapy tools requires comparison to current practice and can reveal areas for improvement before the tool enters clinical practice

8. Human Inspired Self-developmental Model of Neural Network (HIM): Introducing Content/Form Computing

Science.gov (United States)

Krajíček, Jiří

This paper presents cross-disciplinary research between medical/psychological evidence on human abilities and informatics needs to update current models in computer science to support alternative methods for computation and communication. In [10] we have already proposed hypothesis introducing concept of human information model (HIM) as cooperative system. Here we continue on HIM design in detail. In our design, first we introduce Content/Form computing system which is new principle of present methods in evolutionary computing (genetic algorithms, genetic programming). Then we apply this system on HIM (type of artificial neural network) model as basic network self-developmental paradigm. Main inspiration of our natural/human design comes from well known concept of artificial neural networks, medical/psychological evidence and Sheldrake theory of "Nature as Alive" [22].

9. The Study on Human-Computer Interaction Design Based on the Users’ Subconscious Behavior

Science.gov (United States)

Li, Lingyuan

2017-09-01

Human-computer interaction is human-centered. An excellent interaction design should focus on the study of user experience, which greatly comes from the consistence between design and human behavioral habit. However, users’ behavioral habits often result from subconsciousness. Therefore, it is smart to utilize users’ subconscious behavior to achieve design's intention and maximize the value of products’ functions, which gradually becomes a new trend in this field.

10. USING RESEARCH METHODS IN HUMAN COMPUTER INTERACTION TO DESIGN TECHNOLOGY FOR RESILIENCE

OpenAIRE

Lopes, Arminda Guerra

2016-01-01

ABSTRACT Research in human computer interaction (HCI) covers both technological and human behavioural concerns. As a consequence, the contributions made in HCI research tend to be aware to either engineering or the social sciences. In HCI the purpose of practical research contributions is to reveal unknown insights about human behaviour and its relationship to technology. Practical research methods normally used in HCI include formal experiments, field experiments, field studies, interviews, ...

11. Optimal design methods for a digital human-computer interface based on human reliability in a nuclear power plant

International Nuclear Information System (INIS)

Jiang, Jianjun; Zhang, Li; Xie, Tian; Wu, Daqing; Li, Min; Wang, Yiqun; Peng, Yuyuan; Peng, Jie; Zhang, Mengjia; Li, Peiyao; Ma, Congmin; Wu, Xing

2017-01-01

Highlights: • A complete optimization process is established for digital human-computer interfaces of Npps. • A quick convergence search method is proposed. • The authors propose an affinity error probability mapping function to test human reliability. - Abstract: This is the second in a series of papers describing the optimal design method for a digital human-computer interface of nuclear power plant (Npp) from three different points based on human reliability. The purpose of this series is to explore different optimization methods from varying perspectives. This present paper mainly discusses the optimal design method for quantity of components of the same factor. In monitoring process, quantity of components has brought heavy burden to operators, thus, human errors are easily triggered. To solve the problem, the authors propose an optimization process, a quick convergence search method and an affinity error probability mapping function. Two balanceable parameter values of the affinity error probability function are obtained by experiments. The experimental results show that the affinity error probability mapping function about human-computer interface has very good sensitivity and stability, and that quick convergence search method for fuzzy segments divided by component quantity has better performance than general algorithm.

12. Computational Modelling of the Human Islet Amyloid Polypeptide

DEFF Research Database (Denmark)

Skeby, Katrine Kirkeby

2014-01-01

to interpret results correctly. Computational studies and molecular dynamics (MD) simulations in particular have become important tools in the effort to understand biological mechanisms. The strength of these methods is the high resolution in time and space, and the ability to specifically design the system....... Using MD simulations we have investigated the binding of 13 different imaging agents to a fibril segment. Using clustering analysis and binding energy calculations we have identified a common binding mode for the 13 agents in the surface grooves of the fibril, which are present on all amyloid fibrils....... This information combined with specific knowledge about the AD amyloid fibril is the building block for the design of highly specific amyloid imaging agents. We have also used MD simulations to study the interaction between hIAPP and a phospholipid membrane. At neutral pH, we find that the attraction is mainly...

13. Computing Stability Effects of Mutations in Human Superoxide Dismutase 1

DEFF Research Database (Denmark)

Kepp, Kasper Planeta

2014-01-01

Protein stability is affected in several diseases and is of substantial interest in efforts to correlate genotypes to phenotypes. Superoxide dismutase 1 (SOD1) is a suitable test case for such correlations due to its abundance, stability, available crystal structures and thermochemical data......, and physiological importance. In this work, stability changes of SOD1 mutations were computed with five methods, CUPSAT, I-Mutant2.0, I-Mutant3.0, PoPMuSiC, and SDM, with emphasis on structural sensitivity as a potential issue in structure-based protein calculation. The large correlation between experimental...... literature data of SOD1 dimers and monomers (r = 0.82) suggests that mutations in separate protein monomers are mostly additive. PoPMuSiC was most accurate (typical MAE ∼ 1 kcal/mol, r ∼ 0.5). The relative performance of the methods was not very structure-dependent, and the more accurate methods also...

Science.gov (United States)

Bhardwaj, Rajneesh; Ziegler, Kimberly; Seo, Jung Hee; Ramesh, K T; Nguyen, Thao D

2014-01-01

Ocular injuries from blast have increased in recent wars, but the injury mechanism associated with the primary blast wave is unknown. We employ a three-dimensional fluid-structure interaction computational model to understand the stresses and deformations incurred by the globe due to blast overpressure. Our numerical results demonstrate that the blast wave reflections off the facial features around the eye increase the pressure loading on and around the eye. The blast wave produces asymmetric loading on the eye, which causes globe distortion. The deformation response of the globe under blast loading was evaluated, and regions of high stresses and strains inside the globe were identified. Our numerical results show that the blast loading results in globe distortion and large deviatoric stresses in the sclera. These large deviatoric stresses may be indicator for the risk of interfacial failure between the tissues of the sclera and the orbit.

15. Human anatomy nomenclature rules for the computer age.

Science.gov (United States)

Neumann, Paul E; Baud, Robert; Sprumont, Pierre

2017-04-01

Information systems are increasing in importance in biomedical sciences and medical practice. The nomenclature rules of human anatomy were reviewed for adequacy with respect to modern needs. New rules are proposed here to ensure that each Latin term is uniquely associated with an anatomical entity, as short and simple as possible, and machine-interpretable. Observance of these recommendations will also benefit students and translators of the Latin terms into other languages. Clin. Anat. 30:300-302, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

16. Brain-Computer Interfaces Applying Our Minds to Human-computer Interaction

CERN Document Server

Tan, Desney S

2010-01-01

For generations, humans have fantasized about the ability to create devices that can see into a person's mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science fiction stories. Recent advances in cognitive neuroscience and brain imaging technologies have started to turn these myths into a reality, and are providing us with the ability to interface directly with the human brain. This ability is made possible through the use of sensors that monitor physical p

17. Brain-Computer Interfaces. Applying our Minds to Human-Computer Interaction

NARCIS (Netherlands)

Tan, Desney S.; Nijholt, Antinus

2010-01-01

For generations, humans have fantasized about the ability to create devices that can see into a person’s mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science

CERN Document Server

Raux, Antoine; Lane, Ian; Misu, Teruhisa

2016-01-01

This book provides a survey of the state-of-the-art in the practical implementation of Spoken Dialog Systems for applications in everyday settings. It includes contributions on key topics in situated dialog interaction from a number of leading researchers and offers a broad spectrum of perspectives on research and development in the area. In particular, it presents applications in robotics, knowledge access and communication and covers the following topics: dialog for interacting with robots; language understanding and generation; dialog architectures and modeling; core technologies; and the analysis of human discourse and interaction. The contributions are adapted and expanded contributions from the 2014 International Workshop on Spoken Dialog Systems (IWSDS 2014), where researchers and developers from industry and academia alike met to discuss and compare their implementation experiences, analyses and empirical findings.

19. Computational model of soft tissues in the human upper airway.

Science.gov (United States)

Pelteret, J-P V; Reddy, B D

2012-01-01

This paper presents a three-dimensional finite element model of the tongue and surrounding soft tissues with potential application to the study of sleep apnoea and of linguistics and speech therapy. The anatomical data was obtained from the Visible Human Project, and the underlying histological data was also extracted and incorporated into the model. Hyperelastic constitutive models were used to describe the material behaviour, and material incompressibility was accounted for. An active Hill three-element muscle model was used to represent the muscular tissue of the tongue. The neural stimulus for each muscle group was determined through the use of a genetic algorithm-based neural control model. The fundamental behaviour of the tongue under gravitational and breathing-induced loading is investigated. It is demonstrated that, when a time-dependent loading is applied to the tongue, the neural model is able to control the position of the tongue and produce a physiologically realistic response for the genioglossus.

20. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

Science.gov (United States)

Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

2013-01-01

The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

1. Can Computers Foster Human Users’ Creativity? Theory and Praxis of Mixed-Initiative Co-Creativity

Directory of Open Access Journals (Sweden)

Antonios Liapis

2016-07-01

Full Text Available This article discusses the impact of artificially intelligent computers to the process of design, play and educational activities. A computational process which has the necessary intelligence and creativity to take a proactive role in such activities can not only support human creativity but also foster it and prompt lateral thinking. The argument is made both from the perspective of human creativity, where the computational input is treated as an external stimulus which triggers re-framing of humans’ routines and mental associations, but also from the perspective of computational creativity where human input and initiative constrains the search space of the algorithm, enabling it to focus on specific possible solutions to a problem rather than globally search for the optimal. The article reviews four mixed-initiative tools (for design and educational play based on how they contribute to human-machine co-creativity. These paradigms serve different purposes, afford different human interaction methods and incorporate different computationally creative processes. Assessing how co-creativity is facilitated on a per-paradigm basis strengthens the theoretical argument and provides an initial seed for future work in the burgeoning domain of mixed-initiative interaction.

2. Proceedings of the Third International Conference on Intelligent Human Computer Interaction

CERN Document Server

Pokorný, Jaroslav; Snášel, Václav; Abraham, Ajith

2013-01-01

The Third International Conference on Intelligent Human Computer Interaction 2011 (IHCI 2011) was held at Charles University, Prague, Czech Republic from August 29 - August 31, 2011. This conference was third in the series, following IHCI 2009 and IHCI 2010 held in January at IIIT Allahabad, India. Human computer interaction is a fast growing research area and an attractive subject of interest for both academia and industry. There are many interesting and challenging topics that need to be researched and discussed. This book aims to provide excellent opportunities for the dissemination of interesting new research and discussion about presented topics. It can be useful for researchers working on various aspects of human computer interaction. Topics covered in this book include user interface and interaction, theoretical background and applications of HCI and also data mining and knowledge discovery as a support of HCI applications.

3. Treatment of human-computer interface in a decision support system

International Nuclear Information System (INIS)

Heger, A.S.; Duran, F.A.; Cox, R.G.

1992-01-01

One of the most challenging applications facing the computer community is development of effective adaptive human-computer interface. This challenge stems from the complex nature of the human part of this symbiosis. The application of this discipline to the environmental restoration and waste management is further complicated due to the nature of environmental data. The information that is required to manage environmental impacts of human activity is fundamentally complex. This paper will discuss the efforts at Sandia National Laboratories in developing the adaptive conceptual model manager within the constraint of the environmental decision-making. A computer workstation, that hosts the Conceptual Model Manager and the Sandia Environmental Decision Support System will also be discussed

4. Investigation and evaluation into the usability of human-computer interfaces using a typical CAD system

Energy Technology Data Exchange (ETDEWEB)

Rickett, J D

1987-01-01

This research program covers three topics relating to the human-computer interface namely, voice recognition, tools and techniques for evaluation, and user and interface modeling. An investigation into the implementation of voice-recognition technologies examines how voice recognizers may be evaluated in commercial software. A prototype system was developed with the collaboration of FEMVIEW Ltd. (marketing a CAD package). A theoretical approach to evaluation leads to the hypothesis that human-computer interaction is affected by personality, influencing types of dialogue, preferred methods for providing helps, etc. A user model based on personality traits, or habitual-behavior patterns (HBP) is presented. Finally, a practical framework is provided for the evaluation of human-computer interfaces. It suggests that evaluation is an integral part of design and that the iterative use of evaluation techniques throughout the conceptualization, design, implementation and post-implementation stages will ensure systems that satisfy the needs of the users and fulfill the goal of usability.

5. Distribution of absorbed dose in human eye simulated by SRNA-2KG computer code

International Nuclear Information System (INIS)

Ilic, R.; Pesic, M.; Pavlovic, R.; Mostacci, D.

2003-01-01

Rapidly increasing performances of personal computers and development of codes for proton transport based on Monte Carlo methods will allow, very soon, the introduction of the computer planning proton therapy as a normal activity in regular hospital procedures. A description of SRNA code used for such applications and results of calculated distributions of proton-absorbed dose in human eye are given in this paper. (author)

6. Human-computer interaction handbook fundamentals, evolving technologies and emerging applications

CERN Document Server

Sears, Andrew

2007-01-01

This second edition of The Human-Computer Interaction Handbook provides an updated, comprehensive overview of the most important research in the field, including insights that are directly applicable throughout the process of developing effective interactive information technologies. It features cutting-edge advances to the scientific knowledge base, as well as visionary perspectives and developments that fundamentally transform the way in which researchers and practitioners view the discipline. As the seminal volume of HCI research and practice, The Human-Computer Interaction Handbook feature

7. Digital image processing and analysis human and computer vision applications with CVIPtools

CERN Document Server

Umbaugh, Scott E

2010-01-01

Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

8. Human Computer Collaboration at the Edge: Enhancing Collective Situation Understanding with Controlled Natural Language

Science.gov (United States)

2016-09-06

conversational agent with information exchange disabled until the end of the experiment run. The meaning of the indicator in the top- right of the agent... Human Computer Collaboration at the Edge: Enhancing Collective Situation Understanding with Controlled Natural Language Alun Preece∗, William...email: PreeceAD@cardiff.ac.uk †Emerging Technology Services, IBM United Kingdom Ltd, Hursley Park, Winchester, UK ‡US Army Research Laboratory, Human

9. Modelling flow and heat transfer around a seated human body by computational fluid dynamics

DEFF Research Database (Denmark)

Sørensen, Dan Nørtoft; Voigt, Lars Peter Kølgaard

2003-01-01

A database (http://www.ie.dtu.dk/manikin) containing a detailed representation of the surface geometry of a seated female human body was created from a surface scan of a thermal manikin (minus clothing and hair). The radiative heat transfer coefficient and the natural convection flow around...... of the computational manikin has all surface features of a human being; (2) the geometry is an exact copy of an experimental thermal manikin, enabling detailed comparisons between calculations and experiments....

10. Developing Human-Computer Interface Models and Representation Techniques(Dialogue Management as an Integral Part of Software Engineering)

OpenAIRE

Hartson, H. Rex; Hix, Deborah; Kraly, Thomas M.

1987-01-01

The Dialogue Management Project at Virginia Tech is studying the poorly understood problem of human-computer dialogue development. This problem often leads to low usability in human-computer dialogues. The Dialogue Management Project approaches solutions to low usability in interfaces by addressing human-computer dialogue development as an integral and equal part of the total system development process. This project consists of two rather distinct, but dependent, parts. One is development of ...

11. Ergonomic guidelines for using notebook personal computers. Technical Committee on Human-Computer Interaction, International Ergonomics Association.

Science.gov (United States)

Saito, S; Piccoli, B; Smith, M J; Sotoyama, M; Sweitzer, G; Villanueva, M B; Yoshitake, R

2000-10-01

In the 1980's, the visual display terminal (VDT) was introduced in workplaces of many countries. Soon thereafter, an upsurge in reported cases of related health problems, such as musculoskeletal disorders and eyestrain, was seen. Recently, the flat panel display or notebook personal computer (PC) became the most remarkable feature in modern workplaces with VDTs and even in homes. A proactive approach must be taken to avert foreseeable ergonomic and occupational health problems from the use of this new technology. Because of its distinct physical and optical characteristics, the ergonomic requirements for notebook PCs in terms of machine layout, workstation design, lighting conditions, among others, should be different from the CRT-based computers. The Japan Ergonomics Society (JES) technical committee came up with a set of guidelines for notebook PC use following exploratory discussions that dwelt on its ergonomic aspects. To keep in stride with this development, the Technical Committee on Human-Computer Interaction under the auspices of the International Ergonomics Association worked towards the international issuance of the guidelines. This paper unveils the result of this collaborative effort.

12. Human vs. Computer Diagnosis of Students' Natural Selection Knowledge: Testing the Efficacy of Text Analytic Software

Science.gov (United States)

Nehm, Ross H.; Haertig, Hendrik

2012-01-01

Our study examines the efficacy of Computer Assisted Scoring (CAS) of open-response text relative to expert human scoring within the complex domain of evolutionary biology. Specifically, we explored whether CAS can diagnose the explanatory elements (or Key Concepts) that comprise undergraduate students' explanatory models of natural selection with…

13. A hybrid approach to the computational aeroacoustics of human voice production

Czech Academy of Sciences Publication Activity Database

Šidlof, Petr; Zörner, S.; Huppe, A.

2015-01-01

Roč. 14, č. 3 (2015), s. 473-488 ISSN 1617-7959 R&D Projects: GA ČR(CZ) GAP101/11/0207 Institutional support: RVO:61388998 Keywords : computational aeroacoustics * parallel CFD * human voice * vocal folds * ventricular folds Subject RIV: BI - Acoustics Impact factor: 3.032, year: 2015

14. HCI^2 Workbench: A Development Tool for Multimodal Human-Computer Interaction Systems

NARCIS (Netherlands)

Shen, Jie; Wenzhe, Shi; Pantic, Maja

In this paper, we present a novel software tool designed and implemented to simplify the development process of Multimodal Human-Computer Interaction (MHCI) systems. This tool, which is called the HCI^2 Workbench, exploits a Publish / Subscribe (P/S) architecture [13] [14] to facilitate efficient

15. HCI^2 Framework: A software framework for multimodal human-computer interaction systems

NARCIS (Netherlands)

Shen, Jie; Pantic, Maja

2013-01-01

This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a

16. Research Summary 3-D Computational Fluid Dynamics (CFD) Model Of The Human Respiratory System

Science.gov (United States)

The U.S. EPA’s Office of Research and Development (ORD) has developed a 3-D computational fluid dynamics (CFD) model of the human respiratory system that allows for the simulation of particulate based contaminant deposition and clearance, while being adaptable for age, ethnicity,...

17. The Human-Computer Interaction of Cross-Cultural Gaming Strategy

Science.gov (United States)

Chakraborty, Joyram; Norcio, Anthony F.; Van Der Veer, Jacob J.; Andre, Charles F.; Miller, Zachary; Regelsberger, Alexander

2015-01-01

This article explores the cultural dimensions of the human-computer interaction that underlies gaming strategies. The article is a desktop study of existing literature and is organized into five sections. The first examines the cultural aspects of knowledge processing. The social constructs technology interaction is discussed. Following this, the…

18. Enhancing Human-Computer Interaction Design Education: Teaching Affordance Design for Emerging Mobile Devices

Science.gov (United States)

2010-01-01

The evolution of human-computer interaction design (HCID) over the last 20 years suggests that there is a growing need for educational scholars to consider new and more applicable theoretical models of interactive product design. The authors suggest that such paradigms would call for an approach that would equip HCID students with a better…

19. Computational Model-Based Prediction of Human Episodic Memory Performance Based on Eye Movements

Science.gov (United States)

Sato, Naoyuki; Yamaguchi, Yoko

Subjects' episodic memory performance is not simply reflected by eye movements. We use a ‘theta phase coding’ model of the hippocampus to predict subjects' memory performance from their eye movements. Results demonstrate the ability of the model to predict subjects' memory performance. These studies provide a novel approach to computational modeling in the human-machine interface.

20. Rational behavior in decision making. A comparison between humans, computers and fast and frugal strategies

NARCIS (Netherlands)

Snijders, C.C.P.

2007-01-01

Rational behavior in decision making. A comparison between humans, computers, and fast and frugal strategies Chris Snijders and Frits Tazelaar (Eindhoven University of Technology, The Netherlands) Real life decisions often have to be made in "noisy" circumstances: not all crucial information is

1. Human brain as the model of a new computer system. II

Energy Technology Data Exchange (ETDEWEB)

Holtz, K; Langheld, E

1981-12-09

For Pt. I see IBID., Vol. 29, No. 22, P. 13 (1981). The authors describe the self-generating system of connections of a self-teaching no-program associative computer. The self-generating systems of connections are regarded as simulation models of the human brain and compared with the brain structure. The system hardware comprises microprocessor, PROM, memory, VDU, keyboard unit.

2. Seismic-load-induced human errors and countermeasures using computer graphics in plant-operator communication

International Nuclear Information System (INIS)

Hara, Fumio

1988-01-01

This paper remarks the importance of seismic load-induced human errors in plant operation by delineating the characteristics of the task performance of human beings under seismic loads. It focuses on man-machine communication via multidimensional data like that conventionally displayed on large panels in a plant control room. It demonstrates a countermeasure to human errors using a computer graphics technique that conveys the global state of the plant operation to operators through cartoon-like, colored graphs in the form of faces that, with different facial expressions, show the plant safety status. (orig.)

3. MoCog1: A computer simulation of recognition-primed human decision making, considering emotions

Science.gov (United States)

Gevarter, William B.

1992-01-01

The successful results of the first stage of a research effort to develop a versatile computer model of motivated human cognitive behavior are reported. Most human decision making appears to be an experience-based, relatively straightforward, largely automatic response to situations, utilizing cues and opportunities perceived from the current environment. The development, considering emotions, of the architecture and computer program associated with such 'recognition-primed' decision-making is described. The resultant computer program (MoCog1) was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

4. Heuristic and optimal policy computations in the human brain during sequential decision-making.

Science.gov (United States)

Korn, Christoph W; Bach, Dominik R

2018-01-23

Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.

5. Computer-based personality judgments are more accurate than those made by humans

Science.gov (United States)

Youyou, Wu; Kosinski, Michal; Stillwell, David

2015-01-01

Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

6. Computer-based personality judgments are more accurate than those made by humans.

Science.gov (United States)

Youyou, Wu; Kosinski, Michal; Stillwell, David

2015-01-27

Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy.

7. Development and evaluation of a computer-aided system for analyzing human error in railway operations

International Nuclear Information System (INIS)

Kim, Dong San; Baek, Dong Hyun; Yoon, Wan Chul

2010-01-01

As human error has been recognized as one of the major contributors to accidents in safety-critical systems, there has been a strong need for techniques that can analyze human error effectively. Although many techniques have been developed so far, much room for improvement remains. As human error analysis is a cognitively demanding and time-consuming task, it is particularly necessary to develop a computerized system supporting this task. This paper presents a computer-aided system for analyzing human error in railway operations, called Computer-Aided System for Human Error Analysis and Reduction (CAS-HEAR). It supports analysts to find multiple levels of error causes and their causal relations by using predefined links between contextual factors and causal factors as well as links between causal factors. In addition, it is based on a complete accident model; hence, it helps analysts to conduct a thorough analysis without missing any important part of human error analysis. A prototype of CAS-HEAR was evaluated by nine field investigators from six railway organizations in Korea. Its overall usefulness in human error analysis was confirmed, although development of its simplified version and some modification of the contextual factors and causal factors are required in order to ensure its practical use.

8. Human Environmental Disease Network: A computational model to assess toxicology of contaminants.

Science.gov (United States)

Taboureau, Olivier; Audouze, Karine

2017-01-01

During the past decades, many epidemiological, toxicological and biological studies have been performed to assess the role of environmental chemicals as potential toxicants associated with diverse human disorders. However, the relationships between diseases based on chemical exposure rarely have been studied by computational biology. We developed a human environmental disease network (EDN) to explore and suggest novel disease-disease and chemical-disease relationships. The presented scored EDN model is built upon the integration of systems biology and chemical toxicology using information on chemical contaminants and their disease relationships reported in the TDDB database. The resulting human EDN takes into consideration the level of evidence of the toxicant-disease relationships, allowing inclusion of some degrees of significance in the disease-disease associations. Such a network can be used to identify uncharacterized connections between diseases. Examples are discussed for type 2 diabetes (T2D). Additionally, this computational model allows confirmation of already known links between chemicals and diseases (e.g., between bisphenol A and behavioral disorders) and also reveals unexpected associations between chemicals and diseases (e.g., between chlordane and olfactory alteration), thus predicting which chemicals may be risk factors to human health. The proposed human EDN model allows exploration of common biological mechanisms of diseases associated with chemical exposure, helping us to gain insight into disease etiology and comorbidity. This computational approach is an alternative to animal testing supporting the 3R concept.

9. Cognitive engineering in the design of human-computer interaction and expert systems

International Nuclear Information System (INIS)

Salvendy, G.

1987-01-01

The 68 papers contributing to this book cover the following areas: Theories of Interface Design; Methodologies of Interface Design; Applications of Interface Design; Software Design; Human Factors in Speech Technology and Telecommunications; Design of Graphic Dialogues; Knowledge Acquisition for Knowledge-Based Systems; Design, Evaluation and Use of Expert Systems. This demonstrates the dual role of cognitive engineering. On the one hand cognitive engineering is utilized to design computing systems which are compatible with human cognition and can be effectively and be easily utilized by all individuals. On the other hand, cognitive engineering is utilized to transfer human cognition into the computer for the purpose of building expert systems. Two papers are of interest to INIS

10. Human factors with nonhumans - Factors that affect computer-task performance

Science.gov (United States)

Washburn, David A.

1992-01-01

There are two general strategies that may be employed for 'doing human factors research with nonhuman animals'. First, one may use the methods of traditional human factors investigations to examine the nonhuman animal-to-machine interface. Alternatively, one might use performance by nonhuman animals as a surrogate for or model of performance by a human operator. Each of these approaches is illustrated with data in the present review. Chronic ambient noise was found to have a significant but inconsequential effect on computer-task performance by rhesus monkeys (Macaca mulatta). Additional data supported the generality of findings such as these to humans, showing that rhesus monkeys are appropriate models of human psychomotor performance. It is argued that ultimately the interface between comparative psychology and technology will depend on the coordinated use of both strategies of investigation.

11. The role of beliefs in lexical alignment: evidence from dialogs with humans and computers.

Science.gov (United States)

Branigan, Holly P; Pickering, Martin J; Pearson, Jamie; McLean, Janet F; Brown, Ash

2011-10-01

Five experiments examined the extent to which speakers' alignment (i.e., convergence) on words in dialog is mediated by beliefs about their interlocutor. To do this, we told participants that they were interacting with another person or a computer in a task in which they alternated between selecting pictures that matched their 'partner's' descriptions and naming pictures themselves (though in reality all responses were scripted). In both text- and speech-based dialog, participants tended to repeat their partner's choice of referring expression. However, they showed a stronger tendency to align with 'computer' than with 'human' partners, and with computers that were presented as less capable than with computers that were presented as more capable. The tendency to align therefore appears to be mediated by beliefs, with the relevant beliefs relating to an interlocutor's perceived communicative capacity. Copyright © 2011 Elsevier B.V. All rights reserved.

12. A conceptual and computational model of moral decision making in human and artificial agents.

Science.gov (United States)

Wallach, Wendell; Franklin, Stan; Allen, Colin

2010-07-01

Recently, there has been a resurgence of interest in general, comprehensive models of human cognition. Such models aim to explain higher-order cognitive faculties, such as deliberation and planning. Given a computational representation, the validity of these models can be tested in computer simulations such as software agents or embodied robots. The push to implement computational models of this kind has created the field of artificial general intelligence (AGI). Moral decision making is arguably one of the most challenging tasks for computational approaches to higher-order cognition. The need for increasingly autonomous artificial agents to factor moral considerations into their choices and actions has given rise to another new field of inquiry variously known as Machine Morality, Machine Ethics, Roboethics, or Friendly AI. In this study, we discuss how LIDA, an AGI model of human cognition, can be adapted to model both affective and rational features of moral decision making. Using the LIDA model, we will demonstrate how moral decisions can be made in many domains using the same mechanisms that enable general decision making. Comprehensive models of human cognition typically aim for compatibility with recent research in the cognitive and neural sciences. Global workspace theory, proposed by the neuropsychologist Bernard Baars (1988), is a highly regarded model of human cognition that is currently being computationally instantiated in several software implementations. LIDA (Franklin, Baars, Ramamurthy, & Ventura, 2005) is one such computational implementation. LIDA is both a set of computational tools and an underlying model of human cognition, which provides mechanisms that are capable of explaining how an agent's selection of its next action arises from bottom-up collection of sensory data and top-down processes for making sense of its current situation. We will describe how the LIDA model helps integrate emotions into the human decision-making process, and we

13. Human Computer Confluence in Rehabilitation: Digital Media Plasticity and Human Performance Plasticity

DEFF Research Database (Denmark)

Brooks, Anthony Lewis

2013-01-01

Digital media plasticity evocative to embodied interaction is presented as a utilitarian tool when mixed and matched to target human performance potentials specific to nuance of development for those with impairment. A distinct intervention strategy trains via alternative channeling of external s...

14. Automatic procedure for realistic 3D finite element modelling of human brain for bioelectromagnetic computations

International Nuclear Information System (INIS)

Aristovich, K Y; Khan, S H

2010-01-01

Realistic computer modelling of biological objects requires building of very accurate and realistic computer models based on geometric and material data, type, and accuracy of numerical analyses. This paper presents some of the automatic tools and algorithms that were used to build accurate and realistic 3D finite element (FE) model of whole-brain. These models were used to solve the forward problem in magnetic field tomography (MFT) based on Magnetoencephalography (MEG). The forward problem involves modelling and computation of magnetic fields produced by human brain during cognitive processing. The geometric parameters of the model were obtained from accurate Magnetic Resonance Imaging (MRI) data and the material properties - from those obtained from Diffusion Tensor MRI (DTMRI). The 3D FE models of the brain built using this approach has been shown to be very accurate in terms of both geometric and material properties. The model is stored on the computer in Computer-Aided Parametrical Design (CAD) format. This allows the model to be used in a wide a range of methods of analysis, such as finite element method (FEM), Boundary Element Method (BEM), Monte-Carlo Simulations, etc. The generic model building approach presented here could be used for accurate and realistic modelling of human brain and many other biological objects.

15. Towards passive brain-computer interfaces: applying brain-computer interface technology to human-machine systems in general.

Science.gov (United States)

Zander, Thorsten O; Kothe, Christian

2011-04-01

Cognitive monitoring is an approach utilizing realtime brain signal decoding (RBSD) for gaining information on the ongoing cognitive user state. In recent decades this approach has brought valuable insight into the cognition of an interacting human. Automated RBSD can be used to set up a brain-computer interface (BCI) providing a novel input modality for technical systems solely based on brain activity. In BCIs the user usually sends voluntary and directed commands to control the connected computer system or to communicate through it. In this paper we propose an extension of this approach by fusing BCI technology with cognitive monitoring, providing valuable information about the users' intentions, situational interpretations and emotional states to the technical system. We call this approach passive BCI. In the following we give an overview of studies which utilize passive BCI, as well as other novel types of applications resulting from BCI technology. We especially focus on applications for healthy users, and the specific requirements and demands of this user group. Since the presented approach of combining cognitive monitoring with BCI technology is very similar to the concept of BCIs itself we propose a unifying categorization of BCI-based applications, including the novel approach of passive BCI.

16. PERANCANGAN COMPUTER AIDED SYSTEM DALAM MENGANALISA HUMAN ERROR DI PERKERETAAPIAN INDONESIA

Directory of Open Access Journals (Sweden)

Wiwik Budiawan

2013-06-01

the occurrence of a train crash in Indonesia. However, it is not clear how this analysis technique is done. Studies of human error made ​​National Transportation Safety Committee (NTSC is still relatively limited, is not equipped with a systematic method. There are several methods that have been developed at this time, but for railway transportation is not widely developed. Human Factors Analysis and Classification System (HFACS is a human error analysis method were developed and adapted to the Indonesian railway system. To improve the reliability of the analysis of human error, HFACS then developed in the form of web-based applications that can be accessed on a computer or smartphone. The results could be used by the NTSC as railway accident analysis methods particularly associated with human error. Keywords: human error, HFACS, CAS, railways

17. The data base management system alternative for computing in the human services.

Science.gov (United States)

Sircar, S; Schkade, L L; Schoech, D

1983-01-01

The traditional incremental approach to computerization presents substantial problems as systems develop and grow. The Data Base Management System approach to computerization was developed to overcome the problems resulting from implementing computer applications one at a time. The authors describe the applications approach and the alternative Data Base Management System (DBMS) approach through their developmental history, discuss the technology of DBMS components, and consider the implications of choosing the DBMS alternative. Human service managers need an understanding of the DBMS alternative and its applicability to their agency data processing needs. The basis for a conscious selection of computing alternatives is outlined.

18. Cross-cultural human-computer interaction and user experience design a semiotic perspective

CERN Document Server

Brejcha, Jan

2015-01-01

This book describes patterns of language and culture in human-computer interaction (HCI). Through numerous examples, it shows why these patterns matter and how to exploit them to design a better user experience (UX) with computer systems. It provides scientific information on the theoretical and practical areas of the interaction and communication design for research experts and industry practitioners and covers the latest research in semiotics and cultural studies, bringing a set of tools and methods to benefit the process of designing with the cultural background in mind.

19. Human-computer interfaces applied to numerical solution of the Plateau problem

Science.gov (United States)

Elias Fabris, Antonio; Soares Bandeira, Ivana; Ramos Batista, Valério

2015-09-01

In this work we present a code in Matlab to solve the Problem of Plateau numerically, and the code will include human-computer interface. The Problem of Plateau has applications in areas of knowledge like, for instance, Computer Graphics. The solution method will be the same one of the Surface Evolver, but the difference will be a complete graphical interface with the user. This will enable us to implement other kinds of interface like ocular mouse, voice, touch, etc. To date, Evolver does not include any graphical interface, which restricts its use by the scientific community. Specially, its use is practically impossible for most of the Physically Challenged People.

20. Is the corticomedullary index valid to distinguish human from nonhuman bones: a multislice computed tomography study.

Science.gov (United States)

Rérolle, Camille; Saint-Martin, Pauline; Dedouit, Fabrice; Rousseau, Hervé; Telmon, Norbert

2013-09-10

The first step in the identification process of bone remains is to determine whether they are of human or nonhuman origin. This issue may arise when only a fragment of bone is available, as the species of origin is usually easily determined on a complete bone. The present study aims to assess the validity of a morphometric method used by French forensic anthropologists to determine the species of origin: the corticomedullary index (CMI), defined by the ratio of the diameter of the medullary cavity to the total diameter of the bone. We studied the constancy of the CMI from measurements made on computed tomography images (CT scans) of different human bones, and compared our measurements with reference values selected in the literature. The measurements obtained on CT scans at three different sites of 30 human femurs, 24 tibias, and 24 fibulas were compared between themselves and with the CMI reference values for humans, pigs, dogs and sheep. Our results differed significantly from these reference values, with three exceptions: the proximal quarter of the femur and mid-fibular measurements for the human CMI, and the proximal quarter of the tibia for the sheep CMI. Mid-tibial, mid-femoral, and mid-fibular measurements also differed significantly between themselves. Only 22.6% of CT scans of human bones were correctly identified as human. We concluded that the CMI is not an effective method for determining the human origin of bone remains. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

1. Human Factors Principles in Design of Computer-Mediated Visualization for Robot Missions

Energy Technology Data Exchange (ETDEWEB)

David I Gertman; David J Bruemmer

2008-12-01

With increased use of robots as a resource in missions supporting countermine, improvised explosive devices (IEDs), and chemical, biological, radiological nuclear and conventional explosives (CBRNE), fully understanding the best means by which to complement the human operator’s underlying perceptual and cognitive processes could not be more important. Consistent with control and display integration practices in many other high technology computer-supported applications, current robotic design practices rely highly upon static guidelines and design heuristics that reflect the expertise and experience of the individual designer. In order to use what we know about human factors (HF) to drive human robot interaction (HRI) design, this paper reviews underlying human perception and cognition principles and shows how they were applied to a threat detection domain.

2. Computational Characterization of Exogenous MicroRNAs that Can Be Transferred into Human Circulation.

Directory of Open Access Journals (Sweden)

Jiang Shu

Full Text Available MicroRNAs have been long considered synthesized endogenously until very recent discoveries showing that human can absorb dietary microRNAs from animal and plant origins while the mechanism remains unknown. Compelling evidences of microRNAs from rice, milk, and honeysuckle transported to human blood and tissues have created a high volume of interests in the fundamental questions that which and how exogenous microRNAs can be transferred into human circulation and possibly exert functions in humans. Here we present an integrated genomics and computational analysis to study the potential deciding features of transportable microRNAs. Specifically, we analyzed all publicly available microRNAs, a total of 34,612 from 194 species, with 1,102 features derived from the microRNA sequence and structure. Through in-depth bioinformatics analysis, 8 groups of discriminative features have been used to characterize human circulating microRNAs and infer the likelihood that a microRNA will get transferred into human circulation. For example, 345 dietary microRNAs have been predicted as highly transportable candidates where 117 of them have identical sequences with their homologs in human and 73 are known to be associated with exosomes. Through a milk feeding experiment, we have validated 9 cow-milk microRNAs in human plasma using microRNA-sequencing analysis, including the top ranked microRNAs such as bta-miR-487b, miR-181b, and miR-421. The implications in health-related processes have been illustrated in the functional analysis. This work demonstrates the data-driven computational analysis is highly promising to study novel molecular characteristics of transportable microRNAs while bypassing the complex mechanistic details.

3. Computational Characterization of Exogenous MicroRNAs that Can Be Transferred into Human Circulation

Science.gov (United States)

Shu, Jiang; Chiang, Kevin; Zempleni, Janos; Cui, Juan

2015-01-01

MicroRNAs have been long considered synthesized endogenously until very recent discoveries showing that human can absorb dietary microRNAs from animal and plant origins while the mechanism remains unknown. Compelling evidences of microRNAs from rice, milk, and honeysuckle transported to human blood and tissues have created a high volume of interests in the fundamental questions that which and how exogenous microRNAs can be transferred into human circulation and possibly exert functions in humans. Here we present an integrated genomics and computational analysis to study the potential deciding features of transportable microRNAs. Specifically, we analyzed all publicly available microRNAs, a total of 34,612 from 194 species, with 1,102 features derived from the microRNA sequence and structure. Through in-depth bioinformatics analysis, 8 groups of discriminative features have been used to characterize human circulating microRNAs and infer the likelihood that a microRNA will get transferred into human circulation. For example, 345 dietary microRNAs have been predicted as highly transportable candidates where 117 of them have identical sequences with their homologs in human and 73 are known to be associated with exosomes. Through a milk feeding experiment, we have validated 9 cow-milk microRNAs in human plasma using microRNA-sequencing analysis, including the top ranked microRNAs such as bta-miR-487b, miR-181b, and miR-421. The implications in health-related processes have been illustrated in the functional analysis. This work demonstrates the data-driven computational analysis is highly promising to study novel molecular characteristics of transportable microRNAs while bypassing the complex mechanistic details. PMID:26528912

4. SnapAnatomy, a computer-based interactive tool for independent learning of human anatomy.

Science.gov (United States)

Yip, George W; Rajendran, Kanagasuntheram

2008-06-01

Computer-aided instruction materials are becoming increasing popular in medical education and particularly in the teaching of human anatomy. This paper describes SnapAnatomy, a new interactive program that the authors designed for independent learning of anatomy. SnapAnatomy is primarily tailored for the beginner student to encourage the learning of anatomy by developing a three-dimensional visualization of human structure that is essential to applications in clinical practice and the understanding of function. The program allows the student to take apart and to accurately put together body components in an interactive, self-paced and variable manner to achieve the learning outcome.

5. Engageability: a new sub-principle of the learnability principle in human-computer interaction

Directory of Open Access Journals (Sweden)

B Chimbo

2011-12-01

Full Text Available The learnability principle relates to improving the usability of software, as well as users’ performance and productivity. A gap has been identified as the current definition of the principle does not distinguish between users of different ages. To determine the extent of the gap, this article compares the ways in which two user groups, adults and children, learn how to use an unfamiliar software application. In doing this, we bring together the research areas of human-computer interaction (HCI, adult and child learning, learning theories and strategies, usability evaluation and interaction design. A literature survey conducted on learnability and learning processes considered the meaning of learnability of software applications across generations. In an empirical investigation, users aged from 9 to 12 and from 35 to 50 were observed in a usability laboratory while learning to use educational software applications. Insights that emerged from data analysis showed different tactics and approaches that children and adults use when learning unfamiliar software. Eye tracking data was also recorded. Findings indicated that subtle re- interpretation of the learnability principle and its associated sub-principles was required. An additional sub-principle, namely engageability was proposed to incorporate aspects of learnability that are not covered by the existing sub-principles. Our re-interpretation of the learnability principle and the resulting design recommendations should help designers to fulfill the varying needs of different-aged users, and improve the learnability of their designs. Keywords: Child computer interaction, Design principles, Eye tracking, Generational differences, human-computer interaction, Learning theories, Learnability, Engageability, Software applications, Uasability Disciplines: Human-Computer Interaction (HCI Studies, Computer science, Observational Studies

6. Conformational effects on the circular dichroism of Human Carbonic Anhydrase II: a multilevel computational study.

Directory of Open Access Journals (Sweden)

Tatyana G Karabencheva-Christova

Full Text Available Circular Dichroism (CD spectroscopy is a powerful method for investigating conformational changes in proteins and therefore has numerous applications in structural and molecular biology. Here a computational investigation of the CD spectrum of the Human Carbonic Anhydrase II (HCAII, with main focus on the near-UV CD spectra of the wild-type enzyme and it seven tryptophan mutant forms, is presented and compared to experimental studies. Multilevel computational methods (Molecular Dynamics, Semiempirical Quantum Mechanics, Time-Dependent Density Functional Theory were applied in order to gain insight into the mechanisms of interaction between the aromatic chromophores within the protein environment and understand how the conformational flexibility of the protein influences these mechanisms. The analysis suggests that combining CD semi empirical calculations, crystal structures and molecular dynamics (MD could help in achieving a better agreement between the computed and experimental protein spectra and provide some unique insight into the dynamic nature of the mechanisms of chromophore interactions.

7. Sustaining Economic Exploitation of Complex Ecosystems in Computational Models of Coupled Human-Natural Networks

OpenAIRE

Martinez, Neo D.; Tonin, Perrine; Bauer, Barbara; Rael, Rosalyn C.; Singh, Rahul; Yoon, Sangyuk; Yoon, Ilmi; Dunne, Jennifer A.

2012-01-01

Understanding ecological complexity has stymied scientists for decades. Recent elucidation of the famously coined "devious strategies for stability in enduring natural systems" has opened up a new field of computational analyses of complex ecological networks where the nonlinear dynamics of many interacting species can be more realistically mod-eled and understood. Here, we describe the first extension of this field to include coupled human-natural systems. This extension elucidates new strat...

8. Computer-assisted image analysis assay of human neutrophil chemotaxis in vitro

DEFF Research Database (Denmark)

Jensen, P; Kharazmi, A

1991-01-01

We have developed a computer-based image analysis system to measure in-filter migration of human neutrophils in the Boyden chamber. This method is compared with the conventional manual counting techniques. Neutrophils from healthy individuals and from patients with reduced chemotactic activity were....... Another advantage of the assay is that it can be used to show the migration pattern of different populations of neutrophils from both healthy individuals and patients....

9. An experimental and computational framework to build a dynamic protein atlas of human cell division

OpenAIRE

Kavur, Marina; Kavur, Marina; Kavur, Marina; Ellenberg, Jan; Peters, Jan-Michael; Ladurner, Rene; Martinic, Marina; Kueblbeck, Moritz; Nijmeijer, Bianca; Wachsmuth, Malte; Koch, Birgit; Walther, Nike; Politi, Antonio; Heriche, Jean-Karim; Hossain, M.

2017-01-01

Essential biological functions of human cells, such as division, require the tight coordination of the activity of hundreds of proteins in space and time. While live cell imaging is a powerful tool to study the distribution and dynamics of individual proteins after fluorescence tagging, it has not yet been used to map protein networks due to the lack of systematic and quantitative experimental and computational approaches. Using the cell and nuclear boundaries as landmarks, we generated a 4D ...

10. US Army Weapon Systems Human-Computer Interface (WSHCI) style guide, Version 1

Energy Technology Data Exchange (ETDEWEB)

Avery, L.W.; O`Mara, P.A.; Shepard, A.P.

1996-09-30

A stated goal of the U.S. Army has been the standardization of the human computer interfaces (HCIS) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of style guides. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide. This document, the U.S. Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide, represents the first version of that style guide. The purpose of this document is to provide HCI design guidance for RT/NRT Army systems across the weapon systems domains of ground, aviation, missile, and soldier systems. Each domain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their domains.

11. HCIDL: Human-computer interface description language for multi-target, multimodal, plastic user interfaces

Directory of Open Access Journals (Sweden)

Lamia Gaouar

2018-06-01

Full Text Available From the human-computer interface perspectives, the challenges to be faced are related to the consideration of new, multiple interactions, and the diversity of devices. The large panel of interactions (touching, shaking, voice dictation, positioning … and the diversification of interaction devices can be seen as a factor of flexibility albeit introducing incidental complexity. Our work is part of the field of user interface description languages. After an analysis of the scientific context of our work, this paper introduces HCIDL, a modelling language staged in a model-driven engineering approach. Among the properties related to human-computer interface, our proposition is intended for modelling multi-target, multimodal, plastic interaction interfaces using user interface description languages. By combining plasticity and multimodality, HCIDL improves usability of user interfaces through adaptive behaviour by providing end-users with an interaction-set adapted to input/output of terminals and, an optimum layout. Keywords: Model driven engineering, Human-computer interface, User interface description languages, Multimodal applications, Plastic user interfaces

12. An Efficient and Secure m-IPS Scheme of Mobile Devices for Human-Centric Computing

Directory of Open Access Journals (Sweden)

Young-Sik Jeong

2014-01-01

Full Text Available Recent rapid developments in wireless and mobile IT technologies have led to their application in many real-life areas, such as disasters, home networks, mobile social networks, medical services, industry, schools, and the military. Business/work environments have become wire/wireless, integrated with wireless networks. Although the increase in the use of mobile devices that can use wireless networks increases work efficiency and provides greater convenience, wireless access to networks represents a security threat. Currently, wireless intrusion prevention systems (IPSs are used to prevent wireless security threats. However, these are not an ideal security measure for businesses that utilize mobile devices because they do not take account of temporal-spatial and role information factors. Therefore, in this paper, an efficient and secure mobile-IPS (m-IPS is proposed for businesses utilizing mobile devices in mobile environments for human-centric computing. The m-IPS system incorporates temporal-spatial awareness in human-centric computing with various mobile devices and checks users’ temporal spatial information, profiles, and role information to provide precise access control. And it also can extend application of m-IPS to the Internet of things (IoT, which is one of the important advanced technologies for supporting human-centric computing environment completely, for real ubiquitous field with mobile devices.

13. COMPUTING

CERN Multimedia

M. Kasemann

Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

14. Cognitive engineering models: A prerequisite to the design of human-computer interaction in complex dynamic systems

Science.gov (United States)

Mitchell, Christine M.

1993-01-01

This chapter examines a class of human-computer interaction applications, specifically the design of human-computer interaction for the operators of complex systems. Such systems include space systems (e.g., manned systems such as the Shuttle or space station, and unmanned systems such as NASA scientific satellites), aviation systems (e.g., the flight deck of 'glass cockpit' airplanes or air traffic control) and industrial systems (e.g., power plants, telephone networks, and sophisticated, e.g., 'lights out,' manufacturing facilities). The main body of human-computer interaction (HCI) research complements but does not directly address the primary issues involved in human-computer interaction design for operators of complex systems. Interfaces to complex systems are somewhat special. The 'user' in such systems - i.e., the human operator responsible for safe and effective system operation - is highly skilled, someone who in human-machine systems engineering is sometimes characterized as 'well trained, well motivated'. The 'job' or task context is paramount and, thus, human-computer interaction is subordinate to human job interaction. The design of human interaction with complex systems, i.e., the design of human job interaction, is sometimes called cognitive engineering.

15. [Geomagnetic storm decreases coherence of electric oscillations of human brain while working at the computer].

Science.gov (United States)

Novik, O B; Smirnov, F A

2013-01-01

The effect of geomagnetic storms at the latitude of Moscow on the electric oscillations of the human brain cerebral cortex was studied. In course of electroencephalogram measurements it was shown that when the voluntary persons at the age of 18-23 years old were performing tasks using a computer during moderate magnetic storm or no later than 24 hrs after it, the value of the coherence function of electric oscillations of the human brain in the frontal and occipital areas in a range of 4.0-7.9 Hz (so-called the theta rhythm oscillations of the human brain) decreased by a factor of two or more, sometimes reaching zero, although arterial blood pressure, respiratory rate and the electrocardiogram registered during electroencephalogram measurements remained within the standard values.

16. Simulation-based computation of dose to humans in radiological environments

Energy Technology Data Exchange (ETDEWEB)

Breazeal, N.L. [Sandia National Labs., Livermore, CA (United States); Davis, K.R.; Watson, R.A. [Sandia National Labs., Albuquerque, NM (United States); Vickers, D.S. [Brigham Young Univ., Provo, UT (United States). Dept. of Electrical and Computer Engineering; Ford, M.S. [Battelle Pantex, Amarillo, TX (United States). Dept. of Radiation Safety

1996-03-01

The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface.

17. Computational Thermodynamics Analysis of Vaporizing Fuel Droplets in the Human Upper Airways

Science.gov (United States)

Zhang, Zhe; Kleinstreuer, Clement

The detailed knowledge of air flow structures as well as particle transport and deposition in the human lung for typical inhalation flow rates is an important precursor for dosimetry-and-health-effect studies of toxic particles as well as for targeted drug delivery of therapeutic aerosols. Focusing on highly toxic JP-8 fuel aerosols, 3-D airflow and fluid-particle thermodynamics in a human upper airway model starting from mouth to Generation G3 (G0 is the trachea) are simulated using a user-enhanced and experimentally validated finite-volume code. The temperature distributions and their effects on airflow structures, fuel vapor deposition and droplet motion/evaporation are discussed. The computational results show that the thermal effect on vapor deposition is minor, but it may greatly affect droplet deposition in human airways.

18. Simulation-based computation of dose to humans in radiological environments

International Nuclear Information System (INIS)

Breazeal, N.L.; Davis, K.R.; Watson, R.A.; Vickers, D.S.; Ford, M.S.

1996-03-01

The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface

19. COMPUTING

CERN Multimedia

I. Fisk

2011-01-01

Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

20. COMPUTING

CERN Multimedia

P. McBride

The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

1. COMPUTING

CERN Multimedia

M. Kasemann

Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

2. The use of computers to teach human anatomy and physiology to allied health and nursing students

Science.gov (United States)

Bergeron, Valerie J.

Educational institutions are under tremendous pressure to adopt the newest technologies in order to prepare their students to meet the challenges of the twenty-first century. For the last twenty years huge amounts of money have been spent on computers, printers, software, multimedia projection equipment, and so forth. A reasonable question is, "Has it worked?" Has this infusion of resources, financial as well as human, resulted in improved learning? Are the students meeting the intended learning goals? Any attempt to develop answers to these questions should include examining the intended goals and exploring the effects of the changes on students and faculty. This project investigated the impact of a specific application of a computer program in a community college setting on students' attitudes and understanding of human anatomy and physiology. In this investigation two sites of the same community college with seemingly similar students populations, seven miles apart, used different laboratory activities to teach human anatomy and physiology. At one site nursing students were taught using traditional dissections and laboratory activities; at the other site two of the dissections, specifically cat and sheep pluck, were replaced with the A.D.A.M.RTM (Animated Dissection of Anatomy for Medicine) computer program. Analysis of the attitude data indicated that students at both sites were extremely positive about their laboratory experiences. Analysis of the content data indicated a statistically significant difference in performance between the two sites in two of the eight content areas that were studied. For both topics the students using the computer program scored higher. A detailed analysis of the surveys, interviews with faculty and students, examination of laboratory materials, and observations of laboratory facilities in both sites, and cost-benefit analysis led to the development of seven recommendations. The recommendations call for action at the level of the

3. Computational study of depth completion consistent with human bi-stable perception for ambiguous figures.

Science.gov (United States)

Mitsukura, Eiichi; Satoh, Shunji

2018-03-01

We propose a computational model that is consistent with human perception of depth in "ambiguous regions," in which no binocular disparity exists. Results obtained from our model reveal a new characteristic of depth perception. Random dot stereograms (RDS) are often used as examples because RDS provides sufficient disparity for depth calculation. A simple question confronts us: "How can we estimate the depth of a no-texture image region, such as one on white paper?" In such ambiguous regions, mathematical solutions related to binocular disparities are not unique or indefinite. We examine a mathematical description of depth completion that is consistent with human perception of depth for ambiguous regions. Using computer simulation, we demonstrate that resultant depth-maps qualitatively reproduce human depth perception of two kinds. The resultant depth maps produced using our model depend on the initial depth in the ambiguous region. Considering this dependence from psychological viewpoints, we conjecture that humans perceive completed surfaces that are affected by prior-stimuli corresponding to the initial condition of depth. We conducted psychological experiments to verify the model prediction. An ambiguous stimulus was presented after a prior stimulus removed ambiguity. The inter-stimulus interval (ISI) was inserted between the prior stimulus and post-stimulus. Results show that correlation of perception between the prior stimulus and post-stimulus depends on the ISI duration. Correlation is positive, negative, and nearly zero in the respective cases of short (0-200 ms), medium (200-400 ms), and long ISI (>400 ms). Furthermore, based on our model, we propose a computational model that can explain the dependence. Copyright © 2017 Elsevier Ltd. All rights reserved.

4. Computing the influences of different Intraocular Pressures on the human eye components using computational fluid-structure interaction model.

Science.gov (United States)

Karimi, Alireza; Razaghi, Reza; Navidbakhsh, Mahdi; Sera, Toshihiro; Kudo, Susumu

2017-01-01

Intraocular Pressure (IOP) is defined as the pressure of aqueous in the eye. It has been reported that the normal range of IOP should be within the 10-20 mmHg with an average of 15.50 mmHg among the ophthalmologists. Keratoconus is an anti-inflammatory eye disorder that debilitated cornea unable to reserve the normal structure contrary to the IOP in the eye. Consequently, the cornea would bulge outward and invoke a conical shape following by distorted vision. In addition, it is known that any alterations in the structure and composition of the lens and cornea would exceed a change of the eye ball as well as the mechanical and optical properties of the eye. Understanding the precise alteration of the eye components' stresses and deformations due to different IOPs could help elucidate etiology and pathogenesis to develop treatments not only for keratoconus but also for other diseases of the eye. In this study, at three different IOPs, including 10, 20, and 30 mmHg the stresses and deformations of the human eye components were quantified using a Three-Dimensional (3D) computational Fluid-Structure Interaction (FSI) model of the human eye. The results revealed the highest amount of von Mises stress in the bulged region of the cornea with 245 kPa at the IOP of 30 mmHg. The lens was also showed the von Mises stress of 19.38 kPa at the IOPs of 30 mmHg. In addition, by increasing the IOP from 10 to 30 mmHg, the radius of curvature in the cornea and lens was increased accordingly. In contrast, the sclera indicated its highest stress at the IOP of 10 mmHg due to over pressure phenomenon. The variation of IOP illustrated a little influence in the amount of stress as well as the resultant displacement of the optic nerve. These results can be used for understanding the amount of stresses and deformations in the human eye components due to different IOPs as well as for clarifying significant role of IOP on the radius of curvature of the cornea and the lens.

5. Histomorphometric quantification of human pathological bones from synchrotron radiation 3D computed microtomography

International Nuclear Information System (INIS)

Nogueira, Liebert P.; Braz, Delson

2011-01-01

Conventional bone histomorphometry is an important method for quantitative evaluation of bone microstructure. X-ray computed microtomography is a noninvasive technique, which can be used to evaluate histomorphometric indices in trabecular bones (BV/TV, BS/BV, Tb.N, Tb.Th, Tb.Sp). In this technique, the output 3D images are used to quantify the whole sample, differently from the conventional one, in which the quantification is performed in 2D slices and extrapolated for 3D case. In this work, histomorphometric quantification using synchrotron 3D X-ray computed microtomography was performed to quantify pathological samples of human bone. Samples of human bones were cut into small blocks (8 mm x 8 mm x 10 mm) with a precision saw and then imaged. The computed microtomographies were obtained at SYRMEP (Synchrotron Radiation for MEdical Physics) beamline, at ELETTRA synchrotron radiation facility (Italy). The obtained 3D images yielded excellent resolution and details of intra-trabecular bone structures, including marrow present inside trabeculae. Histomorphometric quantification was compared to literature as well. (author)

6. COMPUTING

CERN Multimedia

I. Fisk

2013-01-01

Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

7. Computational fluid dynamics modeling of Bacillus anthracis spore deposition in rabbit and human respiratory airways

Energy Technology Data Exchange (ETDEWEB)

Kabilan, S.; Suffield, S. R.; Recknagle, K. P.; Jacob, R. E.; Einstein, D. R.; Kuprat, A. P.; Carson, J. P.; Colby, S. M.; Saunders, J. H.; Hines, S. A.; Teeguarden, J. G.; Straub, T. M.; Moe, M.; Taft, S. C.; Corley, R. A.

2016-09-01

Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived respectively from computed tomography (CT) and µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation–exhalation breathing conditions using average species-specific minute volumes. Two different exposure scenarios were modeled in the rabbit based upon experimental inhalation studies. For comparison, human simulations were conducted at the highest exposure concentration used during the rabbit experimental exposures. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Due to the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the nasal sinus compared to the human at the same air concentration of anthrax spores. In contrast, higher spore deposition was predicted in the lower conducting airways of the human compared to the rabbit lung due to differences in airway branching pattern. This information can be used to refine published and ongoing biokinetic models of inhalation anthrax spore exposures, which currently estimate deposited spore concentrations based solely upon exposure concentrations and inhaled doses that do not factor in species-specific anatomy and physiology for deposition.

8. Computational Fluid Dynamics Modeling of Bacillus anthracis Spore Deposition in Rabbit and Human Respiratory Airways

Energy Technology Data Exchange (ETDEWEB)

Kabilan, Senthil; Suffield, Sarah R.; Recknagle, Kurtis P.; Jacob, Rick E.; Einstein, Daniel R.; Kuprat, Andrew P.; Carson, James P.; Colby, Sean M.; Saunders, James H.; Hines, Stephanie; Teeguarden, Justin G.; Straub, Tim M.; Moe, M.; Taft, Sarah; Corley, Richard A.

2016-09-30

Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived from computed tomography (CT) or µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation-exhalation breathing conditions using average species-specific minute volumes. The highest exposure concentration was modeled in the rabbit based upon prior acute inhalation studies. For comparison, human simulation was also conducted at the same concentration. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Due to the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the upper conducting airways compared to the human at the same air concentration of anthrax spores. As a result, higher particle deposition was predicted in the conducting airways and deep lung of the human compared to the rabbit lung due to differences in airway branching pattern. This information can be used to refine published and ongoing biokinetic models of inhalation anthrax spore exposures, which currently estimate deposited spore concentrations based solely upon exposure concentrations and inhaled doses that do not factor in species-specific anatomy and physiology.

9. A Chinese Visible Human-based computational female pelvic phantom for radiation dosimetry simulation

International Nuclear Information System (INIS)

Nan, H.; Jinlu, S.; Shaoxiang, Z.; Qing, H.; Li-wen, T.; Chengjun, G.; Tang, X.; Jiang, S. B.; Xiano-lin, Z.

2010-01-01

Accurate voxel phantom is needed for dosimetric simulation in radiation therapy for malignant tumors in female pelvic region. However, most of the existing voxel phantoms are constructed on the basis of Caucasian or non-Chinese population. Materials and Methods: A computational framework for constructing female pelvic voxel phantom for radiation dosimetry was performed based on Chinese Visible Human datasets. First, several organs within pelvic region were segmented from Chinese Visible Human datasets. Then, polygonization and voxelization were performed based on the segmented organs and a 3D computational phantom is built in the form of a set of voxel arrays. Results: The generated phantom can be converted and loaded into treatment planning system for radiation dosimetry calculation. From the observed dosimetric results of those organs and structures, we can evaluate their absorbed dose and implement some simulation studies. Conclusion: A voxel female pelvic phantom was developed from Chinese Visible Human datasets. It can be utilized for dosimetry evaluation and planning simulation, which would be very helpful to improve the clinical performance and reduce the radiation toxicity on organ at risk.

10. Computer graphics of SEM images facilitate recognition of chromosome position in isolated human metaphase plates.

Science.gov (United States)

Hodge, L D; Barrett, J M; Welter, D A

1995-04-01

There is general agreement that at the time of mitosis chromosomes occupy precise positions and that these positions likely affect subsequent nuclear function in interphase. However, before such ideas can be investigated in human cells, it is necessary to determine first the precise position of each chromosome with regard to its neighbors. It has occurred to us that stereo images, produced by scanning electron microscopy, of isolated metaphase plates could form the basis whereby these positions could be ascertained. In this paper we describe a computer graphic technique that permits us to keep track of individual chromosomes in a metaphase plate and to compare chromosome positions in different metaphase plates. Moreover, the computer graphics provide permanent, easily manipulated, rapid recall of stored chromosome profiles. These advantages are demonstrated by a comparison of the relative position of group A-specific and groups D- and G-specific chromosomes to the full complement of chromosomes in metaphase plates isolated from a nearly triploid human-derived cell (HeLa S3) to a hypo-diploid human fetal lung cell.

11. Foundations for Reasoning in Cognition-Based Computational Representations of Human Decision Making; TOPICAL

International Nuclear Information System (INIS)

SENGLAUB, MICHAEL E.; HARRIS, DAVID L.; RAYBOURN, ELAINE M.

2001-01-01

In exploring the question of how humans reason in ambiguous situations or in the absence of complete information, we stumbled onto a body of knowledge that addresses issues beyond the original scope of our effort. We have begun to understand the importance that philosophy, in particular the work of C. S. Peirce, plays in developing models of human cognition and of information theory in general. We have a foundation that can serve as a basis for further studies in cognition and decision making. Peircean philosophy provides a foundation for understanding human reasoning and capturing behavioral characteristics of decision makers due to cultural, physiological, and psychological effects. The present paper describes this philosophical approach to understanding the underpinnings of human reasoning. We present the work of C. S. Peirce, and define sets of fundamental reasoning behavior that would be captured in the mathematical constructs of these newer technologies and would be able to interact in an agent type framework. Further, we propose the adoption of a hybrid reasoning model based on his work for future computational representations or emulations of human cognition

12. Direct estimation of human trabecular bone stiffness using cone beam computed tomography.

Science.gov (United States)

Klintström, Eva; Klintström, Benjamin; Pahr, Dieter; Brismar, Torkel B; Smedby, Örjan; Moreno, Rodrigo

2018-04-10

The aim of this study was to evaluate the possibility of estimating the biomechanical properties of trabecular bone through finite element simulations by using dental cone beam computed tomography data. Fourteen human radius specimens were scanned in 3 cone beam computed tomography devices: 3-D Accuitomo 80 (J. Morita MFG., Kyoto, Japan), NewTom 5 G (QR Verona, Verona, Italy), and Verity (Planmed, Helsinki, Finland). The imaging data were segmented by using 2 different methods. Stiffness (Young modulus), shear moduli, and the size and shape of the stiffness tensor were studied. Corresponding evaluations by using micro-CT were regarded as the reference standard. The 3-D Accuitomo 80 (J. Morita MFG., Kyoto, Japan) showed good performance in estimating stiffness and shear moduli but was sensitive to the choice of segmentation method. NewTom 5 G (QR Verona, Verona, Italy) and Verity (Planmed, Helsinki, Finland) yielded good correlations, but they were not as strong as Accuitomo 80 (J. Morita MFG., Kyoto, Japan). The cone beam computed tomography devices overestimated both stiffness and shear compared with the micro-CT estimations. Finite element-based calculations of biomechanics from cone beam computed tomography data are feasible, with strong correlations for the Accuitomo 80 scanner (J. Morita MFG., Kyoto, Japan) combined with an appropriate segmentation method. Such measurements might be useful for predicting implant survival by in vivo estimations of bone properties. Copyright © 2018 Elsevier Inc. All rights reserved.

13. 3D virtual human atria: A computational platform for studying clinical atrial fibrillation.

Science.gov (United States)

Aslanidi, Oleg V; Colman, Michael A; Stott, Jonathan; Dobrzynski, Halina; Boyett, Mark R; Holden, Arun V; Zhang, Henggui

2011-10-01

Despite a vast amount of experimental and clinical data on the underlying ionic, cellular and tissue substrates, the mechanisms of common atrial arrhythmias (such as atrial fibrillation, AF) arising from the functional interactions at the whole atria level remain unclear. Computational modelling provides a quantitative framework for integrating such multi-scale data and understanding the arrhythmogenic behaviour that emerges from the collective spatio-temporal dynamics in all parts of the heart. In this study, we have developed a multi-scale hierarchy of biophysically detailed computational models for the human atria--the 3D virtual human atria. Primarily, diffusion tensor MRI reconstruction of the tissue geometry and fibre orientation in the human sinoatrial node (SAN) and surrounding atrial muscle was integrated into the 3D model of the whole atria dissected from the Visible Human dataset. The anatomical models were combined with the heterogeneous atrial action potential (AP) models, and used to simulate the AP conduction in the human atria under various conditions: SAN pacemaking and atrial activation in the normal rhythm, break-down of regular AP wave-fronts during rapid atrial pacing, and the genesis of multiple re-entrant wavelets characteristic of AF. Contributions of different properties of the tissue to mechanisms of the normal rhythm and arrhythmogenesis were investigated. Primarily, the simulations showed that tissue heterogeneity caused the break-down of the normal AP wave-fronts at rapid pacing rates, which initiated a pair of re-entrant spiral waves; and tissue anisotropy resulted in a further break-down of the spiral waves into multiple meandering wavelets characteristic of AF. The 3D virtual atria model itself was incorporated into the torso model to simulate the body surface ECG patterns in the normal and arrhythmic conditions. Therefore, a state-of-the-art computational platform has been developed, which can be used for studying multi

14. Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals

Directory of Open Access Journals (Sweden)

Nasoz Fatma

2004-01-01

Full Text Available We discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI and user modeling. We introduce the overall paradigm for our multimodal system that aims at recognizing its users' emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement. We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and potential applications of emotion recognition for multimodal intelligent systems.

15. Investigation on human serum albumin and Gum Tragacanth interactions using experimental and computational methods.

Science.gov (United States)

2018-02-01

The study on the interaction of human serum albumin and Gum Tragacanth, a biodegradable bio-polymer, has been undertaken. For this purpose, several experimental and computational methods were used. Investigation of thermodynamic parameters and mode of interactions were carried out using Fluorescence spectroscopy in 300 and 310K. Also, a Fourier transformed infrared spectra and synchronous fluorescence spectroscopy was performed. To give detailed insight of possible interactions, docking and molecular dynamic simulations were also applied. Results show that the interaction is based on hydrogen bonding and van der Waals forces. Structural analysis implies on no adverse change in protein conformation during binding of GT. Furthermore, computational methods confirm some evidence on secondary structure enhancement of protein as a presence of combining with Gum Tragacanth. Copyright © 2017 Elsevier B.V. All rights reserved.

16. Computational drug design strategies applied to the modelling of human immunodeficiency virus-1 reverse transcriptase inhibitors

Directory of Open Access Journals (Sweden)

Lucianna Helene Santos

2015-11-01

Full Text Available Reverse transcriptase (RT is a multifunctional enzyme in the human immunodeficiency virus (HIV-1 life cycle and represents a primary target for drug discovery efforts against HIV-1 infection. Two classes of RT inhibitors, the nucleoside RT inhibitors (NRTIs and the nonnucleoside transcriptase inhibitors are prominently used in the highly active antiretroviral therapy in combination with other anti-HIV drugs. However, the rapid emergence of drug-resistant viral strains has limited the successful rate of the anti-HIV agents. Computational methods are a significant part of the drug design process and indispensable to study drug resistance. In this review, recent advances in computer-aided drug design for the rational design of new compounds against HIV-1 RT using methods such as molecular docking, molecular dynamics, free energy calculations, quantitative structure-activity relationships, pharmacophore modelling and absorption, distribution, metabolism, excretion and toxicity prediction are discussed. Successful applications of these methodologies are also highlighted.

17. Single-photon emission computed tomography in human immunodeficiency virus encephalopathy: A preliminary report

International Nuclear Information System (INIS)

Masdeu, J.C.; Yudd, A.; Van Heertum, R.L.; Grundman, M.; Hriso, E.; O'Connell, R.A.; Luck, D.; Camli, U.; King, L.N.

1991-01-01

Depression or psychosis in a previously asymptomatic individual infected with the human immunodeficiency virus (HIV) may be psychogenic, related to brain involvement by the HIV or both. Although prognosis and treatment differ depending on etiology, computed tomography (CT) and magnetic resonance imaging (MRI) are usually unrevealing in early HIV encephalopathy and therefore cannot differentiate it from psychogenic conditions. Thirty of 32 patients (94%) with HIV encephalopathy had single-photon emission computed tomography (SPECT) findings that differed from the findings in 15 patients with non-HIV psychoses and 6 controls. SPECT showed multifocal cortical and subcortical areas of hypoperfusion. In 4 cases, cognitive improvement after 6-8 weeks of zidovudine (AZT) therapy was reflected in amelioration of SPECT findings. CT remained unchanged. SPECT may be a useful technique for the evaluation of HIV encephalopathy

18. U.S. Army weapon systems human-computer interface style guide. Version 2

Energy Technology Data Exchange (ETDEWEB)

Avery, L.W.; O`Mara, P.A.; Shepard, A.P.; Donohoo, D.T.

1997-12-31

A stated goal of the US Army has been the standardization of the human computer interfaces (HCIs) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of HCI design guidance documents. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA), now termed the Joint Technical Architecture-Army (JTA-A). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide, which resulted in the US Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide Version 1. Based on feedback from the user community, DISC4 further tasked PNNL to revise Version 1 and publish Version 2. The intent was to update some of the research and incorporate some enhancements. This document provides that revision. The purpose of this document is to provide HCI design guidance for the RT/NRT Army system domain across the weapon systems subdomains of ground, aviation, missile, and soldier systems. Each subdomain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their subdomains.

19. Computational Strategy for Quantifying Human Pesticide Exposure based upon a Saliva Measurement

Directory of Open Access Journals (Sweden)

Charles eTimchalk

2015-05-01

Full Text Available Quantitative exposure data is important for evaluating toxicity risk and biomonitoring is a critical tool for evaluating human exposure. Direct personal monitoring provides the most accurate estimation of a subject’s true dose, and non-invasive methods are advocated for quantifying exposure to xenobiotics. In this regard, there is a need to identify chemicals that are cleared in saliva at concentrations that can be quantified to support the implementation of this approach. This manuscript reviews the computational modeling approaches that are coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics and provides additional insight on species-dependent differences in partitioning that are of key importance for extrapolation. The primary mechanism by which xenobiotics leave the blood and enter saliva involves paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of xenobiotics transferred by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computationally modeled using compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of the Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis identified that both protein-binding and pKa (for weak acids and bases have significant impact on determining partitioning and species dependent differences based upon physiological variance. Future strategies are focused on an in vitro salivary acinar cell based system to experimentally determine and computationally predict salivary gland uptake and clearance for xenobiotics. It is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of chemical exposures in human

20. COMPUTING

CERN Multimedia

I. Fisk

2010-01-01

Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

1. COMPUTING

CERN Multimedia

M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

2. Real-time non-invasive eyetracking and gaze-point determination for human-computer interaction and biomedicine

Science.gov (United States)

Talukder, Ashit; Morookian, John-Michael; Monacos, S.; Lam, R.; Lebaw, C.; Bond, A.

2004-01-01

Eyetracking is one of the latest technologies that has shown potential in several areas including human-computer interaction for people with and without disabilities, and for noninvasive monitoring, detection, and even diagnosis of physiological and neurological problems in individuals.

3. Human-Computer Interaction Handbook Fundamentals, Evolving Technologies, and Emerging Applications

CERN Document Server

Jacko, Julie A

2012-01-01

The third edition of a groundbreaking reference, The Human--Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications raises the bar for handbooks in this field. It is the largest, most complete compilation of HCI theories, principles, advances, case studies, and more that exist within a single volume. The book captures the current and emerging sub-disciplines within HCI related to research, development, and practice that continue to advance at an astonishing rate. It features cutting-edge advances to the scientific knowledge base as well as visionary perspe

4. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems

DEFF Research Database (Denmark)

also deeply appreciate the huge amount of time donated to this process by the 211-member program committee, who paid their own way to attend the face-to-face program committee meeting, an event larger than the average ACM conference. We are proud of the work of the CHI 2013 program committee and hope...... a tremendous amount of work from all areas of the human-computer interaction community. As co-chairs of the process, we are amazed at the ability of the community to organize itself to accomplish this task. We would like to thank the 2680 individual reviewers for their careful consideration of these papers. We...

5. Advances in Human-Computer Interaction: Graphics and Animation Components for Interface Design

Science.gov (United States)

Cipolla Ficarra, Francisco V.; Nicol, Emma; Cipolla-Ficarra, Miguel; Richardson, Lucy

We present an analysis of communicability methodology in graphics and animation components for interface design, called CAN (Communicability, Acceptability and Novelty). This methodology has been under development between 2005 and 2010, obtaining excellent results in cultural heritage, education and microcomputing contexts. In studies where there is a bi-directional interrelation between ergonomics, usability, user-centered design, software quality and the human-computer interaction. We also present the heuristic results about iconography and layout design in blogs and websites of the following countries: Spain, Italy, Portugal and France.

6. Observation of human tissue with phase-contrast x-ray computed tomography

Science.gov (United States)

Momose, Atsushi; Takeda, Tohoru; Itai, Yuji; Tu, Jinhong; Hirano, Keiichi

1999-05-01

Human tissues obtained from cancerous kidneys fixed in formalin were observed with phase-contrast X-ray computed tomography (CT) using 17.7-keV synchrotron X-rays. By measuring the distributions of the X-ray phase shift caused by samples using an X-ray interferometer, sectional images that map the distribution of the refractive index were reconstructed. Because of the high sensitivity of phase- contrast X-ray CT, a cancerous lesion was differentiated from normal tissue and a variety of other structures were revealed without the need for staining.

7. Machine takeover the growing threat to human freedom in a computer-controlled society

CERN Document Server

George, Frank Honywill

1977-01-01

Machine Takeover: The Growing Threat to Human Freedom in a Computer-Controlled Society discusses the implications of technological advancement. The title identifies the changes in society that no one is aware of, along with what this changes entails. The text first covers the information science, particularly the aspect of an automated system for information processing. Next, the selection deals with social implications of information science, such as information pollution. The text also tackles the concerns in the utilization of technology in order to manipulate the lives of people without th

8. South African sign language human-computer interface in the context of the national accessibility portal

CSIR Research Space (South Africa)

Olivrin, GJ

2006-02-01

Full Text Available example, between a deaf person who can sign and an able person or a person with a different disability who cannot sign). METHODOLOGY A signing avatar is set up to work together with a chatterbot. The chatterbot is a natural language dialogue interface... are then offered in sign language as the replies are interpreted by a signing avatar, a living character that can reproduce human-like gestures and expressions. To make South African Sign Language (SASL) available digitally, computational models of the language...

9. A structural approach to constructing perspective efficient and reliable human-computer interfaces

International Nuclear Information System (INIS)

Balint, L.

1989-01-01

The principles of human-computer interface (HCI) realizations are investigated with the aim of getting closer to a general framework and thus, to a more or less solid background of constructing perspective efficient, reliable and cost-effective human-computer interfaces. On the basis of characterizing and classifying the different HCI solutions, the fundamental problems of interface construction are pointed out especially with respect to human error occurrence possibilities. The evolution of HCI realizations is illustrated by summarizing the main properties of past, present and foreseeable future interface generations. HCI modeling is pointed out to be a crucial problem in theoretical and practical investigations. Suggestions concerning HCI structure (hierarchy and modularity), HCI functional dynamics (mapping from input to output information), minimization of human error caused system failures (error-tolerance, error-recovery and error-correcting) as well as cost-effective HCI design and realization methodology (universal and application-oriented vs. application-specific solutions) are presented. The concept of RISC-based and SCAMP-type HCI components is introduced with the aim of having a reduced interaction scheme in communication and a well defined architecture in HCI components' internal structure. HCI efficiency and reliability are dealt with, by taking into account complexity and flexibility. The application of fast computerized prototyping is also briefly investigated as an experimental device of achieving simple, parametrized, invariant HCI models. Finally, a concise outline of an approach of how to construct ideal HCI's is also suggested by emphasizing the open questions and the need of future work related to the proposals, as well. (author). 14 refs, 6 figs

10. Human errors and violations in computer and information security: the viewpoint of network administrators and security specialists.

Science.gov (United States)

Kraemer, Sara; Carayon, Pascale

2007-03-01

This paper describes human errors and violations of end users and network administration in computer and information security. This information is summarized in a conceptual framework for examining the human and organizational factors contributing to computer and information security. This framework includes human error taxonomies to describe the work conditions that contribute adversely to computer and information security, i.e. to security vulnerabilities and breaches. The issue of human error and violation in computer and information security was explored through a series of 16 interviews with network administrators and security specialists. The interviews were audio taped, transcribed, and analyzed by coding specific themes in a node structure. The result is an expanded framework that classifies types of human error and identifies specific human and organizational factors that contribute to computer and information security. Network administrators tended to view errors created by end users as more intentional than unintentional, while errors created by network administrators as more unintentional than intentional. Organizational factors, such as communication, security culture, policy, and organizational structure, were the most frequently cited factors associated with computer and information security.

Science.gov (United States)

Townsend, Molly T.; Sarigul-Klijn, Nesrin

2018-04-01

Living in reduced gravitational environments for a prolonged duration such, as a fly by mission to Mars or an extended stay at the international space station, affects the human body - in particular, the spine. As the spine adapts to spaceflight, morphological and physiological changes cause the mechanical integrity of the spinal column to be compromised, potentially endangering internal organs, nervous health, and human body mechanical function. Therefore, a high fidelity computational model and simulation of the whole human spine was created and validated for the purpose of investigating the mechanical integrity of the spine in crew members during exploratory space missions. A spaceflight exposed spine has been developed through the adaptation of a three-dimensional nonlinear finite element model with the updated Lagrangian formulation of a healthy ground-based human spine in vivo. Simulation of the porohyperelastic response of the intervertebral disc to mechanical unloading resulted in a model capable of accurately predicting spinal swelling/lengthening, spinal motion, and internal stress distribution. The curvature of this space adaptation exposed spine model was compared to a control terrestrial-based finite element model, indicating how the shape changed. Finally, the potential of injury sites to crew members are predicted for a typical 9 day mission.

12. Impact of familiarity on information complexity in human-computer interfaces

Directory of Open Access Journals (Sweden)

Bakaev Maxim

2016-01-01

Full Text Available A quantitative measure of information complexity remains very much desirable in HCI field, since it may aid in optimization of user interfaces, especially in human-computer systems for controlling complex objects. Our paper is dedicated to exploration of subjective (subject-depended aspect of the complexity, conceptualized as information familiarity. Although research of familiarity in human cognition and behaviour is done in several fields, the accepted models in HCI, such as Human Processor or Hick-Hyman’s law do not generally consider this issue. In our experimental study the subjects performed search and selection of digits and letters, whose familiarity was conceptualized as frequency of occurrence in numbers and texts. The analysis showed significant effect of information familiarity on selection time and throughput in regression models, although the R2 values were somehow low. Still, we hope that our results might aid in quantification of information complexity and its further application for optimizing interaction in human-machine systems.

13. COMPUTING

CERN Multimedia

P. McBride

It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

14. COMPUTING

CERN Multimedia

M. Kasemann

Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

15. COMPUTING

CERN Multimedia

I. Fisk

2011-01-01

Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

16. COMPUTING

CERN Multimedia

I. Fisk

2012-01-01

Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

17. COMPUTING

CERN Multimedia

M. Kasemann

CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

18. COMPUTING

CERN Multimedia

I. Fisk

2010-01-01

Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

19. COMPUTING

CERN Multimedia

M. Kasemann

Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

20. Using the Electrocorticographic Speech Network to Control a Brain-Computer Interface in Humans

Science.gov (United States)

Leuthardt, Eric C.; Gaona, Charles; Sharma, Mohit; Szrama, Nicholas; Roland, Jarod; Freudenberg, Zac; Solis, Jamie; Breshears, Jonathan; Schalk, Gerwin

2013-01-01

Electrocorticography (ECoG) has emerged as a new signal platform for brain-computer interface (BCI) systems. Classically, the cortical physiology that has been commonly investigated and utilized for device control in humans has been brain signals from sensorimotor cortex. Hence, it was unknown whether other neurophysiological substrates, such as the speech network, could be used to further improve on or complement existing motor-based control paradigms. We demonstrate here for the first time that ECoG signals associated with different overt and imagined phoneme articulation can enable invasively monitored human patients to control a one-dimensional computer cursor rapidly and accurately. This phonetic content was distinguishable within higher gamma frequency oscillations and enabled users to achieve final target accuracies between 68 and 91% within 15 minutes. Additionally, one of the patients achieved robust control using recordings from a microarray consisting of 1 mm spaced microwires. These findings suggest that the cortical network associated with speech could provide an additional cognitive and physiologic substrate for BCI operation and that these signals can be acquired from a cortical array that is small and minimally invasive. PMID:21471638

1. Effects of muscle fatigue on the usability of a myoelectric human-computer interface.

Science.gov (United States)

Barszap, Alexander G; Skavhaug, Ida-Maria; Joshi, Sanjay S

2016-10-01

Electromyography-based human-computer interface development is an active field of research. However, knowledge on the effects of muscle fatigue for specific devices is limited. We have developed a novel myoelectric human-computer interface in which subjects continuously navigate a cursor to targets by manipulating a single surface electromyography (sEMG) signal. Two-dimensional control is achieved through simultaneous adjustments of power in two frequency bands through a series of dynamic low-level muscle contractions. Here, we investigate the potential effects of muscle fatigue during the use of our interface. In the first session, eight subjects completed 300 cursor-to-target trials without breaks; four using a wrist muscle and four using a head muscle. The wrist subjects returned for a second session in which a static fatiguing exercise took place at regular intervals in-between cursor-to-target trials. In the first session we observed no declines in performance as a function of use, even after the long period of use. In the second session, we observed clear changes in cursor trajectories, paired with a target-specific decrease in hit rates. Copyright © 2016 Elsevier B.V. All rights reserved.

2. Eye Tracking Based Control System for Natural Human-Computer Interaction

Directory of Open Access Journals (Sweden)

Xuebai Zhang

2017-01-01

Full Text Available Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user’s eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.

3. Eye Tracking Based Control System for Natural Human-Computer Interaction.

Science.gov (United States)

Zhang, Xuebai; Liu, Xiaolong; Yuan, Shyan-Ming; Lin, Shu-Fan

2017-01-01

Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.

4. Computational strategy for quantifying human pesticide exposure based upon a saliva measurement

Energy Technology Data Exchange (ETDEWEB)

Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.

2015-05-27

The National Research Council of the National Academies report, Toxicity Testing in the 21st Century: A Vision and Strategy, highlighted the importance of quantitative exposure data for evaluating human toxicity risk and noted that biomonitoring is a critical tool for quantitatively evaluating exposure from both environmental and occupational settings. Direct measurement of chemical exposures using personal monitoring provides the most accurate estimation of a subject’s true exposure, and non-invasive methods have also been advocated for quantifying the pharmacokinetics and bioavailability of drugs and xenobiotics. In this regard, there is a need to identify chemicals that are readily cleared in saliva at concentrations that can be quantified to support the implementation of this approach.. The current manuscript describes the use of computational modeling approaches that are closely coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics. The primary mechanism by which xenobiotics leave the blood and enter saliva is thought to involve paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of drugs and xenobiotics cleared from plasma into saliva by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computational modeled using a combination of compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of a modified Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis of key model parameters specifically identified that both protein-binding and pKa (for weak acids and bases) had the most significant impact on the determination of partitioning and that there were clear species dependent differences based upon physiological variance between

5. Distributed and grid computing projects with research focus in human health.

Science.gov (United States)

Diomidous, Marianna; Zikos, Dimitrios

2012-01-01

Distributed systems and grid computing systems are used to connect several computers to obtain a higher level of performance, in order to solve a problem. During the last decade, projects use the World Wide Web to aggregate individuals' CPU power for research purposes. This paper presents the existing active large scale distributed and grid computing projects with research focus in human health. There have been found and presented 11 active projects with more than 2000 Processing Units (PUs) each. The research focus for most of them is molecular biology and, specifically on understanding or predicting protein structure through simulation, comparing proteins, genomic analysis for disease provoking genes and drug design. Though not in all cases explicitly stated, common target diseases include research to find cure against HIV, dengue, Duchene dystrophy, Parkinson's disease, various types of cancer and influenza. Other diseases include malaria, anthrax, Alzheimer's disease. The need for national initiatives and European Collaboration for larger scale projects is stressed, to raise the awareness of citizens to participate in order to create a culture of internet volunteering altruism.

6. Multi-step EMG Classification Algorithm for Human-Computer Interaction

Science.gov (United States)

A three-electrode human-computer interaction system, based on digital processing of the Electromyogram (EMG) signal, is presented. This system can effectively help disabled individuals paralyzed from the neck down to interact with computers or communicate with people through computers using point-and-click graphic interfaces. The three electrodes are placed on the right frontalis, the left temporalis and the right temporalis muscles in the head, respectively. The signal processing algorithm used translates the EMG signals during five kinds of facial movements (left jaw clenching, right jaw clenching, eyebrows up, eyebrows down, simultaneous left & right jaw clenching) into five corresponding types of cursor movements (left, right, up, down and left-click), to provide basic mouse control. The classification strategy is based on three principles: the EMG energy of one channel is typically larger than the others during one specific muscle contraction; the spectral characteristics of the EMG signals produced by the frontalis and temporalis muscles during different movements are different; the EMG signals from adjacent channels typically have correlated energy profiles. The algorithm is evaluated on 20 pre-recorded EMG signal sets, using Matlab simulations. The results show that this method provides improvements and is more robust than other previous approaches.

7. FDTD computation of human eye exposure to ultra-wideband electromagnetic pulses

Energy Technology Data Exchange (ETDEWEB)

Simicevic, Neven [Center for Applied Physics Studies, Louisiana Tech University, Ruston, LA 71272 (United States)], E-mail: neven@phys.latech.edu

2008-03-21

With an increase in the application of ultra-wideband (UWB) electromagnetic pulses in the communications industry, radar, biotechnology and medicine, comes an interest in UWB exposure safety standards. Despite an increase of the scientific research on bioeffects of exposure to non-ionizing UWB pulses, characterization of those effects is far from complete. A numerical computational approach, such as a finite-difference time domain (FDTD) method, is required to visualize and understand the complexity of broadband electromagnetic interactions. The FDTD method has almost no limits in the description of the geometrical and dispersive properties of the simulated material, it is numerically robust and appropriate for current computer technology. In this paper, a complete calculation of exposure of the human eye to UWB electromagnetic pulses in the frequency range of 3.1-10.6, 22-29 and 57-64 GHz is performed. Computation in this frequency range required a geometrical resolution of the eye of 0.1 mm and an arbitrary precision in the description of its dielectric properties in terms of the Debye model. New results show that the interaction of UWB pulses with the eye tissues exhibits the same properties as the interaction of the continuous electromagnetic waves (CWs) with the frequencies from the pulse's frequency spectrum. It is also shown that under the same exposure conditions the exposure to UWB pulses is from one to many orders of magnitude safer than the exposure to CW.

8. FDTD computation of human eye exposure to ultra-wideband electromagnetic pulses.

Science.gov (United States)

Simicevic, Neven

2008-03-21

With an increase in the application of ultra-wideband (UWB) electromagnetic pulses in the communications industry, radar, biotechnology and medicine, comes an interest in UWB exposure safety standards. Despite an increase of the scientific research on bioeffects of exposure to non-ionizing UWB pulses, characterization of those effects is far from complete. A numerical computational approach, such as a finite-difference time domain (FDTD) method, is required to visualize and understand the complexity of broadband electromagnetic interactions. The FDTD method has almost no limits in the description of the geometrical and dispersive properties of the simulated material, it is numerically robust and appropriate for current computer technology. In this paper, a complete calculation of exposure of the human eye to UWB electromagnetic pulses in the frequency range of 3.1-10.6, 22-29 and 57-64 GHz is performed. Computation in this frequency range required a geometrical resolution of the eye of 0.1 mm and an arbitrary precision in the description of its dielectric properties in terms of the Debye model. New results show that the interaction of UWB pulses with the eye tissues exhibits the same properties as the interaction of the continuous electromagnetic waves (CWs) with the frequencies from the pulse's frequency spectrum. It is also shown that under the same exposure conditions the exposure to UWB pulses is from one to many orders of magnitude safer than the exposure to CW.

9. FDTD computation of human eye exposure to ultra-wideband electromagnetic pulses

International Nuclear Information System (INIS)

Simicevic, Neven

2008-01-01

With an increase in the application of ultra-wideband (UWB) electromagnetic pulses in the communications industry, radar, biotechnology and medicine, comes an interest in UWB exposure safety standards. Despite an increase of the scientific research on bioeffects of exposure to non-ionizing UWB pulses, characterization of those effects is far from complete. A numerical computational approach, such as a finite-difference time domain (FDTD) method, is required to visualize and understand the complexity of broadband electromagnetic interactions. The FDTD method has almost no limits in the description of the geometrical and dispersive properties of the simulated material, it is numerically robust and appropriate for current computer technology. In this paper, a complete calculation of exposure of the human eye to UWB electromagnetic pulses in the frequency range of 3.1-10.6, 22-29 and 57-64 GHz is performed. Computation in this frequency range required a geometrical resolution of the eye of 0.1 mm and an arbitrary precision in the description of its dielectric properties in terms of the Debye model. New results show that the interaction of UWB pulses with the eye tissues exhibits the same properties as the interaction of the continuous electromagnetic waves (CWs) with the frequencies from the pulse's frequency spectrum. It is also shown that under the same exposure conditions the exposure to UWB pulses is from one to many orders of magnitude safer than the exposure to CW

10. MRI Reconstructions of Human Phrenic Nerve Anatomy and Computational Modeling of Cryoballoon Ablative Therapy.

Science.gov (United States)

Goff, Ryan P; Spencer, Julianne H; Iaizzo, Paul A

2016-04-01

The primary goal of this computational modeling study was to better quantify the relative distance of the phrenic nerves to areas where cryoballoon ablations may be applied within the left atria. Phrenic nerve injury can be a significant complication of applied ablative therapies for treatment of drug refractory atrial fibrillation. To date, published reports suggest that such injuries may occur more frequently in cryoballoon ablations than in radiofrequency therapies. Ten human heart-lung blocs were prepared in an end-diastolic state, scanned with MRI, and analyzed using Mimics software as a means to make anatomical measurements. Next, generated computer models of ArticFront cryoballoons (23, 28 mm) were mated with reconstructed pulmonary vein ostias to determine relative distances between the phrenic nerves and projected balloon placements, simulating pulmonary vein isolation. The effects of deep seating balloons were also investigated. Interestingly, the relative anatomical differences in placement of 23 and 28 mm cryoballoons were quite small, e.g., the determined difference between mid spline distance to the phrenic nerves between the two cryoballoon sizes was only 1.7 ± 1.2 mm. Furthermore, the right phrenic nerves were commonly closer to the pulmonary veins than the left, and surprisingly tips of balloons were further from the nerves, yet balloon size choice did not significantly alter calculated distance to the nerves. Such computational modeling is considered as a useful tool for both clinicians and device designers to better understand these associated anatomies that, in turn, may lead to optimization of therapeutic treatments.

11. The UF family of hybrid phantoms of the developing human fetus for computational radiation dosimetry

Energy Technology Data Exchange (ETDEWEB)

Maynard, Matthew R; Geyer, John W; Bolch, Wesley [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL (United States); Aris, John P [Department of Anatomy and Cell Biology, University of Florida, Gainesville, FL (United States); Shifrin, Roger Y, E-mail: wbolch@ufl.edu [Department of Radiology, University of Florida, Gainesville, FL (United States)

2011-08-07

Historically, the development of computational phantoms for radiation dosimetry has primarily been directed at capturing and representing adult and pediatric anatomy, with less emphasis devoted to models of the human fetus. As concern grows over possible radiation-induced cancers from medical and non-medical exposures of the pregnant female, the need to better quantify fetal radiation doses, particularly at the organ-level, also increases. Studies such as the European Union's SOLO (Epidemiological Studies of Exposed Southern Urals Populations) hope to improve our understanding of cancer risks following chronic in utero radiation exposure. For projects such as SOLO, currently available fetal anatomic models do not provide sufficient anatomical detail for organ-level dose assessment. To address this need, two fetal hybrid computational phantoms were constructed using high-quality magnetic resonance imaging and computed tomography image sets obtained for two well-preserved fetal specimens aged 11.5 and 21 weeks post-conception. Individual soft tissue organs, bone sites and outer body contours were segmented from these images using 3D-DOCTOR(TM) and then imported to the 3D modeling software package Rhinoceros(TM) for further modeling and conversion of soft tissue organs, certain bone sites and outer body contours to deformable non-uniform rational B-spline surfaces. The two specimen-specific phantoms, along with a modified version of the 38 week UF hybrid newborn phantom, comprised a set of base phantoms from which a series of hybrid computational phantoms was derived for fetal ages 8, 10, 15, 20, 25, 30, 35 and 38 weeks post-conception. The methodology used to construct the series of phantoms accounted for the following age-dependent parameters: (1) variations in skeletal size and proportion, (2) bone-dependent variations in relative levels of bone growth, (3) variations in individual organ masses and total fetal masses and (4) statistical percentile variations

12. The UF family of hybrid phantoms of the developing human fetus for computational radiation dosimetry

International Nuclear Information System (INIS)

Maynard, Matthew R; Geyer, John W; Bolch, Wesley; Aris, John P; Shifrin, Roger Y

2011-01-01

Historically, the development of computational phantoms for radiation dosimetry has primarily been directed at capturing and representing adult and pediatric anatomy, with less emphasis devoted to models of the human fetus. As concern grows over possible radiation-induced cancers from medical and non-medical exposures of the pregnant female, the need to better quantify fetal radiation doses, particularly at the organ-level, also increases. Studies such as the European Union's SOLO (Epidemiological Studies of Exposed Southern Urals Populations) hope to improve our understanding of cancer risks following chronic in utero radiation exposure. For projects such as SOLO, currently available fetal anatomic models do not provide sufficient anatomical detail for organ-level dose assessment. To address this need, two fetal hybrid computational phantoms were constructed using high-quality magnetic resonance imaging and computed tomography image sets obtained for two well-preserved fetal specimens aged 11.5 and 21 weeks post-conception. Individual soft tissue organs, bone sites and outer body contours were segmented from these images using 3D-DOCTOR(TM) and then imported to the 3D modeling software package Rhinoceros(TM) for further modeling and conversion of soft tissue organs, certain bone sites and outer body contours to deformable non-uniform rational B-spline surfaces. The two specimen-specific phantoms, along with a modified version of the 38 week UF hybrid newborn phantom, comprised a set of base phantoms from which a series of hybrid computational phantoms was derived for fetal ages 8, 10, 15, 20, 25, 30, 35 and 38 weeks post-conception. The methodology used to construct the series of phantoms accounted for the following age-dependent parameters: (1) variations in skeletal size and proportion, (2) bone-dependent variations in relative levels of bone growth, (3) variations in individual organ masses and total fetal masses and (4) statistical percentile variations in

13. POLYAR, a new computer program for prediction of poly(A sites in human sequences

Directory of Open Access Journals (Sweden)

Qamar Raheel

2010-11-01

14. Dual-Modality Imaging of the Human Finger Joint Systems by Using Combined Multispectral Photoacoustic Computed Tomography and Ultrasound Computed Tomography

Directory of Open Access Journals (Sweden)

Yubin Liu

2016-01-01

Full Text Available We developed a homemade dual-modality imaging system that combines multispectral photoacoustic computed tomography and ultrasound computed tomography for reconstructing the structural and functional information of human finger joint systems. The fused multispectral photoacoustic-ultrasound computed tomography (MPAUCT system was examined by the phantom and in vivo experimental tests. The imaging results indicate that the hard tissues such as the bones and the soft tissues including the blood vessels, the tendon, the skins, and the subcutaneous tissues in the finger joints systems can be effectively recovered by using our multimodality MPAUCT system. The developed MPAUCT system is able to provide us with more comprehensive information of the human finger joints, which shows its potential for characterization and diagnosis of bone or joint diseases.

15. High School Students' Written Argumentation Qualities with Problem-Based Computer-Aided Material (PBCAM) Designed about Human Endocrine System

Science.gov (United States)

Vekli, Gülsah Sezen; Çimer, Atilla

2017-01-01

This study investigated development of students' scientific argumentation levels in the applications made with Problem-Based Computer-Aided Material (PBCAM) designed about Human Endocrine System. The case study method was used: The study group was formed of 43 students in the 11th grade of the science high school in Rize. Human Endocrine System…

16. Human-Computer Interaction and Sociological Insight: A Theoretical Examination and Experiment in Building Affinity in Small Groups

Science.gov (United States)

Oren, Michael Anthony

2011-01-01

The juxtaposition of classic sociological theory and the, relatively, young discipline of human-computer interaction (HCI) serves as a powerful mechanism for both exploring the theoretical impacts of technology on human interactions as well as the application of technological systems to moderate interactions. It is the intent of this dissertation…

17. Comparison of computational to human observer detection for evaluation of CT low dose iterative reconstruction

Science.gov (United States)

Eck, Brendan; Fahmi, Rachid; Brown, Kevin M.; Raihani, Nilgoun; Wilson, David L.

2014-03-01

Model observers were created and compared to human observers for the detection of low contrast targets in computed tomography (CT) images reconstructed with an advanced, knowledge-based, iterative image reconstruction method for low x-ray dose imaging. A 5-channel Laguerre-Gauss Hotelling Observer (CHO) was used with internal noise added to the decision variable (DV) and/or channel outputs (CO). Models were defined by parameters: (k1) DV-noise with standard deviation (std) proportional to DV std; (k2) DV-noise with constant std; (k3) CO-noise with constant std across channels; and (k4) CO-noise in each channel with std proportional to CO variance. Four-alternative forced choice (4AFC) human observer studies were performed on sub-images extracted from phantom images with and without a "pin" target. Model parameters were estimated using maximum likelihood comparison to human probability correct (PC) data. PC in human and all model observers increased with dose, contrast, and size, and was much higher for advanced iterative reconstruction (IMR) as compared to filtered back projection (FBP). Detection in IMR was better than FPB at 1/3 dose, suggesting significant dose savings. Model(k1,k2,k3,k4) gave the best overall fit to humans across independent variables (dose, size, contrast, and reconstruction) at fixed display window. However Model(k1) performed better when considering model complexity using the Akaike information criterion. Model(k1) fit the extraordinary detectability difference between IMR and FBP, despite the different noise quality. It is anticipated that the model observer will predict results from iterative reconstruction methods having similar noise characteristics, enabling rapid comparison of methods.

18. COMPUTING

CERN Multimedia

2010-01-01

Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

19. COMPUTING

CERN Multimedia

Contributions from I. Fisk

2012-01-01

Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

20. COMPUTING

CERN Multimedia

M. Kasemann

Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

1. COMPUTING

CERN Multimedia

Matthias Kasemann

Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

2. COMPUTING

CERN Multimedia

P. MacBride

The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

3. COMPUTING

CERN Multimedia

I. Fisk

2013-01-01

Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

4. COMPUTING

CERN Multimedia

I. Fisk

2012-01-01

Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

5. Reciprocity in computer-human interaction: source-based, norm-based, and affect-based explanations.

Science.gov (United States)

Lee, Seungcheol Austin; Liang, Yuhua Jake

2015-04-01

Individuals often apply social rules when they interact with computers, and this is known as the Computers Are Social Actors (CASA) effect. Following previous work, one approach to understand the mechanism responsible for CASA is to utilize computer agents and have the agents attempt to gain human compliance (e.g., completing a pattern recognition task). The current study focuses on three key factors frequently cited to influence traditional notions of compliance: evaluations toward the source (competence and warmth), normative influence (reciprocity), and affective influence (mood). Structural equation modeling assessed the effects of these factors on human compliance with computer request. The final model shows that norm-based influence (reciprocity) increased the likelihood of compliance, while evaluations toward the computer agent did not significantly influence compliance.

6. Computer program for assessing the human dose due to stationary release of tritium

International Nuclear Information System (INIS)

2003-01-01

The computer program TriStat (Tritium dose assessment for stationary release) has been developed to assess the dose to humans assuming a stationary release of tritium as HTO and/or HT from nuclear facilities. A Gaussian dispersion model describes the behavior of HT gas and HTO vapor in the atmosphere. Tritium concentrations in soil, vegetables and forage were estimated on the basis of specific tritium concentrations in the free water component and the organic component. The uptake of contamination via food by humans was modeled by assuming a forage compartment, a vegetable component, and an animal compartment. A standardized vegetable and a standardized animal with the relative content of major nutrients, i.e. proteins, lipids and carbohydrates, representing a standard Japanese diet, were included. A standardized forage was defined in a similar manner by using the forage composition for typical farm animals. These standard feed- and foodstuffs are useful to simplify the tritium dosimetry and the food chain related to the tritium transfer to the human body. (author)

7. Three-dimensional computer-aided human factors engineering analysis of a grafting robot.

Science.gov (United States)

Chiu, Y C; Chen, S; Wu, G J; Lin, Y H

2012-07-01

The objective of this research was to conduct a human factors engineering analysis of a grafting robot design using computer-aided 3D simulation technology. A prototype tubing-type grafting robot for fruits and vegetables was the subject of a series of case studies. To facilitate the incorporation of human models into the operating environment of the grafting robot, I-DEAS graphic software was applied to establish individual models of the grafting robot in line with Jack ergonomic analysis. Six human models (95th percentile, 50th percentile, and 5th percentile by height for both males and females) were employed to simulate the operating conditions and working postures in a real operating environment. The lower back and upper limb stresses of the operators were analyzed using the lower back analysis (LBA) and rapid upper limb assessment (RULA) functions in Jack. The experimental results showed that if a leg space is introduced under the robot, the operator can sit closer to the robot, which reduces the operator's level of lower back and upper limbs stress. The proper environmental layout for Taiwanese operators for minimum levels of lower back and upper limb stress are to set the grafting operation at 23.2 cm away from the operator at a height of 85 cm and with 45 cm between the rootstock and scion units.

8. Rana computatrix to human language: towards a computational neuroethology of language evolution.

Science.gov (United States)

Arbib, Michael A

2003-10-15

Walter's Machina speculatrix inspired the name Rana computatrix for a family of models of visuomotor coordination in the frog, which contributed to the development of computational neuroethology. We offer here an 'evolutionary' perspective on models in the same tradition for rat, monkey and human. For rat, we show how the frog-like taxon affordance model provides a basis for the spatial navigation mechanisms that involve the hippocampus and other brain regions. For monkey, we recall two models of neural mechanisms for visuomotor coordination. The first, for saccades, shows how interactions between the parietal and frontal cortex augment superior colliculus seen as the homologue of frog tectum. The second, for grasping, continues the theme of parieto-frontal interactions, linking parietal affordances to motor schemas in premotor cortex. It further emphasizes the mirror system for grasping, in which neurons are active both when the monkey executes a specific grasp and when it observes a similar grasp executed by others. The model of human-brain mechanisms is based on the mirror-system hypothesis of the evolution of the language-ready brain, which sees the human Broca's area as an evolved extension of the mirror system for grasping.

9. Task analysis and computer aid development for human reliability analysis in nuclear power plants

Energy Technology Data Exchange (ETDEWEB)

Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

2001-04-01

Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

10. Electrophysiological properties of computational human ventricular cell action potential models under acute ischemic conditions.

Science.gov (United States)

Dutta, Sara; Mincholé, Ana; Quinn, T Alexander; Rodriguez, Blanca

2017-10-01

Acute myocardial ischemia is one of the main causes of sudden cardiac death. The mechanisms have been investigated primarily in experimental and computational studies using different animal species, but human studies remain scarce. In this study, we assess the ability of four human ventricular action potential models (ten Tusscher and Panfilov, 2006; Grandi et al., 2010; Carro et al., 2011; O'Hara et al., 2011) to simulate key electrophysiological consequences of acute myocardial ischemia in single cell and tissue simulations. We specifically focus on evaluating the effect of extracellular potassium concentration and activation of the ATP-sensitive inward-rectifying potassium current on action potential duration, post-repolarization refractoriness, and conduction velocity, as the most critical factors in determining reentry vulnerability during ischemia. Our results show that the Grandi and O'Hara models required modifications to reproduce expected ischemic changes, specifically modifying the intracellular potassium concentration in the Grandi model and the sodium current in the O'Hara model. With these modifications, the four human ventricular cell AP models analyzed in this study reproduce the electrophysiological alterations in repolarization, refractoriness, and conduction velocity caused by acute myocardial ischemia. However, quantitative differences are observed between the models and overall, the ten Tusscher and modified O'Hara models show closest agreement to experimental data. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

11. Computational and human observer image quality evaluation of low dose, knowledge-based CT iterative reconstruction

Energy Technology Data Exchange (ETDEWEB)

Eck, Brendan L.; Fahmi, Rachid; Miao, Jun [Department of Biomedical Engineering, Case Western Reserve University, Cleveland, Ohio 44106 (United States); Brown, Kevin M.; Zabic, Stanislav; Raihani, Nilgoun [Philips Healthcare, Cleveland, Ohio 44143 (United States); Wilson, David L., E-mail: dlw@case.edu [Department of Biomedical Engineering, Case Western Reserve University, Cleveland, Ohio 44106 and Department of Radiology, Case Western Reserve University, Cleveland, Ohio 44106 (United States)

2015-10-15

Purpose: Aims in this study are to (1) develop a computational model observer which reliably tracks the detectability of human observers in low dose computed tomography (CT) images reconstructed with knowledge-based iterative reconstruction (IMR™, Philips Healthcare) and filtered back projection (FBP) across a range of independent variables, (2) use the model to evaluate detectability trends across reconstructions and make predictions of human observer detectability, and (3) perform human observer studies based on model predictions to demonstrate applications of the model in CT imaging. Methods: Detectability (d′) was evaluated in phantom studies across a range of conditions. Images were generated using a numerical CT simulator. Trained observers performed 4-alternative forced choice (4-AFC) experiments across dose (1.3, 2.7, 4.0 mGy), pin size (4, 6, 8 mm), contrast (0.3%, 0.5%, 1.0%), and reconstruction (FBP, IMR), at fixed display window. A five-channel Laguerre–Gauss channelized Hotelling observer (CHO) was developed with internal noise added to the decision variable and/or to channel outputs, creating six different internal noise models. Semianalytic internal noise computation was tested against Monte Carlo and used to accelerate internal noise parameter optimization. Model parameters were estimated from all experiments at once using maximum likelihood on the probability correct, P{sub C}. Akaike information criterion (AIC) was used to compare models of different orders. The best model was selected according to AIC and used to predict detectability in blended FBP-IMR images, analyze trends in IMR detectability improvements, and predict dose savings with IMR. Predicted dose savings were compared against 4-AFC study results using physical CT phantom images. Results: Detection in IMR was greater than FBP in all tested conditions. The CHO with internal noise proportional to channel output standard deviations, Model-k4, showed the best trade-off between fit

12. COMPUTING

CERN Multimedia

I. Fisk

2011-01-01

Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

13. Creating Communications, Computing, and Networking Technology Development Road Maps for Future NASA Human and Robotic Missions

Science.gov (United States)

Bhasin, Kul; Hayden, Jeffrey L.

2005-01-01

For human and robotic exploration missions in the Vision for Exploration, roadmaps are needed for capability development and investments based on advanced technology developments. A roadmap development process was undertaken for the needed communications, and networking capabilities and technologies for the future human and robotics missions. The underlying processes are derived from work carried out during development of the future space communications architecture, an d NASA's Space Architect Office (SAO) defined formats and structures for accumulating data. Interrelationships were established among emerging requirements, the capability analysis and technology status, and performance data. After developing an architectural communications and networking framework structured around the assumed needs for human and robotic exploration, in the vicinity of Earth, Moon, along the path to Mars, and in the vicinity of Mars, information was gathered from expert participants. This information was used to identify the capabilities expected from the new infrastructure and the technological gaps in the way of obtaining them. We define realistic, long-term space communication architectures based on emerging needs and translate the needs into interfaces, functions, and computer processing that will be required. In developing our roadmapping process, we defined requirements for achieving end-to-end activities that will be carried out by future NASA human and robotic missions. This paper describes: 10 the architectural framework developed for analysis; 2) our approach to gathering and analyzing data from NASA, industry, and academia; 3) an outline of the technology research to be done, including milestones for technology research and demonstrations with timelines; and 4) the technology roadmaps themselves.

14. Imaging cellular and subcellular structure of human brain tissue using micro computed tomography

Science.gov (United States)

Khimchenko, Anna; Bikis, Christos; Schweighauser, Gabriel; Hench, Jürgen; Joita-Pacureanu, Alexandra-Teodora; Thalmann, Peter; Deyhle, Hans; Osmani, Bekim; Chicherova, Natalia; Hieber, Simone E.; Cloetens, Peter; Müller-Gerbl, Magdalena; Schulz, Georg; Müller, Bert

2017-09-01

Brain tissues have been an attractive subject for investigations in neuropathology, neuroscience, and neurobiol- ogy. Nevertheless, existing imaging methodologies have intrinsic limitations in three-dimensional (3D) label-free visualisation of extended tissue samples down to (sub)cellular level. For a long time, these morphological features were visualised by electron or light microscopies. In addition to being time-consuming, microscopic investigation includes specimen fixation, embedding, sectioning, staining, and imaging with the associated artefacts. More- over, optical microscopy remains hampered by a fundamental limit in the spatial resolution that is imposed by the diffraction of visible light wavefront. In contrast, various tomography approaches do not require a complex specimen preparation and can now reach a true (sub)cellular resolution. Even laboratory-based micro computed tomography in the absorption-contrast mode of formalin-fixed paraffin-embedded (FFPE) human cerebellum yields an image contrast comparable to conventional histological sections. Data of a superior image quality was obtained by means of synchrotron radiation-based single-distance X-ray phase-contrast tomography enabling the visualisation of non-stained Purkinje cells down to the subcellular level and automated cell counting. The question arises, whether the data quality of the hard X-ray tomography can be superior to optical microscopy. Herein, we discuss the label-free investigation of the human brain ultramorphology be means of synchrotron radiation-based hard X-ray magnified phase-contrast in-line tomography at the nano-imaging beamline ID16A (ESRF, Grenoble, France). As an example, we present images of FFPE human cerebellum block. Hard X-ray tomography can provide detailed information on human tissues in health and disease with a spatial resolution below the optical limit, improving understanding of the neuro-degenerative diseases.

15. A soft-contact model for computing safety margins in human prehension.

Science.gov (United States)

Singh, Tarkeshwar; Ambike, Satyajit

2017-10-01

The soft human digit tip forms contact with grasped objects over a finite area and applies a moment about an axis normal to the area. These moments are important for ensuring stability during precision grasping. However, the contribution of these moments to grasp stability is rarely investigated in prehension studies. The more popular hard-contact model assumes that the digits exert a force vector but no free moment on the grasped object. Many sensorimotor studies use this model and show that humans estimate friction coefficients to scale the normal force to grasp objects stably, i.e. the smoother the surface, the tighter the grasp. The difference between the applied normal force and the minimal normal force needed to prevent slipping is called safety margin and this index is widely used as a measure of grasp planning. Here, we define and quantify safety margin using a more realistic contact model that allows digits to apply both forces and moments. Specifically, we adapt a soft-contact model from robotics and demonstrate that the safety margin thus computed is a more accurate and robust index of grasp planning than its hard-contact variant. Previously, we have used the soft-contact model to propose two indices of grasp planning that show how humans account for the shape and inertial properties of an object. A soft-contact based safety margin offers complementary insights by quantifying how humans may account for surface properties of the object and skin tissue during grasp planning and execution. Copyright © 2017 Elsevier B.V. All rights reserved.

16. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

International Nuclear Information System (INIS)

Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

2015-01-01

Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

17. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

Energy Technology Data Exchange (ETDEWEB)

Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

2015-02-01

Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

18. MAPPS (Maintenance Personnel Performance Simulation): a computer simulation model for human reliability analysis

International Nuclear Information System (INIS)

Knee, H.E.; Haas, P.M.

1985-01-01

A computer model has been developed, sensitivity tested, and evaluated capable of generating reliable estimates of human performance measures in the nuclear power plant (NPP) maintenance context. The model, entitled MAPPS (Maintenance Personnel Performance Simulation), is of the simulation type and is task-oriented. It addresses a number of person-machine, person-environment, and person-person variables and is capable of providing the user with a rich spectrum of important performance measures including mean time for successful task performance by a maintenance team and maintenance team probability of task success. These two measures are particularly important for input to probabilistic risk assessment (PRA) studies which were the primary impetus for the development of MAPPS. The simulation nature of the model along with its generous input parameters and output variables allows its usefulness to extend beyond its input to PRA

19. Human-computer interaction for alert warning and attention allocation systems of the multimodal watchstation

Science.gov (United States)

Obermayer, Richard W.; Nugent, William A.

2000-11-01

The SPAWAR Systems Center San Diego is currently developing an advanced Multi-Modal Watchstation (MMWS); design concepts and software from this effort are intended for transition to future United States Navy surface combatants. The MMWS features multiple flat panel displays and several modes of user interaction, including voice input and output, natural language recognition, 3D audio, stylus and gestural inputs. In 1999, an extensive literature review was conducted on basic and applied research concerned with alerting and warning systems. After summarizing that literature, a human computer interaction (HCI) designer's guide was prepared to support the design of an attention allocation subsystem (AAS) for the MMWS. The resultant HCI guidelines are being applied in the design of a fully interactive AAS prototype. An overview of key findings from the literature review, a proposed design methodology with illustrative examples, and an assessment of progress made in implementing the HCI designers guide are presented.

20. Single photon emission computed tomography study of human pulmonary perfusion: preliminary findings

Energy Technology Data Exchange (ETDEWEB)

Carratu, L; Sofia, M [Naples Univ. (Italy). Facolta di Medicina e Chirurgia; Salvatore, M; Muto, P; Ariemma, G [Istituto Nazionale per la Prevenzione, Lo Studio e La Cura dei Tumori Fondazione Pascale, Naples (Italy); Lopez-Majano, V [Cook County Hospital, Chicago, IL (USA). Nuclear Medicine Div.

1984-02-01

Single photon emission computed tomography (SPECT) was performed with /sup 99/Tcsup(m)-albumin macroaggregates to study human pulmonary perfusion in healthy subjects and patients with respiratory diseases such as chronic obstructive pulmonary disease (COPD) and lung neoplasms. The reconstructed SPECT data was displayed in coronal, transverse, sagittal plane sections and compared to conventional perfusion scans. The SPECT data gave more complicated anatomical information about the extent of damage and morphology of the pulmonary vascular bed. In healthy subjects and COPD patients, qualitative and quantitative assessment of pulmonary perfusion could be obtained from serial SPECT scans with respect to distribution and relative concentration of the injected radiopharmaceutical. Furthermore, SPECT of pulmonary perfusion has been useful in detecting the extent of damage to the pulmonary circulation. This is useful for the preoperative evaluation and staging of lung cancer.

1. A Single Camera Motion Capture System for Human-Computer Interaction

Science.gov (United States)

This paper presents a method for markerless human motion capture using a single camera. It uses tree-based filtering to efficiently propagate a probability distribution over poses of a 3D body model. The pose vectors and associated shapes are arranged in a tree, which is constructed by hierarchical pairwise clustering, in order to efficiently evaluate the likelihood in each frame. Anew likelihood function based on silhouette matching is proposed that improves the pose estimation of thinner body parts, i. e. the limbs. The dynamic model takes self-occlusion into account by increasing the variance of occluded body-parts, thus allowing for recovery when the body part reappears. We present two applications of our method that work in real-time on a Cell Broadband Engine™: a computer game and a virtual clothing application.

2. Human thyroid specimen imaging by fluorescent x-ray computed tomography with synchrotron radiation

Science.gov (United States)

Takeda, Tohoru; Yu, Quanwen; Yashiro, Toru; Yuasa, Tetsuya; Hasegawa, Yasuo; Itai, Yuji; Akatsuka, Takao

1999-09-01

Fluorescent x-ray computed tomography (FXCT) is being developed to detect non-radioactive contrast materials in living specimens. The FXCT system consists of a silicon (111) channel cut monochromator, an x-ray slit and a collimator for fluorescent x ray detection, a scanning table for the target organ and an x-ray detector for fluorescent x-ray and transmission x-ray. To reduce Compton scattering overlapped on the fluorescent K(alpha) line, incident monochromatic x-ray was set at 37 keV. The FXCT clearly imaged a human thyroid gland and iodine content was estimated quantitatively. In a case of hyperthyroidism, the two-dimensional distribution of iodine content was not uniform, and thyroid cancer had a small amount of iodine. FXCT can be used to detect iodine within thyroid gland quantitatively and to delineate its distribution.

3. Application of computer-assisted imaging technology in human musculoskeletal joint research

Directory of Open Access Journals (Sweden)

Xudong Liu

2014-01-01

Full Text Available Computer-assisted imaging analysis technology has been widely used in the musculoskeletal joint biomechanics research in recent years. Imaging techniques can accurately reconstruct the anatomic features of the target joint and reproduce its in vivo motion characters. The data has greatly improved our understanding of normal joint function, joint injury mechanism, and surgical treatment, and can provide foundations for using reverse-engineering methods to develop biomimetic artificial joints. In this paper, we systematically reviewed the investigation of in vivo kinematics of the human knee, shoulder, lumber spine, and ankle using advanced imaging technologies, especially those using a dual fluoroscopic imaging system (DFIS. We also briefly discuss future development of imaging analysis technology in musculoskeletal joint research.

4. Histogram analysis for age change of human lung with computed tomography

International Nuclear Information System (INIS)

Shirabe, Ichiju

1990-01-01

In order to evaluate physiological changes of normal lung with aging by computed tomography (CT), the peak position (PP) and full width half maximum (FWHM) of CT-histogram were studied in 77 normal human lung. Above 30 years old, PP tended to be seen in the lower attenuation value with advancing ages, with the result that the follow equation was obtained. CT attenuation value of PP=-0.87 x age -815. The peak position shifted to the range of higher CT attenuation in 30's. FWHM did not change with advancing ages. There were no differences of peak value and FWHM among the upper, middle and lower lung field. In this study, physiological changes of lung were evaluated quantitatively. Furthermore, this study was considered to be useful for diagnosis and treatment in lung diseases. (author)

5. Accuracy of computer-guided implantation in a human cadaver model.

Science.gov (United States)

Yatzkair, Gustavo; Cheng, Alice; Brodie, Stan; Raviv, Eli; Boyan, Barbara D; Schwartz, Zvi

2015-10-01

To examine the accuracy of computer-guided implantation using a human cadaver model with reduced experimental variability. Twenty-eight (28) dental implants representing 12 clinical cases were placed in four cadaver heads using a static guided implantation template. All planning and surgeries were performed by one clinician. All radiographs and measurements were performed by two examiners. The distance of the implants from buccal and lingual bone and mesial implant or tooth was analyzed at the apical and coronal levels, and measurements were compared to the planned values. No significant differences were seen between planned and implanted measurements. Average deviation of an implant from its planning radiograph was 0.8 mm, which is within the range of variability expected from CT analysis. Guided implantation can be used safely with a margin of error of 1 mm. © 2014 The Authors. Clinical Oral Implants Research Published by John Wiley & Sons Ltd.

6. Soft Electronics Enabled Ergonomic Human-Computer Interaction for Swallowing Training

Science.gov (United States)

Lee, Yongkuk; Nicholls, Benjamin; Sup Lee, Dong; Chen, Yanfei; Chun, Youngjae; Siang Ang, Chee; Yeo, Woon-Hong

2017-04-01

We introduce a skin-friendly electronic system that enables human-computer interaction (HCI) for swallowing training in dysphagia rehabilitation. For an ergonomic HCI, we utilize a soft, highly compliant (“skin-like”) electrode, which addresses critical issues of an existing rigid and planar electrode combined with a problematic conductive electrolyte and adhesive pad. The skin-like electrode offers a highly conformal, user-comfortable interaction with the skin for long-term wearable, high-fidelity recording of swallowing electromyograms on the chin. Mechanics modeling and experimental quantification captures the ultra-elastic mechanical characteristics of an open mesh microstructured sensor, conjugated with an elastomeric membrane. Systematic in vivo studies investigate the functionality of the soft electronics for HCI-enabled swallowing training, which includes the application of a biofeedback system to detect swallowing behavior. The collection of results demonstrates clinical feasibility of the ergonomic electronics in HCI-driven rehabilitation for patients with swallowing disorders.

7. Monochromatic computed tomography of the human brain using synchrotron x rays: Technical feasibility

International Nuclear Information System (INIS)

Nachaliel, E.; Dilmanian, F.A.; Garrett, R.F.; Thomlinson, W.C.; Chapman, L.D.; Gmuer, N.F.; Lazarz, N.M.; Moulin, H.R.; Rivers, M.L.; Rarback, H.; Stefan, P.M.; Spanne, P.; Luke, P.N.; Pehl, R.; Thompson, A.C.; Miller, M.

1991-01-01

A monochromatic computed tomography (CT) scanner is being developed at the X17 superconducting wiggler beamline at the National Synchrotron Light Source (NSLS), Brookhaven National Laboratory, to image the human head and neck. The system configuration is one of a horizontal fan beam and an upright seated rotating subject. The purpose of the project are to demonstrate improvement in the image contrast and in the image quantitative accuracy that can be obtained in monochromatic CT and to apply the system to specific clinical research programs in neuroradiology. This paper describes the first phantom studies carried out with a prototype system, using the dual photon absorptiometry (DPA) method at energies of 20 and 39 Kev. The results show that improvements in image contrast and quantitative accuracy are possible with monochromatic DPA CT. Estimates of the clinical performance of the planned CT system are made on the basis of these initial results

8. Twenty Years of Creativity Research in Human-Computer Interaction: Current State and Future Directions

DEFF Research Database (Denmark)

Frich Pedersen, Jonas; Biskjaer, Michael Mose; Dalsgaard, Peter

2018-01-01

Creativity has been a growing topic within the ACM community since the 1990s. However, no clear overview of this trend has been offered. We present a thorough survey of 998 creativity-related publications in the ACM Digital Library collected using keyword search to determine prevailing approaches......, topics, and characteristics of creativity-oriented Human-Computer Interaction (HCI) research. . A selected sample based on yearly citations yielded 221 publications, which were analyzed using constant comparison analysis. We found that HCI is almost exclusively responsible for creativity......-oriented publications; they focus on collaborative creativity rather than individual creativity; there is a general lack of definition of the term ‘creativity’; empirically based contributions are prevalent; and many publications focus on new tools, often developed by researchers. On this basis, we present three...

9. Radiotherapy infrastructure and human resources in Switzerland : Present status and projected computations for 2020.

Science.gov (United States)

Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

2016-09-01

10. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

International Nuclear Information System (INIS)

Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

2016-01-01

11. Computationally derived points of fragility of a human cascade are consistent with current therapeutic strategies.

Directory of Open Access Journals (Sweden)

Deyan Luan

2007-07-01

Full Text Available The role that mechanistic mathematical modeling and systems biology will play in molecular medicine and clinical development remains uncertain. In this study, mathematical modeling and sensitivity analysis were used to explore the working hypothesis that mechanistic models of human cascades, despite model uncertainty, can be computationally screened for points of fragility, and that these sensitive mechanisms could serve as therapeutic targets. We tested our working hypothesis by screening a model of the well-studied coagulation cascade, developed and validated from literature. The predicted sensitive mechanisms were then compared with the treatment literature. The model, composed of 92 proteins and 148 protein-protein interactions, was validated using 21 published datasets generated from two different quiescent in vitro coagulation models. Simulated platelet activation and thrombin generation profiles in the presence and absence of natural anticoagulants were consistent with measured values, with a mean correlation of 0.87 across all trials. Overall state sensitivity coefficients, which measure the robustness or fragility of a given mechanism, were calculated using a Monte Carlo strategy. In the absence of anticoagulants, fluid and surface phase factor X/activated factor X (fX/FXa activity and thrombin-mediated platelet activation were found to be fragile, while fIX/FIXa and fVIII/FVIIIa activation and activity were robust. Both anti-fX/FXa and direct thrombin inhibitors are important classes of anticoagulants; for example, anti-fX/FXa inhibitors have FDA approval for the prevention of venous thromboembolism following surgical intervention and as an initial treatment for deep venous thrombosis and pulmonary embolism. Both in vitro and in vivo experimental evidence is reviewed supporting the prediction that fIX/FIXa activity is robust. When taken together, these results support our working hypothesis that computationally derived points of

12. COMPUTING

CERN Multimedia

M. Kasemann

CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

13. Integrated multimodal human-computer interface and augmented reality for interactive display applications

Science.gov (United States)

Vassiliou, Marius S.; Sundareswaran, Venkataraman; Chen, S.; Behringer, Reinhold; Tam, Clement K.; Chan, M.; Bangayan, Phil T.; McGee, Joshua H.

2000-08-01

We describe new systems for improved integrated multimodal human-computer interaction and augmented reality for a diverse array of applications, including future advanced cockpits, tactical operations centers, and others. We have developed an integrated display system featuring: speech recognition of multiple concurrent users equipped with both standard air- coupled microphones and novel throat-coupled sensors (developed at Army Research Labs for increased noise immunity); lip reading for improving speech recognition accuracy in noisy environments, three-dimensional spatialized audio for improved display of warnings, alerts, and other information; wireless, coordinated handheld-PC control of a large display; real-time display of data and inferences from wireless integrated networked sensors with on-board signal processing and discrimination; gesture control with disambiguated point-and-speak capability; head- and eye- tracking coupled with speech recognition for 'look-and-speak' interaction; and integrated tetherless augmented reality on a wearable computer. The various interaction modalities (speech recognition, 3D audio, eyetracking, etc.) are implemented a 'modality servers' in an Internet-based client-server architecture. Each modality server encapsulates and exposes commercial and research software packages, presenting a socket network interface that is abstracted to a high-level interface, minimizing both vendor dependencies and required changes on the client side as the server's technology improves.

14. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations

Directory of Open Access Journals (Sweden)

Andrea Stocco

2018-04-01

Full Text Available This article describes the data analyzed in the paper “Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model” (Stocco et al., 2017 [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004 [2], Simon task (Craft and Simon, 1970 [3], and Automated Operation Span (Unsworth et al., 2005 [4], as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

15. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations.

Science.gov (United States)

Stocco, Andrea; Yamasaki, Brianna L; Prat, Chantel S

2018-04-01

This article describes the data analyzed in the paper "Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model" (Stocco et al., 2017) [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004) [2], Simon task (Craft and Simon, 1970) [3], and Automated Operation Span (Unsworth et al., 2005) [4]), as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format) as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

16. USING OLFACTORY DISPLAYS AS A NONTRADITIONAL INTERFACE IN HUMAN COMPUTER INTERACTION

Directory of Open Access Journals (Sweden)

Alper Efe

2017-07-01

Full Text Available Smell has its limitations and disadvantages as a display medium, but it also has its strengths and many have recognized its potential. At present, in communications and virtual technologies, smell is either forgotten or improperly stimulated, because non controlled odorants present in the physical space surrounding the user. Nonetheless a controlled presentation of olfactory information can give advantages in various application fields. Therefore, two enabling technologies, electronic noses and especially olfactory displays are reviewed. Scenarios of usage are discussed together with relevant psycho-physiological issues. End-to-end systems including olfactory interfaces are quantitatively characterised under many respects. Recent works done by the authors on field are reported. The article will touch briefly on the control of scent emissions; an important factor to consider when building scented computer systems. As a sample application SUBSMELL system investigated. A look at areas of human computer interaction where olfaction output may prove useful will be presented. The article will finish with some brief conclusions and discuss some shortcomings and gaps of the topic. In particular, the addition of olfactory cues to a virtual environment increased the user's sense of presence and memory of the environment. Also, this article discusses the educational aspect of the subsmell systems.

17. Flat panel computed tomography of human ex vivo heart and bone specimens: initial experience

Energy Technology Data Exchange (ETDEWEB)

Nikolaou, Konstantin; Becker, Christoph R.; Reiser, Maximilian F. [Ludwig-Maximilians-University, Department of Clinical Radiology, Munich (Germany); Flohr, Thomas; Stierstorfer, Karl [CT Division, Siemens Medical Solutions, Forchheim (Germany)

2005-02-01

The aim of this technical investigation was the detailed description of a prototype flat panel detector computed tomography system (FPCT) and its initial evaluation in an ex vivo setting. The prototype FPCT scanner consists of a conventional radiographic flat panel detector, mounted on a multi-slice CT scanner gantry. Explanted human ex vivo heart and foot specimens were examined. Images were reformatted with various reconstruction algorithms and were evaluated for high-resolution anatomic information. For comparison purposes, the ex vivo specimens were also scanned with a conventional 16-detector-row CT scanner (Sensation 16, Siemens Medical Solutions, Forchheim, Germany). With the FPCT prototype used, a 1,024 x 768 resolution matrix can be obtained, resulting in an isotropic voxel size of 0.25 x 0.25 x 0.25 mm at the iso-center. Due to the high spatial resolution, very small structures such as trabecular bone or third-degree, distal branches of coronary arteries could be visualized. This first evaluation showed that flat panel detector systems can be used in a cone-beam computed tomography scanner and that very high spatial resolutions can be achieved. However, there are limitations for in vivo use due to constraints in low contrast resolution and slow scan speed. (orig.)

18. The Dimensions of the Orbital Cavity Based on High-Resolution Computed Tomography of Human Cadavers

DEFF Research Database (Denmark)

Felding, Ulrik Ascanius; Bloch, Sune Land; Buchwald, Christian von

2016-01-01

for surface area. To authors' knowledge, this study is the first to have measured the entire surface area of the orbital cavity.The volume and surface area of the orbital cavity were estimated in computed tomography scans of 11 human cadavers using unbiased stereological sampling techniques. The mean (± SD......) total volume and total surface area of the orbital cavities was 24.27 ± 3.88 cm and 32.47 ± 2.96 cm, respectively. There was no significant difference in volume (P = 0.315) or surface area (P = 0.566) between the 2 orbital cavities.The stereological technique proved to be a robust and unbiased method...... that may be used as a gold standard for comparison with automated computer software. Future imaging studies in blow-out fracture patients may be based on individual and relative calculation involving both herniated volume and fractured surface area in relation to the total volume and surface area...

19. A Novel Wearable Forehead EOG Measurement System for Human Computer Interfaces.

Science.gov (United States)

Heo, Jeong; Yoon, Heenam; Park, Kwang Suk

2017-06-23

Amyotrophic lateral sclerosis (ALS) patients whose voluntary muscles are paralyzed commonly communicate with the outside world using eye movement. There have been many efforts to support this method of communication by tracking or detecting eye movement. An electrooculogram (EOG), an electro-physiological signal, is generated by eye movements and can be measured with electrodes placed around the eye. In this study, we proposed a new practical electrode position on the forehead to measure EOG signals, and we developed a wearable forehead EOG measurement system for use in Human Computer/Machine interfaces (HCIs/HMIs). Four electrodes, including the ground electrode, were placed on the forehead. The two channels were arranged vertically and horizontally, sharing a positive electrode. Additionally, a real-time eye movement classification algorithm was developed based on the characteristics of the forehead EOG. Three applications were employed to evaluate the proposed system: a virtual keyboard using a modified Bremen BCI speller and an automatic sequential row-column scanner, and a drivable power wheelchair. The mean typing speeds of the modified Bremen brain-computer interface (BCI) speller and automatic row-column scanner were 10.81 and 7.74 letters per minute, and the mean classification accuracies were 91.25% and 95.12%, respectively. In the power wheelchair demonstration, the user drove the wheelchair through an 8-shape course without collision with obstacles.

20. HumanComputer Systems Interaction Backgrounds and Applications 2 Part 2

CERN Document Server

Kulikowski, Juliusz; Mroczek, Teresa

2012-01-01

This volume of the book contains a collection of chapters selected from the papers which originally (in shortened form) have been presented at the 3rd International Conference on Human-Systems Interaction held in Rzeszow, Poland, in 2010. The chapters are divided into five sections concerning: IV. Environment monitoring and robotic systems, V. Diagnostic systems, VI. Educational Systems, and VII. General Problems. The novel concepts and realizations of humanoid robots, talking robots and orthopedic surgical robots, as well as those of direct brain-computer interface  are examples of particularly interesting topics presented in Sec. VI. In Sec. V the problems of  skin cancer recognition, colonoscopy diagnosis, and brain strokes diagnosis as well as more general problems of ontology design for  medical diagnostic knowledge are presented. Example of an industrial diagnostic system and a concept of new algorithm for edges detection in computer-analyzed images  are also presented in this Section. Among the edu...

1. Direct Monte Carlo dose calculation using polygon-surface computational human model

International Nuclear Information System (INIS)

Jeong, Jong Hwi; Kim, Chan Hyeong; Yeom, Yeon Su; Cho, Sungkoo; Chung, Min Suk; Cho, Kun-Woo

2011-01-01

In the present study, a voxel-type computational human model was converted to a polygon-surface model, after which it was imported directly to the Geant4 code without using a voxelization process, that is, without converting back to a voxel model. The original voxel model was also imported to the Geant4 code, in order to compare the calculated dose values and the computational speed. The average polygon size of the polygon-surface model was ∼0.5 cm 2 , whereas the voxel resolution of the voxel model was 1.981 × 1.981 × 2.0854 mm 3 . The results showed a good agreement between the calculated dose values of the two models. The polygon-surface model was, however, slower than the voxel model by a factor of 6–9 for the photon energies and irradiation geometries considered in the present study, which nonetheless is considered acceptable, considering that direct use of the polygon-surface model does not require a separate voxelization process. (author)

2. Neural and cortisol responses during play with human and computer partners in children with autism

Science.gov (United States)

Edmiston, Elliot Kale; Merkle, Kristen

2015-01-01

Children with autism spectrum disorder (ASD) exhibit impairment in reciprocal social interactions, including play, which can manifest as failure to show social preference or discrimination between social and nonsocial stimuli. To explore mechanisms underlying these deficits, we collected salivary cortisol from 42 children 8–12 years with ASD or typical development during a playground interaction with a confederate child. Participants underwent functional MRI during a prisoner’s dilemma game requiring cooperation or defection with a human (confederate) or computer partner. Search region of interest analyses were based on previous research (e.g. insula, amygdala, temporal parietal junction—TPJ). There were significant group differences in neural activation based on partner and response pattern. When playing with a human partner, children with ASD showed limited engagement of a social salience brain circuit during defection. Reduced insula activation during defection in the ASD children relative to TD children, regardless of partner type, was also a prominent finding. Insula and TPJ BOLD during defection was also associated with stress responsivity and behavior in the ASD group under playground conditions. Children with ASD engage social salience networks less than TD children during conditions of social salience, supporting a fundamental disturbance of social engagement. PMID:25552572

3. X-ray micro computed tomography for the visualization of an atherosclerotic human coronary artery

Science.gov (United States)

Matviykiv, Sofiya; Buscema, Marzia; Deyhle, Hans; Pfohl, Thomas; Zumbuehl, Andreas; Saxer, Till; Müller, Bert

2017-06-01

Atherosclerosis refers to narrowing or blocking of blood vessels that can lead to a heart attack, chest pain or stroke. Constricted segments of diseased arteries exhibit considerably increased wall shear stress, compared to the healthy ones. One of the possibilities to improve patient’s treatment is the application of nano-therapeutic approaches, based on shear stress sensitive nano-containers. In order to tailor the chemical composition and subsequent physical properties of such liposomes, one has to know precisely the morphology of critically stenosed arteries at micrometre resolution. It is often obtained by means of histology, which has the drawback of offering only two-dimensional information. Additionally, it requires the artery to be decalcified before sectioning, which might lead to deformations within the tissue. Micro computed tomography (μCT) enables the three-dimensional (3D) visualization of soft and hard tissues at micrometre level. μCT allows lumen segmentation that is crucial for subsequent flow simulation analysis. In this communication, tomographic images of a human coronary artery before and after decalcification are qualitatively and quantitatively compared. We analyse the cross section of the diseased human coronary artery before and after decalcification, and calculate the lumen area of both samples.

4. Computational Fluid Dynamics Ventilation Study for the Human Powered Centrifuge at the International Space Station

Science.gov (United States)

Son, Chang H.

2012-01-01

The Human Powered Centrifuge (HPC) is a facility that is planned to be installed on board the International Space Station (ISS) to enable crew exercises under the artificial gravity conditions. The HPC equipment includes a "bicycle" for long-term exercises of a crewmember that provides power for rotation of HPC at a speed of 30 rpm. The crewmember exercising vigorously on the centrifuge generates the amount of carbon dioxide of about two times higher than a crewmember in ordinary conditions. The goal of the study is to analyze the airflow and carbon dioxide distribution within Pressurized Multipurpose Module (PMM) cabin when HPC is operating. A full unsteady formulation is used for airflow and CO2 transport CFD-based modeling with the so-called sliding mesh concept when the HPC equipment with the adjacent Bay 4 cabin volume is considered in the rotating reference frame while the rest of the cabin volume is considered in the stationary reference frame. The rotating part of the computational domain includes also a human body model. Localized effects of carbon dioxide dispersion are examined. Strong influence of the rotating HPC equipment on the CO2 distribution detected is discussed.

5. Computational dissection of human episodic memory reveals mental process-specific genetic profiles

Science.gov (United States)

Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G.; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J.-F.

2015-01-01

Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory. PMID:26261317

6. Computational Prediction of Human Salivary Proteins from Blood Circulation and Application to Diagnostic Biomarker Identification

Science.gov (United States)

Wang, Jiaxin; Liang, Yanchun; Wang, Yan; Cui, Juan; Liu, Ming; Du, Wei; Xu, Ying

2013-01-01

Proteins can move from blood circulation into salivary glands through active transportation, passive diffusion or ultrafiltration, some of which are then released into saliva and hence can potentially serve as biomarkers for diseases if accurately identified. We present a novel computational method for predicting salivary proteins that come from circulation. The basis for the prediction is a set of physiochemical and sequence features we found to be discerning between human proteins known to be movable from circulation to saliva and proteins deemed to be not in saliva. A classifier was trained based on these features using a support-vector machine to predict protein secretion into saliva. The classifier achieved 88.56% average recall and 90.76% average precision in 10-fold cross-validation on the training data, indicating that the selected features are informative. Considering the possibility that our negative training data may not be highly reliable (i.e., proteins predicted to be not in saliva), we have also trained a ranking method, aiming to rank the known salivary proteins from circulation as the highest among the proteins in the general background, based on the same features. This prediction capability can be used to predict potential biomarker proteins for specific human diseases when coupled with the information of differentially expressed proteins in diseased versus healthy control tissues and a prediction capability for blood-secretory proteins. Using such integrated information, we predicted 31 candidate biomarker proteins in saliva for breast cancer. PMID:24324552

7. Neural mechanisms of transient neocortical beta rhythms: Converging evidence from humans, computational modeling, monkeys, and mice

Science.gov (United States)

Sherman, Maxwell A.; Lee, Shane; Law, Robert; Haegens, Saskia; Thorn, Catherine A.; Hämäläinen, Matti S.; Moore, Christopher I.; Jones, Stephanie R.

2016-01-01

Human neocortical 15–29-Hz beta oscillations are strong predictors of perceptual and motor performance. However, the mechanistic origin of beta in vivo is unknown, hindering understanding of its functional role. Combining human magnetoencephalography (MEG), computational modeling, and laminar recordings in animals, we present a new theory that accounts for the origin of spontaneous neocortical beta. In our MEG data, spontaneous beta activity from somatosensory and frontal cortex emerged as noncontinuous beta events typically lasting drive targeting proximal and distal dendrites of pyramidal neurons, where the defining feature of a beta event was a strong distal drive that lasted one beta period (∼50 ms). This beta mechanism rigorously accounted for the beta event profiles; several other mechanisms did not. The spatial location of synaptic drive in the model to supragranular and infragranular layers was critical to the emergence of beta events and led to the prediction that beta events should be associated with a specific laminar current profile. Laminar recordings in somatosensory neocortex from anesthetized mice and awake monkeys supported these predictions, suggesting this beta mechanism is conserved across species and recording modalities. These findings make several predictions about optimal states for perceptual and motor performance and guide causal interventions to modulate beta for optimal function. PMID:27469163

8. Computed aided system for separation and classification of the abnormal erythrocytes in human blood

Science.gov (United States)

Wąsowicz, Michał; Grochowski, Michał; Kulka, Marek; Mikołajczyk, Agnieszka; Ficek, Mateusz; Karpieńko, Katarzyna; Cićkiewicz, Maciej

2017-12-01

The human peripheral blood consists of cells (red cells, white cells, and platelets) suspended in plasma. In the following research the team assessed an influence of nanodiamond particles on blood elements over various periods of time. The material used in the study consisted of samples taken from ten healthy humans of various age, different blood types and both sexes. The markings were leaded by adding to the blood unmodified diamonds and oxidation modified. The blood was put under an impact of two diamond concentrations: 20μl and 100μl. The amount of abnormal cells increased with time. The percentage of echinocytes as a result of interaction with nanodiamonds in various time intervals for individual specimens was scarce. The impact of the two diamond types had no clinical importance on red blood cells. It is supposed that as a result of longlasting exposure a dehydratation of red cells takes place, because of the function of the cells. The analysis of an influence of nanodiamond particles on blood elements was supported by computer system designed for automatic counting and classification of the Red Blood Cells (RBC). The system utilizes advanced image processing methods for RBCs separation and counting and Eigenfaces method coupled with the neural networks for RBCs classification into normal and abnormal cells purposes.

9. Computational dissection of human episodic memory reveals mental process-specific genetic profiles.

Science.gov (United States)

Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J-F

2015-09-01

Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory.

10. Comparison between a Computational Seated Human Model and Experimental Verification Data

Directory of Open Access Journals (Sweden)

Christian G. Olesen

2014-01-01

Full Text Available Sitting-acquired deep tissue injuries (SADTI are the most serious type of pressure ulcers. In order to investigate the aetiology of SADTI a new approach is under development: a musculo-skeletal model which can predict forces between the chair and the human body at different seated postures. This study focuses on comparing results from a model developed in the AnyBody Modeling System, with data collected from an experimental setup. A chair with force-measuring equipment was developed, an experiment was conducted with three subjects, and the experimental results were compared with the predictions of the computational model. The results show that the model predicted the reaction forces for different chair postures well. The correlation coefficients of how well the experiment and model correlate for the seat angle, backrest angle and footrest height was 0.93, 0.96, and 0.95. The study show a good agreement between experimental data and model prediction of forces between a human body and a chair. The model can in the future be used in designing wheelchairs or automotive seats.

11. Resistance to change and resurgence in humans engaging in a computer task.

Science.gov (United States)

Kuroda, Toshikazu; Cançado, Carlos R X; Podlesnik, Christopher A

2016-04-01

The relation between persistence, as measured by resistance to change, and resurgence has been examined with nonhuman animals but not systematically with humans. The present study examined persistence and resurgence with undergraduate students engaging in a computer task for points exchangeable for money. In Phase 1, a target response was maintained on a multiple variable-interval (VI) 15-s (Rich) VI 60-s (Lean) schedule of reinforcement. In Phase 2, the target response was extinguished while an alternative response was reinforced at equal rates in both schedule components. In Phase 3, the target and the alternative responses were extinguished. In an additional test of persistence (Phase 4), target responding was reestablished as in Phase 1 and then disrupted by access to videos in both schedule components. In Phases 2 and 4, target responding was more persistent in the Rich than in the Lean component. Also, resurgence generally was greater in the Rich than in the Lean component in Phase 3. The present findings with humans extend the generality of those obtained with nonhuman animals showing that higher reinforcement rates produce both greater persistence and resurgence, and suggest that common processes underlie response persistence and relapse. Copyright © 2016 Elsevier B.V. All rights reserved.

12. Population of 224 realistic human subject-based computational breast phantoms

Energy Technology Data Exchange (ETDEWEB)

2016-01-15

Purpose: To create a database of highly realistic and anatomically variable 3D virtual breast phantoms based on dedicated breast computed tomography (bCT) data. Methods: A tissue classification and segmentation algorithm was used to create realistic and detailed 3D computational breast phantoms based on 230 + dedicated bCT datasets from normal human subjects. The breast volume was identified using a coarse three-class fuzzy C-means segmentation algorithm which accounted for and removed motion blur at the breast periphery. Noise in the bCT data was reduced through application of a postreconstruction 3D bilateral filter. A 3D adipose nonuniformity (bias field) correction was then applied followed by glandular segmentation using a 3D bias-corrected fuzzy C-means algorithm. Multiple tissue classes were defined including skin, adipose, and several fractional glandular densities. Following segmentation, a skin mask was produced which preserved the interdigitated skin, adipose, and glandular boundaries of the skin interior. Finally, surface modeling was used to produce digital phantoms with methods complementary to the XCAT suite of digital human phantoms. Results: After rejecting some datasets due to artifacts, 224 virtual breast phantoms were created which emulate the complex breast parenchyma of actual human subjects. The volume breast density (with skin) ranged from 5.5% to 66.3% with a mean value of 25.3% ± 13.2%. Breast volumes ranged from 25.0 to 2099.6 ml with a mean value of 716.3 ± 386.5 ml. Three breast phantoms were selected for imaging with digital compression (using finite element modeling) and simple ray-tracing, and the results show promise in their potential to produce realistic simulated mammograms. Conclusions: This work provides a new population of 224 breast phantoms based on in vivo bCT data for imaging research. Compared to previous studies based on only a few prototype cases, this dataset provides a rich source of new cases spanning a wide range

13. Computational dosimetry for grounded and ungrounded human models due to contact current

International Nuclear Information System (INIS)

Chan, Kwok Hung; Hattori, Junya; Laakso, Ilkka; Hirata, Akimasa; Taki, Masao

2013-01-01

This study presents the computational dosimetry of contact currents for grounded and ungrounded human models. The uncertainty of the quasi-static (QS) approximation of the in situ electric field induced in a grounded/ungrounded human body due to the contact current is first estimated. Different scenarios of cylindrical and anatomical human body models are considered, and the results are compared with the full-wave analysis. In the QS analysis, the induced field in the grounded cylindrical model is calculated by the QS finite-difference time-domain (QS-FDTD) method, and compared with the analytical solution. Because no analytical solution is available for the grounded/ungrounded anatomical human body model, the results of the QS-FDTD method are then compared with those of the conventional FDTD method. The upper frequency limit for the QS approximation in the contact current dosimetry is found to be 3 MHz, with a relative local error of less than 10%. The error increases above this frequency, which can be attributed to the neglect of the displacement current. The QS or conventional FDTD method is used for the dosimetry of induced electric field and/or specific absorption rate (SAR) for a contact current injected into the index finger of a human body model in the frequency range from 10 Hz to 100 MHz. The in situ electric fields or SAR are compared with the basic restrictions in the international guidelines/standards. The maximum electric field or the 99th percentile value of the electric fields appear not only in the fat and muscle tissues of the finger, but also around the wrist, forearm, and the upper arm. Some discrepancies are observed between the basic restrictions for the electric field and SAR and the reference levels for the contact current, especially in the extremities. These discrepancies are shown by an equation that relates the current density, tissue conductivity, and induced electric field in the finger with a cross-sectional area of 1 cm 2 . (paper)

14. Micro-Computed Tomography Evaluation of Human Fat Grafts in Nude Mice

Science.gov (United States)

Chung, Michael T.; Hyun, Jeong S.; Lo, David D.; Montoro, Daniel T.; Hasegawa, Masakazu; Levi, Benjamin; Januszyk, Michael; Longaker, Michael T.

2013-01-01

Background Although autologous fat grafting has revolutionized the field of soft tissue reconstruction and augmentation, long-term maintenance of fat grafts is unpredictable. Recent studies have reported survival rates of fat grafts to vary anywhere between 10% and 80% over time. The present study evaluated the long-term viability of human fat grafts in a murine model using a novel imaging technique allowing for in vivo volumetric analysis. Methods Human fat grafts were prepared from lipoaspirate samples using the Coleman technique. Fat was injected subcutaneously into the scalp of 10 adult Crl:NU-Foxn1nu CD-1 male mice. Micro-computed tomography (CT) was performed immediately following injection and then weekly thereafter. Fat volume was rendered by reconstructing a three-dimensional (3D) surface through cubic-spline interpolation. Specimens were also harvested at various time points and sections were prepared and stained with hematoxylin and eosin (H&E), for macrophages using CD68 and for the cannabinoid receptor 1 (CB1). Finally, samples were explanted at 8- and 12-week time points to validate calculated micro-CT volumes. Results Weekly CT scanning demonstrated progressive volume loss over the time course. However, volumetric analysis at the 8- and 12-week time points stabilized, showing an average of 62.2% and 60.9% survival, respectively. Gross analysis showed the fat graft to be healthy and vascularized. H&E analysis and staining for CD68 showed minimal inflammatory reaction with viable adipocytes. Immunohistochemical staining with anti-human CB1 antibodies confirmed human origin of the adipocytes. Conclusions Studies assessing the fate of autologous fat grafts in animals have focused on nonimaging modalities, including histological and biochemical analyses, which require euthanasia of the animals. In this study, we have demonstrated the ability to employ micro-CT for 3D reconstruction and volumetric analysis of human fat grafts in a mouse model. Importantly

15. Conversion of IVA Human Computer Model to EVA Use and Evaluation and Comparison of the Result to Existing EVA Models

Science.gov (United States)

Hamilton, George S.; Williams, Jermaine C.

1998-01-01

This paper describes the methods, rationale, and comparative results of the conversion of an intravehicular (IVA) 3D human computer model (HCM) to extravehicular (EVA) use and compares the converted model to an existing model on another computer platform. The task of accurately modeling a spacesuited human figure in software is daunting: the suit restricts the human's joint range of motion (ROM) and does not have joints collocated with human joints. The modeling of the variety of materials needed to construct a space suit (e. g. metal bearings, rigid fiberglass torso, flexible cloth limbs and rubber coated gloves) attached to a human figure is currently out of reach of desktop computer hardware and software. Therefore a simplified approach was taken. The HCM's body parts were enlarged and the joint ROM was restricted to match the existing spacesuit model. This basic approach could be used to model other restrictive environments in industry such as chemical or fire protective clothing. In summary, the approach provides a moderate fidelity, usable tool which will run on current notebook computers.

16. A computational method for identification of vaccine targets from protein regions of conserved human leukocyte antigen binding

DEFF Research Database (Denmark)

Olsen, Lars Rønn; Simon, Christian; Kudahl, Ulrich J.

2015-01-01

Background: Computational methods for T cell-based vaccine target discovery focus on selection of highly conserved peptides identified across pathogen variants, followed by prediction of their binding of human leukocyte antigen molecules. However, experimental studies have shown that T cells ofte...... or proteome using human leukocyte antigen binding predictions and made a web-accessible software implementation freely available at http://met-hilab.cbs.dtu.dk/blockcons/....

17. Rethinking Human-Centered Computing: Finding the Customer and Negotiated Interactions at the Airport

Science.gov (United States)

Wales, Roxana; O'Neill, John; Mirmalek, Zara

2003-01-01

The breakdown in the air transportation system over the past several years raises an interesting question for researchers: How can we help improve the reliability of airline operations? In offering some answers to this question, we make a statement about Huuman-Centered Computing (HCC). First we offer the definition that HCC is a multi-disciplinary research and design methodology focused on supporting humans as they use technology by including cognitive and social systems, computational tools and the physical environment in the analysis of organizational systems. We suggest that a key element in understanding organizational systems is that there are external cognitive and social systems (customers) as well as internal cognitive and social systems (employees) and that they interact dynamically to impact the organization and its work. The design of human-centered intelligent systems must take this outside-inside dynamic into account. In the past, the design of intelligent systems has focused on supporting the work and improvisation requirements of employees but has often assumed that customer requirements are implicitly satisfied by employee requirements. Taking a customer-centric perspective provides a different lens for understanding this outside-inside dynamic, the work of the organization and the requirements of both customers and employees In this article we will: 1) Demonstrate how the use of ethnographic methods revealed the important outside-inside dynamic in an airline, specifically the consequential relationship between external customer requirements and perspectives and internal organizational processes and perspectives as they came together in a changing environment; 2) Describe how taking a customer centric perspective identifies places where the impact of the outside-inside dynamic is most critical and requires technology that can be adaptive; 3) Define and discuss the place of negotiated interactions in airline operations, identifying how these

18. Historical Overview, Current Status, and Future Trends in Human-Computer Interfaces for Process Control

International Nuclear Information System (INIS)

Owre, Fridtjov

2003-01-01

Approximately 25 yr ago, the first computer-based process control systems, including computer-generated displays, appeared. It is remarkable how slowly the human-computer interfaces (HCI's) of such systems have developed over the years. The display design approach in those early days had its roots in the topology of the process. Usually, the information came from the piping and instrumentation diagrams. Later, some important additional functions were added to the basic system, such as alarm and trend displays. Today, these functions are still the basic ones, and the end-user displays have not changed much except for improved display quality in terms of colors, font types and sizes, resolution, and object shapes, resulting from improved display hardware.Today, there are two schools of display design competing for supremacy in the process control segment of the HCI community. One can be characterized by extension and integration of current practice, while the other is more revolutionary.The extension of the current practice approach can be described in terms of added system functionality and integration. This means that important functions for the plant operator - such as signal validation, plant overview information, safety parameter displays, procedures, prediction of future states, and plant performance optimization - are added to the basic functions and integrated in a total unified HCI for the plant operator.The revolutionary approach, however, takes as its starting point the design process itself. The functioning of the plant is described in terms of the plant goals and subgoals, as well as the means available to reach these goals. Then, displays are designed representing this functional structure - in clear contrast to the earlier plant topology representation. Depending on the design approach used, the corresponding displays have various designations, e.g., function-oriented, task-oriented, or ecological displays.This paper gives a historical overview of past

19. Computational analysis of histidine mutations on the structural stability of human tyrosinases leading to albinism insurgence.

Science.gov (United States)

Hassan, Mubashir; Abbas, Qamar; Raza, Hussain; Moustafa, Ahmed A; Seo, Sung-Yum

2017-07-25

Misfolding and structural alteration in proteins lead to serious malfunctions and cause various diseases in humans. Mutations at the active binding site in tyrosinase impair structural stability and cause lethal albinism by abolishing copper binding. To evaluate the histidine mutational effect, all mutated structures were built using homology modelling. The protein sequence was retrieved from the UniProt database, and 3D models of original and mutated human tyrosinase sequences were predicted by changing the residual positions within the target sequence separately. Structural and mutational analyses were performed to interpret the significance of mutated residues (N 180 , R 202 , Q 202 , R 211 , Y 363 , R 367 , Y 367 and D 390 ) at the active binding site of tyrosinases. CSpritz analysis depicted that 23.25% residues actively participate in the instability of tyrosinase. The accuracy of predicted models was confirmed through online servers ProSA-web, ERRAT score and VERIFY 3D values. The theoretical pI and GRAVY generated results also showed the accuracy of the predicted models. The CCA negative correlation results depicted that the replacement of mutated residues at His within the active binding site disturbs the structural stability of tyrosinases. The predicted CCA scores of Tyr 367 (-0.079) and Q/R 202 (0.032) revealed that both mutations have more potential to disturb the structural stability. MD simulation analyses of all predicted models justified that Gln 202 , Arg 202 , Tyr 367 and D 390 replacement made the protein structures more susceptible to destabilization. Mutational results showed that the replacement of His with Q/R 202 and Y/R 363 has a lethal effect and may cause melanin associated diseases such as OCA1. Taken together, our computational analysis depicts that the mutated residues such as Q/R 202 and Y/R 363 actively participate in instability and misfolding of tyrosinases, which may govern OCA1 through disturbing the melanin biosynthetic pathway.

20. Sex determination of human mandible using metrical parameters by computed tomography: A prospective radiographic short study

Directory of Open Access Journals (Sweden)

Basavaraj N Kallalli

2016-01-01

Full Text Available Introduction: Sex determination of unidentified human remains is very important in forensic medicine, medicolegal cases, and forensic anthropology. The mandible is the largest and hardest facial bone that commonly resists postmortem damage and forms an important source of personal identification. Additional studies have demonstrated the applicability of facial reconstruction using three-dimensional computed tomography scan (3D-CT for the purpose of individual identification. Aim: To determine the sex of human mandible using metrical parameters by CT. Materials and Methods: The study included thirty subjects (15 males and 15 females, with age group ranging between 10 and 60 years obtained from the outpatient department of Oral Medicine and Radiology, Narsinhbhai Patel Dental College and Hospital. CT scan was performed on all the subjects, and the data obtained were reconstructed for 3D viewing. After obtaining 3D-CT scan, a total of seven mandibular measurements, i.e., gonial angle (G-angle, ramus length (Ramus-L, minimum ramus breadth and gonion-gnathion length (G-G-L, bigonial breadth, bicondylar breadth (BIC-Br, and coronoid length (CO-L were measured; collected data were analyzed using SPSS statistical analysis program by Student's t-test. Results: The result of the study showed that out of seven parameters, G-angle, Ramus-L, G-G-L, BIC-Br, and CO-L showed a significant statistical difference (P < 0.05, with overall accuracy of 86% for males and 82% for females. Conclusion: Personal identification using mandible by conventional methods has already been proved but with variable efficacies. Advanced imaging modalities can aid in personal identification with much higher accuracy than conventional methods.

1. Human Perception, SBS Sympsoms and Performance of Office Work during Exposure to Air Polluted by Building Materials and Personal Computers

DEFF Research Database (Denmark)

Bako-Biro, Zsolt

The present thesis deals with the impact of polluted air from building materials and personal computers on human perception, Sick Building Syndrome (SBS) symptoms and performance of office work. These effects have been studies in a series of experiments that are described in two different chapters...

2. User involvement in the design of human-computer interactions: some similarities and differences between design approaches

NARCIS (Netherlands)

Bekker, M.M.; Long, J.B.

1998-01-01

This paper presents a general review of user involvement in the design of human-computer interactions, as advocated by a selection of different approaches to design. The selection comprises User-Centred Design, Participatory Design, Socio-Technical Design, Soft Systems Methodology, and Joint

3. A computer model of the biosphere, to estimate stochastic and non-stochastic effects of radionuclides on humans

International Nuclear Information System (INIS)

Laurens, J.M.

1985-01-01

A computer code was written to model food chains in order to estimate the internal and external doses, for stochastic and non-stochastic effects, on humans (adults and infants). Results are given for 67 radionuclides, for unit concentration in water (1 Bq/L) and in atmosphere (1 Bq/m 3 )

4. The effect of repeated freeze-thaw cycles on human muscle tissue visualized by postmortem computed tomography (PMCT)

NARCIS (Netherlands)

Klop, Anthony C.; Vester, Marloes E. M.; Colman, Kerri L.; Ruijter, Jan M.; van Rijn, Rick R.; Oostra, Roelof-Jan

2017-01-01

The aim of this study was to determine whether effects of repetitive freeze-thaw cycles, with various thawing temperatures, on human muscle tissue can be quantified using postmortem computed tomography (PMCT) technology. An additional objective was to determine the preferred thawing temperature for

5. The mind-writing pupil : A human-computer interface based on decoding of covert attention through pupillometry

NARCIS (Netherlands)

Mathôt, Sebastiaan; Melmi, Jean Baptiste; Van Der Linden, Lotje; Van Der Stigchel, Stefan

2016-01-01

We present a new human-computer interface that is based on decoding of attention through pupillometry. Our method builds on the recent finding that covert visual attention affects the pupillary light response: Your pupil constricts when you covertly (without looking at it) attend to a bright,

6. Elucidating Mechanisms of Molecular Recognition Between Human Argonaute and miRNA Using Computational Approaches

KAUST Repository

Jiang, Hanlun

2016-12-06

MicroRNA (miRNA) and Argonaute (AGO) protein together form the RNA-induced silencing complex (RISC) that plays an essential role in the regulation of gene expression. Elucidating the underlying mechanism of AGO-miRNA recognition is thus of great importance not only for the in-depth understanding of miRNA function but also for inspiring new drugs targeting miRNAs. In this chapter we introduce a combined computational approach of molecular dynamics (MD) simulations, Markov state models (MSMs), and protein-RNA docking to investigate AGO-miRNA recognition. Constructed from MD simulations, MSMs can elucidate the conformational dynamics of AGO at biologically relevant timescales. Protein-RNA docking can then efficiently identify the AGO conformations that are geometrically accessible to miRNA. Using our recent work on human AGO2 as an example, we explain the rationale and the workflow of our method in details. This combined approach holds great promise to complement experiments in unraveling the mechanisms of molecular recognition between large, flexible, and complex biomolecules.

7. Neural Computations Mediating One-Shot Learning in the Human Brain

Science.gov (United States)

Lee, Sang Wan; O’Doherty, John P.; Shimojo, Shinsuke

2015-01-01

Incremental learning, in which new knowledge is acquired gradually through trial and error, can be distinguished from one-shot learning, in which the brain learns rapidly from only a single pairing of a stimulus and a consequence. Very little is known about how the brain transitions between these two fundamentally different forms of learning. Here we test a computational hypothesis that uncertainty about the causal relationship between a stimulus and an outcome induces rapid changes in the rate of learning, which in turn mediates the transition between incremental and one-shot learning. By using a novel behavioral task in combination with functional magnetic resonance imaging (fMRI) data from human volunteers, we found evidence implicating the ventrolateral prefrontal cortex and hippocampus in this process. The hippocampus was selectively “switched” on when one-shot learning was predicted to occur, while the ventrolateral prefrontal cortex was found to encode uncertainty about the causal association, exhibiting increased coupling with the hippocampus for high-learning rates, suggesting this region may act as a “switch,” turning on and off one-shot learning as required. PMID:25919291

8. Design of a compact low-power human-computer interaction equipment for hand motion

Science.gov (United States)

Wu, Xianwei; Jin, Wenguang

2017-01-01

Human-Computer Interaction (HCI) raises demand of convenience, endurance, responsiveness and naturalness. This paper describes a design of a compact wearable low-power HCI equipment applied to gesture recognition. System combines multi-mode sense signals: the vision sense signal and the motion sense signal, and the equipment is equipped with the depth camera and the motion sensor. The dimension (40 mm × 30 mm) and structure is compact and portable after tight integration. System is built on a module layered framework, which contributes to real-time collection (60 fps), process and transmission via synchronous confusion with asynchronous concurrent collection and wireless Blue 4.0 transmission. To minimize equipment's energy consumption, system makes use of low-power components, managing peripheral state dynamically, switching into idle mode intelligently, pulse-width modulation (PWM) of the NIR LEDs of the depth camera and algorithm optimization by the motion sensor. To test this equipment's function and performance, a gesture recognition algorithm is applied to system. As the result presents, general energy consumption could be as low as 0.5 W.

9. Redesign of a computerized clinical reminder for colorectal cancer screening: a human-computer interaction evaluation

Directory of Open Access Journals (Sweden)

Saleem Jason J

2011-11-01

Full Text Available Abstract Background Based on barriers to the use of computerized clinical decision support (CDS learned in an earlier field study, we prototyped design enhancements to the Veterans Health Administration's (VHA's colorectal cancer (CRC screening clinical reminder to compare against the VHA's current CRC reminder. Methods In a controlled simulation experiment, 12 primary care providers (PCPs used prototypes of the current and redesigned CRC screening reminder in a within-subject comparison. Quantitative measurements were based on a usability survey, workload assessment instrument, and workflow integration survey. We also collected qualitative data on both designs. Results Design enhancements to the VHA's existing CRC screening clinical reminder positively impacted aspects of usability and workflow integration but not workload. The qualitative analysis revealed broad support across participants for the design enhancements with specific suggestions for improving the reminder further. Conclusions This study demonstrates the value of a human-computer interaction evaluation in informing the redesign of information tools to foster uptake, integration into workflow, and use in clinical practice.

10. Spectral and computational features of the binding between riparins and human serum albumin

Science.gov (United States)

Camargo, Cintia Ramos; Caruso, Ícaro Putinhon; Gutierrez, Stanley Juan Chavez; Fossey, Marcelo Andres; Filho, José Maria Barbosa; Cornélio, Marinônio Lopes

2018-02-01

The green Brazilian bay leaf, a spice much prized in local cuisine (Aniba riparia, Lauraceae), contains chemical compounds presenting benzoyl-derivatives named riparins, which have anti-inflammatory, antimicrobial and anxiolytic properties. However, it is unclear what kind of interaction riparins perform with any molecular target. As a profitable target, human serum albumin (HSA) is one of the principal extracellular proteins, with an exceptional capacity to interact with several molecules, and it also plays a crucial role in the transport, distribution, and metabolism of a wide variety of endogenous and exogenous ligands. To outline the HSA-riparin interaction mechanism, spectroscopy and computational methods were synergistically applied. An evaluation through fluorescence spectroscopy showed that the emission, attributed to Trp 214, at 346 nm decreased with titrations of riparins. A static quenching mechanism was observed in the binding of riparins to HSA. Fluorescence experiments performed at 298, 308 and 318 K made it possible to conduct thermodynamic analysis indicating a spontaneous reaction in the complex formation (ΔG modulating the interaction between riparins and HSA. Site marker competitive experiments indicated Site I as being the most suitable, and the molecular modeling tools reinforced the experimental results detailing the participation of residues.

11. Interaction of promethazine and adiphenine to human hemoglobin: A comparative spectroscopic and computational analysis

Science.gov (United States)

Maurya, Neha; ud din Parray, Mehraj; Maurya, Jitendra Kumar; Kumar, Amit; Patel, Rajan

2018-06-01

The binding nature of amphiphilic drugs viz. promethazine hydrochloride (PMT) and adiphenine hydrochloride (ADP), with human hemoglobin (Hb) was unraveled by fluorescence, absorbance, time resolved fluorescence, fluorescence resonance energy transfer (FRET) and circular dichroism (CD) spectral techniques in combination with molecular docking and molecular dynamic simulation methods. The steady state fluorescence spectra indicated that both PMT and ADP quenches the fluorescence of Hb through static quenching mechanism which was further confirmed by time resolved fluorescence spectra. The UV-Vis spectroscopy suggested ground state complex formation. The activation energy (Ea) was observed more in the case of Hb-ADP than Hb-PMT interaction system. The FRET result indicates the high probability of energy transfer from β Trp37 residue of Hb to the PMT (r = 2.02 nm) and ADP (r = 2.33 nm). The thermodynamic data reveal that binding of PMT with Hb are exothermic in nature involving hydrogen bonding and van der Waal interaction whereas in the case of ADP hydrophobic forces play the major role and binding process is endothermic in nature. The CD results show that both PMT and ADP, induced secondary structural changes of Hb and unfold the protein by losing a large helical content while the effect is more pronounced with ADP. Additionally, we also utilized computational approaches for deep insight into the binding of these drugs with Hb and the results are well matched with our experimental results.

12. Controlling a human-computer interface system with a novel classification method that uses electrooculography signals.

Science.gov (United States)

Wu, Shang-Lin; Liao, Lun-De; Lu, Shao-Wei; Jiang, Wei-Ling; Chen, Shi-An; Lin, Chin-Teng

2013-08-01

Electrooculography (EOG) signals can be used to control human-computer interface (HCI) systems, if properly classified. The ability to measure and process these signals may help HCI users to overcome many of the physical limitations and inconveniences in daily life. However, there are currently no effective multidirectional classification methods for monitoring eye movements. Here, we describe a classification method used in a wireless EOG-based HCI device for detecting eye movements in eight directions. This device includes wireless EOG signal acquisition components, wet electrodes and an EOG signal classification algorithm. The EOG classification algorithm is based on extracting features from the electrical signals corresponding to eight directions of eye movement (up, down, left, right, up-left, down-left, up-right, and down-right) and blinking. The recognition and processing of these eight different features were achieved in real-life conditions, demonstrating that this device can reliably measure the features of EOG signals. This system and its classification procedure provide an effective method for identifying eye movements. Additionally, it may be applied to study eye functions in real-life conditions in the near future.

13. Computer simulation of leadership, consensus decision making and collective behaviour in humans.

Directory of Open Access Journals (Sweden)

Song Wu

Full Text Available The aim of this study is to evaluate the reliability of a crowd simulation model developed by the authors by reproducing Dyer et al.'s experiments (published in Philosophical Transactions in 2009 on human leadership and consensus decision making in a computer-based environment. The theoretical crowd model of the simulation environment is presented, and its results are compared and analysed against Dyer et al.'s original experiments. It is concluded that the simulation results are largely consistent with the experiments, which demonstrates the reliability of the crowd model. Furthermore, the simulation data also reveals several additional new findings, namely: 1 the phenomena of sacrificing accuracy to reach a quicker consensus decision found in ants colonies was also discovered in the simulation; 2 the ability of reaching consensus in groups has a direct impact on the time and accuracy of arriving at the target position; 3 the positions of the informed individuals or leaders in the crowd could have significant impact on the overall crowd movement; and 4 the simulation also confirmed Dyer et al.'s anecdotal evidence of the proportion of the leadership in large crowds and its effect on crowd movement. The potential applications of these findings are highlighted in the final discussion of this paper.

14. How should Fitts' Law be applied to human-computer interaction?

Science.gov (United States)

Gillan, D. J.; Holden, K.; Adam, S.; Rudisill, M.; Magee, L.

1992-01-01

The paper challenges the notion that any Fitts' Law model can be applied generally to human-computer interaction, and proposes instead that applying Fitts' Law requires knowledge of the users' sequence of movements, direction of movement, and typical movement amplitudes as well as target sizes. Two experiments examined a text selection task with sequences of controlled movements (point-click and point-drag). For the point-click sequence, a Fitts' Law model that used the diagonal across the text object in the direction of pointing (rather than the horizontal extent of the text object) as the target size provided the best fit for the pointing time data, whereas for the point-drag sequence, a Fitts' Law model that used the vertical size of the text object as the target size gave the best fit. Dragging times were fitted well by Fitts' Law models that used either the vertical or horizontal size of the terminal character in the text object. Additional results of note were that pointing in the point-click sequence was consistently faster than in the point-drag sequence, and that pointing in either sequence was consistently faster than dragging. The discussion centres around the need to define task characteristics before applying Fitts' Law to an interface design or analysis, analyses of pointing and of dragging, and implications for interface design.

15. Dedicated mobile volumetric cone-beam computed tomography for human brain imaging: A phantom study.

Science.gov (United States)

Ryu, Jong-Hyun; Kim, Tae-Hoon; Jeong, Chang-Won; Jun, Hong-Young; Heo, Dong-Woon; Lee, Jinseok; Kim, Kyong-Woo; Yoon, Kwon-Ha

2015-01-01

Mobile computed tomography (CT) with a cone-beam source is increasingly used in the clinical field. Mobile cone-beam CT (CBCT) has great merits; however, its clinical utility for brain imaging has been limited due to problems including scan time and image quality. The aim of this study was to develop a dedicated mobile volumetric CBCT for obtaining brain images, and to optimize the imaging protocol using a brain phantom. The mobile volumetric CBCT system was evaluated with regards to scan time and image quality, measured as signal-to-noise-ratio (SNR), contrast-to-noise-ratio (CNR), spatial resolution (10% MTF), and effective dose. Brain images were obtained using a CT phantom. The CT scan took 5.14 s at 360 projection views. SNR and CNR were 5.67 and 14.5 at 120 kV/10 mA. SNR and CNR values showed slight improvement as the x-ray voltage and current increased (p < 0.001). Effective dose and 10% MTF were 0.92 mSv and 360 μ m at 120 kV/10 mA. Various intracranial structures were clearly visible in the brain phantom images. Using this CBCT under optimal imaging acquisition conditions, it is possible to obtain human brain images with low radiation dose, reproducible image quality, and fast scan time.

16. Creating computer aided 3D model of spleen and kidney based based on Visible Human Project

International Nuclear Information System (INIS)

2005-01-01

To investigate the efficacy of computer aided 3-dimensional (3D) reconstruction technique on visualization and modeling of gross anatomical structures with an affordable methodology applied on the spleen and kidney. From The Visible Human Project Dataset cryosection images, developed by the National Library of Medicine, the spleen and kidney sections were preferred to be used due to their highly distinct contours. The software used for the reconstruction were Surf Driver 3.5.3 for Mac and Cinema 4D X L version 7.1 for Mac OS X. This study was carried out in May 2004 at the Department of Anatomy, Hacettepe University, Ankara, Turkey. As a result of this study, it is determined that these 2 programs could be effectively used both for 3D modeling of the mentioned organs and volumetric analyses on these models. It is also seen that it is possible to hold the physical models of these gross anatomical digital ones with stereolithography technique by means of the data exchange file format provided by the program and present such images as anaglyph. Surf Driver 3.5.3 for Mac OS and Cinema 4 DXL version 7.1 for Mac OS X can be used effectively for reconstruction of gross anatomical structures from serial parallel sections with distinct contours such as spleen and kidney and the animation of models. These software constitute a highly effective way of getting volumetric calculations, spatial relations and morphometrical measurements of reconstructed structures. (author)

17. Computational Modelling of Gas-Particle Flows with Different Particle Morphology in the Human Nasal Cavity

Directory of Open Access Journals (Sweden)

Kiao Inthavong

2009-01-01

Full Text Available This paper summarises current studies related to numerical gas-particle flows in the human nasal cavity. Of interest are the numerical modelling requirements to consider the effects of particle morphology for a variety of particle shapes and sizes such as very small particles sizes (nanoparticles, elongated shapes (asbestos fibres, rough shapes (pollen, and porous light density particles (drug particles are considered. It was shown that important physical phenomena needed to be addressed for different particle characteristics. This included the Brownian diffusion for submicron particles. Computational results for the nasal capture efficiency for nano-particles and various breathing rates in the laminar regime were found to correlate well with the ratio of particle diffusivity to the breathing rate. For micron particles, particle inertia is the most significant property and the need to use sufficient drag laws is important. Drag correlations for fibrous and rough surfaced particles were investigated to enable particle tracking. Based on the simulated results, semi-empirical correlations for particle deposition were fitted in terms of Peclet number and inertial parameter for nanoparticles and micron particles respectively.

18. Neural computations mediating one-shot learning in the human brain.

Directory of Open Access Journals (Sweden)

Sang Wan Lee

2015-04-01

Full Text Available Incremental learning, in which new knowledge is acquired gradually through trial and error, can be distinguished from one-shot learning, in which the brain learns rapidly from only a single pairing of a stimulus and a consequence. Very little is known about how the brain transitions between these two fundamentally different forms of learning. Here we test a computational hypothesis that uncertainty about the causal relationship between a stimulus and an outcome induces rapid changes in the rate of learning, which in turn mediates the transition between incremental and one-shot learning. By using a novel behavioral task in combination with functional magnetic resonance imaging (fMRI data from human volunteers, we found evidence implicating the ventrolateral prefrontal cortex and hippocampus in this process. The hippocampus was selectively "switched" on when one-shot learning was predicted to occur, while the ventrolateral prefrontal cortex was found to encode uncertainty about the causal association, exhibiting increased coupling with the hippocampus for high-learning rates, suggesting this region may act as a "switch," turning on and off one-shot learning as required.

19. Selection of suitable hand gestures for reliable myoelectric human computer interface.

Science.gov (United States)

Castro, Maria Claudia F; Arjunan, Sridhar P; Kumar, Dinesh K

2015-04-09

Myoelectric controlled prosthetic hand requires machine based identification of hand gestures using surface electromyogram (sEMG) recorded from the forearm muscles. This study has observed that a sub-set of the hand gestures have to be selected for an accurate automated hand gesture recognition, and reports a method to select these gestures to maximize the sensitivity and specificity. Experiments were conducted where sEMG was recorded from the muscles of the forearm while subjects performed hand gestures and then was classified off-line. The performances of ten gestures were ranked using the proposed Positive-Negative Performance Measurement Index (PNM), generated by a series of confusion matrices. When using all the ten gestures, the sensitivity and specificity was 80.0% and 97.8%. After ranking the gestures using the PNM, six gestures were selected and these gave sensitivity and specificity greater than 95% (96.5% and 99.3%); Hand open, Hand close, Little finger flexion, Ring finger flexion, Middle finger flexion and Thumb flexion. This work has shown that reliable myoelectric based human computer interface systems require careful selection of the gestures that have to be recognized and without such selection, the reliability is poor.

20. The Human Factors and Ergonomics of P300-Based Brain-Computer Interfaces

Directory of Open Access Journals (Sweden)

J. Clark Powers

2015-08-01

Full Text Available Individuals with severe neuromuscular impairments face many challenges in communication and manipulation of the environment. Brain-computer interfaces (BCIs show promise in presenting real-world applications that can provide such individuals with the means to interact with the world using only brain waves. Although there has been a growing body of research in recent years, much relates only to technology, and not to technology in use—i.e., real-world assistive technology employed by users. This review examined the literature to highlight studies that implicate the human factors and ergonomics (HFE of P300-based BCIs. We assessed 21 studies on three topics to speak directly to improving the HFE of these systems: (1 alternative signal evocation methods within the oddball paradigm; (2 environmental interventions to improve user performance and satisfaction within the constraints of current BCI systems; and (3 measures and methods of measuring user acceptance. We found that HFE is central to the performance of P300-based BCI systems, although researchers do not often make explicit this connection. Incorporation of measures of user acceptance and rigorous usability evaluations, increased engagement of disabled users as test participants, and greater realism in testing will help progress the advancement of P300-based BCI systems in assistive applications.

1. Elucidating Mechanisms of Molecular Recognition Between Human Argonaute and miRNA Using Computational Approaches.

Science.gov (United States)

Jiang, Hanlun; Zhu, Lizhe; Héliou, Amélie; Gao, Xin; Bernauer, Julie; Huang, Xuhui

2017-01-01

MicroRNA (miRNA) and Argonaute (AGO) protein together form the RNA-induced silencing complex (RISC) that plays an essential role in the regulation of gene expression. Elucidating the underlying mechanism of AGO-miRNA recognition is thus of great importance not only for the in-depth understanding of miRNA function but also for inspiring new drugs targeting miRNAs. In this chapter we introduce a combined computational approach of molecular dynamics (MD) simulations, Markov state models (MSMs), and protein-RNA docking to investigate AGO-miRNA recognition. Constructed from MD simulations, MSMs can elucidate the conformational dynamics of AGO at biologically relevant timescales. Protein-RNA docking can then efficiently identify the AGO conformations that are geometrically accessible to miRNA. Using our recent work on human AGO2 as an example, we explain the rationale and the workflow of our method in details. This combined approach holds great promise to complement experiments in unraveling the mechanisms of molecular recognition between large, flexible, and complex biomolecules.

2. An object-oriented computational model to study cardiopulmonary hemodynamic interactions in humans.

Science.gov (United States)

Ngo, Chuong; Dahlmanns, Stephan; Vollmer, Thomas; Misgeld, Berno; Leonhardt, Steffen

2018-06-01

This work introduces an object-oriented computational model to study cardiopulmonary interactions in humans. Modeling was performed in object-oriented programing language Matlab Simscape, where model components are connected with each other through physical connections. Constitutive and phenomenological equations of model elements are implemented based on their non-linear pressure-volume or pressure-flow relationship. The model includes more than 30 physiological compartments, which belong either to the cardiovascular or respiratory system. The model considers non-linear behaviors of veins, pulmonary capillaries, collapsible airways, alveoli, and the chest wall. Model parameters were derisved based on literature values. Model validation was performed by comparing simulation results with clinical and animal data reported in literature. The model is able to provide quantitative values of alveolar, pleural, interstitial, aortic and ventricular pressures, as well as heart and lung volumes during spontaneous breathing and mechanical ventilation. Results of baseline simulation demonstrate the consistency of the assigned parameters. Simulation results during mechanical ventilation with PEEP trials can be directly compared with animal and clinical data given in literature. Object-oriented programming languages can be used to model interconnected systems including model non-linearities. The model provides a useful tool to investigate cardiopulmonary activity during spontaneous breathing and mechanical ventilation. Copyright © 2018 Elsevier B.V. All rights reserved.

3. A morphometric study on regeneration of the human liver following hepatectomy by computed tomography

International Nuclear Information System (INIS)

Okamoto, Eizo; Yamanaka, Naoki

1983-01-01

A morphometric study has been carried out on the restoration of remnant hepatic volume (RHV) after various extent of hepatectomy in humans by serial computed tomography in 15 non-cirrhotics, 14 cirrhotics and 7 hepatic failures. Restoration of RHV has been observed only in patients with more than 10 % hepatectomy. In non-cirrhotics with major hepatectomy (RHV less than 600 cm 3 ), an early rapid increasing phase was followed by a subsequent decreasing phase and then a slow increasing phase. Decreasing phase was absent in most non-cirrhotics with moderate hepatectomy (RHV 600-1000 cm 3 ) and cirrhotics. Daily increase rate of RHV during the first posthepatectomy month was inversely proportional to the RHV at operation restoring invariably to 800-900 cm 3 at the end of this month. Termination of regeneration was within 6 months in non-cirrhotics with moderate hepatectomy and from 6 to 12 months in those with major hepatectomy. It was delayed in cirrhotics. RHV has finally attained to an average of 90 % of preoperative hepatic volume in non-cirrhotics and 81 % in cirrhotics. The restoration of RHV was extremely poor in hepatic failures. (author)

4. Computer modelling of the chemical speciation of Americium (III) in human body fluids

International Nuclear Information System (INIS)

Jiang, Shu-bin; Lei, Jia-rong; Wang, He-yi; Zhong, Zhi-jing; Yang, Yong; Du, Yang

2008-01-01

A multi-phase equilibrium model consisted of multi-metal ion and low molecular mass ligands in human body fluid has been constructed to discuss the speciation of Am 3+ in gastric juice, sweat, interstitial fluid, intracellular fluid and urine of human body, respectively. Computer simulations indicated that the major Am(III)P Species were Am 3+ , [Am Cl] 2+ and [AmH 2 PO 4 ] 2+ at pH 4 became dominant with higher pH value when [Am] = 1 x 10 -7 mol/L in gastric juice model and percentage of AmPO 4 increased with [Am]. in sweat system, Am(III) existed with soluble species at pH 4.2∼pH 7.5 when [Am] = 1 x 10 -7 mol/L and Am(III) existed with Am 3+ and [Am OH] 2+ at pH 6.5 when [Am] -10 mol/L or [Am] > 5 x 10 -8 mol/L . With addition of EDTA, the Am(III) existed with soluble [Am EDTA] - whereas the Am(III) existed with insoluble AmPO 4 when [Am] > 1 x 10 -12 mol/L at interstitial fluid. The major Am(III) species was AmPO 4 at pH 7.0 and [Am]=4 x 10 -12 mol/L in intracellular fluid, which implied Am(III) represented strong cell toxicity. The percentage of Am(III) soluble species increased at lower pH hinted that the Am(III), in the form of aerosol, ingested by macrophage, could released into interstitial fluid and bring strong toxicity to skeleton system. The soluble Am(III) species was dominant when pH 4 when pH > 4.5 when [Am] = 1 x 10 -10 Pmol/L in human urine, so it was favorable to excrete Am(III) from kidney by taking acid materials. (author)

5. Saliency of color image derivatives: a comparison between computational models and human perception

NARCIS (Netherlands)

Vazquez, E.; Gevers, T.; Lucassen, M.; van de Weijer, J.; Baldrich, R.

2010-01-01

In this paper, computational methods are proposed to compute color edge saliency based on the information content of color edges. The computational methods are evaluated on bottom-up saliency in a psychophysical experiment, and on a more complex task of salient object detection in real-world images.

6. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

Energy Technology Data Exchange (ETDEWEB)

Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar [KSA-KSB, Kantonsspital Aarau, RadioOnkologieZentrum, Aarau (Switzerland); Zwahlen, Daniel [Kantonsspital Graubuenden, Department of Radiotherapy, Chur (Switzerland); Bodis, Stephan [KSA-KSB, Kantonsspital Aarau, RadioOnkologieZentrum, Aarau (Switzerland); University Hospital Zurich, Department of Radiation Oncology, Zurich (Switzerland)

2016-09-15

7. Computer-assisted design and synthesis of a highly selective smart adsorbent for extraction of clonazepam from human serum.

Science.gov (United States)

2013-01-01

A computational approach was applied to screen functional monomers and polymerization solvents for rational design of molecular imprinted polymers (MIPs) as smart adsorbents for solid-phase extraction of clonazepam (CLO) form human serum. The comparison of the computed binding energies of the complexes formed between the template and functional monomers was conducted. The primary computational results were corrected by taking into calculation both the basis set superposition error (BSSE) and the effect of the polymerization solvent using the counterpoise (CP) correction and the polarizable continuum model, respectively. Based on the theoretical calculations, trifluoromethyl acrylic acid (TFMAA) and acrylonitrile (ACN) were found as the best and the worst functional monomers, correspondingly. To test the accuracy of the computational results, three MIPs were synthesized by different functional monomers and their Langmuir-Freundlich (LF) isotherms were studied. The experimental results obtained confirmed the computational results and indicated that the MIP synthesized using TFMAA had the highest affinity for CLO in human serum despite the presence of a vast spectrum of ions. Copyright © 2012 Elsevier B.V. All rights reserved.

8. Body composition of the human lower extremity observed by computed tomography

International Nuclear Information System (INIS)

Suzuki, Masataka; Hasegawa, Makiko; Wu, Chung-Lei; Mimaru, Osamu

1987-01-01

Using computed tomography image, the body composition on the lower extremity were observed in 24 adult human (10 male, 14 female). CT image were taken at proximal section (upper a third on thigh), distal section (lower a third on thigh) and leg section (upper a third on leg), and the quantities determind from the images included the area of total cross-section, muscle, subcutaneous fat, connective tissue and bone in the each cross-section. The ratios of the each components to total area were surveyed. The age related changes and the differences between the three body types, which were defined by Rohrer's index, were discussed in both sexes. The following results were obtained. 1. The ratio of the each component to total sectional area in the three section levels was the highest in the muscle following in order of subcutaneous fat, connective tissue and bone in man generally. On the other hand, in female, the subcutaneous fat was higher than the muscle in the proximal section by A and C body types, but the muscle was higher than the subcutaneous fat by D body type in this section and by all body types in distal and leg sections. 2. Concerning the correlationship between the ratios of the components in the section and Rohrer's index or ages, they were in positive relation on the ratios of the subcutaneous fat and the connective tissue, and were in negative relation on the ratio of the muscle in the femoral section by male. 3. Decreasing with age of muscular area were found at under 50 ages in extensor, at 50 age in adductor and at about 60 ages in flexor on the proximal section, and at 50 age in extensor, after 55 age in adductor and at about 60 age in flexor on the distal section in man respectively. On the leg section, the decreasing tendency with ages were predominant in flexor by man and were found after 50 age by female too. (author)

9. VX hydrolysis by human serum paraoxonase 1: a comparison of experimental and computational results.

Directory of Open Access Journals (Sweden)

Matthew W Peterson

Full Text Available Human Serum paraoxonase 1 (HuPON1 is an enzyme that has been shown to hydrolyze a variety of chemicals including the nerve agent VX. While wildtype HuPON1 does not exhibit sufficient activity against VX to be used as an in vivo countermeasure, it has been suggested that increasing HuPON1's organophosphorous hydrolase activity by one or two orders of magnitude would make the enzyme suitable for this purpose. The binding interaction between HuPON1 and VX has recently been modeled, but the mechanism for VX hydrolysis is still unknown. In this study, we created a transition state model for VX hydrolysis (VX(ts in water using quantum mechanical/molecular mechanical simulations, and docked the transition state model to 22 experimentally characterized HuPON1 variants using AutoDock Vina. The HuPON1-VX(ts complexes were grouped by reaction mechanism using a novel clustering procedure. The average Vina interaction energies for different clusters were compared to the experimentally determined activities of HuPON1 variants to determine which computational procedures best predict how well HuPON1 variants will hydrolyze VX. The analysis showed that only conformations which have the attacking hydroxyl group of VX(ts coordinated by the sidechain oxygen of D269 have a significant correlation with experimental results. The results from this study can be used for further characterization of how HuPON1 hydrolyzes VX and design of HuPON1 variants with increased activity against VX.

10. Human factors design of nuclear power plant control rooms including computer-based operator aids

International Nuclear Information System (INIS)

Bastl, W.; Felkel, L.; Becker, G.; Bohr, E.

1983-01-01

The scientific handling of human factors problems in control rooms began around 1970 on the basis of safety considerations. Some recent research work deals with the development of computerized systems like plant balance calculation, safety parameter display, alarm reduction and disturbance analysis. For disturbance analysis purposes it is necessary to homogenize the information presented to the operator according to the actual plant situation in order to supply the operator with the information he most urgently needs at the time. Different approaches for solving this problem are discussed, and an overview is given on what is being done. Other research projects concentrate on the detailed analysis of operators' diagnosis strategies in unexpected situations, in order to obtain a better understanding of their mental processes and the influences upon them when such situations occur. This project involves the use of a simulator and sophisticated recording and analysis methods. Control rooms are currently designed with the aid of mock-ups. They enable operators to contribute their experience to the optimization of the arrangement of displays and controls. Modern control rooms are characterized by increasing use of process computers and CRT (Cathode Ray Tube) displays. A general concept for the integration of the new computerized system and the conventional control panels is needed. The technical changes modify operators' tasks, and future ergonomic work in nuclear plants will need to consider the re-allocation of function between man and machine, the incorporation of task changes in training programmes, and the optimal design of information presentation using CRTs. Aspects of developments in control room design are detailed, typical research results are dealt with, and a brief forecast of the ergonomic contribution to be made in the Federal Republic of Germany is given

11. Supervised learning from human performance at the computationally hard problem of optimal traffic signal control on a network of junctions.

Science.gov (United States)

Box, Simon

2014-12-01

Optimal switching of traffic lights on a network of junctions is a computationally intractable problem. In this research, road traffic networks containing signallized junctions are simulated. A computer game interface is used to enable a human 'player' to control the traffic light settings on the junctions within the simulation. A supervised learning approach, based on simple neural network classifiers can be used to capture human player's strategies in the game and thus develop a human-trained machine control (HuTMaC) system that approaches human levels of performance. Experiments conducted within the simulation compare the performance of HuTMaC to two well-established traffic-responsive control systems that are widely deployed in the developed world and also to a temporal difference learning-based control method. In all experiments, HuTMaC outperforms the other control methods in terms of average delay and variance over delay. The conclusion is that these results add weight to the suggestion that HuTMaC may be a viable alternative, or supplemental method, to approximate optimization for some practical engineering control problems where the optimal strategy is computationally intractable.

12. ORION: a computer code for evaluating environmental concentrations and dose equivalent to human organs or tissue from airborne radionuclides

International Nuclear Information System (INIS)

Shinohara, K.; Nomura, T.; Iwai, M.

1983-05-01

The computer code ORION has been developed to evaluate the environmental concentrations and the dose equivalent to human organs or tissue from air-borne radionuclides released from multiple nuclear installations. The modified Gaussian plume model is applied to calculate the dispersion of the radionuclide. Gravitational settling, dry deposition, precipitation scavenging and radioactive decay are considered to be the causes of depletion and deposition on the ground or on vegetation. ORION is written in the FORTRAN IV language and can be run on IBM 360, 370, 303X, 43XX and FACOM M-series computers. 8 references, 6 tables

13. Three-dimensional evaluation of human jaw bone microarchitecture: correlation between the microarchitectural parameters of cone beam computed tomography and micro-computer tomography.

Science.gov (United States)

Kim, Jo-Eun; Yi, Won-Jin; Heo, Min-Suk; Lee, Sam-Sun; Choi, Soon-Chul; Huh, Kyung-Hoe

2015-12-01

To evaluate the potential feasibility of cone beam computed tomography (CBCT) in the assessment of trabecular bone microarchitecture. Sixty-eight specimens from four pairs of human jaw were scanned using both micro-computed tomography (micro-CT) of 19.37-μm voxel size and CBCT of 100-μm voxel size. The correlation of 3-dimensional parameters between CBCT and micro-CT was evaluated. All parameters, except bone-specific surface and trabecular thickness, showed linear correlations between the 2 imaging modalities (P < .05). Among the parameters, bone volume, percent bone volume, trabecular separation, and degree of anisotropy (DA) of CBCT images showed strong correlations with those of micro-CT images. DA showed the strongest correlation (r = 0.693). Most microarchitectural parameters from CBCT were correlated with those from micro-CT. Some microarchitectural parameters, especially DA, could be used as strong predictors of bone quality in the human jaw. Copyright © 2015 Elsevier Inc. All rights reserved.

14. Designing Computer Agents With Facial Personality To Improve Human-Machine Collaboration

National Research Council Canada - National Science Library

Tidball, Brian E

2006-01-01

.... This study examined whether people perceive personality in static digital faces that portray expressions of emotion, and if the digital faces would influence human performance on a simple human...

15. Using Tablet PCs in Classroom for Teaching Human-Computer Interaction: An Experience in High Education

Science.gov (United States)

da Silva, André Constantino; Marques, Daniela; de Oliveira, Rodolfo Francisco; Noda, Edgar

2014-01-01

The use of computers in the teaching and learning process is investigated by many researches and, nowadays, due the available diversity of computing devices, tablets are become popular in classroom too. So what are the advantages and disadvantages to use tablets in classroom? How can we shape the teaching and learning activities to get the best of…

16. University Students and Ethics of Computer Technology Usage: Human Resource Development

Science.gov (United States)

2012-01-01

The primary purpose of this study was to determine the level of students' awareness about computer technology ethics at the Hashemite University in Jordan. A total of 180 university students participated in the study by completing the questionnaire designed by the researchers, named the Computer Technology Ethics Questionnaire (CTEQ). Results…

17. Evaluation of the reliability concerning the identification of human factors as contributing factors by a computer supported event analysis (CEA)

International Nuclear Information System (INIS)

Wilpert, B.; Maimer, H.; Loroff, C.

2000-01-01

The project's objectives are the evaluation of the reliability concerning the identification of Human Factors as contributing factors by a computer supported event analysis (CEA). CEA is a computer version of SOL (Safety through Organizational Learning). Parts of the first step were interviews with experts from the nuclear power industry and the evaluation of existing computer supported event analysis methods. This information was combined to a requirement profile for the CEA software. The next step contained the implementation of the software in an iterative process of evaluation. The completion of this project was the testing of the CEA software. As a result the testing demonstrated that it is possible to identify contributing factors with CEA validly. In addition, CEA received a very positive feedback from the experts. (orig.) [de

18. A Conceptual Architecture for Adaptive Human-Computer Interface of a PT Operation Platform Based on Context-Awareness

Directory of Open Access Journals (Sweden)

Qing Xue

2014-01-01

Full Text Available We present a conceptual architecture for adaptive human-computer interface of a PT operation platform based on context-awareness. This architecture will form the basis of design for such an interface. This paper describes components, key technologies, and working principles of the architecture. The critical contents covered context information modeling, processing, relationship establishing between contexts and interface design knowledge by use of adaptive knowledge reasoning, and visualization implementing of adaptive interface with the aid of interface tools technology.

19. Computational voxel phantom, associated to anthropometric and anthropomorphic real phantom for dosimetry in human male pelvis radiotherapy

International Nuclear Information System (INIS)

Silva, Cleuza Helena Teixeira; Campos, Tarcisio Passos Ribeiro de

2005-01-01

This paper addresses a computational model of voxels through MCNP5 Code and the experimental development of an anthropometric and anthropomorphic phantom for dosimetry in human male pelvis brachytherapy focusing prostatic tumors. For elaboration of the computational model of the human male pelvis, anatomical section images from the Visible Man Project were applied. Such selected and digital images were associated to a numeric representation, one for each section. Such computational representation of the anatomical sections was transformed into a bi-dimensional mesh of equivalent tissue. The group of bidimensional meshes was concatenated forming the three-dimensional model of voxels to be used by the MCNP5 code. In association to the anatomical information, data from the density and chemical composition of the basic elements, representatives of the organs and involved tissues, were setup in a material database for the MCNP-5. The model will be applied for dosimetric evaluations in situations of irradiation of the human masculine pelvis. Such 3D model of voxel is associated to the code of transport of particles MCNP5, allowing future simulations. It was also developed the construction of human masculine pelvis phantom, based on anthropometric and anthropomorphic dates and in the use of representative equivalent tissues of the skin, fatty, muscular and glandular tissue, as well as the bony structure.This part of work was developed in stages, being built the bony cast first, later the muscular structures and internal organs. They were then jointly mounted and inserted in the skin cast. The representative component of the fatty tissue was incorporate and accomplished the final retouchings in the skin. The final result represents the development of two important essential tools for elaboration of computational and experimental dosimetry. Thus, it is possible its use in calibrations of pre-existent protocols in radiotherapy, as well as for tests of new protocols, besides

20. Ontology for assessment studies of human-computer-interaction in surgery.

Science.gov (United States)

Machno, Andrej; Jannin, Pierre; Dameron, Olivier; Korb, Werner; Scheuermann, Gerik; Meixensberger, Jürgen

2015-02-01

New technologies improve modern medicine, but may result in unwanted consequences. Some occur due to inadequate human-computer-interactions (HCI). To assess these consequences, an investigation model was developed to facilitate the planning, implementation and documentation of studies for HCI in surgery. The investigation model was formalized in Unified Modeling Language and implemented as an ontology. Four different top-level ontologies were compared: Object-Centered High-level Reference, Basic Formal Ontology, General Formal Ontology (GFO) and Descriptive Ontology for Linguistic and Cognitive Engineering, according to the three major requirements of the investigation model: the domain-specific view, the experimental scenario and the representation of fundamental relations. Furthermore, this article emphasizes the distinction of "information model" and "model of meaning" and shows the advantages of implementing the model in an ontology rather than in a database. The results of the comparison show that GFO fits the defined requirements adequately: the domain-specific view and the fundamental relations can be implemented directly, only the representation of the experimental scenario requires minor extensions. The other candidates require wide-ranging extensions, concerning at least one of the major implementation requirements. Therefore, the GFO was selected to realize an appropriate implementation of the developed investigation model. The ensuing development considered the concrete implementation of further model aspects and entities: sub-domains, space and time, processes, properties, relations and functions. The investigation model and its ontological implementation provide a modular guideline for study planning, implementation and documentation within the area of HCI research in surgery. This guideline helps to navigate through the whole study process in the form of a kind of standard or good clinical practice, based on the involved foundational frameworks

1. A Novel Feature Optimization for Wearable Human-Computer Interfaces Using Surface Electromyography Sensors

Directory of Open Access Journals (Sweden)

Han Sun

2018-03-01

Full Text Available The novel human-computer interface (HCI using bioelectrical signals as input is a valuable tool to improve the lives of people with disabilities. In this paper, surface electromyography (sEMG signals induced by four classes of wrist movements were acquired from four sites on the lower arm with our designed system. Forty-two features were extracted from the time, frequency and time-frequency domains. Optimal channels were determined from single-channel classification performance rank. The optimal-feature selection was according to a modified entropy criteria (EC and Fisher discrimination (FD criteria. The feature selection results were evaluated by four different classifiers, and compared with other conventional feature subsets. In online tests, the wearable system acquired real-time sEMG signals. The selected features and trained classifier model were used to control a telecar through four different paradigms in a designed environment with simple obstacles. Performance was evaluated based on travel time (TT and recognition rate (RR. The results of hardware evaluation verified the feasibility of our acquisition systems, and ensured signal quality. Single-channel analysis results indicated that the channel located on the extensor carpi ulnaris (ECU performed best with mean classification accuracy of 97.45% for all movement’s pairs. Channels placed on ECU and the extensor carpi radialis (ECR were selected according to the accuracy rank. Experimental results showed that the proposed FD method was better than other feature selection methods and single-type features. The combination of FD and random forest (RF performed best in offline analysis, with 96.77% multi-class RR. Online results illustrated that the state-machine paradigm with a 125 ms window had the highest maneuverability and was closest to real-life control. Subjects could accomplish online sessions by three sEMG-based paradigms, with average times of 46.02, 49.06 and 48.08 s

2. About possibility of temperature trace observing on a human skin through clothes by using computer processing of IR image

Science.gov (United States)

Trofimov, Vyacheslav A.; Trofimov, Vladislav V.; Shestakov, Ivan L.; Blednov, Roman G.

2017-05-01

One of urgent security problems is a detection of objects placed inside the human body. Obviously, for safety reasons one cannot use X-rays for such object detection widely and often. For this purpose, we propose to use THz camera and IR camera. Below we continue a possibility of IR camera using for a detection of temperature trace on a human body. In contrast to passive THz camera using, the IR camera does not allow to see very pronounced the object under clothing. Of course, this is a big disadvantage for a security problem solution based on the IR camera using. To find possible ways for this disadvantage overcoming we make some experiments with IR camera, produced by FLIR Company and develop novel approach for computer processing of images captured by IR camera. It allows us to increase a temperature resolution of IR camera as well as human year effective susceptibility enhancing. As a consequence of this, a possibility for seeing of a human body temperature changing through clothing appears. We analyze IR images of a person, which drinks water and eats chocolate. We follow a temperature trace on human body skin, caused by changing of temperature inside the human body. Some experiments are made with observing of temperature trace from objects placed behind think overall. Demonstrated results are very important for the detection of forbidden objects, concealed inside the human body, by using non-destructive control without using X-rays.

3. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

International Nuclear Information System (INIS)

Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie; Mandelli, Diego; Smith, Curtis Lee

2015-01-01

The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS 'pathways,' or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

4. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

Energy Technology Data Exchange (ETDEWEB)

Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Herberger, Sarah Elizabeth Marie [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

2015-09-01

The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with other experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.

5. A truly human interface: Interacting face-to-face with someone whose words are determined by a computer program

Directory of Open Access Journals (Sweden)

Kevin eCorti

2015-05-01

Full Text Available We use speech shadowing to create situations wherein people converse in person with a human whose words are determined by a conversational agent computer program. Speech shadowing involves a person (the shadower repeating vocal stimuli originating from a separate communication source in real-time. Humans shadowing for conversational agent sources (e.g., chat bots become hybrid agents (echoborgs capable of face-to-face interlocution. We report three studies that investigated people’s experiences interacting with echoborgs and the extent to which echoborgs pass as autonomous humans. First, participants in a Turing Test spoke with a chat bot via either a text interface or an echoborg. Human shadowing did not improve the chat bot’s chance of passing but did increase interrogators’ ratings of how human-like the chat bot seemed. In our second study, participants had to decide whether their interlocutor produced words generated by a chat bot or simply pretended to be one. Compared to those who engaged a text interface, participants who engaged an echoborg were more likely to perceive their interlocutor as pretending to be a chat bot. In our third study, participants were naïve to the fact that their interlocutor produced words generated by a chat bot. Unlike those who engaged a text interface, the vast majority of participants who engaged an echoborg neither sensed nor suspected a robotic interaction. These findings have implications for android science, the Turing Test paradigm, and human-computer interaction. The human body, as the delivery mechanism of communication, fundamentally alters the social psychological dynamics of interactions with machine intelligence.

6. Computer-aided diagnosis for phase-contrast X-ray computed tomography: quantitative characterization of human patellar cartilage with high-dimensional geometric features.

Science.gov (United States)

Nagarajan, Mahesh B; Coan, Paola; Huber, Markus B; Diemoz, Paul C; Glaser, Christian; Wismüller, Axel

2014-02-01

Phase-contrast computed tomography (PCI-CT) has shown tremendous potential as an imaging modality for visualizing human cartilage with high spatial resolution. Previous studies have demonstrated the ability of PCI-CT to visualize (1) structural details of the human patellar cartilage matrix and (2) changes to chondrocyte organization induced by osteoarthritis. This study investigates the use of high-dimensional geometric features in characterizing such chondrocyte patterns in the presence or absence of osteoarthritic damage. Geometrical features derived from the scaling index method (SIM) and statistical features derived from gray-level co-occurrence matrices were extracted from 842 regions of interest (ROI) annotated on PCI-CT images of ex vivo human patellar cartilage specimens. These features were subsequently used in a machine learning task with support vector regression to classify ROIs as healthy or osteoarthritic; classification performance was evaluated using the area under the receiver-operating characteristic curve (AUC). SIM-derived geometrical features exhibited the best classification performance (AUC, 0.95 ± 0.06) and were most robust to changes in ROI size. These results suggest that such geometrical features can provide a detailed characterization of the chondrocyte organization in the cartilage matrix in an automated and non-subjective manner, while also enabling classification of cartilage as healthy or osteoarthritic with high accuracy. Such features could potentially serve as imaging markers for evaluating osteoarthritis progression and its response to different therapeutic intervention strategies.

7. Simple computational modeling for human extracorporeal irradiation using the BNCT facility of the RA-3 Reactor

International Nuclear Information System (INIS)

Farias, Ruben; Gonzalez, S.J.; Bellino, A.; Sztenjberg, M.; Pinto, J.; Thorp, Silvia I.; Gadan, M.; Pozzi, Emiliano; Schwint, Amanda E.; Heber, Elisa M.; Trivillin, V.A.; Zarza, Leandro G.; Estryk, Guillermo; Miller, M.; Bortolussi, S.; Soto, M.S.; Nigg, D.W.

2009-01-01

We present a simple computational model of the reactor RA-3 developed using Monte Carlo transport code MCNP. The model parameters are adjusted in order to reproduce experimental measured points in air and the source validation is performed in an acrylic phantom. Performance analysis is carried out using computational models of animal extracorporeal irradiation in liver and lung. Analysis is also performed inside a neutron shielded receptacle use for the irradiation of rats with a model of hepatic metastases.The computational model reproduces the experimental behavior in all the analyzed cases with a maximum difference of 10 percent. (author)

8. A human-assisted computer generated LA-grammar for simple ...

African Journals Online (AJOL)

Southern African Linguistics and Applied Language Studies ... of computer programs to generate Left Associative Grammars (LAGs) for natural languages is described. The generation proceeds from examples of correct sentences and needs ...

9. Mathematical Capture of Human Crowd Behavioral Data for Computational Model Building, Verification, and Validation

Science.gov (United States)

2011-03-21

throughout the experimental runs. Reliable and validated measures of anxiety ( Spielberger , 1983), as well as custom-constructed questionnaires about...Crowd modeling and simulation technologies. Transactions on modeling and computer simulation, 20(4). Spielberger , C. D. (1983

10. Human brain as the model for a new computer system. I

Energy Technology Data Exchange (ETDEWEB)

Holtz, K; Langheld, E

1981-11-25

The authors discuss ideas for the design of a self-teaching no-program associative computer. A simple system, requiring no programming, is based on the construction of a system of concepts with self-arranging data.

11. A computational method for probabilistic safety assessment of I and C systems and human operators in nuclear power plants

International Nuclear Information System (INIS)

Kim, Man Cheol; Seong, Poong Hyun

2006-01-01

To make probabilistic safety assessment (PSA) more realistic, the improvements of human reliability analysis (HRA) are essential. But, current HRA methods have many limitations including the lack of considerations on the interdependency between instrumentation and control (I and C) systems and human operators, and lack of theoretical basis for situation assessment of human operators. To overcome these limitations, we propose a new method for the quantitative safety assessment of I and C systems and human operators. The proposed method is developed based on the computational models for the knowledge-driven monitoring and the situation assessment of human operators, with the consideration of the interdependency between I and C systems and human operators. The application of the proposed method to an example situation demonstrates that the quantitative description by the proposed method for a probable scenario well matches with the qualitative description of the scenario. It is also demonstrated that the proposed method can probabilistically consider all possible scenarios and the proposed method can be used to quantitatively evaluate the effects of various context factor on the safety of nuclear power plants. In our opinion, the proposed method can be used as the basis for the development of advanced HRA methods

12. Development and application of the Chinese adult female computational phantom Rad-HUMAN

International Nuclear Information System (INIS)

Wu, Yican; Cheng, Mengyun; Wang, Wen; Fan, Yanchang; Zhao, Kai; He, Tao; Pei, Xi; Shang, Leiming; Chen, Chaobin; Long, Pengcheng; Cao, Ruifen; Wang, Guozhong; Zhou, Shaoheng; Yu, Shengpeng; Hu, Liqin; Zeng, Q.

2013-01-01

Rad-HUMAN is a whole-body numerical phantom of a Chinese adult woman which contains 46 organs and tissues and was created by MCAM6 software using the color photographs of the Chinese Visible Human dataset. This dataset was obtained from a 22-year old Chinese female cadaver judged to represent normal human anatomy as much as possible. The density and elemental composition recommended in the ICRP Publication 89 and in the ICRU report 44 were assigned to the organ and tissue in Rad-HUMAN for radiation protection purpose. The last step was to implement the anatomical data into a Monte Carlo code. Rad-HUMAN contains more than 28.8 billion tiny volume units, which produces an accurately whole-body numerical phantom of a Chinese adult female

13. Realization of a quantum Hamiltonian Boolean logic gate on the Si(001):H surface.

Science.gov (United States)

Kolmer, Marek; Zuzak, Rafal; Dridi, Ghassen; Godlewski, Szymon; Joachim, Christian; Szymonski, Marek

2015-08-07

The design and construction of the first prototypical QHC (Quantum Hamiltonian Computing) atomic scale Boolean logic gate is reported using scanning tunnelling microscope (STM) tip-induced atom manipulation on an Si(001):H surface. The NOR/OR gate truth table was confirmed by dI/dU STS (Scanning Tunnelling Spectroscopy) tracking how the surface states of the QHC quantum circuit on the Si(001):H surface are shifted according to the input logical status.

14. Human and Robotic Space Mission Use Cases for High-Performance Spaceflight Computing

Science.gov (United States)

Some, Raphael; Doyle, Richard; Bergman, Larry; Whitaker, William; Powell, Wesley; Johnson, Michael; Goforth, Montgomery; Lowry, Michael

2013-01-01

Spaceflight computing is a key resource in NASA space missions and a core determining factor of spacecraft capability, with ripple effects throughout the spacecraft, end-to-end system, and mission. Onboard computing can be aptly viewed as a "technology multiplier" in that advances provide direct dramatic improvements in flight functions and capabilities across the NASA mission classes, and enable new flight capabilities and mission scenarios, increasing science and exploration return. Space-qualified computing technology, however, has not advanced significantly in well over ten years and the current state of the practice fails to meet the near- to mid-term needs of NASA missions. Recognizing this gap, the NASA Game Changing Development Program (GCDP), under the auspices of the NASA Space Technology Mission Directorate, commissioned a study on space-based computing needs, looking out 15-20 years. The study resulted in a recommendation to pursue high-performance spaceflight computing (HPSC) for next-generation missions, and a decision to partner with the Air Force Research Lab (AFRL) in this development.

15. Efficient computation of electrograms and ECGs in human whole heart simulations using a reaction-eikonal model.

Science.gov (United States)

Neic, Aurel; Campos, Fernando O; Prassl, Anton J; Niederer, Steven A; Bishop, Martin J; Vigmond, Edward J; Plank, Gernot

2017-10-01

Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.

16. Functional physiology of the human terminal antrum defined by high-resolution electrical mapping and computational modeling.

Science.gov (United States)

Berry, Rachel; Miyagawa, Taimei; Paskaranandavadivel, Niranchan; Du, Peng; Angeli, Timothy R; Trew, Mark L; Windsor, John A; Imai, Yohsuke; O'Grady, Gregory; Cheng, Leo K

2016-11-01

High-resolution (HR) mapping has been used to study gastric slow-wave activation; however, the specific characteristics of antral electrophysiology remain poorly defined. This study applied HR mapping and computational modeling to define functional human antral physiology. HR mapping was performed in 10 subjects using flexible electrode arrays (128-192 electrodes; 16-24 cm 2 ) arranged from the pylorus to mid-corpus. Anatomical registration was by photographs and anatomical landmarks. Slow-wave parameters were computed, and resultant data were incorporated into a computational fluid dynamics (CFD) model of gastric flow to calculate impact on gastric mixing. In all subjects, extracellular mapping demonstrated normal aboral slow-wave propagation and a region of increased amplitude and velocity in the prepyloric antrum. On average, the high-velocity region commenced 28 mm proximal to the pylorus, and activation ceased 6 mm from the pylorus. Within this region, velocity increased 0.2 mm/s per mm of tissue, from the mean 3.3 ± 0.1 mm/s to 7.5 ± 0.6 mm/s (P human terminal antral contraction is controlled by a short region of rapid high-amplitude slow-wave activity. Distal antral wave acceleration plays a major role in antral flow and mixing, increasing particle strain and trituration. Copyright © 2016 the American Physiological Society.

17. Diagnostic Accuracy of Periapical Radiography and Cone-beam Computed Tomography in Identifying Root Canal Configuration of Human Premolars.

Science.gov (United States)

Sousa, Thiago Oliveira; Haiter-Neto, Francisco; Nascimento, Eduarda Helena Leandro; Peroni, Leonardo Vieira; Freitas, Deborah Queiroz; Hassan, Bassam

2017-07-01

The aim of this study was to assess the diagnostic accuracy of periapical radiography (PR) and cone-beam computed tomographic (CBCT) imaging in the detection of the root canal configuration (RCC) of human premolars. PR and CBCT imaging of 114 extracted human premolars were evaluated by 2 oral radiologists. RCC was recorded according to Vertucci's classification. Micro-computed tomographic imaging served as the gold standard to determine RCC. Accuracy, sensitivity, specificity, and predictive values were calculated. The Friedman test compared both PR and CBCT imaging with the gold standard. CBCT imaging showed higher values for all diagnostic tests compared with PR. Accuracy was 0.55 and 0.89 for PR and CBCT imaging, respectively. There was no difference between CBCT imaging and the gold standard, whereas PR differed from both CBCT and micro-computed tomographic imaging (P < .0001). CBCT imaging was more accurate than PR for evaluating different types of RCC individually. Canal configuration types III, VII, and "other" were poorly identified on CBCT imaging with a detection accuracy of 50%, 0%, and 43%, respectively. With PR, all canal configurations except type I were poorly visible. PR presented low performance in the detection of RCC in premolars, whereas CBCT imaging showed no difference compared with the gold standard. Canals with complex configurations were less identifiable using both imaging methods, especially PR. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

18. Computational modeling of blast wave interaction with a human body and assessment of traumatic brain injury

Science.gov (United States)

Tan, X. G.; Przekwas, A. J.; Gupta, R. K.

2017-11-01

The modeling of human body biomechanics resulting from blast exposure poses great challenges because of the complex geometry and the substantial material heterogeneity. We developed a detailed human body finite element model representing both the geometry and the materials realistically. The model includes the detailed head (face, skull, brain and spinal cord), the neck, the skeleton, air cavities (lungs) and the tissues. Hence, it can be used to properly model the stress wave propagation in the human body subjected to blast loading. The blast loading on the human was generated from a simulated C4 explosion. We used the highly scalable solvers in the multi-physics code CoBi for both the blast simulation and the human body biomechanics. The meshes generated for these simulations are of good quality so that relatively large time-step sizes can be used without resorting to artificial time scaling treatments. The coupled gas dynamics and biomechanics solutions were validated against the shock tube test data. The human body models were used to conduct parametric simulations to find the biomechanical response and the brain injury mechanism due to blasts impacting the human body. Under the same blast loading condition, we showed the importance of inclusion of the whole body.

19. Computer-aided diagnosis in phase contrast imaging X-ray computed tomography for quantitative characterization of ex vivo human patellar cartilage.

Science.gov (United States)

Nagarajan, Mahesh B; Coan, Paola; Huber, Markus B; Diemoz, Paul C; Glaser, Christian; Wismuller, Axel

2013-10-01

Visualization of ex vivo human patellar cartilage matrix through the phase contrast imaging X-ray computed tomography (PCI-CT) has been previously demonstrated. Such studies revealed osteoarthritis-induced changes to chondrocyte organization in the radial zone. This study investigates the application of texture analysis to characterizing such chondrocyte patterns in the presence and absence of osteoarthritic damage. Texture features derived from Minkowski functionals (MF) and gray-level co-occurrence matrices (GLCM) were extracted from 842 regions of interest (ROI) annotated on PCI-CT images of ex vivo human patellar cartilage specimens. These texture features were subsequently used in a machine learning task with support vector regression to classify ROIs as healthy or osteoarthritic; classification performance was evaluated using the area under the receiver operating characteristic curve (AUC). The best classification performance was observed with the MF features perimeter (AUC: 0.94 ±0.08 ) and "Euler characteristic" (AUC: 0.94 ±0.07 ), and GLCM-derived feature "Correlation" (AUC: 0.93 ±0.07). These results suggest that such texture features can provide a detailed characterization of the chondrocyte organization in the cartilage matrix, enabling classification of cartilage as healthy or osteoarthritic with high accuracy.

20. Quality of human-computer interaction - results of a national usability survey of hospital-IT in Germany

Directory of Open Access Journals (Sweden)

Bundschuh Bettina B

2011-11-01

Full Text Available Abstract Background Due to the increasing functionality of medical information systems, it is hard to imagine day to day work in hospitals without IT support. Therefore, the design of dialogues between humans and information systems is one of the most important issues to be addressed in health care. This survey presents an analysis of the current quality level of human-computer interaction of healthcare-IT in German hospitals, focused on the users' point of view. Methods To evaluate the usability of clinical-IT according to the design principles of EN ISO 9241-10 the IsoMetrics Inventory, an assessment tool, was used. The focus of this paper has been put on suitability for task, training effort and conformity with user expectations, differentiated by information systems. Effectiveness has been evaluated with the focus on interoperability and functionality of different IT systems. Results 4521 persons from 371 hospitals visited the start page of the study, while 1003 persons from 158 hospitals completed the questionnaire. The results show relevant variations between different information systems. Conclusions Specialised information systems with defined functionality received better assessments than clinical information systems in general. This could be attributed to the improved customisation of these specialised systems for specific working environments. The results can be used as reference data for evaluation and benchmarking of human computer engineering in clinical health IT context for future studies.

1. Brain-Computer Interface Controlled Cyborg: Establishing a Functional Information Transfer Pathway from Human Brain to Cockroach Brain.

Science.gov (United States)

Li, Guangye; Zhang, Dingguo

2016-01-01

An all-chain-wireless brain-to-brain system (BTBS), which enabled motion control of a cyborg cockroach via human brain, was developed in this work. Steady-state visual evoked potential (SSVEP) based brain-computer interface (BCI) was used in this system for recognizing human motion intention and an optimization algorithm was proposed in SSVEP to improve online performance of the BCI. The cyborg cockroach was developed by surgically integrating a portable microstimulator that could generate invasive electrical nerve stimulation. Through Bluetooth communication, specific electrical pulse trains could be triggered from the microstimulator by BCI commands and were sent through the antenna nerve to stimulate the brain of cockroach. Serial experiments were designed and conducted to test overall performance of the BTBS with six human subjects and three cockroaches. The experimental results showed that the online classification accuracy of three-mode BCI increased from 72.86% to 78.56% by 5.70% using the optimization algorithm and the mean response accuracy of the cyborgs using this system reached 89.5%. Moreover, the results also showed that the cyborg could be navigated by the human brain to complete walking along an S-shape track with the success rate of about 20%, suggesting the proposed BTBS established a feasible functional information transfer pathway from the human brain to the cockroach brain.

2. Computer Models of the Human Body Signature for Sensing Through the Wall Radar Applications

National Research Council Canada - National Science Library

Dogaru, Traian; Nguyen, Lam; Le, Calvin

2007-01-01

.... We analyze the radar cross section (RCS) of the human body in different configurations as a function of aspect angle, frequency, and polarization, drawing important conclusions in terms of the magnitude, variability, and statistics...

National Research Council Canada - National Science Library

Meyer, David

2004-01-01

... (Executive-Process/Interactive Control) was developed, applied to several types of tasks to accurately represent human performance, and inspired to collection of new data that cast new light on the scientific analysis of key phenomena...

4. Computational Integration of Human Genetic Data to Evaluate AOP-Specific Susceptibility

Science.gov (United States)

There is a need for approaches to efficiently evaluate human genetic variability and susceptibility related to environmental chemical exposure. Direct estimation of the genetic contribution to variability in susceptibility to environmental chemicals is only possible in special ca...

5. Computational comparison of β-mannosidases of animals, humans, microbes, and plants

OpenAIRE

2011-01-01

The b-mannosidase (MANB) enzyme is involved in removing mannose residue from the nonreducing end, and its impaired activity leads to b-mannosidosis. MANB amino acid sequences of humans, other mammals, plants, fungi, and bacteria were compared to determine their similarities, differences, and predicted 3D structures. Our cloned MANB DNA sequence showed a 99% similarity to a previously reported human MANB DNA sequence but 16 nucleotide differences were observed, showing the polymorphic nature o...

6. Computational comparison of β-mannosidases of animals, humans, microbes, and plants

OpenAIRE

2014-01-01

The b-mannosidase (MANB) enzyme is involved in removing mannose residue from the nonreducing end, and its impaired activity leads to b-mannosidosis. MANB amino acid sequences of humans, other mammals, plants, fungi, and bacteria were compared to determine their similarities, differences, and predicted 3D structures. Our cloned MANB DNA sequence showed a 99% similarity to a previously reported human MANB DNA sequence but 16 nucleotide differences were observed, showing the polymorphic nature o...

7. Robotic lower limb prosthesis design through simultaneous computer optimizations of human and prosthesis costs

Science.gov (United States)

Handford, Matthew L.; Srinivasan, Manoj

2016-02-01

Robotic lower limb prostheses can improve the quality of life for amputees. Development of such devices, currently dominated by long prototyping periods, could be sped up by predictive simulations. In contrast to some amputee simulations which track experimentally determined non-amputee walking kinematics, here, we explicitly model the human-prosthesis interaction to produce a prediction of the user’s walking kinematics. We obtain simulations of an amputee using an ankle-foot prosthesis by simultaneously optimizing human movements and prosthesis actuation, minimizing a weighted sum of human metabolic and prosthesis costs. The resulting Pareto optimal solutions predict that increasing prosthesis energy cost, decreasing prosthesis mass, and allowing asymmetric gaits all decrease human metabolic rate for a given speed and alter human kinematics. The metabolic rates increase monotonically with speed. Remarkably, by performing an analogous optimization for a non-amputee human, we predict that an amputee walking with an appropriately optimized robotic prosthesis can have a lower metabolic cost - even lower than assuming that the non-amputee’s ankle torques are cost-free.

8. Human Computer Interaction (HCI) and Internet Residency: Implications for Both Personal Life and Teaching/Learning

Science.gov (United States)

Crearie, Linda

2016-01-01

Technological advances over the last decade have had a significant impact on the teaching and learning experiences students encounter today. We now take technologies such as Web 2.0, mobile devices, cloud computing, podcasts, social networking, super-fast broadband, and connectedness for granted. So what about the student use of these types of…

9. A common currency for the computation of motivational values in the human striatum

NARCIS (Netherlands)

Sescousse, G.T.; Li, Y.; Dreher, J.C.

2015-01-01

Reward comparison in the brain is thought to be achieved through the use of a 'common currency', implying that reward value representations are computed on a unique scale in the same brain regions regardless of the reward type. Although such a mechanism has been identified in the ventro-medial

10. A common currency for the computation of motivational values in the human striatum

NARCIS (Netherlands)

Sescousse, G.T.; Li, Y.; Dreher, J.C.

2014-01-01

Reward comparison in the brain is thought to be achieved through the use of a ‘common currency’, implying that reward value representations are computed on a unique scale in the same brain regions regardless of the reward type. Although such a mechanism has been identified in the ventro-medial

11. Human Computing in the Life Sciences: What does the future hold?

NARCIS (Netherlands)

Fikkert, F.W.

2007-01-01

In future computing environments you will be surrounded and supported by all kinds of technologies. Characteristic is that you can interact with them in a natural way: you can speak to, point at, or even frown about some piece of presented information: the environment understands your intent.

12. Development of a Computer-Assisted Cranial Nerve Simulation from the Visible Human Dataset

Science.gov (United States)

Yeung, Jeffrey C.; Fung, Kevin; Wilson, Timothy D.

2011-01-01

Advancements in technology and personal computing have allowed for the development of novel teaching modalities such as online web-based modules. These modules are currently being incorporated into medical curricula and, in some paradigms, have been shown to be superior to classroom instruction. We believe that these modules have the potential of…

13. Toward affective brain-computer interfaces : exploring the neurophysiology of affect during human media interaction

NARCIS (Netherlands)

Mühl, C.

2012-01-01

Affective Brain-Computer Interfaces (aBCI), the sensing of emotions from brain activity, seems a fantasy from the realm of science fiction. But unlike faster-than-light travel or teleportation, aBCI seems almost within reach due to novel sensor technologies, the advancement of neuroscience, and the

14. A computer vision system for rapid search inspired by surface-based attention mechanisms from human perception.

Science.gov (United States)

Mohr, Johannes; Park, Jong-Han; Obermayer, Klaus

2014-12-01

Humans are highly efficient at visual search tasks by focusing selective attention on a small but relevant region of a visual scene. Recent results from biological vision suggest that surfaces of distinct physical objects form the basic units of this attentional process. The aim of this paper is to demonstrate how such surface-based attention mechanisms can speed up a computer vision system for visual search. The system uses fast perceptual grouping of depth cues to represent the visual world at the level of surfaces. This representation is stored in short-term memory and updated over time. A top-down guided attention mechanism sequentially selects one of the surfaces for detailed inspection by a recognition module. We show that the proposed attention framework requires little computational overhead (about 11 ms), but enables the system to operate in real-time and leads to a substantial increase in search efficiency. Copyright © 2014 Elsevier Ltd. All rights reserved.

15. Computational determination of the effects of virulent Escherichia coli and salmonella bacteriophages on human gut.

Science.gov (United States)

2016-10-01

Salmonella and Escherichia coli are different types of bacteria that cause food poisoning in humans. In the elderly, infants and people with chronic conditions, it is very dangerous if Salmonella or E. coli gets into the bloodstream and then they must be treated by phage therapy. Treating Salmonella and E. coli by phage therapy affects the gut flora. This research paper presents a system for detecting the effects of virulent E. coli and Salmonella bacteriophages on human gut. A method based on Domain-Domain Interactions (DDIs) model is implemented in the proposed system to determine the interactions between the proteins of human gut bacteria and the proteins of bacteriophages that infect virulent E. coli and Salmonella. The system helps gastroenterologists to realize the effect of injecting bacteriophages that infect virulent E. coli and Salmonella on the human gut. By testing the system over Enterobacteria phage 933W, Enterobacteria phage VT2-Sa and Enterobacteria phage P22, it resulted in four interactions between the proteins of the bacteriophages that infect E. coli O157:H7, E. coli O104:H4 and Salmonella typhimurium and the proteins of human gut bacterium strains. Several effects were detected such as: antibacterial activity against a number of bacterial species in human gut, regulation of cellular differentiation and organogenesis during gut, lung, and heart development, ammonia assimilation in bacteria, yeasts, and plants, energizing defense system and its function in the detoxification of lipopolysaccharide, and in the prevention of bacterial translocation in human gut. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

16. A Computer Simulation Approach to the Study of Effects of Deck Surface Compliance on Initial Impact Impulse Forces in Human Gait

National Research Council Canada - National Science Library

Bretz, David

2000-01-01

.... One proposal for reducing knee disorders is to install more compliant decking The goal of this thesis is to develop a computer model of the human gait that estimates the transarticulation forces...

17. Big data challenges in decoding cortical activity in a human with quadriplegia to inform a brain computer interface.

Science.gov (United States)

Friedenberg, David A; Bouton, Chad E; Annetta, Nicholas V; Skomrock, Nicholas; Mingming Zhang; Schwemmer, Michael; Bockbrader, Marcia A; Mysiw, W Jerry; Rezai, Ali R; Bresler, Herbert S; Sharma, Gaurav

2016-08-01

Recent advances in Brain Computer Interfaces (BCIs) have created hope that one day paralyzed patients will be able to regain control of their paralyzed limbs. As part of an ongoing clinical study, we have implanted a 96-electrode Utah array in the motor cortex of a paralyzed human. The array generates almost 3 million data points from the brain every second. This presents several big data challenges towards developing algorithms that should not only process the data in real-time (for the BCI to be responsive) but are also robust to temporal variations and non-stationarities in the sensor data. We demonstrate an algorithmic approach to analyze such data and present a novel method to evaluate such algorithms. We present our methodology with examples of decoding human brain data in real-time to inform a BCI.

18. Human factors guidelines and methodology in the design of a user computer interface: a case study

International Nuclear Information System (INIS)

Richards, R.E.; Gilmore, W.E.; Haney, L.N.

1986-01-01

In this case study, human factors personnel were requested to participate in a project team of programmers and operations specialists to design a cathode ray tube (CRT) display system for a complex process control application. This presentation describes the process and benefits obtained by incorporating human factors guidelines and methods in system design. Standard human engineering guidelines and techniques were utilized by the project team. In addition, previously published documents and research findings sponsored by the US Nuclear Regulatory Commission (USNRC) were used. Preliminary tasks involved a review of the draft plant procedures. Then, interviews with operators were conducted to establish the initial information for the displays. These initial requirements were evaluated against existing guidelines and criteria to determine the optimal presentation formats. Detailed steps of the approaches used, design decisions made, and tradeoffs that resulted in the final user acceptable design are discussed. 7 refs., 2 figs

19. Plutonium detection in humans using octagonal computer-generated color patterns

International Nuclear Information System (INIS)

Phillips, W.G.; Curtis, S.P.

1985-01-01

Routine analysis of humans for plutonium lung burdens is accomplished with two phoswich low-energy gamma detectors. The analysis of data from each detector provides the spectroscopist with a total of eight parameters. These parameters are normalized and displayed as an octagonal histogram over laid against the historical analyses of uncontaminated humans similar in body geometry, i.e., weight, height, and chest thickness. Subjects containing lung burdens of plutonium within (one standard deviation) of the historical average yield data which are displayed on a color graphics terminal as a green octagon. Analyses which yield values greater than 1 sigma above the historical average produce a distorted yellow, orange, or red display. Thus, through color and pattern recognition, the analyst may see at a glance if the current data statistically indicate human contamination

20. A heuristic model for computational prediction of human branch point sequence.

Science.gov (United States)

Wen, Jia; Wang, Jue; Zhang, Qing; Guo, Dianjing

2017-10-24

Pre-mRNA splicing is the removal of introns from precursor mRNAs (pre-mRNAs) and the concurrent ligation of the flanking exons to generate mature mRNA. This process is catalyzed by the spliceosome, where the splicing factor 1 (SF1) specifically recognizes the seven-nucleotide branch point sequence (BPS) and the U2 snRNP later displaces the SF1 and binds to the BPS. In mammals, the degeneracy of BPS motifs together with the lack of a large set of experimentally verified BPSs complicates the task of BPS prediction in silico. In this paper, we develop a simple and yet efficient heuristic model for human BPS prediction based on a novel scoring scheme, which quantifies the splicing strength of putative BPSs. The candidate BPS is restricted exclusively within a defined BPS search region to avoid the influences of other elements in the intron and therefore the prediction accuracy is improved. Moreover, using two types of relative frequencies for human BPS prediction, we demonstrate our model outperformed other current implementations on experimentally verified human introns. We propose that the binding energy contributes to the molecular recognition involved in human pre-mRNA splicing. In addition, a genome-wide human BPS prediction is carried out. The characteristics of predicted BPSs are in accordance with experimentally verified human BPSs, and branch site positions relative to the 3'ss and the 5'end of the shortened AGEZ are consistent with the results of published papers. Meanwhile, a webserver for BPS predictor is freely available at http://biocomputer.bio.cuhk.edu.hk/BPS .

1. Contrasting human- and computer-generated english: the case of football match report

OpenAIRE

Viluckas, Paulius

2017-01-01

This paper aims to compare two kinds of football reports: written by football reporters and automatically generated by a video game. The corpus of real reports consists of match reports from the BBC website, while the corpus of computer-generated language has been compiled from video game Football Manager 2017. The aim of the study is to compare two varieties of football discourse by applying the lexical bundle approach (Biber et al. 2004). More specifically, the analysis involves a compariso...

2. Maximal thickness of the normal human pericardium assessed by electron-beam computed tomography

International Nuclear Information System (INIS)

Delille, J.P.; Hernigou, A.; Sene, V.; Chatellier, G.; Boudeville, J.C.; Challande, P.; Plainfosse, M.C.

1999-01-01

The purpose of this study was to determine the maximal value of normal pericardial thickness with an electron-beam computed tomography unit allowing fast scan times of 100 ms to reduce cardiac motion artifacts. Electron-beam computed tomography was performed in 260 patients with hypercholesterolemia and/or hypertension, as these pathologies have no effect on pericardial thickness. The pixel size was 0.5 mm. Measurements could be performed in front of the right ventricle, the right atrioventricular groove, the right atrium, the left ventricle, and the interventricular groove. Maximal thickness of normal pericardium was defined at the 95th percentile. Inter-observer and intra-observer reproducibility studies were assessed from additional CT scans by the Bland and Altman method [24]. The maximal thickness of the normal pericardium was 2 mm for 95 % of cases. For the reproducibility studies, there was no significant relationship between the inter-observer and intra-observer measurements, but all pericardial thickness measurements were ≤ 1.6 mm. Using electron-beam computed tomography, which assists in decreasing substantially cardiac motion artifacts, the threshold of detection of thickened pericardium is statistically established as being 2 mm for 95 % of the patients with hypercholesterolemia and/or hypertension. However, the spatial resolution available prevents a reproducible measure of the real thickness of thin pericardium. (orig.)

3. Maximal thickness of the normal human pericardium assessed by electron-beam computed tomography

Energy Technology Data Exchange (ETDEWEB)

Delille, J.P.; Hernigou, A.; Sene, V.; Chatellier, G.; Boudeville, J.C.; Challande, P.; Plainfosse, M.C. [Service de Radiologie Centrale, Hopital Broussais, Paris (France)

1999-08-01

The purpose of this study was to determine the maximal value of normal pericardial thickness with an electron-beam computed tomography unit allowing fast scan times of 100 ms to reduce cardiac motion artifacts. Electron-beam computed tomography was performed in 260 patients with hypercholesterolemia and/or hypertension, as these pathologies have no effect on pericardial thickness. The pixel size was 0.5 mm. Measurements could be performed in front of the right ventricle, the right atrioventricular groove, the right atrium, the left ventricle, and the interventricular groove. Maximal thickness of normal pericardium was defined at the 95th percentile. Inter-observer and intra-observer reproducibility studies were assessed from additional CT scans by the Bland and Altman method [24]. The maximal thickness of the normal pericardium was 2 mm for 95 % of cases. For the reproducibility studies, there was no significant relationship between the inter-observer and intra-observer measurements, but all pericardial thickness measurements were {<=} 1.6 mm. Using electron-beam computed tomography, which assists in decreasing substantially cardiac motion artifacts, the threshold of detection of thickened pericardium is statistically established as being 2 mm for 95 % of the patients with hypercholesterolemia and/or hypertension. However, the spatial resolution available prevents a reproducible measure of the real thickness of thin pericardium. (orig.) With 6 figs., 1 tab., 31 refs.

4. Examining human behavior in video games: The development of a computational model to measure aggression.

Science.gov (United States)

Lamb, Richard; Annetta, Leonard; Hoston, Douglas; Shapiro, Marina; Matthews, Benjamin

2018-06-01

Video games with violent content have raised considerable concern in popular media and within academia. Recently, there has been considerable attention regarding the claim of the relationship between aggression and video game play. The authors of this study propose the use of a new class of tools developed via computational models to allow examination of the question of whether there is a relationship between violent video games and aggression. The purpose of this study is to computationally model and compare the General Aggression Model with the Diathesis Mode of Aggression related to the play of violent content in video games. A secondary purpose is to provide a method of measuring and examining individual aggression arising from video game play. Total participants examined for this study are N = 1065. This study occurs in three phases. Phase 1 is the development and quantification of the profile combination of traits via latent class profile analysis. Phase 2 is the training of the artificial neural network. Phase 3 is the comparison of each model as a computational model with and without the presence of video game violence. Results suggest that a combination of environmental factors and genetic predispositions trigger aggression related to video games.

5. Development and validation of a new dynamic computer-controlled model of the human stomach and small intestine.

Science.gov (United States)

Guerra, Aurélie; Denis, Sylvain; le Goff, Olivier; Sicardi, Vincent; François, Olivier; Yao, Anne-Françoise; Garrait, Ghislain; Manzi, Aimé Pacifique; Beyssac, Eric; Alric, Monique; Blanquet-Diot, Stéphanie

2016-06-01

For ethical, regulatory, and economic reasons, in vitro human digestion models are increasingly used as an alternative to in vivo assays. This study aims to present the new Engineered Stomach and small INtestine (ESIN) model and its validation for pharmaceutical applications. This dynamic computer-controlled system reproduces, according to in vivo data, the complex physiology of the human stomach and small intestine, including pH, transit times, chyme mixing, digestive secretions, and passive absorption of digestion products. Its innovative design allows a progressive meal intake and the differential gastric emptying of solids and liquids. The pharmaceutical behavior of two model drugs (paracetamol immediate release form and theophylline sustained release tablet) was studied in ESIN during liquid digestion. The results were compared to those found with a classical compendial method (paddle apparatus) and in human volunteers. Paracetamol and theophylline tablets showed similar absorption profiles in ESIN and in healthy subjects. For theophylline, a level A in vitro-in vivo correlation could be established between the results obtained in ESIN and in humans. Interestingly, using a pharmaceutical basket, the swelling and erosion of the theophylline sustained release form was followed during transit throughout ESIN. ESIN emerges as a relevant tool for pharmaceutical studies but once further validated may find many other applications in nutritional, toxicological, and microbiological fields. Biotechnol. Bioeng. 2016;113: 1325-1335. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

6. Computer Modeling and Simulation of Bullet Impact to the Human Thorax

National Research Council Canada - National Science Library

Jolly, Johannes

2000-01-01

.... The objective of the study was to create a viable finite element model of the human thorax. The model was validated by comparing the results of tests of body armor systems conducted on cadavers to results obtained from finite element analysis...

7. Elucidating Mechanisms of Molecular Recognition Between Human Argonaute and miRNA Using Computational Approaches

KAUST Repository

Jiang, Hanlun; Zhu, Lizhe; Hé liou, Amé lie; Gao, Xin; Bernauer, Julie; Huang, Xuhui

2016-01-01

that are geometrically accessible to miRNA. Using our recent work on human AGO2 as an example, we explain the rationale and the workflow of our method in details. This combined approach holds great promise to complement experiments in unraveling the mechanisms

8. Exploring the Human Element of Computer-Assisted Language Learning: An Iranian Context

Science.gov (United States)

Fatemi Jahromi, Seyed Abolghasseminits; Salimi, Farimah

2013-01-01

Based on various theories of human agency (Ajzen, I. (2005). "Attitudes, personality and behavior" (2nd ed.). London: Open University Press; Davis, F.D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. "MIS Quarterly", 13, 319-340; Rogers, E.M. (1983). "Diffusion of…

9. An Investigative Laboratory Course in Human Physiology Using Computer Technology and Collaborative Writing

Science.gov (United States)

FitzPatrick, Kathleen A.

2004-01-01

Active investigative student-directed experiences in laboratory science are being encouraged by national science organizations. A growing body of evidence from classroom assessment supports their effectiveness. This study describes four years of implementation and assessment of an investigative laboratory course in human physiology for 65…

10. Computing the English Middle Ages: A Sociotechnical Study of Medievalists' Engagement with Digital Humanities

Science.gov (United States)

Simpson, Grant Leyton

2017-01-01

With few exceptions, digital humanities projects and objects have been described rather than studied. This dissertation attempts to advance that discourse by empirically studying, from a sociotechnical point of view, DH projects and the products they produce, specifically those within the realm of Old and Middle English language and literature.…

11. ERP Human Enhancement Progress Report : Use case and computational model for adaptive maritime automation

NARCIS (Netherlands)

Kleij, R. van der; Broek, J. van den; Brake, G.M. te; Rypkema, J.A.; Schilder, C.M.C.

2015-01-01

Automation is often applied in order to increase the cost-effectiveness, reliability and safety of maritime ship and offshore operations. Automation of operator tasks, has not, however, eliminated human error so much as created opportunities for new kinds of error. The ambition of the Adaptive

12. HumanComputer Systems Interaction Backgrounds and Applications 2 Part 1

CERN Document Server

Kulikowski, Juliusz; Mroczek, Teresa

2012-01-01

The main contemporary human-system interaction (H-SI) problems consist in design and/or improvement of the tools for effective exchange of information between individual humans or human groups and technical systems created for humans aiding in reaching their vital goals. This book is a second issue in a series devoted to the novel in H-SI results and contributions reached for the last years by many research groups in European and extra-European countries. The preliminary (usually shortened) versions of the chapters  were presented as conference papers at the 3rd International Conference on H-SI held in Rzeszow, Poland, in 2010. A  large number of valuable papers  selected for publication caused a necessity to publish the book in two volumes. The given, 1st Volume  consists of sections devoted to: I. Decision Supporting Systems, II. Distributed Knowledge Bases and WEB Systems and III. Impaired Persons  Aiding Systems. The decision supporting systems concern various application areas, like enterprises mana...

13. The Use Of Computational Human Performance Modeling As Task Analysis Tool

Energy Technology Data Exchange (ETDEWEB)

Jacuqes Hugo; David Gertman

2012-07-01

14. Computational Breakthrough of Natural Lead Hits from the Genus of Arisaema against Human Respiratory Syncytial Virus.

Science.gov (United States)

Kant, Kamal; Lal, Uma Ranjan; Ghosh, Manik

2018-01-01

To date, efforts for the prevention and treatment of human respiratory syncytial virus (RSV) infection have been still vain, and there is no safe and effective clinical accepted vaccine. Arisaema genus has claimed for various traditional bioactivities, but scientific assessments are quite limited. This encouraged us to carry out our present study on around 60 phytoconstituents of different Arisaema species as a natural inhibitor against the human RSV. Selected 60 phytochemical entities were evaluated on the docking behavior of human RSV receptor (PDB: 4UCC) using Maestro 9.3 (Schrödinger, LLC, Cambridge, USA). Furthermore, kinetic properties and toxicity nature of top graded ligands were analyzed through QikProp and ProTox tools. Notably, rutin (glide score: -8.49), schaftoside (glide score: -8.18) and apigenin-6,8-di-C-β-D-galactoside (glide score - 7.29) have resulted in hopeful natural lead hits with an ideal range of kinetic descriptors values. ProTox tool (oral rodent toxicity) has resulted in likely toxicity targets of apex-graded tested ligands. Finally, the whole efforts can be explored further as a model to confirm its anti-human RSV potential with wet laboratory experiments. Rutin, schaftoside, and apigenin-6,8-di-C-β-D-galactoside showed promising top hits docking profile against human respiratory syncytial virusMoreover, absorption, distribution, metabolism, excretion properties (QikProp) of top hits resulted within an ideal range of kinetic descriptorsProTox tool highlighted toxicity class ranges, LD 50 values, and possible toxicity targets of apex-graded tested ligands. Abbreviations used: RSV: Respiratory syncytial virus, PRRSV: Porcine respiratory and reproductive syndrome virus, ADME-T: Absorption, distribution, metabolism, excretion, and toxicity.

15. Distribution of recombination hotspots in the human genome--a comparison of computer simulations with real data.

Directory of Open Access Journals (Sweden)

Dorota Mackiewicz

Full Text Available Recombination is the main cause of genetic diversity. Thus, errors in this process can lead to chromosomal abnormalities. Recombination events are confined to narrow chromosome regions called hotspots in which characteristic DNA motifs are found. Genomic analyses have shown that both recombination hotspots and DNA motifs are distributed unevenly along human chromosomes and are much more frequent in the subtelomeric regions of chromosomes than in their central parts. Clusters of motifs roughly follow the distribution of recombination hotspots whereas single motifs show a negative correlation with the hotspot distribution. To model the phenomena related to recombination, we carried out computer Monte Carlo simulations of genome evolution. Computer simulations generated uneven distribution of hotspots with their domination in the subtelomeric regions of chromosomes. They also revealed that purifying selection eliminating defective alleles is strong enough to cause such hotspot distribution. After sufficiently long time of simulations, the structure of chromosomes reached a dynamic equilibrium, in which number and global distribution of both hotspots and defective alleles remained statistically unchanged, while their precise positions were shifted. This resembles the dynamic structure of human and chimpanzee genomes, where hotspots change their exact locations but the global distributions of recombination events are very similar.

16. Distribution of Recombination Hotspots in the Human Genome – A Comparison of Computer Simulations with Real Data

Science.gov (United States)

Mackiewicz, Dorota; de Oliveira, Paulo Murilo Castro; Moss de Oliveira, Suzana; Cebrat, Stanisław

2013-01-01

Recombination is the main cause of genetic diversity. Thus, errors in this process can lead to chromosomal abnormalities. Recombination events are confined to narrow chromosome regions called hotspots in which characteristic DNA motifs are found. Genomic analyses have shown that both recombination hotspots and DNA motifs are distributed unevenly along human chromosomes and are much more frequent in the subtelomeric regions of chromosomes than in their central parts. Clusters of motifs roughly follow the distribution of recombination hotspots whereas single motifs show a negative correlation with the hotspot distribution. To model the phenomena related to recombination, we carried out computer Monte Carlo simulations of genome evolution. Computer simulations generated uneven distribution of hotspots with their domination in the subtelomeric regions of chromosomes. They also revealed that purifying selection eliminating defective alleles is strong enough to cause such hotspot distribution. After sufficiently long time of simulations, the structure of chromosomes reached a dynamic equilibrium, in which number and global distribution of both hotspots and defective alleles remained statistically unchanged, while their precise positions were shifted. This resembles the dynamic structure of human and chimpanzee genomes, where hotspots change their exact locations but the global distributions of recombination events are very similar. PMID:23776462

17. Human factors in computing systems: focus on patient-centered health communication at the ACM SIGCHI conference.

Science.gov (United States)

Wilcox, Lauren; Patel, Rupa; Chen, Yunan; Shachak, Aviv

2013-12-01

Health Information Technologies, such as electronic health records (EHR) and secure messaging, have already transformed interactions among patients and clinicians. In addition, technologies supporting asynchronous communication outside of clinical encounters, such as email, SMS, and patient portals, are being increasingly used for follow-up, education, and data reporting. Meanwhile, patients are increasingly adopting personal tools to track various aspects of health status and therapeutic progress, wishing to review these data with clinicians during consultations. These issues have drawn increasing interest from the human-computer interaction (HCI) community, with special focus on critical challenges in patient-centered interactions and design opportunities that can address these challenges. We saw this community presenting and interacting at the ACM SIGCHI 2013, Conference on Human Factors in Computing Systems, (also known as CHI), held April 27-May 2nd, 2013 at the Palais de Congrès de Paris in France. CHI 2013 featured many formal avenues to pursue patient-centered health communication: a well-attended workshop, tracks of original research, and a lively panel discussion. In this report, we highlight these events and the main themes we identified. We hope that it will help bring the health care communication and the HCI communities closer together. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

18. Application of a computational situation assessment model to human system interface design and experimental validation of its effectiveness

International Nuclear Information System (INIS)

Lee, Hyun-Chul; Koh, Kwang-Yong; Seong, Poong-Hyun

2013-01-01

Highlights: ► We validate the effectiveness of a proposed procedure thru an experiment. ► The proposed procedure addresses the salient coding of the key information. ► It was found that salience coding affects operators’ attention significantly. ► The first observation to the key information quickly guided to the correct situation awareness. ► It was validated the proposed procedure is effective for better situation awareness. - Abstract: To evaluate the effects of human cognitive characteristics on situation awareness, a computational situation assessment model of nuclear power plant operators has been developed, as well as a procedure to apply the developed model to the design of human system interfaces (HSIs). The concept of the proposed procedure is to identify the key information source, which is expected to guarantee fast and accurate diagnosis when operators attend to it. The developed computational model is used to search the diagnostic paths and the key information source. In this study, an experiment with twelve trained participants was executed to validate the effectiveness of the proposed procedure. Eighteen scenarios covering various accidents were administered twice for each subject, and experimental data were collected and analyzed. As a result of the data analysis, it was validated that the salience level of information sources significantly influences the attention of operators, and the first observation of the key information sources leads operators to a quick and correct situation assessment. Therefore, we conclude that the proposed procedure for applying the developed model to HSI design is effective

19. A Computer Clone of Human Expert for Mobility Management Scheme (E-MMS): Step toward Green Transportation

Science.gov (United States)

Resdiansyah; O. K Rahmat, R. A.; Ismail, A.

2018-03-01

Green transportation refers to a sustainable transport that gives the least impact in terms of social and environmental but at the same time is able to supply energy sources globally that includes non-motorized transport strategies deployment to promote healthy lifestyles, also known as Mobility Management Scheme (MMS). As construction of road infrastructure cannot help solve the problem of congestion, past research has shown that MMS is an effective measure to mitigate congestion and to achieve green transportation. MMS consists of different strategies and policies that subdivided into categories according to how they are able to influence travel behaviour. Appropriate selection of mobility strategies will ensure its effectiveness in mitigating congestion problems. Nevertheless, determining appropriate strategies requires human expert and depends on a number of success factors. This research has successfully developed a computer clone system based on human expert, called E-MMS. The process of knowledge acquisition for MMS strategies and the next following process to selection of strategy has been encode in a knowledge-based system using a shell expert system. The newly developed computer cloning system was successfully verified, validated and evaluated (VV&E) by comparing the result output with the real transportation expert recommendation in which the findings suggested Introduction

20. Computational Fluid Dynamics Modeling of the Human Pulmonary Arteries with Experimental Validation.

Science.gov (United States)

Bordones, Alifer D; Leroux, Matthew; Kheyfets, Vitaly O; Wu, Yu-An; Chen, Chia-Yuan; Finol, Ender A

2018-05-21

Pulmonary hypertension (PH) is a chronic progressive disease characterized by elevated pulmonary arterial pressure, caused by an increase in pulmonary arterial impedance. Computational fluid dynamics (CFD) can be used to identify metrics representative of the stage of PH disease. However, experimental validation of CFD models is often not pursued due to the geometric complexity of the model or uncertainties in the reproduction of the required flow conditions. The goal of this work is to validate experimentally a CFD model of a pulmonary artery phantom using a particle image velocimetry (PIV) technique. Rapid prototyping was used for the construction of the patient-specific pulmonary geometry, derived from chest computed tomography angiography images. CFD simulations were performed with the pulmonary model with a Reynolds number matching those of the experiments. Flow rates, the velocity field, and shear stress distributions obtained with the CFD simulations were compared to their counterparts from the PIV flow visualization experiments. Computationally predicted flow rates were within 1% of the experimental measurements for three of the four branches of the CFD model. The mean velocities in four transversal planes of study were within 5.9 to 13.1% of the experimental mean velocities. Shear stresses were qualitatively similar between the two methods with some discrepancies in the regions of high velocity gradients. The fluid flow differences between the CFD model and the PIV phantom are attributed to experimental inaccuracies and the relative compliance of the phantom. This comparative analysis yielded valuable information on the accuracy of CFD predicted hemodynamics in pulmonary circulation models.

1. Enhanced operational safety of BWRs by advanced computer technology and human engineering

International Nuclear Information System (INIS)

Tomizawa, T.; Fukumoto, A.; Neda, T.; Toda, Y.; Takizawa, Y.

1984-01-01

In BWR nuclear power plants, where unit capacity is increasing and the demand for assured safety is growing, it has become important for the information interface between man and machine to work smoothly. Efforts to improve man-machine communication have been going on for the past ten years in Japan. Computer facilities and colour CRT display systems are amongst the most useful new methods. Advanced computer technology has been applied to operating plants and found to be very helpful for safe operation. A display monitoring system (DMS) is in operation in a 1100 MW(e) BWR plant. A total combination test was successfully completed on the 'plant operation by displayed information and automation' system (PODIA) in February 1983 before shipment to the site. The objective of this test was to verify the improved qualification of the newly developed advanced PODIA man-machine system by this enlarged fabrication test concept. In addition, the development of special graphics displays for the main control room and technical support centre to assist operators in assessing plant safety and diagnosing problems is required to meet post-TMI regulations. For this purpose, a prototype safety parameter display system (called Toshiba SPDS) with two colour CRT displays and a computer (TOSBAC-7/70) was developed in 1981 as an independent safety monitoring system. The PODIA and SPDS are now independent systems, but their combination has been found to be more useful and valuable for nuclear power plant safety. The paper discusses supervisory and operational concepts in the advanced main control room including SPDS, and describes the PODIA and SPDS verification tests including the valuable experience obtained after improvements in the qualification of these systems had been made to satisfactory operational safety levels. (author)

2. AirDraw: Leveraging Smart Watch Motion Sensors for Mobile Human Computer Interactions

OpenAIRE

Sajjadi, Seyed A; Moazen, Danial; Nahapetian, Ani

2017-01-01

Wearable computing is one of the fastest growing technologies today. Smart watches are poised to take over at least of half the wearable devices market in the near future. Smart watch screen size, however, is a limiting factor for growth, as it restricts practical text input. On the other hand, wearable devices have some features, such as consistent user interaction and hands-free, heads-up operations, which pave the way for gesture recognition methods of text entry. This paper proposes a new...

3. Designing Second Generation Anti-Alzheimer Compounds as Inhibitors of Human Acetylcholinesterase: Computational Screening of Synthetic Molecules and Dietary Phytochemicals.

Science.gov (United States)

Amat-Ur-Rasool, Hafsa; Ahmed, Mehboob

2015-01-01

4. Human versus Computer Controlled Selection of Ventilator Settings: An Evaluation of Adaptive Support Ventilation and Mid-Frequency Ventilation

Directory of Open Access Journals (Sweden)

Eduardo Mireles-Cabodevila

2012-01-01

Full Text Available Background. There are modes of mechanical ventilation that can select ventilator settings with computer controlled algorithms (targeting schemes. Two examples are adaptive support ventilation (ASV and mid-frequency ventilation (MFV. We studied how different clinician-chosen ventilator settings are from these computer algorithms under different scenarios. Methods. A survey of critical care clinicians provided reference ventilator settings for a 70 kg paralyzed patient in five clinical/physiological scenarios. The survey-derived values for minute ventilation and minute alveolar ventilation were used as goals for ASV and MFV, respectively. A lung simulator programmed with each scenario’s respiratory system characteristics was ventilated using the clinician, ASV, and MFV settings. Results. Tidal volumes ranged from 6.1 to 8.3 mL/kg for the clinician, 6.7 to 11.9 mL/kg for ASV, and 3.5 to 9.9 mL/kg for MFV. Inspiratory pressures were lower for ASV and MFV. Clinician-selected tidal volumes were similar to the ASV settings for all scenarios except for asthma, in which the tidal volumes were larger for ASV and MFV. MFV delivered the same alveolar minute ventilation with higher end expiratory and lower end inspiratory volumes. Conclusions. There are differences and similarities among initial ventilator settings selected by humans and computers for various clinical scenarios. The ventilation outcomes are the result of the lung physiological characteristics and their interaction with the targeting scheme.

5. Designing Second Generation Anti-Alzheimer Compounds as Inhibitors of Human Acetylcholinesterase: Computational Screening of Synthetic Molecules and Dietary Phytochemicals.

Directory of Open Access Journals (Sweden)

Hafsa Amat-Ur-Rasool

6. Computer-Aided Design (CAD) Tools to Support the Human Factors Design Teams

Science.gov (United States)

Null, Cynthia H.; Jackson, Mariea D.; Perry, Trey; Quick, Jason C.; Stokes, Jack W.

2014-01-01

The scope of this assessment was to develop a library of basic 1-Gravity (G) human posture and motion elements used to construct complex virtual simulations of ground processing and maintenance tasks for spaceflight vehicles, including launch vehicles, crewed spacecraft, robotic spacecraft, satellites, and other payloads. The report herein describes the task, its purpose, performance, findings, NASA Engineering and Safety Center (NESC) recommendations, and conclusions in the definition and assemblage of the postures and motions database (PMD).

7. Computational Modeling of Human Metabolism and Its Application to Systems Biomedicine.

Science.gov (United States)

Aurich, Maike K; Thiele, Ines

2016-01-01

Modern high-throughput techniques offer immense opportunities to investigate whole-systems behavior, such as those underlying human diseases. However, the complexity of the data presents challenges in interpretation, and new avenues are needed to address the complexity of both diseases and data. Constraint-based modeling is one formalism applied in systems biology. It relies on a genome-scale reconstruction that captures extensive biochemical knowledge regarding an organism. The human genome-scale metabolic reconstruction is increasingly used to understand normal cellular and disease states because metabolism is an important factor in many human diseases. The application of human genome-scale reconstruction ranges from mere querying of the model as a knowledge base to studies that take advantage of the model's topology and, most notably, to functional predictions based on cell- and condition-specific metabolic models built based on omics data.An increasing number and diversity of biomedical questions are being addressed using constraint-based modeling and metabolic models. One of the most successful biomedical applications to date is cancer metabolism, but constraint-based modeling also holds great potential for inborn errors of metabolism or obesity. In addition, it offers great prospects for individualized approaches to diagnostics and the design of disease prevention and intervention strategies. Metabolic models support this endeavor by providing easy access to complex high-throughput datasets. Personalized metabolic models have been introduced. Finally, constraint-based modeling can be used to model whole-body metabolism, which will enable the elucidation of metabolic interactions between organs and disturbances of these interactions as either causes or consequence of metabolic diseases. This chapter introduces constraint-based modeling and describes some of its contributions to systems biomedicine.

8. Computational study of aggregation mechanism in human lysozyme[D67H].

Directory of Open Access Journals (Sweden)

Dharmeshkumar Patel

Full Text Available Aggregation of proteins is an undesired phenomena that affects both human health and bioengineered products such as therapeutic proteins. Finding preventative measures could be facilitated by a molecular-level understanding of dimer formation, which is the first step in aggregation. Here we present a molecular dynamics (MD study of dimer formation propensity in human lysozyme and its D67H variant. Because the latter protein aggregates while the former does not, they offer an ideal system for testing the feasibility of the proposed MD approach which comprises three stages: i partially unfolded conformers involved in dimer formation are generated via high-temperature MD simulations, ii potential dimer structures are searched using docking and refined with MD, iii free energy calculations are performed to find the most stable dimer structure. Our results provide a detailed explanation for how a single mutation (D67H turns human lysozyme from non-aggregating to an aggregating protein. Conversely, the proposed method can be used to identify the residues causing aggregation in a protein, which can be mutated to prevent it.

9. Detection of small traumatic hemorrhages using a computer-generated average human brain CT.

Science.gov (United States)

Afzali-Hashemi, Liza; Hazewinkel, Marieke; Tjepkema-Cloostermans, Marleen C; van Putten, Michel J A M; Slump, Cornelis H

2018-04-01

Computed tomography is a standard diagnostic imaging technique for patients with traumatic brain injury (TBI). A limitation is the poor-to-moderate sensitivity for small traumatic hemorrhages. A pilot study using an automatic method to detect hemorrhages [Formula: see text] in diameter in patients with TBI is presented. We have created an average image from 30 normal noncontrast CT scans that were automatically aligned using deformable image registration as implemented in Elastix software. Subsequently, the average image was aligned to the scans of TBI patients, and the hemorrhages were detected by a voxelwise subtraction of the average image from the CT scans of nine TBI patients. An experienced neuroradiologist and a radiologist in training assessed the presence of hemorrhages in the final images and determined the false positives and false negatives. The 9 CT scans contained 67 small haemorrhages, of which 97% was correctly detected by our system. The neuroradiologist detected three false positives, and the radiologist in training found two false positives. For one patient, our method showed a hemorrhagic contusion that was originally missed. Comparing individual CT scans with a computed average may assist the physicians in detecting small traumatic hemorrhages in patients with TBI.

10. Computational composites

DEFF Research Database (Denmark)

Vallgårda, Anna K. A.; Redström, Johan

2007-01-01

Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

11. Computer-aided training sensorimotor cortex functions in humans before the upper limb transplantation using virtual reality and sensory feedback.

Science.gov (United States)

Kurzynski, Marek; Jaskolska, Anna; Marusiak, Jaroslaw; Wolczowski, Andrzej; Bierut, Przemyslaw; Szumowski, Lukasz; Witkowski, Jerzy; Kisiel-Sajewicz, Katarzyna

2017-08-01

One of the biggest problems of upper limb transplantation is lack of certainty as to whether a patient will be able to control voluntary movements of transplanted hands. Based on findings of the recent research on brain cortex plasticity, a premise can be drawn that mental training supported with visual and sensory feedback can cause structural and functional reorganization of the sensorimotor cortex, which leads to recovery of function associated with the control of movements performed by the upper limbs. In this study, authors - based on the above observations - propose the computer-aided training (CAT) system, which generating visual and sensory stimuli, should enhance the effectiveness of mental training applied to humans before upper limb transplantation. The basis for the concept of computer-aided training system is a virtual hand whose reaching and grasping movements the trained patient can observe on the VR headset screen (visual feedback) and whose contact with virtual objects the patient can feel as a touch (sensory feedback). The computer training system is composed of three main components: (1) the system generating 3D virtual world in which the patient sees the virtual limb from the perspective as if it were his/her own hand; (2) sensory feedback transforming information about the interaction of the virtual hand with the grasped object into mechanical vibration; (3) the therapist's panel for controlling the training course. Results of the case study demonstrate that mental training supported with visual and sensory stimuli generated by the computer system leads to a beneficial change of the brain activity related to motor control of the reaching in the patient with bilateral upper limb congenital transverse deficiency. Copyright © 2017 Elsevier Ltd. All rights reserved.

12. Installation and test of new human machine interface of the HANARO control computer

International Nuclear Information System (INIS)

Kim, Min Jin; Kim, Y. K.; Choi, Y. S.; Jung, H. S.; Kim, H. K.; Wu, J. S.

2002-06-01

As a first step of the long-term replace plan, we upgraded BCS, the HMI of HANARO control computer. ProcessSuite system that was imported this time consists of a workstation class PC and application program that is compatible with MLC and operates on Windows NT 4.0. Operation data storage function that was disabled due to disk drive failure of BCS is now enabled and log scale display and a secure means to enter demand power are made available. Mostly the configuration of ProcessSuite system was found correct although we found some discrepancy and corrected them. The further works to be done are the addition of graphic display screens based on flow diagram, selection and procurement of alarm printer, provision for chart recorders, transmission of important operating parameters and so on. After that, we will produce some documents to prove the performance of new system and prepare user manual for operator

13. Permanency analysis on human electroencephalogram signals for pervasive Brain-Computer Interface systems.

Science.gov (United States)

Sadeghi, Koosha; Junghyo Lee; Banerjee, Ayan; Sohankar, Javad; Gupta, Sandeep K S

2017-07-01

Brain-Computer Interface (BCI) systems use some permanent features of brain signals to recognize their corresponding cognitive states with high accuracy. However, these features are not perfectly permanent, and BCI system should be continuously trained over time, which is tedious and time consuming. Thus, analyzing the permanency of signal features is essential in determining how often to repeat training. In this paper, we monitor electroencephalogram (EEG) signals, and analyze their behavior through continuous and relatively long period of time. In our experiment, we record EEG signals corresponding to rest state (eyes open and closed) from one subject everyday, for three and a half months. The results show that signal features such as auto-regression coefficients remain permanent through time, while others such as power spectral density specifically in 5-7 Hz frequency band are not permanent. In addition, eyes open EEG data shows more permanency than eyes closed data.

14. Visualization and analysis of flow patterns of human carotid bifurcation by computational fluid dynamics

International Nuclear Information System (INIS)

Xue Yunjing; Gao Peiyi; Lin Yan

2007-01-01

Objective: To investigate flow patterns at carotid bifurcation in vivo by combining computational fluid dynamics (CFD)and MR angiography imaging. Methods: Seven subjects underwent contrast-enhanced MR angiography of carotid artery in Siemens 3.0 T MR. Flow patterns of the carotid artery bifurcation were calculated and visualized by combining MR vascular imaging post-processing and CFD. Results: The flow patterns of the carotid bifurcations in 7 subjects were varied with different phases of a cardiac cycle. The turbulent flow and back flow occurred at bifurcation and proximal of internal carotid artery (ICA) and external carotid artery (ECA), their occurrence and conformation were varied with different phase of a cardiac cycle. The turbulent flow and back flow faded out quickly when the blood flow to the distal of ICA and ECA. Conclusion: CFD combined with MR angiography can be utilized to visualize the cyclical change of flow patterns of carotid bifurcation with different phases of a cardiac cycle. (authors)

15. An advanced computational bioheat transfer model for a human body with an embedded systemic circulation.

Science.gov (United States)

Coccarelli, Alberto; Boileau, Etienne; Parthimos, Dimitris; Nithiarasu, Perumal

2016-10-01

In the present work, an elaborate one-dimensional thermofluid model for a human body is presented. By contrast to the existing pure conduction-/perfusion-based models, the proposed methodology couples the arterial fluid dynamics of a human body with a multi-segmental bioheat model of surrounding solid tissues. In the present configuration, arterial flow is included through a network of elastic vessels. More than a dozen solid segments are employed to represent the heat conduction in the surrounding tissues, and each segment is constituted by a multilayered circular cylinder. Such multi-layers allow flexible delineation of the geometry and incorporation of properties of different tissue types. The coupling of solid tissue and fluid models requires subdivision of the arterial circulation into large and small arteries. The heat exchange between tissues and arterial wall occurs by convection in large vessels and by perfusion in small arteries. The core region, including the heart, provides the inlet conditions for the fluid equations. In the proposed model, shivering, sweating, and perfusion changes constitute the basis of the thermoregulatory system. The equations governing flow and heat transfer in the circulatory system are solved using a locally conservative Galerkin approach, and the heat conduction in the surrounding tissues is solved using a standard implicit backward Euler method. To investigate the effectiveness of the proposed model, temperature field evolutions are monitored at different points of the arterial tree and in the surrounding tissue layers. To study the differences due to flow-induced convection effects on thermal balance, the results of the current model are compared against those of the widely used modelling methodologies. The results show that the convection significantly influences the temperature distribution of the solid tissues in the vicinity of the arteries. Thus, the inner convection has a more predominant role in the human body heat

16. A computer algorithm for the differentiation between lung and gastrointestinal tract activities in the human body

International Nuclear Information System (INIS)

Mellor, R.A.; Harrington, C.L.; Bard, S.T.

1984-01-01

Proposed changes to 10CFR20 combining internal and external exposures will require accurate and precise in vivo bioassay data. One of the many uncertainties in the interpretation of in vivo bioassay data is the imprecise knowledge of the location of any observed radioactivity within the body of an individual. Attempts to minimize this uncertainty have been made by collimating the field of view of a single photon detector to each organ or body system of concern. In each of these cases, full removal of any potential gamma flux from organs other than the desired organ is not achieved. In certain commercially available systems this ''cross talk'' may range from 20 to 40 percent. A computerized algorithm has been developed which resolves this ''cross talk'' for all observed radionuclides in a system composed of two high purity germanium photon detectors separately viewing the lung and GI regions of a subject. The algorithm routinely applies cross talk correction factors and photopeak detection efficiencies to the net spectral photopeak areas determined by a peak search methodology. Separate lung and GI activities, corrected for cross talk, are calculated and reported. The logic utilized in the total software package, as well as the derivation of the cross talk correction factors, will be discussed. Any limitations of the computer algorithm when applied to various radioactivity levels will also be identified. An evaluation of the cross talk factors for potential use in differentiating surface contamination from true organ burdens will be presented. In addition, the capability to efficiently execute this software using a low cost, portable stand-alone computer system will be demonstrated

17. Generation of a suite of 3D computer-generated breast phantoms from a limited set of human subject data

International Nuclear Information System (INIS)

Hsu, Christina M. L.; Palmeri, Mark L.; Segars, W. Paul; Veress, Alexander I.; Dobbins, James T. III

2013-01-01

Purpose: The authors previously reported on a three-dimensional computer-generated breast phantom, based on empirical human image data, including a realistic finite-element based compression model that was capable of simulating multimodality imaging data. The computerized breast phantoms are a hybrid of two phantom generation techniques, combining empirical breast CT (bCT) data with flexible computer graphics techniques. However, to date, these phantoms have been based on single human subjects. In this paper, the authors report on a new method to generate multiple phantoms, simulating additional subjects from the limited set of original dedicated breast CT data. The authors developed an image morphing technique to construct new phantoms by gradually transitioning between two human subject datasets, with the potential to generate hundreds of additional pseudoindependent phantoms from the limited bCT cases. The authors conducted a preliminary subjective assessment with a limited number of observers (n= 4) to illustrate how realistic the simulated images generated with the pseudoindependent phantoms appeared. Methods: Several mesh-based geometric transformations were developed to generate distorted breast datasets from the original human subject data. Segmented bCT data from two different human subjects were used as the “base” and “target” for morphing. Several combinations of transformations were applied to morph between the “base’ and “target” datasets such as changing the breast shape, rotating the glandular data, and changing the distribution of the glandular tissue. Following the morphing, regions of skin and fat were assigned to the morphed dataset in order to appropriately assign mechanical properties during the compression simulation. The resulting morphed breast was compressed using a finite element algorithm and simulated mammograms were generated using techniques described previously. Sixty-two simulated mammograms, generated from morphing

18. Dual-Energy Computed Tomography Gemstone Spectral Imaging: A Novel Technique to Determine Human Cardiac Calculus Composition.

Science.gov (United States)

Cheng, Ching-Li; Chang, Hsiao-Huang; Ko, Shih-Chi; Huang, Pei-Jung; Lin, Shan-Yang

2016-01-01

Understanding the chemical composition of any calculus in different human organs is essential for choosing the best treatment strategy for patients. The purpose of this study was to assess the capability of determining the chemical composition of a human cardiac calculus using gemstone spectral imaging (GSI) mode on a single-source dual-energy computed tomography (DECT) in vitro. The cardiac calculus was directly scanned on the Discovery CT750 HD FREEdom Edition using GSI mode, in vitro. A portable fiber-optic Raman spectroscopy was also applied to verify the quantitative accuracy of the DECT measurements. The results of spectral DECT measurements indicate that effective Z values in 3 designated positions located in this calculus were 15.02 to 15.47, which are close to values of 15.74 to 15.86, corresponding to the effective Z values of calcium apatite and hydroxyapatite. The Raman spectral data were also reflected by the predominant Raman peak at 960 cm for hydroxyapatite and the minor peak at 875 cm for calcium apatite. A potential single-source DECT with GSI mode was first used to examine the morphological characteristics and chemical compositions of a giant human cardiac calculus, in vitro. The CT results were consistent with the Raman spectral data, suggesting that spectral CT imaging techniques could be accurately used to diagnose and characterize the compositional materials in the cardiac calculus.

19. Proceedings of the 5th Danish Human-Computer Interaction Research Symposium

DEFF Research Database (Denmark)

Clemmensen, Torkil; Nielsen, Lene

2005-01-01

Lene Nielsen DEALING WITH REALITY - IN THEORY Gitte Skou PetersenA NEW IFIP WORKING GROUP - HUMAN WORK INTERACTION DESIGN Rikke Ørngreen, Torkil Clemmensen & Annelise Mark-Pejtersen CLASSIFICATION OF DESCRIPTIONS USED IN SOFTWARE AND INTERACTION DESIGN Georg Strøm OBSTACLES TO DESIGN IN VOLUNTEER BASED...... for the symposium, of which 14 were presented orally in four panel sessions. Previously the symposium has been held at University of Aarhus 2001, University of Copenhagen 2002, Roskilde University Center 2003, Aalborg University 2004. Torkil Clemmensen & Lene Nielsen Copenhagen, November 2005 CONTENT INTRODUCTION...

20. [Computer optical topography: a study of the repeatability of the results of human body model examination].

Science.gov (United States)

2007-01-01

The problem of repeatability of the results of examination of a plastic human body model is considered. The model was examined in 7 positions using an optical topograph for kyphosis diagnosis. The examination was performed under television camera monitoring. It was shown that variation of the model position in the camera view affected the repeatability of the results of topographic examination, especially if the model-to-camera distance was changed. A study of the repeatability of the results of optical topographic examination can help to increase the reliability of the topographic method, which is widely used for medical screening of children and adolescents.

1. SLIM-MAUD - a computer based technique for human reliability assessment

International Nuclear Information System (INIS)

Embrey, D.E.

1985-01-01

The Success Likelihood Index Methodology (SLIM) is a widely applicable technique which can be used to assess human error probabilities in both proceduralized and cognitive tasks (i.e. those involving decision making, problem solving, etc.). It assumes that expert assessors are able to evaluate the relative importance (or weights) of different factors called Performance Shaping Factors (PSFs), in determining the likelihood of error for the situations being assessed. Typical PSFs are the extent to which good procedures are available, operators are adequately trained, the man-machine interface is well designed, etc. If numerical ratings are made of the PSFs for the specific tasks being evaluated, these can be combined with the weights to give a numerical index, called the Success Likelihood Index (SLI). The SLI represents, in numerical form, the overall assessment of the experts of the likelihood of task success. The SLI can be subsequently transformed to a corresponding human error probability (HEP) estimate. The latest form of the SLIM technique is implemented using a microcomputer based system called MAUD (Multi-Attribute Utility Decomposition), the resulting technique being called SLIM-MAUD. A detailed description of the SLIM-MAUD technique and case studies of applications are available. An illustrative example of the application of SLIM-MAUD in probabilistic risk assessment is given

2. Delineating the Impact of Weightlessness on Human Physiology Using Computational Models

Science.gov (United States)

2015-01-01

Microgravity environment has profound effects on several important human physiological systems. The impact of weightlessness is usually indirect as mediated by changes in the biological fluid flow and transport and alterations in the deformation and stress fields of the compliant tissues. In this context, Fluid-Structural and Fluid-Solid Interaction models provide a valuable tool in delineating the physical origins of the physiological changes so that systematic countermeasures can be devised to reduce their adverse effects. In this presentation, impact of gravity on three human physiological systems will be considered. The first case involves prediction of cardiac shape change and altered stress distributions in weightlessness. The second, presents a fluid-structural-interaction (FSI) analysis and assessment of the vestibular system and explores the reasons behind the unexpected microgravity caloric stimulation test results performed aboard the Skylab. The last case investigates renal stone development in microgravity and the possible impact of re-entry into partial gravity on the development and transport of nucleating, growing, and agglomerating renal calculi in the nephron. Finally, the need for model validation and verification and application of the FSI models to assess the effects of Artificial Gravity (AG) are also briefly discussed.

3. Computational study of ‘HUB’ microRNA in human cardiac diseases

Science.gov (United States)

Krishnan, Remya; Nair, Achuthsankar S.; Dhar, Pawan K.

2017-01-01

MicroRNAs (miRNAs) are small non-coding RNAs ~22 nucleotides long that do not encode for proteins but have been reported to influence gene expression in normal and abnormal health conditions. Though a large body of scientific literature on miRNAs exists, their network level profile linking molecules with their corresponding phenotypes, is less explored. Here, we studied a network of 191 human miRNAs reported to play a role in 30 human cardiac diseases. Our aim was to study miRNA network properties like hubness and preferred associations, using data mining, network graph theory and statistical analysis. A total of 16 miRNAs were found to have a disease node connectivity of >5 edges (i.e., they were linked to more than 5 diseases) and were considered hubs in the miRNAcardiac disease network. Alternatively, when diseases were considered as hubs, >10 of miRNAs showed up on each ‘disease hub node’. Of all the miRNAs associated with diseases, 19 miRNAs (19/24= 79.1% of upregulated events) were found to be upregulated in atherosclerosis. The data suggest micro RNAs as early stage biological markers in cardiac conditions with potential towards microRNA based therapeutics. PMID:28479745

4. Computational and experimental research on infrared trace by human being contact

Energy Technology Data Exchange (ETDEWEB)

Xiong Zonglong; Yang Kuntao; Ding Wenxiu; Zhang Nanyangsheng; Zheng Wenheng

2010-06-20

The indoor detection of the human body's thermal trace plays an important role in the fields of infrared detecting, scouting, infrared camouflage, and infrared rescuing and tracking. Currently, quantitative description and analysis for this technology are lacking due to the absence of human infrared radiation analysis. To solve this problem, we study the heating and cooling process by observing body contact and removal on an object, respectively. Through finite-element simulation and carefully designed experiments, an analytical model of the infrared trace of body contact is developed based on infrared physics and heat transfer theory. Using this model, the impact of body temperature on material thermal parameters is investigated. The sensitivity of material thermal parameters, the thermal distribution, and the changes of the thermograph's contrast are then found and analyzed. Excellent matching results achieved between the simulation and the experiments demonstrate the strong impact of temperature on material thermal parameters. Conclusively, the new model, simulation, and experimental results are beneficial to the future development and implementation of infrared trace technology.

5. Computational and experimental research on infrared trace by human being contact

International Nuclear Information System (INIS)

Xiong Zonglong; Yang Kuntao; Ding Wenxiu; Zhang Nanyangsheng; Zheng Wenheng

2010-01-01

The indoor detection of the human body's thermal trace plays an important role in the fields of infrared detecting, scouting, infrared camouflage, and infrared rescuing and tracking. Currently, quantitative description and analysis for this technology are lacking due to the absence of human infrared radiation analysis. To solve this problem, we study the heating and cooling process by observing body contact and removal on an object, respectively. Through finite-element simulation and carefully designed experiments, an analytical model of the infrared trace of body contact is developed based on infrared physics and heat transfer theory. Using this model, the impact of body temperature on material thermal parameters is investigated. The sensitivity of material thermal parameters, the thermal distribution, and the changes of the thermograph's contrast are then found and analyzed. Excellent matching results achieved between the simulation and the experiments demonstrate the strong impact of temperature on material thermal parameters. Conclusively, the new model, simulation, and experimental results are beneficial to the future development and implementation of infrared trace technology.

6. Analysis of Ion Currents Contribution to Repolarization in Human Heart Failure Using Computer Models

Energy Technology Data Exchange (ETDEWEB)

Marotta, F.; Paci, M.A.; Severi, S.; Trenor, B.

2016-07-01

The mechanisms underlying repolarization of the ventricular action potential (AP) are subject of research for anti-arrhythmic drugs. In fact, the prolongation of the AP occurs in several conditions of heart disease, such as heart failure, a major problem precursor for serious arrhythmias. In this study, we investigated the phenomena of repolarization reserve, defined as the capacity of the cell to repolarize in case of a functional loss, and the all-or-none repolarization, which depends on the delicate balance of inward and outward currents in the different phases of the AP, under conditions of human heart failure (HF). To simulate HF conditions, the O'Hara et al. human AP model was modified and specific protocols for all-or-none repolarization were applied. Our results show that in the early repolarization the threshold for all-or-none repolarization is not altered in HF even if a decrease in potassium currents can be observed. To quantify the contribution of the individual ion currents to HF induced AP prolongation, we used a novel piecewise-linear approximation approach proposed by Paci et al. In particular, INaL and ICaL are the main responsible for APD prolongation due to HF (85 and 35 ms respectively). Our results highlight this novel algorithm as a powerful tool to have a more complete picture of the complex ionic mechanisms underlying this disease and confirm the important role of the late sodium current in HF repolarization. (Author)

7. Variation in the human ribs geometrical properties and mechanical response based on X-ray computed tomography images resolution.

Science.gov (United States)

Perz, Rafał; Toczyski, Jacek; Subit, Damien

2015-01-01

Computational models of the human body are commonly used for injury prediction in automobile safety research. To create these models, the geometry of the human body is typically obtained from segmentation of medical images such as computed tomography (CT) images that have a resolution between 0.2 and 1mm/pixel. While the accuracy of the geometrical and structural information obtained from these images depend greatly on their resolution, the effect of image resolution on the estimation of the ribs geometrical properties has yet to be established. To do so, each of the thirty-four sections of ribs obtained from a Post Mortem Human Surrogate (PMHS) was imaged using three different CT modalities: standard clinical CT (clinCT), high resolution clinical CT (HRclinCT), and microCT. The images were processed to estimate the rib cross-section geometry and mechanical properties, and the results were compared to those obtained from the microCT images by computing the 'deviation factor', a metric that quantifies the relative difference between results obtained from clinCT and HRclinCT to those obtained from microCT. Overall, clinCT images gave a deviation greater than 100%, and were therefore deemed inadequate for the purpose of this study. HRclinCT overestimated the rib cross-sectional area by 7.6%, the moments of inertia by about 50%, and the cortical shell area by 40.2%, while underestimating the trabecular area by 14.7%. Next, a parametric analysis was performed to quantify how the variations in the estimate of the geometrical properties affected the rib predicted mechanical response under antero-posterior loading. A variation of up to 45% for the predicted peak force and up to 50% for the predicted stiffness was observed. These results provide a quantitative estimate of the sensitivity of the response of the FE model to the resolution of the images used to generate it. They also suggest that a correction factor could be derived from the comparison between microCT and

8. Flow velocity-driven differentiation of human mesenchymal stromal cells in silk fibroin scaffolds: A combined experimental and computational approach.

Directory of Open Access Journals (Sweden)

Jolanda Rita Vetsch

Full Text Available Mechanical loading plays a major role in bone remodeling and fracture healing. Mimicking the concept of mechanical loading of bone has been widely studied in bone tissue engineering by perfusion cultures. Nevertheless, there is still debate regarding the in-vitro mechanical stimulation regime. This study aims at investigating the effect of two different flow rates (vlow = 0.001m/s and vhigh = 0.061m/s on the growth of mineralized tissue produced by human mesenchymal stromal cells cultured on 3-D silk fibroin scaffolds. The flow rates applied were chosen to mimic the mechanical environment during early fracture healing or during bone remodeling, respectively. Scaffolds cultured under static conditions served as a control. Time-lapsed micro-computed tomography showed that mineralized extracellular matrix formation was completely inhibited at vlow compared to vhigh and the static group. Biochemical assays and histology confirmed these results and showed enhanced osteogenic differentiation at vhigh whereas the amount of DNA was increased at vlow. The biological response at vlow might correspond to the early stage of fracture healing, where cell proliferation and matrix production is prominent. Visual mapping of shear stresses, simulated by computational fluid dynamics, to 3-D micro-computed tomography data revealed that shear stresses up to 0.39mPa induced a higher DNA amount and shear stresses between 0.55mPa and 24mPa induced osteogenic differentiation. This study demonstrates the feasibility to drive cell behavior of human mesenchymal stromal cells by the flow velocity applied in agreement with mechanical loading mimicking early fracture healing (vlow or bone remodeling (vhigh. These results can be used in the future to tightly control the behavior of human mesenchymal stromal cells towards proliferation or differentiation. Additionally, the combination of experiment and simulation presented is a strong tool to link biological responses to

9. Virtual Environment Computer Simulations to Support Human Factors Engineering and Operations Analysis for the RLV Program

Science.gov (United States)

Lunsford, Myrtis Leigh

1998-01-01

The Army-NASA Virtual Innovations Laboratory (ANVIL) was recently created to provide virtual reality tools for performing Human Engineering and operations analysis for both NASA and the Army. The author's summer research project consisted of developing and refining these tools for NASA's Reusable Launch Vehicle (RLV) program. Several general simulations were developed for use by the ANVIL for the evaluation of the X34 Engine Changeout procedure. These simulations were developed with the software tool dVISE 4.0.0 produced by Division Inc. All software was run on an SGI Indigo2 High Impact. This paper describes the simulations, various problems encountered with the simulations, other summer activities, and possible work for the future. We first begin with a brief description of virtual reality systems.

10. Nuclear power plant human computer interface design incorporating console simulation, operations personnel, and formal evaluation techniques

International Nuclear Information System (INIS)

Chavez, C.; Edwards, R.M.; Goldberg, J.H.

1993-01-01

New CRT-based information displays which enhance the human machine interface are playing a very important role and are being increasingly used in control rooms since they present a higher degree of flexibility compared to conventional hardwired instrumentation. To prototype a new console configuration and information display system at the Experimental Breeder Reactor II (EBR-II), an iterative process of console simulation and evaluation involving operations personnel is being pursued. Entire panels including selector switches and information displays are simulated and driven by plant dynamical simulations with realistic responses that reproduce the actual cognitive and physical environment. Careful analysis and formal evaluation of operator interaction while using the simulated console will be conducted to determine underlying principles for effective control console design for this particular group of operation personnel. Additional iterations of design, simulation, and evaluation will then be conducted as necessary

11. Computer-based diagnostic monitoring to enhance the human-machine interface of complex processes

International Nuclear Information System (INIS)

Kim, I.S.

1992-02-01

There is a growing interest in introducing an automated, on-line, diagnostic monitoring function into the human-machine interfaces (HMIs) or control rooms of complex process plants. The design of such a system should be properly integrated with other HMI systems in the control room, such as the alarms system or the Safety Parameter Display System (SPDS). This paper provides a conceptual foundation for the development of a Plant-wide Diagnostic Monitoring System (PDMS), along with functional requirements for the system and other advanced HMI systems. Insights are presented into the design of an efficient and robust PDMS, which were gained from a critical review of various methodologies developed in the nuclear power industry, the chemical process industry, and the space technological community

12. A new model of 18F-fluoride kinetics in humans. Simulation by analogue computer

International Nuclear Information System (INIS)

Charkes, N.D.; Philips, C.M.

1977-01-01

The authors have developed a 5-compartment model of short-term fluoride bone kinetics in adult humans which fits published observations made on plasma concentration, bone uptake and urinary excretion. The compartments are physiologically meaningful. When changes were simulated in systemic blood flow, a logarithmic relationship was found between early bone uptake and flow over a limited range of flows, but no simple relationship 1-2 hours after injection, when scans are usually made. Bone appears to behave as if it were almost saturated with fluoride at that time. The authors' observations are at variance with certain of the theoretical requirements for measuring skeletal blood flow by the extraction method and suggest that measurements made in the past may have been erroneous. (author)

13. Using virtual humans and computer animations to learn complex motor skills: a case study in karate

Directory of Open Access Journals (Sweden)

Spanlang Bernhard

2011-12-01

Full Text Available Learning motor skills is a complex task involving a lot of cognitive issues. One of the main issues consists in retrieving the relevant information from the learning environment. In a traditional learning situation, a teacher gives oral explanations and performs actions to provide the learner with visual examples. Using virtual reality (VR as a tool for learning motor tasks is promising. However, it raises questions about the type of information this kind of environments can offer. In this paper, we propose to analyze the impact of virtual humans on the perception of the learners. As a case study, we propose to apply this research problem to karate gestures. The results of this study show no signiﬁcant difference on the after training performance of learners confronted to three different learning environments (traditional group, video and VR.

14. New reconstruction algorithm for digital breast tomosynthesis: better image quality for humans and computers.

Science.gov (United States)

Rodriguez-Ruiz, Alejandro; Teuwen, Jonas; Vreemann, Suzan; Bouwman, Ramona W; van Engen, Ruben E; Karssemeijer, Nico; Mann, Ritse M; Gubern-Merida, Albert; Sechopoulos, Ioannis

2017-01-01

Background The image quality of digital breast tomosynthesis (DBT) volumes depends greatly on the reconstruction algorithm. Purpose To compare two DBT reconstruction algorithms used by the Siemens Mammomat Inspiration system, filtered back projection (FBP), and FBP with iterative optimizations (EMPIRE), using qualitative analysis by human readers and detection performance of machine learning algorithms. Material and Methods Visual grading analysis was performed by four readers specialized in breast imaging who scored 100 cases reconstructed with both algorithms (70 lesions). Scoring (5-point scale: 1 = poor to 5 = excellent quality) was performed on presence of noise and artifacts, visualization of skin-line and Cooper's ligaments, contrast, and image quality, and, when present, lesion visibility. In parallel, a three-dimensional deep-learning convolutional neural network (3D-CNN) was trained (n = 259 patients, 51 positives with BI-RADS 3, 4, or 5 calcifications) and tested (n = 46 patients, nine positives), separately with FBP and EMPIRE volumes, to discriminate between samples with and without calcifications. The partial area under the receiver operating characteristic curve (pAUC) of each 3D-CNN was used for comparison. Results EMPIRE reconstructions showed better contrast (3.23 vs. 3.10, P = 0.010), image quality (3.22 vs. 3.03, P algorithm provides DBT volumes with better contrast and image quality, fewer artifacts, and improved visibility of calcifications for human observers, as well as improved detection performance with deep-learning algorithms.

15. A common currency for the computation of motivational values in the human striatum

Science.gov (United States)

Li, Yansong; Dreher, Jean-Claude

2015-01-01

Reward comparison in the brain is thought to be achieved through the use of a ‘common currency’, implying that reward value representations are computed on a unique scale in the same brain regions regardless of the reward type. Although such a mechanism has been identified in the ventro-medial prefrontal cortex and ventral striatum in the context of decision-making, it is less clear whether it similarly applies to non-choice situations. To answer this question, we scanned 38 participants with fMRI while they were presented with single cues predicting either monetary or erotic rewards, without the need to make a decision. The ventral striatum was the main brain structure to respond to both cues while showing increasing activity with increasing expected reward intensity. Most importantly, the relative response of the striatum to monetary vs erotic cues was correlated with the relative motivational value of these rewards as inferred from reaction times. Similar correlations were observed in a fronto-parietal network known to be involved in attentional focus and motor readiness. Together, our results suggest that striatal reward value signals not only obey to a common currency mechanism in the absence of choice but may also serve as an input to adjust motivated behaviour accordingly. PMID:24837478

16. Design of the human computer interface on the telerobotic small emplacement excavator

International Nuclear Information System (INIS)

Thompson, D.H.; Killough, S.M.; Burks, B.L.; Draper, J.V.

1995-01-01

The small emplacement excavator (SEE) is a ruggedized military vehicle with backhoe and front loader used by the U.S. Army for explosive ordinance disposal (EOD) and general utility excavation activities. This project resulted from a joint need in the U.S. Department of Energy (DOE) for a remote controlled excavator for buried waste operations and the U.S. Department of Defense for remote EOD operations. To evaluate the feasibility of removing personnel from the SEE vehicle during high-risk excavation tasks, a development and demonstration project was initiated. Development of a telerobotic SEE (TSEE) was performed by the Oak Ridge National Laboratory in a project funded jointly by the U.S. Army and the DOE. The TSEE features teleoperated driving, a telerobotic backhoe with four degrees of freedom, and a teleoperated front loader with two degrees of freedom on the bucket. Remote capabilities include driving (forward, reverse, brake, steering), power takeoff shifting to enable digging modes, deploying stabilizers, excavation, and computer system booting

17. The role of the oximes HI-6 and HS-6 inside human acetylcholinesterase inhibited with nerve agents: a computational study.

Science.gov (United States)

Cuya, Teobaldo; Gonçalves, Arlan da Silva; da Silva, Jorge Alberto Valle; Ramalho, Teodorico C; Kuca, Kamil; C C França, Tanos

2017-10-27

The oximes 4-carbamoyl-1-[({2-[(E)-(hydroxyimino) methyl] pyridinium-1-yl} methoxy) methyl] pyridinium (known as HI-6) and 3-carbamoyl-1-[({2-[(E)-(hydroxyimino) methyl] pyridinium-1-yl} methoxy) methyl] pyridinium (known as HS-6) are isomers differing from each other only by the position of the carbamoyl group on the pyridine ring. However, this slight difference was verified to be responsible for big differences in the percentual of reactivation of acetylcholinesterase (AChE) inhibited by the nerve agents tabun, sarin, cyclosarin, and VX. In order to try to find out the reason for this, a computational study involving molecular docking, molecular dynamics, and binding energies calculations, was performed on the binding modes of HI-6 and HS-6 on human AChE (HssAChE) inhibited by those nerve agents.

18. A computational method for identification of vaccine targets from protein regions of conserved human leukocyte antigen binding

DEFF Research Database (Denmark)

Olsen, Lars Rønn; Simon, Christian; Kudahl, Ulrich J.

2015-01-01

Background: Computational methods for T cell-based vaccine target discovery focus on selection of highly conserved peptides identified across pathogen variants, followed by prediction of their binding of human leukocyte antigen molecules. However, experimental studies have shown that T cells often...... target diverse regions in highly variable viral pathogens and this diversity may need to be addressed through redefinition of suitable peptide targets. Methods: We have developed a method for antigen assessment and target selection for polyvalent vaccines, with which we identified immune epitopes from...... variable regions, where all variants bind HLA. These regions, although variable, can thus be considered stable in terms of HLA binding and represent valuable vaccine targets. Results: We applied this method to predict CD8+ T-cell targets in influenza A H7N9 hemagglutinin and significantly increased...

19. The effect of reinforcer magnitude on probability and delay discounting of experienced outcomes in a computer game task in humans.

Science.gov (United States)

Greenhow, Anna K; Hunt, Maree J; Macaskill, Anne C; Harper, David N

2015-09-01

Delay and uncertainty of receipt both reduce the subjective value of reinforcers. Delay has a greater impact on the subjective value of smaller reinforcers than of larger ones while the reverse is true for uncertainty. We investigated the effect of reinforcer magnitude on discounting of delayed and uncertain reinforcers using a novel approach: embedding relevant choices within a computer game. Participants made repeated choices between smaller, certain, immediate outcomes and larger, but delayed or uncertain outcomes while experiencing the result of each choice. Participants' choices were generally well described by the hyperbolic discounting function. Smaller numbers of points were discounted more steeply than larger numbers as a function of delay but not probability. The novel experiential choice task described is a promising approach to investigating both delay and probability discounting in humans. © Society for the Experimental Analysis of Behavior.

20. Computer-assisted machine-to-human protocols for authentication of a RAM-based embedded system

Science.gov (United States)

Idrissa, Abdourhamane; Aubert, Alain; Fournel, Thierry

2012-06-01

Mobile readers used for optical identification of manufactured products can be tampered in different ways: with hardware Trojan or by powering up with fake configuration data. How a human verifier can authenticate the reader to be handled for goods verification? In this paper, two cryptographic protocols are proposed to achieve the verification of a RAM-based system through a trusted auxiliary machine. Such a system is assumed to be composed of a RAM memory and a secure block (in practice a FPGA or a configurable microcontroller). The system is connected to an input/output interface and contains a Non Volatile Memory where the configuration data are stored. Here, except the secure block, all the blocks are exposed to attacks. At the registration stage of the first protocol, the MAC of both the secret and the configuration data, denoted M0 is computed by the mobile device without saving it then transmitted to the user in a secure environment. At the verification stage, the reader which is challenged with nonces sendsMACs / HMACs of both nonces and MAC M0 (to be recomputed), keyed with the secret. These responses are verified by the user through a trusted auxiliary MAC computer unit. Here the verifier does not need to tract a (long) list of challenge / response pairs. This makes the protocol tractable for a human verifier as its participation in the authentication process is increased. In counterpart the secret has to be shared with the auxiliary unit. This constraint is relaxed in a second protocol directly derived from Fiat-Shamir's scheme.