WorldWideScience

Sample records for human centered computing

  1. Human-centered Computing: Toward a Human Revolution

    OpenAIRE

    Jaimes, Alejandro; Gatica-Perez, Daniel; Sebe, Nicu; Huang, Thomas S.

    2007-01-01

    Human-centered computing studies the design, development, and deployment of mixed-initiative human-computer systems. HCC is emerging from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts.

  2. Supporting Human Activities - Exploring Activity-Centered Computing

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob

    2002-01-01

    In this paper we explore an activity-centered computing paradigm that is aimed at supporting work processes that are radically different from the ones known from office work. Our main inspiration is healthcare work that is characterized by an extreme degree of mobility, many interruptions, ad-hoc...

  3. Human-Centered Design of Human-Computer-Human Dialogs in Aerospace Systems

    Science.gov (United States)

    Mitchell, Christine M.

    1998-01-01

    A series of ongoing research programs at Georgia Tech established a need for a simulation support tool for aircraft computer-based aids. This led to the design and development of the Georgia Tech Electronic Flight Instrument Research Tool (GT-EFIRT). GT-EFIRT is a part-task flight simulator specifically designed to study aircraft display design and single pilot interaction. ne simulator, using commercially available graphics and Unix workstations, replicates to a high level of fidelity the Electronic Flight Instrument Systems (EFIS), Flight Management Computer (FMC) and Auto Flight Director System (AFDS) of the Boeing 757/767 aircraft. The simulator can be configured to present information using conventional looking B757n67 displays or next generation Primary Flight Displays (PFD) such as found on the Beech Starship and MD-11.

  4. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  5. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  6. Rethinking Human-Centered Computing: Finding the Customer and Negotiated Interactions at the Airport

    Science.gov (United States)

    Wales, Roxana; O'Neill, John; Mirmalek, Zara

    2003-01-01

    The breakdown in the air transportation system over the past several years raises an interesting question for researchers: How can we help improve the reliability of airline operations? In offering some answers to this question, we make a statement about Huuman-Centered Computing (HCC). First we offer the definition that HCC is a multi-disciplinary research and design methodology focused on supporting humans as they use technology by including cognitive and social systems, computational tools and the physical environment in the analysis of organizational systems. We suggest that a key element in understanding organizational systems is that there are external cognitive and social systems (customers) as well as internal cognitive and social systems (employees) and that they interact dynamically to impact the organization and its work. The design of human-centered intelligent systems must take this outside-inside dynamic into account. In the past, the design of intelligent systems has focused on supporting the work and improvisation requirements of employees but has often assumed that customer requirements are implicitly satisfied by employee requirements. Taking a customer-centric perspective provides a different lens for understanding this outside-inside dynamic, the work of the organization and the requirements of both customers and employees In this article we will: 1) Demonstrate how the use of ethnographic methods revealed the important outside-inside dynamic in an airline, specifically the consequential relationship between external customer requirements and perspectives and internal organizational processes and perspectives as they came together in a changing environment; 2) Describe how taking a customer centric perspective identifies places where the impact of the outside-inside dynamic is most critical and requires technology that can be adaptive; 3) Define and discuss the place of negotiated interactions in airline operations, identifying how these

  7. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  8. COMPUTATIONAL SCIENCE CENTER

    International Nuclear Information System (INIS)

    DAVENPORT, J.

    2006-01-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  9. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  10. Human factors in computing systems: focus on patient-centered health communication at the ACM SIGCHI conference.

    Science.gov (United States)

    Wilcox, Lauren; Patel, Rupa; Chen, Yunan; Shachak, Aviv

    2013-12-01

    Health Information Technologies, such as electronic health records (EHR) and secure messaging, have already transformed interactions among patients and clinicians. In addition, technologies supporting asynchronous communication outside of clinical encounters, such as email, SMS, and patient portals, are being increasingly used for follow-up, education, and data reporting. Meanwhile, patients are increasingly adopting personal tools to track various aspects of health status and therapeutic progress, wishing to review these data with clinicians during consultations. These issues have drawn increasing interest from the human-computer interaction (HCI) community, with special focus on critical challenges in patient-centered interactions and design opportunities that can address these challenges. We saw this community presenting and interacting at the ACM SIGCHI 2013, Conference on Human Factors in Computing Systems, (also known as CHI), held April 27-May 2nd, 2013 at the Palais de Congrès de Paris in France. CHI 2013 featured many formal avenues to pursue patient-centered health communication: a well-attended workshop, tracks of original research, and a lively panel discussion. In this report, we highlight these events and the main themes we identified. We hope that it will help bring the health care communication and the HCI communities closer together. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  12. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  13. Center for Advanced Computational Technology

    Science.gov (United States)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  14. Human Computer Music Performance

    OpenAIRE

    Dannenberg, Roger B.

    2012-01-01

    Human Computer Music Performance (HCMP) is the study of music performance by live human performers and real-time computer-based performers. One goal of HCMP is to create a highly autonomous artificial performer that can fill the role of a human, especially in a popular music setting. This will require advances in automated music listening and understanding, new representations for music, techniques for music synchronization, real-time human-computer communication, music generation, sound synt...

  15. Human Performance Research Center

    Data.gov (United States)

    Federal Laboratory Consortium — Biochemistry:Improvements in energy metabolism, muscular strength and endurance capacity have a basis in biochemical and molecular adaptations within the human body....

  16. Ubiquitous human computing.

    Science.gov (United States)

    Zittrain, Jonathan

    2008-10-28

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a drawing pin and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This paper explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

  17. When computers were human

    CERN Document Server

    Grier, David Alan

    2013-01-01

    Before Palm Pilots and iPods, PCs and laptops, the term ""computer"" referred to the people who did scientific calculations by hand. These workers were neither calculating geniuses nor idiot savants but knowledgeable people who, in other circumstances, might have become scientists in their own right. When Computers Were Human represents the first in-depth account of this little-known, 200-year epoch in the history of science and technology. Beginning with the story of his own grandmother, who was trained as a human computer, David Alan Grier provides a poignant introduction to the wider wo

  18. Center for computer security: Computer Security Group conference. Summary

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-06-01

    Topics covered include: computer security management; detection and prevention of computer misuse; certification and accreditation; protection of computer security, perspective from a program office; risk analysis; secure accreditation systems; data base security; implementing R and D; key notarization system; DOD computer security center; the Sandia experience; inspector general's report; and backup and contingency planning. (GHT)

  19. Human-Centered Design Capability

    Science.gov (United States)

    Fitts, David J.; Howard, Robert

    2009-01-01

    For NASA, human-centered design (HCD) seeks opportunities to mitigate the challenges of living and working in space in order to enhance human productivity and well-being. Direct design participation during the development stage is difficult, however, during project formulation, a HCD approach can lead to better more cost-effective products. HCD can also help a program enter the development stage with a clear vision for product acquisition. HCD tools for clarifying design intent are listed. To infuse HCD into the spaceflight lifecycle the Space and Life Sciences Directorate developed the Habitability Design Center. The Center has collaborated successfully with program and project design teams and with JSC's Engineering Directorate. This presentation discusses HCD capabilities and depicts the Center's design examples and capabilities.

  20. Handbook of human computation

    CERN Document Server

    Michelucci, Pietro

    2013-01-01

    This volume addresses the emerging area of human computation, The chapters, written by leading international researchers, explore existing and future opportunities to combine the respective strengths of both humans and machines in order to create powerful problem-solving capabilities. The book bridges scientific communities, capturing and integrating the unique perspective and achievements of each. It coalesces contributions from industry and across related disciplines in order to motivate, define, and anticipate the future of this exciting new frontier in science and cultural evolution. Reade

  1. Engineering computations at the national magnetic fusion energy computer center

    International Nuclear Information System (INIS)

    Murty, S.

    1983-01-01

    The National Magnetic Fusion Energy Computer Center (NMFECC) was established by the U.S. Department of Energy's Division of Magnetic Fusion Energy (MFE). The NMFECC headquarters is located at Lawrence Livermore National Laboratory. Its purpose is to apply large-scale computational technology and computing techniques to the problems of controlled thermonuclear research. In addition to providing cost effective computing services, the NMFECC also maintains a large collection of computer codes in mathematics, physics, and engineering that is shared by the entire MFE research community. This review provides a broad perspective of the NMFECC, and a list of available codes at the NMFECC for engineering computations is given

  2. Activity report of Computing Research Center

    Energy Technology Data Exchange (ETDEWEB)

    1997-07-01

    On April 1997, National Laboratory for High Energy Physics (KEK), Institute of Nuclear Study, University of Tokyo (INS), and Meson Science Laboratory, Faculty of Science, University of Tokyo began to work newly as High Energy Accelerator Research Organization after reconstructing and converting their systems, under aiming at further development of a wide field of accelerator science using a high energy accelerator. In this Research Organization, Applied Research Laboratory is composed of four Centers to execute assistance of research actions common to one of the Research Organization and their relating research and development (R and D) by integrating the present four centers and their relating sections in Tanashi. What is expected for the assistance of research actions is not only its general assistance but also its preparation and R and D of a system required for promotion and future plan of the research. Computer technology is essential to development of the research and can communize for various researches in the Research Organization. On response to such expectation, new Computing Research Center is required for promoting its duty by coworking and cooperating with every researchers at a range from R and D on data analysis of various experiments to computation physics acting under driving powerful computer capacity such as supercomputer and so forth. Here were described on report of works and present state of Data Processing Center of KEK at the first chapter and of the computer room of INS at the second chapter and on future problems for the Computing Research Center. (G.K.)

  3. Academic Specialization and Contemporary University Humanities Centers

    Science.gov (United States)

    Brownley, Martine W.

    2012-01-01

    Given the academic specialization endemic today in humanities disciplines, some of the most important work of humanities centers has become promoting education about the humanities in general. After charting the rise of humanities centers in the US, three characteristics of centers that enable their advancement of larger concerns of the humanities…

  4. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  5. AHPCRC - Army High Performance Computing Research Center

    Science.gov (United States)

    2010-01-01

    computing. Of particular interest is the ability of a distrib- uted jamming network (DJN) to jam signals in all or part of a sensor or communications net...and reasoning, assistive technologies. FRIEDRICH (FRITZ) PRINZ Finmeccanica Professor of Engineering, Robert Bosch Chair, Department of Engineering...High Performance Computing Research Center www.ahpcrc.org BARBARA BRYAN AHPCRC Research and Outreach Manager, HPTi (650) 604-3732 bbryan@hpti.com Ms

  6. Senior Computational Scientist | Center for Cancer Research

    Science.gov (United States)

    The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP),

  7. Making IBM's Computer, Watson, Human

    Science.gov (United States)

    Rachlin, Howard

    2012-01-01

    This essay uses the recent victory of an IBM computer (Watson) in the TV game, "Jeopardy," to speculate on the abilities Watson would need, in addition to those it has, to be human. The essay's basic premise is that to be human is to behave as humans behave and to function in society as humans function. Alternatives to this premise are considered…

  8. A Computer Learning Center for Environmental Sciences

    Science.gov (United States)

    Mustard, John F.

    2000-01-01

    In the fall of 1998, MacMillan Hall opened at Brown University to students. In MacMillan Hall was the new Computer Learning Center, since named the EarthLab which was outfitted with high-end workstations and peripherals primarily focused on the use of remotely sensed and other spatial data in the environmental sciences. The NASA grant we received as part of the "Centers of Excellence in Applications of Remote Sensing to Regional and Global Integrated Environmental Assessments" was the primary source of funds to outfit this learning and research center. Since opening, we have expanded the range of learning and research opportunities and integrated a cross-campus network of disciplines who have come together to learn and use spatial data of all kinds. The EarthLab also forms a core of undergraduate, graduate, and faculty research on environmental problems that draw upon the unique perspective of remotely sensed data. Over the last two years, the Earthlab has been a center for research on the environmental impact of water resource use in and regions, impact of the green revolution on forest cover in India, the design of forest preserves in Vietnam, and detailed assessments of the utility of thermal and hyperspectral data for water quality analysis. It has also been used extensively for local environmental activities, in particular studies on the impact of lead on the health of urban children in Rhode Island. Finally, the EarthLab has also served as a key educational and analysis center for activities related to the Brown University Affiliated Research Center that is devoted to transferring university research to the private sector.

  9. The Computational Physics Program of the national MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs

  10. Human Centered Hardware Modeling and Collaboration

    Science.gov (United States)

    Stambolian Damon; Lawrence, Brad; Stelges, Katrine; Henderson, Gena

    2013-01-01

    In order to collaborate engineering designs among NASA Centers and customers, to in clude hardware and human activities from multiple remote locations, live human-centered modeling and collaboration across several sites has been successfully facilitated by Kennedy Space Center. The focus of this paper includes innovative a pproaches to engineering design analyses and training, along with research being conducted to apply new technologies for tracking, immersing, and evaluating humans as well as rocket, vehic le, component, or faci lity hardware utilizing high resolution cameras, motion tracking, ergonomic analysis, biomedical monitoring, wor k instruction integration, head-mounted displays, and other innovative human-system integration modeling, simulation, and collaboration applications.

  11. A multipurpose computing center with distributed resources

    Science.gov (United States)

    Chudoba, J.; Adam, M.; Adamová, D.; Kouba, T.; Mikula, A.; Říkal, V.; Švec, J.; Uhlířová, J.; Vokáč, P.; Svatoš, M.

    2017-10-01

    The Computing Center of the Institute of Physics (CC IoP) of the Czech Academy of Sciences serves a broad spectrum of users with various computing needs. It runs WLCG Tier-2 center for the ALICE and the ATLAS experiments; the same group of services is used by astroparticle physics projects the Pierre Auger Observatory (PAO) and the Cherenkov Telescope Array (CTA). OSG stack is installed for the NOvA experiment. Other groups of users use directly local batch system. Storage capacity is distributed to several locations. DPM servers used by the ATLAS and the PAO are all in the same server room, but several xrootd servers for the ALICE experiment are operated in the Nuclear Physics Institute in Řež, about 10 km away. The storage capacity for the ATLAS and the PAO is extended by resources of the CESNET - the Czech National Grid Initiative representative. Those resources are in Plzen and Jihlava, more than 100 km away from the CC IoP. Both distant sites use a hierarchical storage solution based on disks and tapes. They installed one common dCache instance, which is published in the CC IoP BDII. ATLAS users can use these resources using the standard ATLAS tools in the same way as the local storage without noticing this geographical distribution. Computing clusters LUNA and EXMAG dedicated to users mostly from the Solid State Physics departments offer resources for parallel computing. They are part of the Czech NGI infrastructure MetaCentrum with distributed batch system based on torque with a custom scheduler. Clusters are installed remotely by the MetaCentrum team and a local contact helps only when needed. Users from IoP have exclusive access only to a part of these two clusters and take advantage of higher priorities on the rest (1500 cores in total), which can also be used by any user of the MetaCentrum. IoP researchers can also use distant resources located in several towns of the Czech Republic with a capacity of more than 12000 cores in total.

  12. Minimal mobile human computer interaction

    NARCIS (Netherlands)

    el Ali, A.

    2013-01-01

    In the last 20 years, the widespread adoption of personal, mobile computing devices in everyday life, has allowed entry into a new technological era in Human Computer Interaction (HCI). The constant change of the physical and social context in a user's situation made possible by the portability of

  13. Human Computation An Integrated Approach to Learning from the Crowd

    CERN Document Server

    Law, Edith

    2011-01-01

    Human computation is a new and evolving research area that centers around harnessing human intelligence to solve computational problems that are beyond the scope of existing Artificial Intelligence (AI) algorithms. With the growth of the Web, human computation systems can now leverage the abilities of an unprecedented number of people via the Web to perform complex computation. There are various genres of human computation applications that exist today. Games with a purpose (e.g., the ESP Game) specifically target online gamers who generate useful data (e.g., image tags) while playing an enjoy

  14. NASA Human Health and Performance Center (NHHPC)

    Science.gov (United States)

    Davis, Jeffery R.

    2010-01-01

    This slide presentation reviews the purpose, potential members and participants of the NASA Human Health and Performance Center (NHHPC). Included in the overview is a brief description of the administration and current activities of the NHHPC.

  15. Human Centered Design and Development for NASA's MerBoard

    Science.gov (United States)

    Trimble, Jay

    2003-01-01

    This viewgraph presentation provides an overview of the design and development process for NASA's MerBoard. These devices are large interactive display screens which can be shown on the user's computer, which will allow scientists in many locations to interpret and evaluate mission data in real-time. These tools are scheduled to be used during the 2003 Mars Exploration Rover (MER) expeditions. Topics covered include: mission overview, Mer Human Centered Computers, FIDO 2001 observations and MerBoard prototypes.

  16. Humans, computers and wizards human (simulated) computer interaction

    CERN Document Server

    Fraser, Norman; McGlashan, Scott; Wooffitt, Robin

    2013-01-01

    Using data taken from a major European Union funded project on speech understanding, the SunDial project, this book considers current perspectives on human computer interaction and argues for the value of an approach taken from sociology which is based on conversation analysis.

  17. NASA Center for Computational Sciences: History and Resources

    Science.gov (United States)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  18. Center for Computing Research Summer Research Proceedings 2015.

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, Andrew Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Parks, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-18

    The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).

  19. Artifical Intelligence for Human Computing

    NARCIS (Netherlands)

    Huang, Th.S.; Nijholt, Antinus; Pantic, Maja; Pentland, A.; Unknown, [Unknown

    2007-01-01

    This book constitutes the thoroughly refereed post-proceedings of two events discussing AI for Human Computing: one Special Session during the Eighth International ACM Conference on Multimodal Interfaces (ICMI 2006), held in Banff, Canada, in November 2006, and a Workshop organized in conjunction

  20. Bioinformatics and Computational Core Technology Center

    Data.gov (United States)

    Federal Laboratory Consortium — SERVICES PROVIDED BY THE COMPUTER CORE FACILITYEvaluation, purchase, set up, and maintenance of the computer hardware and network for the 170 users in the research...

  1. Guest Editorial Special Issue on Human Computing

    NARCIS (Netherlands)

    Pantic, Maja; Santos, E.; Pentland, A.; Nijholt, Antinus

    2009-01-01

    The seven articles in this special issue focus on human computing. Most focus on two challenging issues in human computing, namely, machine analysis of human behavior in group interactions and context-sensitive modeling.

  2. Building the Teraflops/Petabytes Production Computing Center

    International Nuclear Information System (INIS)

    Kramer, William T.C.; Lucas, Don; Simon, Horst D.

    1999-01-01

    In just one decade, the 1990s, supercomputer centers have undergone two fundamental transitions which require rethinking their operation and their role in high performance computing. The first transition in the early to mid-1990s resulted from a technology change in high performance computing architecture. Highly parallel distributed memory machines built from commodity parts increased the operational complexity of the supercomputer center, and required the introduction of intellectual services as equally important components of the center. The second transition is happening in the late 1990s as centers are introducing loosely coupled clusters of SMPs as their premier high performance computing platforms, while dealing with an ever-increasing volume of data. In addition, increasing network bandwidth enables new modes of use of a supercomputer center, in particular, computational grid applications. In this paper we describe what steps NERSC is taking to address these issues and stay at the leading edge of supercomputing centers.; N

  3. Human ear recognition by computer

    CERN Document Server

    Bhanu, Bir; Chen, Hui

    2010-01-01

    Biometrics deals with recognition of individuals based on their physiological or behavioral characteristics. The human ear is a new feature in biometrics that has several merits over the more common face, fingerprint and iris biometrics. Unlike the fingerprint and iris, it can be easily captured from a distance without a fully cooperative subject, although sometimes it may be hidden with hair, scarf and jewellery. Also, unlike a face, the ear is a relatively stable structure that does not change much with the age and facial expressions. ""Human Ear Recognition by Computer"" is the first book o

  4. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2002-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  5. THE CENTER FOR DATA INTENSIVE COMPUTING

    International Nuclear Information System (INIS)

    GLIMM, J.

    2001-01-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook

  6. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2001-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  7. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2003-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  8. Toward human-centered algorithm design

    Directory of Open Access Journals (Sweden)

    Eric PS Baumer

    2017-07-01

    Full Text Available As algorithms pervade numerous facets of daily life, they are incorporated into systems for increasingly diverse purposes. These systems’ results are often interpreted differently by the designers who created them than by the lay persons who interact with them. This paper offers a proposal for human-centered algorithm design, which incorporates human and social interpretations into the design process for algorithmically based systems. It articulates three specific strategies for doing so: theoretical, participatory, and speculative. Drawing on the author’s work designing and deploying multiple related systems, the paper provides a detailed example of using a theoretical approach. It also discusses findings pertinent to participatory and speculative design approaches. The paper addresses both strengths and challenges for each strategy in helping to center the process of designing algorithmically based systems around humans.

  9. Computational Physics Program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1984-12-01

    The principal objective of the computational physics group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. A summary of the groups activities is presented, including computational studies in MHD equilibria and stability, plasma transport, Fokker-Planck, and efficient numerical and programming algorithms. References are included

  10. South Atlantic Humanities Center Seminars -- Spring 2004

    OpenAIRE

    Elliott, Jean

    2004-01-01

    The South Atlantic Humanities Center (SAHC) at Virginia Tech is sponsoring several seminars this spring.•À_•À_ SAHC is a partnership of the Virginia Foundation for the Humanities, Virginia Tech, and the University of Virginia.•À_•À_ SAHC focuses on the U.S. South Atlantic from a regional and transatlantic perspective. It explores and preserves the rich heritage of a region stretching from Virginia to the Virgin Islands.•À_•À_ It engages artists and performers, writers and filmmakers, teachers...

  11. Building a High Performance Computing Infrastructure for Novosibirsk Scientific Center

    International Nuclear Information System (INIS)

    Adakin, A; Chubarov, D; Nikultsev, V; Belov, S; Kaplin, V; Sukharev, A; Zaytsev, A; Kalyuzhny, V; Kuchin, N; Lomakin, S

    2011-01-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies (ICT), and Institute of Computational Mathematics and Mathematical Geophysics (ICM and MG). Since each institute has specific requirements on the architecture of the computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for the particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM and MG), and a Grid Computing Facility of BINP. Recently a dedicated optical network with the initial bandwidth of 10 Gbps connecting these three facilities was built in order to make it possible to share the computing resources among the research communities of participating institutes, thus providing a common platform for building the computing infrastructure for various scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technologies based on XEN and KVM platforms. The solution implemented was tested thoroughly within the computing environment of KEDR detector experiment which is being carried out at BINP, and foreseen to be applied to the use cases of other HEP experiments in the upcoming future.

  12. Computational-physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1982-02-01

    The computational physics group is ivolved in several areas of fusion research. One main area is the application of multidimensional Fokker-Planck, transport and combined Fokker-Planck/transport codes to both toroidal and mirror devices. Another major area is the investigation of linear and nonlinear resistive magnetohydrodynamics in two and three dimensions, with applications to all types of fusion devices. The MHD work is often coupled with the task of numerically generating equilibria which model experimental devices. In addition to these computational physics studies, investigations of more efficient numerical algorithms are being carried out

  13. Lecture 4: Cloud Computing in Large Computer Centers

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    This lecture will introduce Cloud Computing concepts identifying and analyzing its characteristics, models, and applications. Also, you will learn how CERN built its Cloud infrastructure and which tools are been used to deploy and manage it. About the speaker: Belmiro Moreira is an enthusiastic software engineer passionate about the challenges and complexities of architecting and deploying Cloud Infrastructures in ve...

  14. The computational physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1988-01-01

    The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generation of supercomputers. The computational physics group is involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to compact toroids. Another major area is the investigation of kinetic instabilities using a 3-D particle code. This work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence are being examined. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers

  15. Wings: A New Paradigm in Human-Centered Design

    Science.gov (United States)

    Schutte, Paul C.

    1997-01-01

    Many aircraft accidents/incidents investigations cite crew error as a causal factor (Boeing Commercial Airplane Group 1996). Human factors experts suggest that crew error has many underlying causes and should be the start of an accident investigation and not the end. One of those causes, the flight deck design, is correctable. If a flight deck design does not accommodate the human's unique abilities and deficits, crew error may simply be the manifestation of this mismatch. Pilots repeatedly report that they are "behind the aircraft" , i.e., they do not know what the automated aircraft is doing or how the aircraft is doing it until after the fact. Billings (1991) promotes the concept of "human-centered automation"; calling on designers to allocate appropriate control and information to the human. However, there is much ambiguity regarding what it mean's to be human-centered. What often are labeled as "human-centered designs" are actually designs where a human factors expert has been involved in the design process or designs where tests have shown that humans can operate them. While such designs may be excellent, they do not represent designs that are systematically produced according to some set of prescribed methods and procedures. This paper describes a design concept, called Wings, that offers a clearer definition for human-centered design. This new design concept is radically different from current design processes in that the design begins with the human and uses the human body as a metaphor for designing the aircraft. This is not because the human is the most important part of the aircraft (certainly the aircraft would be useless without lift and thrust), but because he is the least understood, the least programmable, and one of the more critical elements. The Wings design concept has three properties: a reversal in the design process, from aerodynamics-, structures-, and propulsion-centered to truly human-centered; a design metaphor that guides function

  16. Cooperation in human-computer communication

    OpenAIRE

    Kronenberg, Susanne

    2000-01-01

    The goal of this thesis is to simulate cooperation in human-computer communication to model the communicative interaction process of agents in natural dialogs in order to provide advanced human-computer interaction in that coherence is maintained between contributions of both agents, i.e. the human user and the computer. This thesis contributes to certain aspects of understanding and generation and their interaction in the German language. In spontaneous dialogs agents cooperate by the pro...

  17. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    Energy Technology Data Exchange (ETDEWEB)

    Hules, J. [ed.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  18. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  19. Human Computing and Machine Understanding of Human Behavior: A Survey

    NARCIS (Netherlands)

    Pantic, Maja; Pentland, Alex; Nijholt, Antinus; Huang, Thomas; Quek, F.; Yang, Yie

    2006-01-01

    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing, which we will call human computing, should

  20. Human-centered automation of testing, surveillance and maintenance

    International Nuclear Information System (INIS)

    Bhatt, S.C.; Sun, B.K.H.

    1991-01-01

    Manual surveillance and testing of instrumentation, control and protection systems at nuclear power plants involves system and human errors which can lead to substantial plant down time. Frequent manual testing can also contribute significantly to operation and maintenance cost. Automation technology offers potential for prudent applications at the power plant to reduce testing errors and cost. To help address the testing problems and to harness the benefit of automation application, input from utilities is obtained on suitable automation approaches. This paper includes lessens from successful past experience at a few plants where some island of automation exist. The results are summarized as a set of specifications for semi automatic testing. A human-centered automation methodology is proposed with the guidelines for optimal human/computer division of tasks given. Implementation obstacles for significant changes of testing practices are identified and methods acceptable to nuclear power plants for addressing these obstacles have been suggested

  1. Argonne's Laboratory computing center - 2007 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.; Pieper, G. W.

    2008-05-28

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific

  2. Applied Computational Fluid Dynamics at NASA Ames Research Center

    Science.gov (United States)

    Holst, Terry L.; Kwak, Dochan (Technical Monitor)

    1994-01-01

    The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.

  3. Computational geometry lectures at the morningside center of mathematics

    CERN Document Server

    Wang, Ren-Hong

    2003-01-01

    Computational geometry is a borderline subject related to pure and applied mathematics, computer science, and engineering. The book contains articles on various topics in computational geometry, which are based on invited lectures and some contributed papers presented by researchers working during the program on Computational Geometry at the Morningside Center of Mathematics of the Chinese Academy of Science. The opening article by R.-H. Wang gives a nice survey of various aspects of computational geometry, many of which are discussed in more detail in other papers in the volume. The topics include problems of optimal triangulation, splines, data interpolation, problems of curve and surface design, problems of shape control, quantum teleportation, and others.

  4. Language evolution and human-computer interaction

    Science.gov (United States)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  5. Center for Computational Wind Turbine Aerodynamics and Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Sørensen, Jens Nørkær

    2014-01-01

    In order to design and operate a wind farm optimally it is necessary to know in detail how the wind behaves and interacts with the turbines in a farm. This not only requires knowledge about meteorology, turbulence and aerodynamics, but it also requires access to powerful computers and efficient s...... software. Center for Computational Wind Turbine Aerodynamics and Atmospheric Turbulence was established in 2010 in order to create a world-leading cross-disciplinary flow center that covers all relevant disciplines within wind farm meteorology and aerodynamics.......In order to design and operate a wind farm optimally it is necessary to know in detail how the wind behaves and interacts with the turbines in a farm. This not only requires knowledge about meteorology, turbulence and aerodynamics, but it also requires access to powerful computers and efficient...

  6. Occupational stress in human computer interaction.

    Science.gov (United States)

    Smith, M J; Conway, F T; Karsh, B T

    1999-04-01

    There have been a variety of research approaches that have examined the stress issues related to human computer interaction including laboratory studies, cross-sectional surveys, longitudinal case studies and intervention studies. A critical review of these studies indicates that there are important physiological, biochemical, somatic and psychological indicators of stress that are related to work activities where human computer interaction occurs. Many of the stressors of human computer interaction at work are similar to those stressors that have historically been observed in other automated jobs. These include high workload, high work pressure, diminished job control, inadequate employee training to use new technology, monotonous tasks, por supervisory relations, and fear for job security. New stressors have emerged that can be tied primarily to human computer interaction. These include technology breakdowns, technology slowdowns, and electronic performance monitoring. The effects of the stress of human computer interaction in the workplace are increased physiological arousal; somatic complaints, especially of the musculoskeletal system; mood disturbances, particularly anxiety, fear and anger; and diminished quality of working life, such as reduced job satisfaction. Interventions to reduce the stress of computer technology have included improved technology implementation approaches and increased employee participation in implementation. Recommendations for ways to reduce the stress of human computer interaction at work are presented. These include proper ergonomic conditions, increased organizational support, improved job content, proper workload to decrease work pressure, and enhanced opportunities for social support. A model approach to the design of human computer interaction at work that focuses on the system "balance" is proposed.

  7. UC Merced Center for Computational Biology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Colvin, Michael; Watanabe, Masakatsu

    2010-11-30

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformation of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs

  8. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Göttingen

    Science.gov (United States)

    Meyer, Jörg; Quadt, Arnulf; Weber, Pavel; ATLAS Collaboration

    2011-12-01

    GoeGrid is a grid resource center located in Göttingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields of grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community, GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center is presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster is detailed. The benefits are an efficient use of computer and personpower resources.

  9. Human Adaptation to the Computer.

    Science.gov (United States)

    1986-09-01

    8217"’ TECHNOSTRESS " 5 5’..,:. VI I. CONCLUSIONS-------------------------59 -- LIST OF REFERENCES-------------------------61 BI BLI OGRAPHY...computer has not developed. Instead, what has developed is a "modern disease of adaptation" called " technostress ," a phrase coined by Brod. Craig...34 technostress ." Managers (according to Brod) have been implementing computers in ways that contribute directly to this stress: [Ref. 3:p. 38) 1. They

  10. Challenges for Virtual Humans in Human Computing

    NARCIS (Netherlands)

    Reidsma, Dennis; Ruttkay, Z.M.; Huang, T; Nijholt, Antinus; Pantic, Maja; Pentland, A.

    The vision of Ambient Intelligence (AmI) presumes a plethora of embedded services and devices that all endeavor to support humans in their daily activities as unobtrusively as possible. Hardware gets distributed throughout the environment, occupying even the fabric of our clothing. The environment

  11. Cloud Computing in Science and Engineering and the “SciShop.ru” Computer Simulation Center

    Directory of Open Access Journals (Sweden)

    E. V. Vorozhtsov

    2011-12-01

    Full Text Available Various aspects of cloud computing applications for scientific research, applied design, and remote education are described in this paper. An analysis of the different aspects is performed based on the experience from the “SciShop.ru” Computer Simulation Center. This analysis shows that cloud computing technology has wide prospects in scientific research applications, applied developments and also remote education of specialists, postgraduates, and students.

  12. Mailman Segal Center for Human Development | NSU

    Science.gov (United States)

    rendition of the National Anthem sung by Jonathan Richard, a young man with an autism spectrum disorder (ASD Dean Jim & Jan Moran Family Center Village Collaborations Early Learning Programs About Early Learning Programs Family Center Preschool About Our Preschool Enrollment Family Center Infant & Toddler

  13. The NIRA computer program package (photonuclear data center). Final report

    International Nuclear Information System (INIS)

    Vander Molen, H.J.; Gerstenberg, H.M.

    1976-02-01

    The Photonuclear Data Center's NIRA library of programs, executable from mass storage on the National Bureau of Standard's central computer facility, is described. Detailed instructions are given (with examples) for the use of the library to analyze, evaluate, synthesize, and produce for publication camera-ready tabular and graphical presentations of digital photonuclear reaction cross-section data. NIRA is the acronym for Nuclear Information Research Associate

  14. Object categorization: computer and human vision perspectives

    National Research Council Canada - National Science Library

    Dickinson, Sven J

    2009-01-01

    .... The result of a series of four highly successful workshops on the topic, the book gathers many of the most distinguished researchers from both computer and human vision to reflect on their experience...

  15. Human law and computer law comparative perspectives

    CERN Document Server

    Hildebrandt, Mireille

    2014-01-01

    This book probes the epistemological and hermeneutic implications of data science and artificial intelligence for democracy and the Rule of Law, and the challenges posed by computing technologies traditional legal thinking and the regulation of human affairs.

  16. Fundamentals of human-computer interaction

    CERN Document Server

    Monk, Andrew F

    1985-01-01

    Fundamentals of Human-Computer Interaction aims to sensitize the systems designer to the problems faced by the user of an interactive system. The book grew out of a course entitled """"The User Interface: Human Factors for Computer-based Systems"""" which has been run annually at the University of York since 1981. This course has been attended primarily by systems managers from the computer industry. The book is organized into three parts. Part One focuses on the user as processor of information with studies on visual perception; extracting information from printed and electronically presented

  17. The role of dedicated data computing centers in the age of cloud computing

    Science.gov (United States)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  18. Approaching Engagement towards Human-Engaged Computing

    DEFF Research Database (Denmark)

    Niksirat, Kavous Salehzadeh; Sarcar, Sayan; Sun, Huatong

    2018-01-01

    Debates regarding the nature and role of HCI research and practice have intensified in recent years, given the ever increasingly intertwined relations between humans and technologies. The framework of Human-Engaged Computing (HEC) was proposed and developed over a series of scholarly workshops to...

  19. Modeling multimodal human-computer interaction

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2004-01-01

    Incorporating the well-known Unified Modeling Language into a generic modeling framework makes research on multimodal human-computer interaction accessible to a wide range off software engineers. Multimodal interaction is part of everyday human discourse: We speak, move, gesture, and shift our gaze

  20. Secure data exchange between intelligent devices and computing centers

    Science.gov (United States)

    Naqvi, Syed; Riguidel, Michel

    2005-03-01

    The advent of reliable spontaneous networking technologies (commonly known as wireless ad-hoc networks) has ostensibly raised stakes for the conception of computing intensive environments using intelligent devices as their interface with the external world. These smart devices are used as data gateways for the computing units. These devices are employed in highly volatile environments where the secure exchange of data between these devices and their computing centers is of paramount importance. Moreover, their mission critical applications require dependable measures against the attacks like denial of service (DoS), eavesdropping, masquerading, etc. In this paper, we propose a mechanism to assure reliable data exchange between an intelligent environment composed of smart devices and distributed computing units collectively called 'computational grid'. The notion of infosphere is used to define a digital space made up of a persistent and a volatile asset in an often indefinite geographical space. We study different infospheres and present general evolutions and issues in the security of such technology-rich and intelligent environments. It is beyond any doubt that these environments will likely face a proliferation of users, applications, networked devices, and their interactions on a scale never experienced before. It would be better to build in the ability to uniformly deal with these systems. As a solution, we propose a concept of virtualization of security services. We try to solve the difficult problems of implementation and maintenance of trust on the one hand, and those of security management in heterogeneous infrastructure on the other hand.

  1. New developments in delivering public access to data from the National Center for Computational Toxicology at the EPA

    Science.gov (United States)

    Researchers at EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The goal of this researc...

  2. Students' Ways of Experiencing Human-Centered Design

    Science.gov (United States)

    Zoltowski, Carla B.

    2010-01-01

    This study investigated the qualitatively different ways which students experienced human-centered design. The findings of this research are important in developing effective design learning experiences and have potential impact across design education. This study provides the basis for being able to assess learning of human-centered design which…

  3. New computer system for the Japan Tier-2 center

    CERN Multimedia

    Hiroyuki Matsunaga

    2007-01-01

    The ICEPP (International Center for Elementary Particle Physics) of the University of Tokyo has been operating an LCG Tier-2 center dedicated to the ATLAS experiment, and is going to switch over to the new production system which has been recently installed. The system will be of great help to the exciting physics analyses for coming years. The new computer system includes brand-new blade servers, RAID disks, a tape library system and Ethernet switches. The blade server is DELL PowerEdge 1955 which contains two Intel dual-core Xeon (WoodCrest) CPUs running at 3GHz, and a total of 650 servers will be used as compute nodes. Each of the RAID disks is configured to be RAID-6 with 16 Serial ATA HDDs. The equipment as well as the cooling system is placed in a new large computer room, and both are hooked up to UPS (uninterruptible power supply) units for stable operation. As a whole, the system has been built with redundant configuration in a cost-effective way. The next major upgrade will take place in thre...

  4. Human computing and machine understanding of human behavior: A survey

    NARCIS (Netherlands)

    Pentland, Alex; Huang, Thomas S.; Huang, Th.S.; Nijholt, Antinus; Pantic, Maja; Pentland, A.

    2007-01-01

    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing should be about anticipatory user interfaces

  5. The psychology of computer displays in the modern mission control center

    Science.gov (United States)

    Granaas, Michael M.; Rhea, Donald C.

    1988-01-01

    Work at NASA's Western Aeronautical Test Range (WATR) has demonstrated the need for increased consideration of psychological factors in the design of computer displays for the WATR mission control center. These factors include color perception, memory load, and cognitive processing abilities. A review of relevant work in the human factors psychology area is provided to demonstrate the need for this awareness. The information provided should be relevant in control room settings where computerized displays are being used.

  6. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  7. Pilots of the future - Human or computer?

    Science.gov (United States)

    Chambers, A. B.; Nagel, D. C.

    1985-01-01

    In connection with the occurrence of aircraft accidents and the evolution of the air-travel system, questions arise regarding the computer's potential for making fundamental contributions to improving the safety and reliability of air travel. An important result of an analysis of the causes of aircraft accidents is the conclusion that humans - 'pilots and other personnel' - are implicated in well over half of the accidents which occur. Over 70 percent of the incident reports contain evidence of human error. In addition, almost 75 percent show evidence of an 'information-transfer' problem. Thus, the question arises whether improvements in air safety could be achieved by removing humans from control situations. In an attempt to answer this question, it is important to take into account also certain advantages which humans have in comparison to computers. Attention is given to human error and the effects of technology, the motivation to automate, aircraft automation at the crossroads, the evolution of cockpit automation, and pilot factors.

  8. Parallel structures in human and computer memory

    Science.gov (United States)

    Kanerva, Pentti

    1986-08-01

    If we think of our experiences as being recorded continuously on film, then human memory can be compared to a film library that is indexed by the contents of the film strips stored in it. Moreover, approximate retrieval cues suffice to retrieve information stored in this library: We recognize a familiar person in a fuzzy photograph or a familiar tune played on a strange instrument. This paper is about how to construct a computer memory that would allow a computer to recognize patterns and to recall sequences the way humans do. Such a memory is remarkably similar in structure to a conventional computer memory and also to the neural circuits in the cortex of the cerebellum of the human brain. The paper concludes that the frame problem of artificial intelligence could be solved by the use of such a memory if we were able to encode information about the world properly.

  9. High Performance Computing in Science and Engineering '15 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2015. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  10. High Performance Computing in Science and Engineering '17 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael; HLRS 2017

    2018-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2017. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance.The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  11. The Past, Present and Future of Human Computer Interaction

    KAUST Repository

    Churchill, Elizabeth

    2018-01-16

    Human Computer Interaction (HCI) focuses on how people interact with, and are transformed by computation. Our current technology landscape is changing rapidly. Interactive applications, devices and services are increasingly becoming embedded into our environments. From our homes to the urban and rural spaces, we traverse everyday. We are increasingly able toヨoften required toヨmanage and configure multiple, interconnected devices and program their interactions. Artificial intelligence (AI) techniques are being used to create dynamic services that learn about us and others, that make conclusions about our intents and affiliations, and that mould our digital interactions based in predictions about our actions and needs, nudging us toward certain behaviors. Computation is also increasingly embedded into our bodies. Understanding human interactions in the everyday digital and physical context. During this lecture, Elizabeth Churchill -Director of User Experience at Google- will talk about how an emerging landscape invites us to revisit old methods and tactics for understanding how people interact with computers and computation, and how it challenges us to think about new methods and frameworks for understanding the future of human-centered computation.

  12. Human-centered incubator: beyond a design concept

    OpenAIRE

    Goossens, R H M; Willemsen, H

    2013-01-01

    We read with interest the paper by Ferris and Shepley1 on a human-centered design project with university students on neonatal incubators. It is interesting to see that in the design solutions and concepts as presented by Ferris and Shepley,1 human-centered design played an important role. In 2005, a master thesis project was carried out in the Delft University of Technology, following a similar human-centered design approach.2, 3 In that design project we also addressed the noise level insid...

  13. Building a Prototype of LHC Analysis Oriented Computing Centers

    Science.gov (United States)

    Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.

    2012-12-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  14. Building a Prototype of LHC Analysis Oriented Computing Centers

    International Nuclear Information System (INIS)

    Bagliesi, G; Boccali, T; Della Ricca, G; Donvito, G; Paganoni, M

    2012-01-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  15. Applying Human Computation Methods to Information Science

    Science.gov (United States)

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  16. Feedback Loops in Communication and Human Computing

    NARCIS (Netherlands)

    op den Akker, Hendrikus J.A.; Heylen, Dirk K.J.; Pantic, Maja; Pentland, Alex; Nijholt, Antinus; Huang, Thomas S.

    Building systems that are able to analyse communicative behaviours or take part in conversations requires a sound methodology in which the complex organisation of conversations is understood and tested on real-life samples. The data-driven approaches to human computing not only have a value for the

  17. Human Memory Organization for Computer Programs.

    Science.gov (United States)

    Norcio, A. F.; Kerst, Stephen M.

    1983-01-01

    Results of study investigating human memory organization in processing of computer programming languages indicate that algorithmic logic segments form a cognitive organizational structure in memory for programs. Statement indentation and internal program documentation did not enhance organizational process of recall of statements in five Fortran…

  18. High Performance Computing in Science and Engineering '99 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2000-01-01

    The book contains reports about the most significant projects from science and engineering of the Federal High Performance Computing Center Stuttgart (HLRS). They were carefully selected in a peer-review process and are showcases of an innovative combination of state-of-the-art modeling, novel algorithms and the use of leading-edge parallel computer technology. The projects of HLRS are using supercomputer systems operated jointly by university and industry and therefore a special emphasis has been put on the industrial relevance of results and methods.

  19. High Performance Computing in Science and Engineering '98 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    1999-01-01

    The book contains reports about the most significant projects from science and industry that are using the supercomputers of the Federal High Performance Computing Center Stuttgart (HLRS). These projects are from different scientific disciplines, with a focus on engineering, physics and chemistry. They were carefully selected in a peer-review process and are showcases for an innovative combination of state-of-the-art physical modeling, novel algorithms and the use of leading-edge parallel computer technology. As HLRS is in close cooperation with industrial companies, special emphasis has been put on the industrial relevance of results and methods.

  20. Computational Complexity and Human Decision-Making.

    Science.gov (United States)

    Bossaerts, Peter; Murawski, Carsten

    2017-12-01

    The rationality principle postulates that decision-makers always choose the best action available to them. It underlies most modern theories of decision-making. The principle does not take into account the difficulty of finding the best option. Here, we propose that computational complexity theory (CCT) provides a framework for defining and quantifying the difficulty of decisions. We review evidence showing that human decision-making is affected by computational complexity. Building on this evidence, we argue that most models of decision-making, and metacognition, are intractable from a computational perspective. To be plausible, future theories of decision-making will need to take into account both the resources required for implementing the computations implied by the theory, and the resource constraints imposed on the decision-maker by biology. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Introduction to human-computer interaction

    CERN Document Server

    Booth, Paul

    2014-01-01

    Originally published in 1989 this title provided a comprehensive and authoritative introduction to the burgeoning discipline of human-computer interaction for students, academics, and those from industry who wished to know more about the subject. Assuming very little knowledge, the book provides an overview of the diverse research areas that were at the time only gradually building into a coherent and well-structured field. It aims to explain the underlying causes of the cognitive, social and organizational problems typically encountered when computer systems are introduced. It is clear and co

  2. High Performance Computing in Science and Engineering '02 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2003-01-01

    This book presents the state-of-the-art in modeling and simulation on supercomputers. Leading German research groups present their results achieved on high-end systems of the High Performance Computing Center Stuttgart (HLRS) for the year 2002. Reports cover all fields of supercomputing simulation ranging from computational fluid dynamics to computer science. Special emphasis is given to industrially relevant applications. Moreover, by presenting results for both vector sytems and micro-processor based systems the book allows to compare performance levels and usability of a variety of supercomputer architectures. It therefore becomes an indispensable guidebook to assess the impact of the Japanese Earth Simulator project on supercomputing in the years to come.

  3. Does Every Research Library Need a Digital Humanities Center?

    Science.gov (United States)

    Schaffner, Jennifer; Erway, Ricky

    2014-01-01

    The digital humanities (DH) are attracting considerable attention and funding at the same time that this nascent field is striving for an identity. Some research libraries are making significant investments by creating digital humanities centers. However, questions about whether such investments are warranted persist across the cultural heritage…

  4. Human-centered automation: Development of a philosophy

    Science.gov (United States)

    Graeber, Curtis; Billings, Charles E.

    1990-01-01

    Information on human-centered automation philosophy is given in outline/viewgraph form. It is asserted that automation of aircraft control will continue in the future, but that automation should supplement, not supplant the human management and control function in civil air transport.

  5. Proxemics in Human-Computer Interaction

    OpenAIRE

    Greenberg, Saul; Honbaek, Kasper; Quigley, Aaron; Reiterer, Harald; Rädle, Roman

    2014-01-01

    In 1966, anthropologist Edward Hall coined the term "proxemics." Proxemics is an area of study that identifies the culturally dependent ways in which people use interpersonal distance to understand and mediate their interactions with others. Recent research has demonstrated the use of proxemics in human-computer interaction (HCI) for supporting users' explicit and implicit interactions in a range of uses, including remote office collaboration, home entertainment, and games. One promise of pro...

  6. Human-Computer Interaction in Smart Environments

    Science.gov (United States)

    Paravati, Gianluca; Gatteschi, Valentina

    2015-01-01

    Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  7. Development of a computer system at La Hague center

    International Nuclear Information System (INIS)

    Mimaud, Robert; Malet, Georges; Ollivier, Francis; Fabre, J.-C.; Valois, Philippe; Desgranges, Patrick; Anfossi, Gilbert; Gentizon, Michel; Serpollet, Roger.

    1977-01-01

    The U.P.2 plant, built at La Hague Center is intended mainly for the reprocessing of spent fuels coming from (as metal) graphite-gas reactors and (as oxide) light-water, heavy-water and breeder reactors. In each of the five large nuclear units the digital processing of measurements was dealt with until 1974 by CAE 3030 data processors. During the period 1974-1975 a modern industrial computer system was set up. This system, equipped with T 2000/20 material from the Telemecanique company, consists of five measurement acquisition devices (for a total of 1500 lines processed) and two central processing units (CPU). The connection of these two PCU (Hardware and Software) enables an automatic connection of the system either on the first CPU or on the second one. The system covers, at present, data processing, threshold monitoring, alarm systems, display devices, periodical listing, and specific calculations concerning the process (balances etc), and at a later stage, an automatic control of certain units of the Process [fr

  8. 1st AAU Workshop on Human-Centered Robotics

    DEFF Research Database (Denmark)

    The 2012 AAU Workshop on Human-Centered Robotics took place on 15 Nov. 2012, at Aalborg University, Aalborg. The workshop provides a platform for robotics researchers, including professors, PhD and Master students to exchange their ideas and latest results. The objective is to foster closer...... interaction among researchers from multiple relevant disciplines in the human-centered robotics, and consequently, to promote collaborations across departments of all faculties towards making our center a center of excellence in robotics. The workshop becomes a great success, with 13 presentations, attracting...... more than 45 participants from AAU, SDU, DTI and industrial companies as well. The proceedings contain 7 full papers selected out from the full papers submitted afterwards on the basis of workshop abstracts. The papers represent major research development of robotics at AAU, including medical robots...

  9. Human-computer interaction : Guidelines for web animation

    OpenAIRE

    Galyani Moghaddam, Golnessa; Moballeghi, Mostafa

    2006-01-01

    Human-computer interaction in the large is an interdisciplinary area which attracts researchers, educators, and practioners from many differenf fields. Human-computer interaction studies a human and a machine in communication, it draws from supporting knowledge on both the machine and the human side. This paper is related to the human side of human-computer interaction and focuses on animations. The growing use of animation in Web pages testifies to the increasing ease with which such multim...

  10. Brain-Computer Interfaces Revolutionizing Human-Computer Interaction

    CERN Document Server

    Graimann, Bernhard; Allison, Brendan

    2010-01-01

    A brain-computer interface (BCI) establishes a direct output channel between the human brain and external devices. BCIs infer user intent via direct measures of brain activity and thus enable communication and control without movement. This book, authored by experts in the field, provides an accessible introduction to the neurophysiological and signal-processing background required for BCI, presents state-of-the-art non-invasive and invasive approaches, gives an overview of current hardware and software solutions, and reviews the most interesting as well as new, emerging BCI applications. The book is intended not only for students and young researchers, but also for newcomers and other readers from diverse backgrounds keen to learn about this vital scientific endeavour.

  11. M-center growth in alkali halides: computer simulation

    International Nuclear Information System (INIS)

    Aguilar, M.; Jaque, F.; Agullo-Lopez, F.

    1983-01-01

    The heterogeneous interstitial nucleation model previously proposed to explain F-center growth curves in irradiated alkali halides has been extended to account for M-center kinetics. The interstitials produced during the primary irradiation event are assumed to be trapped at impurities and interstitial clusters or recombine with F and M centers. For M-center formation two cases have been considered: (a) diffusion and aggregation of F centers, and (b) statistical generation and pairing of F centers. Process (b) is the only one consistent with the quadratic relationship between M and F center concentrations. However, to account for the F/M ratios experimentally observed as well as for the role of dose-rate, a modified statistical model involving random creation and association of F + -F pairs has been shown to be adequate. (author)

  12. Human-Computer Interaction in Smart Environments

    Directory of Open Access Journals (Sweden)

    Gianluca Paravati

    2015-08-01

    Full Text Available Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  13. Aspects of computer control from the human engineering standpoint

    International Nuclear Information System (INIS)

    Huang, T.V.

    1979-03-01

    A Computer Control System includes data acquisition, information display and output control signals. In order to design such a system effectively we must first determine the required operational mode: automatic control (closed loop), computer assisted (open loop), or hybrid control. The choice of operating mode will depend on the nature of the plant, the complexity of the operation, the funds available, and the technical expertise of the operating staff, among many other factors. Once the mode has been selected, consideration must be given to the method (man/machine interface) by which the operator interacts with this system. The human engineering factors are of prime importance to achieving high operating efficiency and very careful attention must be given to this aspect of the work, if full operator acceptance is to be achieved. This paper will discuss these topics and will draw on experience gained in setting up the computer control system in Main Control Center for Stanford University's Accelerator Center (a high energy physics research facility)

  14. Human-Centered Design Bill of Rights for Educators.

    Science.gov (United States)

    Sugar, William A.

    This paper presents a potential solution to encourage technology adoption and integration within schools by proposing a human-centered technology "bill of rights" for educators. The intention of this bill of rights it to influence educators' beliefs towards technology and to enable educators to confront with confidence the seemingly…

  15. Human-Centered Design for the Personal Satellite Assistant

    Science.gov (United States)

    Bradshaw, Jeffrey M.; Sierhuis, Maarten; Gawdiak, Yuri; Thomas, Hans; Greaves, Mark; Clancey, William J.; Swanson, Keith (Technical Monitor)

    2000-01-01

    The Personal Satellite Assistant (PSA) is a softball-sized flying robot designed to operate autonomously onboard manned spacecraft in pressurized micro-gravity environments. We describe how the Brahms multi-agent modeling and simulation environment in conjunction with a KAoS agent teamwork approach can be used to support human-centered design for the PSA.

  16. Wooden Spaceships: Human-Centered Vehicle Design for Space

    Science.gov (United States)

    Twyford, Evan

    2009-01-01

    Presentation will focus on creative human centered design solutions in relation to manned space vehicle design and development in the NASA culture. We will talk about design process, iterative prototyping, mockup building and user testing and evaluation. We will take an inside look at how new space vehicle concepts are developed and designed for real life exploration scenarios.

  17. Human-Computer Interaction The Agency Perspective

    CERN Document Server

    Oliveira, José

    2012-01-01

    Agent-centric theories, approaches and technologies are contributing to enrich interactions between users and computers. This book aims at highlighting the influence of the agency perspective in Human-Computer Interaction through a careful selection of research contributions. Split into five sections; Users as Agents, Agents and Accessibility, Agents and Interactions, Agent-centric Paradigms and Approaches, and Collective Agents, the book covers a wealth of novel, original and fully updated material, offering:   ü  To provide a coherent, in depth, and timely material on the agency perspective in HCI ü  To offer an authoritative treatment of the subject matter presented by carefully selected authors ü  To offer a balanced and broad coverage of the subject area, including, human, organizational, social, as well as technological concerns. ü  To offer a hands-on-experience by covering representative case studies and offering essential design guidelines   The book will appeal to a broad audience of resea...

  18. Measuring Multimodal Synchrony for Human-Computer Interaction

    NARCIS (Netherlands)

    Reidsma, Dennis; Nijholt, Antinus; Tschacher, Wolfgang; Ramseyer, Fabian; Sourin, A.

    2010-01-01

    Nonverbal synchrony is an important and natural element in human-human interaction. It can also play various roles in human-computer interaction. In particular this is the case in the interaction between humans and the virtual humans that inhabit our cyberworlds. Virtual humans need to adapt their

  19. Fluid dynamics parallel computer development at NASA Langley Research Center

    Science.gov (United States)

    Townsend, James C.; Zang, Thomas A.; Dwoyer, Douglas L.

    1987-01-01

    To accomplish more detailed simulations of highly complex flows, such as the transition to turbulence, fluid dynamics research requires computers much more powerful than any available today. Only parallel processing on multiple-processor computers offers hope for achieving the required effective speeds. Looking ahead to the use of these machines, the fluid dynamicist faces three issues: algorithm development for near-term parallel computers, architecture development for future computer power increases, and assessment of possible advantages of special purpose designs. Two projects at NASA Langley address these issues. Software development and algorithm exploration is being done on the FLEX/32 Parallel Processing Research Computer. New architecture features are being explored in the special purpose hardware design of the Navier-Stokes Computer. These projects are complementary and are producing promising results.

  20. Center for computation and visualization of geometric structures. [Annual], Progress report

    Energy Technology Data Exchange (ETDEWEB)

    1993-02-12

    The mission of the Center is to establish a unified environment promoting research, education, and software and tool development. The work is centered on computing, interpreted in a broad sense to include the relevant theory, development of algorithms, and actual implementation. The research aspects of the Center are focused on geometry; correspondingly the computational aspects are focused on three (and higher) dimensional visualization. The educational aspects are likewise centered on computing and focused on geometry. A broader term than education is `communication` which encompasses the challenge of explaining to the world current research in mathematics, and specifically geometry.

  1. Human computer interaction using hand gestures

    CERN Document Server

    Premaratne, Prashan

    2014-01-01

    Human computer interaction (HCI) plays a vital role in bridging the 'Digital Divide', bringing people closer to consumer electronics control in the 'lounge'. Keyboards and mouse or remotes do alienate old and new generations alike from control interfaces. Hand Gesture Recognition systems bring hope of connecting people with machines in a natural way. This will lead to consumers being able to use their hands naturally to communicate with any electronic equipment in their 'lounge.' This monograph will include the state of the art hand gesture recognition approaches and how they evolved from their inception. The author would also detail his research in this area for the past 8 years and how the future might turn out to be using HCI. This monograph will serve as a valuable guide for researchers (who would endeavour into) in the world of HCI.

  2. Advanced Technologies, Embedded and Multimedia for Human-Centric Computing

    CERN Document Server

    Chao, Han-Chieh; Deng, Der-Jiunn; Park, James; HumanCom and EMC 2013

    2014-01-01

    The theme of HumanCom and EMC are focused on the various aspects of human-centric computing for advances in computer science and its applications, embedded and multimedia computing and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. And the theme of EMC (Advanced in Embedded and Multimedia Computing) is focused on the various aspects of embedded system, smart grid, cloud and multimedia computing, and it provides an opportunity for academic, industry professionals to discuss the latest issues and progress in the area of embedded and multimedia computing. Therefore this book will be include the various theories and practical applications in human-centric computing and embedded and multimedia computing.

  3. An Interdisciplinary Bibliography for Computers and the Humanities Courses.

    Science.gov (United States)

    Ehrlich, Heyward

    1991-01-01

    Presents an annotated bibliography of works related to the subject of computers and the humanities. Groups items into textbooks and overviews; introductions; human and computer languages; literary and linguistic analysis; artificial intelligence and robotics; social issue debates; computers' image in fiction; anthologies; writing and the…

  4. The epistemology and ontology of human-computer interaction

    NARCIS (Netherlands)

    Brey, Philip A.E.

    2005-01-01

    This paper analyzes epistemological and ontological dimensions of Human-Computer Interaction (HCI) through an analysis of the functions of computer systems in relation to their users. It is argued that the primary relation between humans and computer systems has historically been epistemic:

  5. Leveraging human-centered design in chronic disease prevention.

    Science.gov (United States)

    Matheson, Gordon O; Pacione, Chris; Shultz, Rebecca K; Klügl, Martin

    2015-04-01

    Bridging the knowing-doing gap in the prevention of chronic disease requires deep appreciation and understanding of the complexities inherent in behavioral change. Strategies that have relied exclusively on the implementation of evidence-based data have not yielded the desired progress. The tools of human-centered design, used in conjunction with evidence-based data, hold much promise in providing an optimal approach for advancing disease prevention efforts. Directing the focus toward wide-scale education and application of human-centered design techniques among healthcare professionals will rapidly multiply their effective ability to bring the kind of substantial results in disease prevention that have eluded the healthcare industry for decades. This, in turn, would increase the likelihood of prevention by design. Copyright © 2015 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  6. The Study on Human-Computer Interaction Design Based on the Users’ Subconscious Behavior

    Science.gov (United States)

    Li, Lingyuan

    2017-09-01

    Human-computer interaction is human-centered. An excellent interaction design should focus on the study of user experience, which greatly comes from the consistence between design and human behavioral habit. However, users’ behavioral habits often result from subconsciousness. Therefore, it is smart to utilize users’ subconscious behavior to achieve design's intention and maximize the value of products’ functions, which gradually becomes a new trend in this field.

  7. Center for Advanced Energy Studies: Computer Assisted Virtual Environment (CAVE)

    Data.gov (United States)

    Federal Laboratory Consortium — The laboratory contains a four-walled 3D computer assisted virtual environment - or CAVE TM — that allows scientists and engineers to literally walk into their data...

  8. Diamond NV centers for quantum computing and quantum networks

    NARCIS (Netherlands)

    Childress, L.; Hanson, R.

    2013-01-01

    The exotic features of quantum mechanics have the potential to revolutionize information technologies. Using superposition and entanglement, a quantum processor could efficiently tackle problems inaccessible to current-day computers. Nonlocal correlations may be exploited for intrinsically secure

  9. Human resource management in patient-centered pharmaceutical care.

    Science.gov (United States)

    White, S J

    1994-04-01

    Patient-centered care may have the pharmacists and technicians reporting either directly or in a matrix to other than pharmacy administration. The pharmacy administrative people will need to be both effective leaders and managers utilizing excellent human resource management skills. Significant creativity and innovation will be needed for transition from departmental-based services to patient care team services. Changes in the traditional methods of recruiting, interviewing, hiring, training, developing, inspiring, evaluating, and disciplining are required in this new environment.

  10. The Benefits of Making Data from the EPA National Center for Computational Toxicology available for reuse (ACS Fall meeting 3 of 12)

    Science.gov (United States)

    Researchers at EPA’s National Center for Computational Toxicology (NCCT) integrate advances in biology, chemistry, exposure and computer science to help prioritize chemicals for further research based on potential human health risks. The goal of this research is to quickly evalua...

  11. Performance of Cloud Computing Centers with Multiple Priority Classes

    NARCIS (Netherlands)

    Ellens, W.; Zivkovic, Miroslav; Akkerboom, J.; Litjens, R.; van den Berg, Hans Leo

    In this paper we consider the general problem of resource provisioning within cloud computing. We analyze the problem of how to allocate resources to different clients such that the service level agreements (SLAs) for all of these clients are met. A model with multiple service request classes

  12. National Energy Research Scientific Computing Center 2007 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hules, John A.; Bashor, Jon; Wang, Ucilia; Yarris, Lynn; Preuss, Paul

    2008-10-23

    This report presents highlights of the research conducted on NERSC computers in a variety of scientific disciplines during the year 2007. It also reports on changes and upgrades to NERSC's systems and services aswell as activities of NERSC staff.

  13. 2012 International Conference on Human-centric Computing

    CERN Document Server

    Jin, Qun; Yeo, Martin; Hu, Bin; Human Centric Technology and Service in Smart Space, HumanCom 2012

    2012-01-01

    The theme of HumanCom is focused on the various aspects of human-centric computing for advances in computer science and its applications and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in human-centric computing. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject.

  14. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  15. Human-Computer Interaction and Information Management Research Needs

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — In a visionary future, Human-Computer Interaction HCI and Information Management IM have the potential to enable humans to better manage their lives through the use...

  16. Conception of a computer for the nuclear medical department of the Augsburg hospital center

    International Nuclear Information System (INIS)

    Graf, G.; Heidenreich, P.

    1984-01-01

    A computer system based on the Siemens R30 process computer has been employed at the Institute of Nuclear Medicine of the Augsburg Hospital Center since early 1981. This system, including the development and testing of organ-specific evaluation programs, was used as a basis for the conception of the new computer system for the department of nuclear medicine of the Augsburg Hospital Center. The computer system was extended and installed according to this conception when the new 1400-bed hospital was opened in the 3rd phase of construction in autumn 1982. (orig.) [de

  17. Center for Computer Security newsletter. Volume 2, Number 3

    Energy Technology Data Exchange (ETDEWEB)

    None

    1983-05-01

    The Fifth Computer Security Group Conference was held November 16 to 18, 1982, at the Knoxville Hilton in Knoxville, Tennessee. Attending were 183 people, representing the Department of Energy, DOE contractors, other government agencies, and vendor organizations. In these papers are abridgements of most of the papers presented in Knoxville. Less than half-a-dozen speakers failed to furnish either abstracts or full-text papers of their Knoxville presentations.

  18. Computer-aided dispatch--traffic management center field operational test : state of Utah final report

    Science.gov (United States)

    2006-07-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch Traffic Management Center Integration Field Operations Test in the State of Utah. The document discusses evaluation findings in the followin...

  19. Computer-aided dispatch--traffic management center field operational test : Washington State final report

    Science.gov (United States)

    2006-05-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch - Traffic Management Center Integration Field Operations Test in the State of Washington. The document discusses evaluation findings in the foll...

  20. Information and psychomotor skills knowledge acquisition: A student-customer-centered and computer-supported approach.

    Science.gov (United States)

    Nicholson, Anita; Tobin, Mary

    2006-01-01

    This presentation will discuss coupling commercial and customized computer-supported teaching aids to provide BSN nursing students with a friendly customer-centered self-study approach to psychomotor skill acquisition.

  1. Geometric Modeling and Reasoning of Human-Centered Freeform Products

    CERN Document Server

    Wang, Charlie C L

    2013-01-01

    The recent trend in user-customized product design requires the shape of products to be automatically adjusted according to the human body’s shape, so that people will feel more comfortable when wearing these products.  Geometric approaches can be used to design the freeform shape of products worn by people, which can greatly improve the efficiency of design processes in various industries involving customized products (e.g., garment design, toy design, jewel design, shoe design, and design of medical devices, etc.). These products are usually composed of very complex geometric shapes (represented by free-form surfaces), and are not driven by a parameter table but a digital human model with free-form shapes or part of human bodies (e.g., wrist, foot, and head models).   Geometric Modeling and Reasoning of Human-Centered Freeform Products introduces the algorithms of human body reconstruction, freeform product modeling, constraining and reconstructing freeform products, and shape optimization for improving...

  2. Computer Modeling of Human Delta Opioid Receptor

    Directory of Open Access Journals (Sweden)

    Tatyana Dzimbova

    2013-04-01

    Full Text Available The development of selective agonists of δ-opioid receptor as well as the model of interaction of ligands with this receptor is the subjects of increased interest. In the absence of crystal structures of opioid receptors, 3D homology models with different templates have been reported in the literature. The problem is that these models are not available for widespread use. The aims of our study are: (1 to choose within recently published crystallographic structures templates for homology modeling of the human δ-opioid receptor (DOR; (2 to evaluate the models with different computational tools; and (3 to precise the most reliable model basing on correlation between docking data and in vitro bioassay results. The enkephalin analogues, as ligands used in this study, were previously synthesized by our group and their biological activity was evaluated. Several models of DOR were generated using different templates. All these models were evaluated by PROCHECK and MolProbity and relationship between docking data and in vitro results was determined. The best correlations received for the tested models of DOR were found between efficacy (erel of the compounds, calculated from in vitro experiments and Fitness scoring function from docking studies. New model of DOR was generated and evaluated by different approaches. This model has good GA341 value (0.99 from MODELLER, good values from PROCHECK (92.6% of most favored regions and MolProbity (99.5% of favored regions. Scoring function correlates (Pearson r = -0.7368, p-value = 0.0097 with erel of a series of enkephalin analogues, calculated from in vitro experiments. So, this investigation allows suggesting a reliable model of DOR. Newly generated model of DOR receptor could be used further for in silico experiments and it will give possibility for faster and more correct design of selective and effective ligands for δ-opioid receptor.

  3. CNC Turning Center Advanced Operations. Computer Numerical Control Operator/Programmer. 444-332.

    Science.gov (United States)

    Skowronski, Steven D.; Tatum, Kenneth

    This student guide provides materials for a course designed to introduce the student to the operations and functions of a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 presents course expectations and syllabus, covers safety precautions, and describes the CNC turning center components, CNC…

  4. Intention and Usage of Computer Based Information Systems in Primary Health Centers

    Science.gov (United States)

    Hosizah; Kuntoro; Basuki N., Hari

    2016-01-01

    The computer-based information system (CBIS) is adopted by almost all of in health care setting, including the primary health center in East Java Province Indonesia. Some of softwares available were SIMPUS, SIMPUSTRONIK, SIKDA Generik, e-puskesmas. Unfortunately they were most of the primary health center did not successfully implemented. This…

  5. Multimodal Information Presentation for High-Load Human Computer Interaction

    NARCIS (Netherlands)

    Cao, Y.

    2011-01-01

    This dissertation addresses multimodal information presentation in human computer interaction. Information presentation refers to the manner in which computer systems/interfaces present information to human users. More specifically, the focus of our work is not on which information to present, but

  6. Stereo Vision for Unrestricted Human-Computer Interaction

    OpenAIRE

    Eldridge, Ross; Rudolph, Heiko

    2008-01-01

    Human computer interfaces have come long way in recent years, but the goal of a computer interpreting unrestricted human movement remains elusive. The use of stereo vision in this field has enabled the development of systems that begin to approach this goal. As computer technology advances we come ever closer to a system that can react to the ambiguities of human movement in real-time. In the foreseeable future stereo computer vision is not likely to replace the keyboard or mouse. There is at...

  7. A Descriptive Study towards Green Computing Practice Application for Data Centers in IT Based Industries

    Directory of Open Access Journals (Sweden)

    Anthony Jnr. Bokolo

    2018-01-01

    Full Text Available The progressive upsurge in demand for processing and computing power has led to a subsequent upsurge in data center carbon emissions, cost incurred, unethical waste management, depletion of natural resources and high energy utilization. This raises the issue of the sustainability attainment in data centers of Information Technology (IT based industries. Green computing practice can be applied to facilitate sustainability attainment as IT based industries utilizes data centers to provide services to staffs, practitioners and end users. But it is a known fact that enterprise servers utilize huge quantity of energy and incur other expenditures in cooling operations and it is difficult to address the needs of accuracy and efficiency in data centers while yet encouraging a greener application practice alongside cost reduction. Thus this research study focus on the practice application of Green computing in data centers which houses servers and as such presents the Green computing life cycle strategies and best practices to be practiced for better management in data centers in IT based industries. Data was collected through questionnaire from 133 respondents in industries that currently operate their in-house data centers. The analysed data was used to verify the Green computing life cycle strategies presented in this study. Findings from the data shows that each of the life cycles strategies is significant in assisting IT based industries apply Green computing practices in their data centers. This study would be of interest to knowledge and data management practitioners as well as environmental manager and academicians in deploying Green data centers in their organizations.

  8. Scientific visualization in computational aerodynamics at NASA Ames Research Center

    Science.gov (United States)

    Bancroft, Gordon V.; Plessel, Todd; Merritt, Fergus; Walatka, Pamela P.; Watson, Val

    1989-01-01

    The visualization methods used in computational fluid dynamics research at the NASA-Ames Numerical Aerodynamic Simulation facility are examined, including postprocessing, tracking, and steering methods. The visualization requirements of the facility's three-dimensional graphical workstation are outlined and the types hardware and software used to meet these requirements are discussed. The main features of the facility's current and next-generation workstations are listed. Emphasis is given to postprocessing techniques, such as dynamic interactive viewing on the workstation and recording and playback on videodisk, tape, and 16-mm film. Postprocessing software packages are described, including a three-dimensional plotter, a surface modeler, a graphical animation system, a flow analysis software toolkit, and a real-time interactive particle-tracer.

  9. Benefits of Subliminal Feedback Loops in Human-Computer Interaction

    OpenAIRE

    Walter Ritter

    2011-01-01

    A lot of efforts have been directed to enriching human-computer interaction to make the user experience more pleasing or efficient. In this paper, we briefly present work in the fields of subliminal perception and affective computing, before we outline a new approach to add analog communication channels to the human-computer interaction experience. In this approach, in addition to symbolic predefined mappings of input to output, a subliminal feedback loop is used that provides feedback in evo...

  10. Human computer confluence applied in healthcare and rehabilitation.

    Science.gov (United States)

    Viaud-Delmon, Isabelle; Gaggioli, Andrea; Ferscha, Alois; Dunne, Stephen

    2012-01-01

    Human computer confluence (HCC) is an ambitious research program studying how the emerging symbiotic relation between humans and computing devices can enable radically new forms of sensing, perception, interaction, and understanding. It is an interdisciplinary field, bringing together researches from horizons as various as pervasive computing, bio-signals processing, neuroscience, electronics, robotics, virtual & augmented reality, and provides an amazing potential for applications in medicine and rehabilitation.

  11. From Human-Computer Interaction to Human-Robot Social Interaction

    OpenAIRE

    Toumi, Tarek; Zidani, Abdelmadjid

    2014-01-01

    Human-Robot Social Interaction became one of active research fields in which researchers from different areas propose solutions and directives leading robots to improve their interactions with humans. In this paper we propose to introduce works in both human robot interaction and human computer interaction and to make a bridge between them, i.e. to integrate emotions and capabilities concepts of the robot in human computer model to become adequate for human robot interaction and discuss chall...

  12. Safety Metrics for Human-Computer Controlled Systems

    Science.gov (United States)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  13. High Performance Computing in Science and Engineering '16 : Transactions of the High Performance Computing Center, Stuttgart (HLRS) 2016

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2016. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  14. Human performance models for computer-aided engineering

    Science.gov (United States)

    Elkind, Jerome I. (Editor); Card, Stuart K. (Editor); Hochberg, Julian (Editor); Huey, Beverly Messick (Editor)

    1989-01-01

    This report discusses a topic important to the field of computational human factors: models of human performance and their use in computer-based engineering facilities for the design of complex systems. It focuses on a particular human factors design problem -- the design of cockpit systems for advanced helicopters -- and on a particular aspect of human performance -- vision and related cognitive functions. By focusing in this way, the authors were able to address the selected topics in some depth and develop findings and recommendations that they believe have application to many other aspects of human performance and to other design domains.

  15. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...

  16. Human-centered design of the human-system interfaces of medical equipment: thyroid uptake system

    International Nuclear Information System (INIS)

    Monteiro, Jonathan K.R.; Farias, Marcos S.; Santos, Isaac J.A. Luquetti; Monteiro, Beany G.

    2013-01-01

    Technology plays an important role in modern medical centers, making healthcare increasingly complex, relying on complex technical equipment. This technical complexity is particularly noticeable in the nuclear medicine. Poorly design human-system interfaces can increase the risks for human error. The human-centered approach emphasizes the development of the equipment with a deep understanding of the users activities, current work practices, needs and abilities of the users. An important concept of human-centered design is that the ease-of-use of the equipment can be ensured only if users are actively incorporated in all phases of the life cycle of design process. Representative groups of users are exposed to the equipment at various stages in development, in a variety of testing, evaluation and interviewing situations. The users feedback obtained is then used to refine the design, with the result serving as input to the next interaction of design process. The limits of the approach are that the users cannot address any particular future needs without prior experience or knowledge about the equipment operation. The aim of this paper is to present a methodological framework that contributes to the design of the human-system interfaces, through an approach related to the users and their activities. A case study is described in which the methodological framework is being applied in development of new human-system interfaces of the thyroid uptake system. (author)

  17. Image Visual Realism: From Human Perception to Machine Computation.

    Science.gov (United States)

    Fan, Shaojing; Ng, Tian-Tsong; Koenig, Bryan L; Herberg, Jonathan S; Jiang, Ming; Shen, Zhiqi; Zhao, Qi

    2017-08-30

    Visual realism is defined as the extent to which an image appears to people as a photo rather than computer generated. Assessing visual realism is important in applications like computer graphics rendering and photo retouching. However, current realism evaluation approaches use either labor-intensive human judgments or automated algorithms largely dependent on comparing renderings to reference images. We develop a reference-free computational framework for visual realism prediction to overcome these constraints. First, we construct a benchmark dataset of 2520 images with comprehensive human annotated attributes. From statistical modeling on this data, we identify image attributes most relevant for visual realism. We propose both empirically-based (guided by our statistical modeling of human data) and CNN-learned features to predict visual realism of images. Our framework has the following advantages: (1) it creates an interpretable and concise empirical model that characterizes human perception of visual realism; (2) it links computational features to latent factors of human image perception.

  18. Computational Intelligence in a Human Brain Model

    Directory of Open Access Journals (Sweden)

    Viorel Gaftea

    2016-06-01

    Full Text Available This paper focuses on the current trends in brain research domain and the current stage of development of research for software and hardware solutions, communication capabilities between: human beings and machines, new technologies, nano-science and Internet of Things (IoT devices. The proposed model for Human Brain assumes main similitude between human intelligence and the chess game thinking process. Tactical & strategic reasoning and the need to follow the rules of the chess game, all are very similar with the activities of the human brain. The main objective for a living being and the chess game player are the same: securing a position, surviving and eliminating the adversaries. The brain resolves these goals, and more, the being movement, actions and speech are sustained by the vital five senses and equilibrium. The chess game strategy helps us understand the human brain better and easier replicate in the proposed ‘Software and Hardware’ SAH Model.

  19. The Next Wave: Humans, Computers, and Redefining Reality

    Science.gov (United States)

    Little, William

    2018-01-01

    The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.

  20. Japan's silver human resource centers and participant well-being.

    Science.gov (United States)

    Weiss, Robert S; Bass, Scott A; Heimovitz, Harley K; Oka, Masato

    2005-03-01

    Japan's Silver Human Resource Center (SHRC) program provides part-time, paid employment to retirement-aged men and women. We studied 393 new program participants and examined whether part-time work influenced their well-being or "ikigai." The participants were divided into those who had worked in SHRC-provided jobs in the preceding year, and those who had not. Gender-stratified regression models were fitted to determine whether SHRC employment was associated with increased well-being. For men, actively working at a SHRC job was associated with greater well-being, compared to inactive members. And men with SHRC jobs and previous volunteering experience had the greatest increase in well-being. Women SHRC job holders did not experience increased well-being at the year's end. The study concludes that there is justification for exploring the usefulness of a similar program for American retirees who desire post-retirement part-time work.

  1. Life Sciences Division and Center for Human Genome Studies 1994

    Energy Technology Data Exchange (ETDEWEB)

    Cram, L.S.; Stafford, C. [comp.

    1995-09-01

    This report summarizes the research and development activities of the Los Alamos National Laboratory`s Life Sciences Division and the biological aspects of the Center for Human Genome Studies for the calendar year 1994. The technical portion of the report is divided into two parts, (1) selected research highlights and (2) research projects and accomplishments. The research highlights provide a more detailed description of a select set of projects. A technical description of all projects is presented in sufficient detail so that the informed reader will be able to assess the scope and significance of each project. Summaries useful to the casual reader desiring general information have been prepared by the group leaders and appear in each group overview. Investigators on the staff of the Life Sciences Division will be pleased to provide further information.

  2. Accident sequence analysis of human-computer interface design

    International Nuclear Information System (INIS)

    Fan, C.-F.; Chen, W.-H.

    2000-01-01

    It is important to predict potential accident sequences of human-computer interaction in a safety-critical computing system so that vulnerable points can be disclosed and removed. We address this issue by proposing a Multi-Context human-computer interaction Model along with its analysis techniques, an Augmented Fault Tree Analysis, and a Concurrent Event Tree Analysis. The proposed augmented fault tree can identify the potential weak points in software design that may induce unintended software functions or erroneous human procedures. The concurrent event tree can enumerate possible accident sequences due to these weak points

  3. Is function-based control room design human-centered?

    International Nuclear Information System (INIS)

    Norros, L.; Savioja, P.

    2006-01-01

    Function-based approaches to system interface design appears an appealing possibility in helping designers and operators to cope with the vast amount of information needed to control complex processes. In this paper we provide evidence of operator performance analyses showing that outcome-centered performance measures may not be sufficiently informative for design. We need analyses indicating habitual patterns of using information, operator practices. We argue that practices that portray functional orienting to the task support mastery of the process. They also create potential to make use of function-based information presentation. We see that functional design is not an absolute value. Instead, such design should support communication of the functional significance of the process information to the operators in variable situations. Hence, it should facilitate development of practices that focus to interpreting this message. Successful function-based design facilitates putting operations into their contexts and is human-centered in an extended sense: It aids making sense in the complex, dynamic and uncertain environment. (authors)

  4. The effective use of virtualization for selection of data centers in a cloud computing environment

    Science.gov (United States)

    Kumar, B. Santhosh; Parthiban, Latha

    2018-04-01

    Data centers are the places which consist of network of remote servers to store, access and process the data. Cloud computing is a technology where users worldwide will submit the tasks and the service providers will direct the requests to the data centers which are responsible for execution of tasks. The servers in the data centers need to employ the virtualization concept so that multiple tasks can be executed simultaneously. In this paper we proposed an algorithm for data center selection based on energy of virtual machines created in server. The virtualization energy in each of the server is calculated and total energy of the data center is obtained by the summation of individual server energy. The tasks submitted are routed to the data center with least energy consumption which will result in minimizing the operational expenses of a service provider.

  5. Applying Human-Centered Design Methods to Scientific Communication Products

    Science.gov (United States)

    Burkett, E. R.; Jayanty, N. K.; DeGroot, R. M.

    2016-12-01

    Knowing your users is a critical part of developing anything to be used or experienced by a human being. User interviews, journey maps, and personas are all techniques commonly employed in human-centered design practices because they have proven effective for informing the design of products and services that meet the needs of users. Many non-designers are unaware of the usefulness of personas and journey maps. Scientists who are interested in developing more effective products and communication can adopt and employ user-centered design approaches to better reach intended audiences. Journey mapping is a qualitative data-collection method that captures the story of a user's experience over time as related to the situation or product that requires development or improvement. Journey maps help define user expectations, where they are coming from, what they want to achieve, what questions they have, their challenges, and the gaps and opportunities that can be addressed by designing for them. A persona is a tool used to describe the goals and behavioral patterns of a subset of potential users or customers. The persona is a qualitative data model that takes the form of a character profile, built upon data about the behaviors and needs of multiple users. Gathering data directly from users avoids the risk of basing models on assumptions, which are often limited by misconceptions or gaps in understanding. Journey maps and user interviews together provide the data necessary to build the composite character that is the persona. Because a persona models the behaviors and needs of the target audience, it can then be used to make informed product design decisions. We share the methods and advantages of developing and using personas and journey maps to create more effective science communication products.

  6. Human viral pathogens are pervasive in wastewater treatment center aerosols.

    Science.gov (United States)

    Brisebois, Evelyne; Veillette, Marc; Dion-Dupont, Vanessa; Lavoie, Jacques; Corbeil, Jacques; Culley, Alexander; Duchaine, Caroline

    2018-05-01

    Wastewater treatment center (WTC) workers may be vulnerable to diseases caused by viruses, such as the common cold, influenza and gastro-intestinal infections. Although there is a substantial body of literature characterizing the microbial community found in wastewater, only a few studies have characterized the viral component of WTC aerosols, despite the fact that most diseases affecting WTC workers are of viral origin and that some of these viruses are transmitted through the air. In this study, we evaluated in four WTCs the presence of 11 viral pathogens of particular concern in this milieu and used a metagenomic approach to characterize the total viral community in the air of one of those WTCs. The presence of viruses in aerosols in different locations of individual WTCs was evaluated and the results obtained with four commonly used air samplers were compared. We detected four of the eleven viruses tested, including human adenovirus (hAdV), rotavirus, hepatitis A virus (HAV) and Herpes Simplex virus type 1 (HSV1). The results of the metagenomic assay uncovered very few viral RNA sequences in WTC aerosols, however sequences from human DNA viruses were in much greater relative abundance. Copyright © 2017. Published by Elsevier B.V.

  7. Human-Centered Design as an Integrating Discipline

    Directory of Open Access Journals (Sweden)

    Guy André Boy

    2017-02-01

    Full Text Available What is research today? Good research has to be indexed within appropriate mechanisms to be visible, considered and finally useful. These mechanisms are based on quantitative research methods and codes that are often very academic. Consequently, they impose rigorous constraints on the way results should be obtained and presented. In addition, everything people learn in academia needs to be graded. This leads to standard packaging of what should be learned and results in making people executants and not creators nor inventors. In other words, this academic standardization precludes freedom for innovation. This paper proposes Human-Centered Design (HCD as a solution to override these limitations and roadblocks. HCD involves expertise, experience, participation, modeling and simulation, complexity analysis and qualitative research. What is education today? Education is organized in silos with little attempt to integrate individual academic disciplines. Large system integration is almost never learned in engineering schools, and Human- Systems Integration (HSI even less. Instead, real-life problemsolving requires integration skills. What is design research? We often hear that design has nothing to do with research, and conversely. Putting design and research together, as complementary disciplines, contributes to combine creativity, rigorous demonstration and validation. This is somehow what HCD is about.

  8. Improving flight condition situational awareness through Human Centered Design.

    Science.gov (United States)

    Craig, Carol

    2012-01-01

    In aviation, there is currently a lack of accurate and timely situational information, specifically weather data, which is essential when dealing with the unpredictable complexities that can arise while flying. For example, weather conditions that require immediate evasive action by the flight crew, such as isolated heavy rain, micro bursts, and atmospheric turbulence, require that the flight crew receive near real-time and precise information about the type, position, and intensity of those conditions. Human factors issues arise in considering how to display the various sources of weather information to the users of that information and how to integrate this display into the existing environment. In designing weather information display systems, it is necessary to meet the demands of different users, which requires an examination of the way in which the users process and use weather information. Using Human Centered Design methodologies and concepts will result in a safer, more efficient and more intuitive solution. Specific goals of this approach include 1) Enabling better fuel planning; 2) Allowing better divert strategies; 3) Ensuring pilots, navigators, dispatchers and mission planners are referencing weather from the same sources; 4) Improving aircrew awareness of aviation hazards such as turbulence, icing, hail and convective activity; 5) Addressing inconsistent availability of hazard forecasts outside the United States Air Defense Identification Zone (ADIZ); and 6) Promoting goal driven approaches versus event driven (prediction).

  9. Naturalistic Cognition: A Research Paradigm for Human-Centered Design

    Directory of Open Access Journals (Sweden)

    Peter Storkerson

    2010-01-01

    Full Text Available Naturalistic thinking and knowing, the tacit, experiential, and intuitive reasoning of everyday interaction, have long been regarded as inferior to formal reason and labeled primitive, fallible, subjective, superstitious, and in some cases ineffable. But, naturalistic thinking is more rational and definable than it appears. It is also relevant to design. Inquiry into the mechanisms of naturalistic thinking and knowledge can bring its resources into focus and enable designers to create better, human-centered designs for use in real-world settings. This article makes a case for the explicit, formal study of implicit, naturalistic thinking within the fields of design. It develops a framework for defining and studying naturalistic thinking and knowledge, for integrating them into design research and practice, and for developing a more integrated, consistent theory of knowledge in design. It will (a outline historical definitions of knowledge, attitudes toward formal and naturalistic thinking, and the difficulties presented by the co-presence of formal and naturalistic thinking in design, (b define and contrast formal and naturalistic thinking as two distinct human cognitive systems, (c demonstrate the importance of naturalistic cognition in formal thinking and real-world judgment, (d demonstrate methods for researching naturalistic thinking that can be of use in design, and (e briefly discuss the impact on design theory of admitting naturalistic thinking as valid, systematic, and knowable.

  10. Human-computer interaction and management information systems

    CERN Document Server

    Galletta, Dennis F

    2014-01-01

    ""Human-Computer Interaction and Management Information Systems: Applications"" offers state-of-the-art research by a distinguished set of authors who span the MIS and HCI fields. The original chapters provide authoritative commentaries and in-depth descriptions of research programs that will guide 21st century scholars, graduate students, and industry professionals. Human-Computer Interaction (or Human Factors) in MIS is concerned with the ways humans interact with information, technologies, and tasks, especially in business, managerial, organizational, and cultural contexts. It is distinctiv

  11. Mobile human-computer interaction perspective on mobile learning

    CSIR Research Space (South Africa)

    Botha, Adèle

    2010-10-01

    Full Text Available Applying a Mobile Human Computer Interaction (MHCI) view to the domain of education using Mobile Learning (Mlearning), the research outlines its understanding of the influences and effects of different interactions on the use of mobile technology...

  12. Cognition beyond the brain computation, interactivity and human artifice

    CERN Document Server

    Cowley, Stephen J

    2013-01-01

    Arguing that a collective dimension has given cognitive flexibility to human intelligence, this book shows that traditional cognitive psychology underplays the role of bodies, dialogue, diagrams, tools, talk, customs, habits, computers and cultural practices.

  13. Computers, the Human Mind, and My In-Laws' House.

    Science.gov (United States)

    Esque, Timm J.

    1996-01-01

    Discussion of human memory, computer memory, and the storage of information focuses on a metaphor that can account for memory without storage and can set the stage for systemic research around a more comprehensive, understandable theory. (Author/LRW)

  14. AHPCRC (Army High Performance Computing Research Center) Bulletin. Volume 1, Issue 2

    Science.gov (United States)

    2011-01-01

    area and the researchers working on these projects. Also inside: news from the AHPCRC consortium partners at Morgan State University and the NASA ...Computing Research Center is provided by the supercomputing and research facilities at Stanford University and at the NASA Ames Research Center at...atomic and molecular level, he said. He noted that “every general would like to have” a Star Trek -like holodeck, where holographic avatars could

  15. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    International Nuclear Information System (INIS)

    Bogdanov, A.V.; Yuzhanin, N.V.; Zolotarev, V.I.; Ezhakova, T.R.

    2017-01-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is reviewed and development of the corresponding elements of the system is described in the present paper.

  16. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    Science.gov (United States)

    Bogdanov, A. V.; Iuzhanin, N. V.; Zolotarev, V. I.; Ezhakova, T. R.

    2017-12-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is being reviewed and development of the corresponding elements of the system is being described in the present paper.

  17. The Emotiv EPOC interface paradigm in Human-Computer Interaction

    OpenAIRE

    Ancău Dorina; Roman Nicolae-Marius; Ancău Mircea

    2017-01-01

    Numerous studies have suggested the use of decoded error potentials in the brain to improve human-computer communication. Together with state-of-the-art scientific equipment, experiments have also tested instruments with more limited performance for the time being, such as Emotiv EPOC. This study presents a review of these trials and a summary of the results obtained. However, the level of these results indicates a promising prospect for using this headset as a human-computer interface for er...

  18. Technical Data Management Center: a focal point for meteorological and other environmental transport computing technology

    International Nuclear Information System (INIS)

    McGill, B.; Maskewitz, B.F.; Trubey, D.K.

    1981-01-01

    The Technical Data Management Center, collecting, packaging, analyzing, and distributing information, computer technology and data which includes meteorological and other environmental transport work is located at the Oak Ridge National Laboratory, within the Engineering Physics Division. Major activities include maintaining a collection of computing technology and associated literature citations to provide capabilities for meteorological and environmental work. Details of the activities on behalf of TDMC's sponsoring agency, the US Nuclear Regulatory Commission, are described

  19. Where computers disappear, virtual humans appear

    NARCIS (Netherlands)

    Nijholt, Antinus; Sourin, A.

    2004-01-01

    In this paper, we survey the role of virtual humans (or embodied conversational agents) in smart and ambient intelligence environments. Research in this area can profit from research done earlier in virtual reality environments and research on verbal and nonverbal interaction. We discuss virtual

  20. Audio Technology and Mobile Human Computer Interaction

    DEFF Research Database (Denmark)

    Chamberlain, Alan; Bødker, Mads; Hazzard, Adrian

    2017-01-01

    Audio-based mobile technology is opening up a range of new interactive possibilities. This paper brings some of those possibilities to light by offering a range of perspectives based in this area. It is not only the technical systems that are developing, but novel approaches to the design...... and understanding of audio-based mobile systems are evolving to offer new perspectives on interaction and design and support such systems to be applied in areas, such as the humanities....

  1. Object recognition in images by human vision and computer vision

    NARCIS (Netherlands)

    Chen, Q.; Dijkstra, J.; Vries, de B.

    2010-01-01

    Object recognition plays a major role in human behaviour research in the built environment. Computer based object recognition techniques using images as input are challenging, but not an adequate representation of human vision. This paper reports on the differences in object shape recognition

  2. ErgoTMC, A New Tool For Human-Centered TMC Design

    Science.gov (United States)

    2000-04-01

    The Federal Highway Administration (FHWA) has recently made available a new tool to assist Transportation Management Center (TMC) managers and designers in incorporating human-centered design principles into their TMCs. ErgoTMC, a web site tailored t...

  3. Use of the Human Centered Design concept when designing ergonomic NPP control rooms

    International Nuclear Information System (INIS)

    Skrehot, Petr A.; Houser, Frantisek; Riha, Radek; Tuma, Zdenek

    2015-01-01

    Human-Centered Design is a concept aimed at reconciling human needs on the one hand and limitations posed by the design disposition of the room being designed on the other hand. This paper describes the main aspects of application of the Human-Centered Design concept to the design of nuclear power plant control rooms. (orig.)

  4. CENTER CONDITIONS AND CYCLICITY FOR A FAMILY OF CUBIC SYSTEMS: COMPUTER ALGEBRA APPROACH.

    Science.gov (United States)

    Ferčec, Brigita; Mahdi, Adam

    2013-01-01

    Using methods of computational algebra we obtain an upper bound for the cyclicity of a family of cubic systems. We overcame the problem of nonradicality of the associated Bautin ideal by moving from the ring of polynomials to a coordinate ring. Finally, we determine the number of limit cycles bifurcating from each component of the center variety.

  5. CNC Turning Center Operations and Prove Out. Computer Numerical Control Operator/Programmer. 444-334.

    Science.gov (United States)

    Skowronski, Steven D.

    This student guide provides materials for a course designed to instruct the student in the recommended procedures used when setting up tooling and verifying part programs for a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 discusses course content and reviews and demonstrates set-up procedures…

  6. A Perspective on Computational Human Performance Models as Design Tools

    Science.gov (United States)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  7. From STEM to STEAM: Toward a Human-Centered Education

    Science.gov (United States)

    Boy, Guy A.

    2013-01-01

    The 20th century was based on local linear engineering of complicated systems. We made cars, airplanes and chemical plants for example. The 21st century has opened a new basis for holistic non-linear design of complex systems, such as the Internet, air traffic management and nanotechnologies. Complexity, interconnectivity, interaction and communication are major attributes of our evolving society. But, more interestingly, we have started to understand that chaos theories may be more important than reductionism, to better understand and thrive on our planet. Systems need to be investigated and tested as wholes, which requires a cross-disciplinary approach and new conceptual principles and tools. Consequently, schools cannot continue to teach isolated disciplines based on simple reductionism. Science; Technology, Engineering, and Mathematics (STEM) should be integrated together with the Arts1 to promote creativity together with rationalization, and move to STEAM (with an "A" for Arts). This new concept emphasizes the possibility of longer-term socio-technical futures instead of short-term financial predictions that currently lead to uncontrolled economies. Human-centered design (HCD) can contribute to improving STEAM education technologies, systems and practices. HCD not only provides tools and techniques to build useful and usable things, but also an integrated approach to learning by doing, expressing and critiquing, exploring possible futures, and understanding complex systems.

  8. Accurate Computation of Periodic Regions' Centers in the General M-Set with Integer Index Number

    Directory of Open Access Journals (Sweden)

    Wang Xingyuan

    2010-01-01

    Full Text Available This paper presents two methods for accurately computing the periodic regions' centers. One method fits for the general M-sets with integer index number, the other fits for the general M-sets with negative integer index number. Both methods improve the precision of computation by transforming the polynomial equations which determine the periodic regions' centers. We primarily discuss the general M-sets with negative integer index, and analyze the relationship between the number of periodic regions' centers on the principal symmetric axis and in the principal symmetric interior. We can get the centers' coordinates with at least 48 significant digits after the decimal point in both real and imaginary parts by applying the Newton's method to the transformed polynomial equation which determine the periodic regions' centers. In this paper, we list some centers' coordinates of general M-sets' k-periodic regions (k=3,4,5,6 for the index numbers α=−25,−24,…,−1 , all of which have highly numerical accuracy.

  9. Delivering an Informational Hub for Data at the National Center for Computational Toxicology (ACS Spring Meeting) 7 of 7

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  10. Investigating Impact Metrics for Performance for the US EPA National Center for Computational Toxicology (ACS Fall meeting)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  11. It is time to talk about people: a human-centered healthcare system

    Directory of Open Access Journals (Sweden)

    Borgi Lea

    2010-11-01

    Full Text Available Abstract Examining vulnerabilities within our current healthcare system we propose borrowing two tools from the fields of engineering and design: a Reason's system approach 1 and b User-centered design 23. Both approaches are human-centered in that they consider common patterns of human behavior when analyzing systems to identify problems and generate solutions. This paper examines these two human-centered approaches in the context of healthcare. We argue that maintaining a human-centered orientation in clinical care, research, training, and governance is critical to the evolution of an effective and sustainable healthcare system.

  12. HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL*

    Science.gov (United States)

    Malamud, Ofer; Pop-Eleches, Cristian

    2012-01-01

    This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven’s Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors. PMID:22719135

  13. High Performance Computing in Science and Engineering '08 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2009-01-01

    The discussions and plans on all scienti?c, advisory, and political levels to realize an even larger “European Supercomputer” in Germany, where the hardware costs alone will be hundreds of millions Euro – much more than in the past – are getting closer to realization. As part of the strategy, the three national supercomputing centres HLRS (Stuttgart), NIC/JSC (Julic ¨ h) and LRZ (Munich) have formed the Gauss Centre for Supercomputing (GCS) as a new virtual organization enabled by an agreement between the Federal Ministry of Education and Research (BMBF) and the state ministries for research of Baden-Wurttem ¨ berg, Bayern, and Nordrhein-Westfalen. Already today, the GCS provides the most powerful high-performance computing - frastructure in Europe. Through GCS, HLRS participates in the European project PRACE (Partnership for Advances Computing in Europe) and - tends its reach to all European member countries. These activities aligns well with the activities of HLRS in the European HPC infrastructur...

  14. Use of computers and Internet among people with severe mental illnesses at peer support centers.

    Science.gov (United States)

    Brunette, Mary F; Aschbrenner, Kelly A; Ferron, Joelle C; Ustinich, Lee; Kelly, Michael; Grinley, Thomas

    2017-12-01

    Peer support centers are an ideal setting where people with severe mental illnesses can access the Internet via computers for online health education, peer support, and behavioral treatments. The purpose of this study was to assess computer use and Internet access in peer support agencies. A peer-assisted survey assessed the frequency with which consumers in all 13 New Hampshire peer support centers (n = 702) used computers to access Internet resources. During the 30-day survey period, 200 of the 702 peer support consumers (28%) responded to the survey. More than 3 quarters (78.5%) of respondents had gone online to seek information in the past year. About half (49%) of respondents were interested in learning about online forums that would provide information and peer support for mental health issues. Peer support centers may be a useful venue for Web-based approaches to education, peer support, and intervention. Future research should assess facilitators and barriers to use of Web-based resources among people with severe mental illness in peer support centers. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. The UK Human Genome Mapping Project online computing service.

    Science.gov (United States)

    Rysavy, F R; Bishop, M J; Gibbs, G P; Williams, G W

    1992-04-01

    This paper presents an overview of computing and networking facilities developed by the Medical Research Council to provide online computing support to the Human Genome Mapping Project (HGMP) in the UK. The facility is connected to a number of other computing facilities in various centres of genetics and molecular biology research excellence, either directly via high-speed links or through national and international wide-area networks. The paper describes the design and implementation of the current system, a 'client/server' network of Sun, IBM, DEC and Apple servers, gateways and workstations. A short outline of online computing services currently delivered by this system to the UK human genetics research community is also provided. More information about the services and their availability could be obtained by a direct approach to the UK HGMP-RC.

  16. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    Science.gov (United States)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  17. The Role of the Radiation Safety Information Computational Center (RSICC) in Knowledge Management

    International Nuclear Information System (INIS)

    Valentine, T.

    2016-01-01

    Full text: The Radiation Safety Information Computational Center (RSICC) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 packages that have been provided by contributors from various agencies. RSICC’s customers obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to help ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programmes both domestically and internationally, as the majority of RSICC’s customers are students attending U.S. universities. RSICC also supports and promotes workshops and seminars in nuclear science and technology to further the use and/or development of computational tools and data. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC’s activities, services, and systems that support knowledge management and education and training in the nuclear field. (author

  18. Current state and future direction of computer systems at NASA Langley Research Center

    Science.gov (United States)

    Rogers, James L. (Editor); Tucker, Jerry H. (Editor)

    1992-01-01

    Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.

  19. Human face recognition using eigenface in cloud computing environment

    Science.gov (United States)

    Siregar, S. T. M.; Syahputra, M. F.; Rahmat, R. F.

    2018-02-01

    Doing a face recognition for one single face does not take a long time to process, but if we implement attendance system or security system on companies that have many faces to be recognized, it will take a long time. Cloud computing is a computing service that is done not on a local device, but on an internet connected to a data center infrastructure. The system of cloud computing also provides a scalability solution where cloud computing can increase the resources needed when doing larger data processing. This research is done by applying eigenface while collecting data as training data is also done by using REST concept to provide resource, then server can process the data according to existing stages. After doing research and development of this application, it can be concluded by implementing Eigenface, recognizing face by applying REST concept as endpoint in giving or receiving related information to be used as a resource in doing model formation to do face recognition.

  20. Radiation Shielding Information Center: a source of computer codes and data for fusion neutronics studies

    International Nuclear Information System (INIS)

    McGill, B.L.; Roussin, R.W.; Trubey, D.K.; Maskewitz, B.F.

    1980-01-01

    The Radiation Shielding Information Center (RSIC), established in 1962 to collect, package, analyze, and disseminate information, computer codes, and data in the area of radiation transport related to fission, is now being utilized to support fusion neutronics technology. The major activities include: (1) answering technical inquiries on radiation transport problems, (2) collecting, packaging, testing, and disseminating computing technology and data libraries, and (3) reviewing literature and operating a computer-based information retrieval system containing material pertinent to radiation transport analysis. The computer codes emphasize methods for solving the Boltzmann equation such as the discrete ordinates and Monte Carlo techniques, both of which are widely used in fusion neutronics. The data packages include multigroup coupled neutron-gamma-ray cross sections and kerma coefficients, other nuclear data, and radiation transport benchmark problem results

  1. The Role of Computers in Research and Development at Langley Research Center

    Science.gov (United States)

    Wieseman, Carol D. (Compiler)

    1994-01-01

    This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.

  2. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  3. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  4. Argonne's Laboratory Computing Resource Center 2009 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B. (CLS-CI)

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  5. The Emotiv EPOC interface paradigm in Human-Computer Interaction

    Directory of Open Access Journals (Sweden)

    Ancău Dorina

    2017-01-01

    Full Text Available Numerous studies have suggested the use of decoded error potentials in the brain to improve human-computer communication. Together with state-of-the-art scientific equipment, experiments have also tested instruments with more limited performance for the time being, such as Emotiv EPOC. This study presents a review of these trials and a summary of the results obtained. However, the level of these results indicates a promising prospect for using this headset as a human-computer interface for error decoding.

  6. From humans to computers cognition through visual perception

    CERN Document Server

    Alexandrov, Viktor Vasilievitch

    1991-01-01

    This book considers computer vision to be an integral part of the artificial intelligence system. The core of the book is an analysis of possible approaches to the creation of artificial vision systems, which simulate human visual perception. Much attention is paid to the latest achievements in visual psychology and physiology, the description of the functional and structural organization of the human perception mechanism, the peculiarities of artistic perception and the expression of reality. Computer vision models based on these data are investigated. They include the processes of external d

  7. An intelligent multi-media human-computer dialogue system

    Science.gov (United States)

    Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.

    1988-01-01

    Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.

  8. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  9. A Human-Centred Tangible approach to learning Computational Thinking

    Directory of Open Access Journals (Sweden)

    Tommaso Turchi

    2016-08-01

    Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

  10. The Einstein Center for Epigenomics: studying the role of epigenomic dysregulation in human disease.

    Science.gov (United States)

    McLellan, Andrew S; Dubin, Robert A; Jing, Qiang; Maqbool, Shahina B; Olea, Raul; Westby, Gael; Broin, Pilib Ó; Fazzari, Melissa J; Zheng, Deyou; Suzuki, Masako; Greally, John M

    2009-10-01

    There is increasing interest in the role of epigenetic and transcriptional dysregulation in the pathogenesis of a range of human diseases, not just in the best-studied example of cancer. It is, however, quite difficult for an individual investigator to perform these studies, as they involve genome-wide molecular assays combined with sophisticated computational analytical approaches of very large datasets that may be generated from various resources and technologies. In 2008, the Albert Einstein College of Medicine in New York, USA established a Center for Epigenomics to facilitate the research programs of its investigators, providing shared resources for genome-wide assays and for data analysis. As a result, several avenues of research are now expanding, with cancer epigenomics being complemented by studies of the epigenomics of infectious disease and a neuroepigenomics program.

  11. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2015-01-01

    Full Text Available The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same time, the topic is among the most motivating tasks due to the comprehensive multidisciplinary and technical requirements. To enhance the educational process, the paper proposes a novel model-centered framework and develops the relevant learning objects. It allows implementing an educational platform of constructivist learning process, thus enabling learners’ experimentation with the provided programming models, obtaining learners’ competences of the modern scientific research and computational thinking, and capturing the relevant technical knowledge. It also provides an integral platform that allows a simultaneous and comparative introduction to pipelining and parallel computing. The programming language C for developing programming models and message passing interface (MPI and OpenMP parallelization tools have been chosen for implementation.

  12. Choice of Human-Computer Interaction Mode in Stroke Rehabilitation.

    Science.gov (United States)

    Mousavi Hondori, Hossein; Khademi, Maryam; Dodakian, Lucy; McKenzie, Alison; Lopes, Cristina V; Cramer, Steven C

    2016-03-01

    Advances in technology are providing new forms of human-computer interaction. The current study examined one form of human-computer interaction, augmented reality (AR), whereby subjects train in the real-world workspace with virtual objects projected by the computer. Motor performances were compared with those obtained while subjects used a traditional human-computer interaction, that is, a personal computer (PC) with a mouse. Patients used goal-directed arm movements to play AR and PC versions of the Fruit Ninja video game. The 2 versions required the same arm movements to control the game but had different cognitive demands. With AR, the game was projected onto the desktop, where subjects viewed the game plus their arm movements simultaneously, in the same visual coordinate space. In the PC version, subjects used the same arm movements but viewed the game by looking up at a computer monitor. Among 18 patients with chronic hemiparesis after stroke, the AR game was associated with 21% higher game scores (P = .0001), 19% faster reaching times (P = .0001), and 15% less movement variability (P = .0068), as compared to the PC game. Correlations between game score and arm motor status were stronger with the AR version. Motor performances during the AR game were superior to those during the PC game. This result is due in part to the greater cognitive demands imposed by the PC game, a feature problematic for some patients but clinically useful for others. Mode of human-computer interface influences rehabilitation therapy demands and can be individualized for patients. © The Author(s) 2015.

  13. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  14. Unifying Human Centered Design and Systems Engineering for Human Systems Integration

    Science.gov (United States)

    Boy, Guy A.; McGovernNarkevicius, Jennifer

    2013-01-01

    Despite the holistic approach of systems engineering (SE), systems still fail, and sometimes spectacularly. Requirements, solutions and the world constantly evolve and are very difficult to keep current. SE requires more flexibility and new approaches to SE have to be developed to include creativity as an integral part and where the functions of people and technology are appropriately allocated within our highly interconnected complex organizations. Instead of disregarding complexity because it is too difficult to handle, we should take advantage of it, discovering behavioral attractors and the emerging properties that it generates. Human-centered design (HCD) provides the creativity factor that SE lacks. It promotes modeling and simulation from the early stages of design and throughout the life cycle of a product. Unifying HCD and SE will shape appropriate human-systems integration (HSI) and produce successful systems.

  15. The Radiation Safety Information Computational Center (RSICC): A Resource for Nuclear Science Applications

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, Bernadette Lugue [ORNL

    2009-01-01

    The Radiation Safety Information Computational Center (RSICC) has been in existence since 1963. RSICC collects, organizes, evaluates and disseminates technical information (software and nuclear data) involving the transport of neutral and charged particle radiation, and shielding and protection from the radiation associated with: nuclear weapons and materials, fission and fusion reactors, outer space, accelerators, medical facilities, and nuclear waste management. RSICC serves over 12,000 scientists and engineers from about 100 countries.

  16. Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)

    Science.gov (United States)

    Valentine, Timothy

    2017-09-01

    The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.

  17. Plants and Human Affairs: Educational Enhancement Via a Computer.

    Science.gov (United States)

    Crovello, Theodore J.; Smith, W. Nelson

    To enhance both teaching and learning in an advanced undergraduate elective course on the interrelationships of plants and human affairs, the computer was used for information retrieval, multiple choice course review, and the running of three simulation models--plant related systems (e.g., the rise in world coffee prices after the 1975 freeze in…

  18. Humor in Human-Computer Interaction : A Short Survey

    NARCIS (Netherlands)

    Nijholt, Anton; Niculescu, Andreea; Valitutti, Alessandro; Banchs, Rafael E.; Joshi, Anirudha; Balkrishan, Devanuj K.; Dalvi, Girish; Winckler, Marco

    2017-01-01

    This paper is a short survey on humor in human-computer interaction. It describes how humor is designed and interacted with in social media, virtual agents, social robots and smart environments. Benefits and future use of humor in interactions with artificial entities are discussed based on

  19. A Software Framework for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2009-01-01

    This paper describes a software framework we designed and implemented for the development and research in the area of multimodal human-computer interface. The proposed framework is based on publish / subscribe architecture, which allows developers and researchers to conveniently configure, test and

  20. Computational 3-D Model of the Human Respiratory System

    Science.gov (United States)

    We are developing a comprehensive, morphologically-realistic computational model of the human respiratory system that can be used to study the inhalation, deposition, and clearance of contaminants, while being adaptable for age, race, gender, and health/disease status. The model ...

  1. Why computer games can be essential for human flourishing

    NARCIS (Netherlands)

    Fröding, B.; Peterson, M.B.

    2013-01-01

    Traditionally, playing computer games and engaging in other online activities has been seen as a threat to well-being, health and long-term happiness. It is feared that spending many hours per day in front of the screen leads the individual to forsake other, more worthwhile activities, such as human

  2. Homo ludens in the loop playful human computation systems

    CERN Document Server

    Krause, Markus

    2014-01-01

    The human mind is incredible. It solves problems with ease that will elude machines even for the next decades. This book explores what happens when humans and machines work together to solve problems machines cannot yet solve alone. It explains how machines and computers can work together and how humans can have fun helping to face some of the most challenging problems of artificial intelligence. In this book, you will find designs for games that are entertaining and yet able to collect data to train machines for complex tasks such as natural language processing or image understanding. You wil

  3. The National Institutes of Health Center for Human Immunology, Autoimmunity, and Inflammation: history and progress.

    Science.gov (United States)

    Dickler, Howard B; McCoy, J Philip; Nussenblatt, Robert; Perl, Shira; Schwartzberg, Pamela A; Tsang, John S; Wang, Ena; Young, Neil S

    2013-05-01

    The Center for Human Immunology, Autoimmunity, and Inflammation (CHI) is an exciting initiative of the NIH intramural program begun in 2009. It is uniquely trans-NIH in support (multiple institutes) and leadership (senior scientists from several institutes who donate their time). Its goal is an in-depth assessment of the human immune system using high-throughput multiplex technologies for examination of immune cells and their products, the genome, gene expression, and epigenetic modulation obtained from individuals both before and after interventions, adding information from in-depth clinical phenotyping, and then applying advanced biostatistical and computer modeling methods for mining these diverse data. The aim is to develop a comprehensive picture of the human "immunome" in health and disease, elucidate common pathogenic pathways in various diseases, identify and validate biomarkers that predict disease progression and responses to new interventions, and identify potential targets for new therapeutic modalities. Challenges, opportunities, and progress are detailed. Published 2013. This article is a U.S. Government work and is in the public domain in the USA.

  4. Computational Fluid and Particle Dynamics in the Human Respiratory System

    CERN Document Server

    Tu, Jiyuan; Ahmadi, Goodarz

    2013-01-01

    Traditional research methodologies in the human respiratory system have always been challenging due to their invasive nature. Recent advances in medical imaging and computational fluid dynamics (CFD) have accelerated this research. This book compiles and details recent advances in the modelling of the respiratory system for researchers, engineers, scientists, and health practitioners. It breaks down the complexities of this field and provides both students and scientists with an introduction and starting point to the physiology of the respiratory system, fluid dynamics and advanced CFD modeling tools. In addition to a brief introduction to the physics of the respiratory system and an overview of computational methods, the book contains best-practice guidelines for establishing high-quality computational models and simulations. Inspiration for new simulations can be gained through innovative case studies as well as hands-on practice using pre-made computational code. Last but not least, students and researcher...

  5. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    Science.gov (United States)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  6. Pattern Recognition as a Human Centered non-Euclidean Problem

    NARCIS (Netherlands)

    Duin, R.P.W.

    2010-01-01

    Regularities in the world are human defined. Patterns in the observed phenomena are there because we define and recognize them as such. Automatic pattern recognition tries to bridge the gap between human judgment and measurements made by artificial sensors. This is done in two steps: representation

  7. Spectrum of tablet computer use by medical students and residents at an academic medical center

    Directory of Open Access Journals (Sweden)

    Robert Robinson

    2015-07-01

    Full Text Available Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians.Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM in July and August of 2012.Results. There were 76 medical student responses (26% response rate and 66 resident/fellow responses to this survey (21% response rate. Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035. The most common reported uses were for accessing medical reference applications (46%, e-Books (45%, and board study (32%. Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010, review radiology images (27% vs. 12%, p = 0.019, and enter patient care orders (26% vs. 3%, p < 0.001.Discussion. This study shows a high prevalence and frequency of tablet computer use among physicians in training at this academic medical center. Most residents and students use tablet computers to access medical references, e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks.Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on

  8. A novel polar-based human face recognition computational model

    Directory of Open Access Journals (Sweden)

    Y. Zana

    2009-07-01

    Full Text Available Motivated by a recently proposed biologically inspired face recognition approach, we investigated the relation between human behavior and a computational model based on Fourier-Bessel (FB spatial patterns. We measured human recognition performance of FB filtered face images using an 8-alternative forced-choice method. Test stimuli were generated by converting the images from the spatial to the FB domain, filtering the resulting coefficients with a band-pass filter, and finally taking the inverse FB transformation of the filtered coefficients. The performance of the computational models was tested using a simulation of the psychophysical experiment. In the FB model, face images were first filtered by simulated V1- type neurons and later analyzed globally for their content of FB components. In general, there was a higher human contrast sensitivity to radially than to angularly filtered images, but both functions peaked at the 11.3-16 frequency interval. The FB-based model presented similar behavior with regard to peak position and relative sensitivity, but had a wider frequency band width and a narrower response range. The response pattern of two alternative models, based on local FB analysis and on raw luminance, strongly diverged from the human behavior patterns. These results suggest that human performance can be constrained by the type of information conveyed by polar patterns, and consequently that humans might use FB-like spatial patterns in face processing.

  9. Initial constructs for patient-centered outcome measures to evaluate brain-computer interfaces.

    Science.gov (United States)

    Andresen, Elena M; Fried-Oken, Melanie; Peters, Betts; Patrick, Donald L

    2016-10-01

    The authors describe preliminary work toward the creation of patient-centered outcome (PCO) measures to evaluate brain-computer interface (BCI) as an assistive technology (AT) for individuals with severe speech and physical impairments (SSPI). In Phase 1, 591 items from 15 existing measures were mapped to the International Classification of Functioning, Disability and Health (ICF). In Phase 2, qualitative interviews were conducted with eight people with SSPI and seven caregivers. Resulting text data were coded in an iterative analysis. Most items (79%) were mapped to the ICF environmental domain; over half (53%) were mapped to more than one domain. The ICF framework was well suited for mapping items related to body functions and structures, but less so for items in other areas, including personal factors. Two constructs emerged from qualitative data: quality of life (QOL) and AT. Component domains and themes were identified for each. Preliminary constructs, domains and themes were generated for future PCO measures relevant to BCI. Existing instruments are sufficient for initial items but do not adequately match the values of people with SSPI and their caregivers. Field methods for interviewing people with SSPI were successful, and support the inclusion of these individuals in PCO research. Implications for Rehabilitation Adapted interview methods allow people with severe speech and physical impairments to participate in patient-centered outcomes research. Patient-centered outcome measures are needed to evaluate the clinical implementation of brain-computer interface as an assistive technology.

  10. Computer Vision Syndrome among Call Center Employees at Telecommunication Company in Bandung

    Directory of Open Access Journals (Sweden)

    Ghea Nursyifa

    2016-06-01

    Full Text Available Background: The occurrence of Computer Vision Syndrome (CVS at the workplace has increased within decades due to theprolonged use of computers. Knowledge of CVS is necessary in order to develop an awareness of how to prevent and alleviate itsprevalence . The objective of this study was to assess the knowledge of CVS among call center employees and to explore the most frequent CVS symptom experienced by the workers. Methods: A descriptive cross sectional study was conducted during the period of September to November 2014 at Telecommunication Company in Bandung using a questionnaire consisting of 30 questions. Out of the 30 questions/statements, 15 statements were about knowledge of CVS and other 15 questions were about the occurrence of CVS and its symptoms. In this study 125 call center employees participated as respondents using consecutive sampling. The level of knowledge was divided into 3 categories: good (76–100%, fair (75–56% and poor (<56%. The collected data was presented in frequency tabulation. Results: There was 74.4% of the respondents had poor knowledge of CVS. The most symptom experienced by the respondents was asthenopia. Conclusions: The CVS occurs in call center employees with various symptoms and signs. This situation is not supported by good knowledge of the syndrome which can hamper prevention programs.

  11. Modeling Remote I/O versus Staging Tradeoff in Multi-Data Center Computing

    International Nuclear Information System (INIS)

    Suslu, Ibrahim H

    2014-01-01

    In multi-data center computing, data to be processed is not always local to the computation. This is a major challenge especially for data-intensive Cloud computing applications, since large amount of data would need to be either moved the local sites (staging) or accessed remotely over the network (remote I/O). Cloud application developers generally chose between staging and remote I/O intuitively without making any scientific comparison specific to their application data access patterns since there is no generic model available that they can use. In this paper, we propose a generic model for the Cloud application developers which would help them to choose the most appropriate data access mechanism for their specific application workloads. We define the parameters that potentially affect the end-to-end performance of the multi-data center Cloud applications which need to access large datasets over the network. To test and validate our models, we implemented a series of synthetic benchmark applications to simulate the most common data access patterns encountered in Cloud applications. We show that our model provides promising results in different settings with different parameters, such as network bandwidth, server and client capabilities, and data access ratio

  12. Computed tomography-guided core-needle biopsy of lung lesions: an oncology center experience

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, Marcos Duarte; Fonte, Alexandre Calabria da; Chojniak, Rubens, E-mail: marcosduarte@yahoo.com.b [Hospital A.C. Camargo, Sao Paulo, SP (Brazil). Dept. of Radiology and Imaging Diagnosis; Andrade, Marcony Queiroz de [Hospital Alianca, Salvador, BA (Brazil); Gross, Jefferson Luiz [Hospital A.C. Camargo, Sao Paulo, SP (Brazil). Dept. of Chest Surgery

    2011-03-15

    Objective: The present study is aimed at describing the experience of an oncology center with computed tomography guided core-needle biopsy of pulmonary lesions. Materials and Methods: Retrospective analysis of 97 computed tomography-guided core-needle biopsy of pulmonary lesions performed in the period between 1996 and 2004 in a Brazilian reference oncology center (Hospital do Cancer - A.C. Camargo). Information regarding material appropriateness and the specific diagnoses were collected and analyzed. Results: Among 97 lung biopsies, 94 (96.9%) supplied appropriate specimens for histological analyses, with 71 (73.2%) cases being diagnosed as malignant lesions and 23 (23.7%) diagnosed as benign lesions. Specimens were inappropriate for analysis in three cases. The frequency of specific diagnosis was 83 (85.6%) cases, with high rates for both malignant lesions with 63 (88.7%) cases and benign lesions with 20 (86.7%). As regards complications, a total of 12 cases were observed as follows: 7 (7.2%) cases of hematoma, 3 (3.1%) cases of pneumothorax and 2 (2.1%) cases of hemoptysis. Conclusion: Computed tomography-guided core needle biopsy of lung lesions demonstrated high rates of material appropriateness and diagnostic specificity, and low rates of complications in the present study. (author)

  13. A Theory of Human Needs Should Be Human-Centered, Not Animal-Centered: Commentary on Kenrick et al. (2010).

    Science.gov (United States)

    Kesebir, Selin; Graham, Jesse; Oishi, Shigehiro

    2010-05-01

    Kenrick et al. (2010, this issue) make an important contribution by presenting a theory of human needs within an evolutionary framework. In our opinion, however, this framework bypasses the human uniqueness that Maslow intended to capture in his theory. We comment on the unique power of culture in shaping human motivation at the phylogenetic, ontogenetic, and proximate levels. We note that culture-gene coevolution may be a more promising lead to a theory of human motivation than a mammalcentric evolutionary perspective. © The Author(s) 2010.

  14. Teaching Human-Centered Security Using Nontraditional Techniques

    Science.gov (United States)

    Renaud, Karen; Cutts, Quintin

    2013-01-01

    Computing science students amass years of programming experience and a wealth of factual knowledge in their undergraduate courses. Based on our combined years of experience, however, one of our students' abiding shortcomings is that they think there is only "one correct answer" to issues in most courses: an "idealistic"…

  15. Human-Computer Interaction, Tourism and Cultural Heritage

    Science.gov (United States)

    Cipolla Ficarra, Francisco V.

    We present a state of the art of the human-computer interaction aimed at tourism and cultural heritage in some cities of the European Mediterranean. In the work an analysis is made of the main problems deriving from training understood as business and which can derail the continuous growth of the HCI, the new technologies and tourism industry. Through a semiotic and epistemological study the current mistakes in the context of the interrelations of the formal and factual sciences will be detected and also the human factors that have an influence on the professionals devoted to the development of interactive systems in order to safeguard and boost cultural heritage.

  16. Threat and vulnerability analysis and conceptual design of countermeasures for a computer center under construction

    International Nuclear Information System (INIS)

    Rozen, A.; Musacchio, J.M.

    1988-01-01

    This project involved the assessment of a new computer center to be used as the main national data processing facility of a large European bank. This building serves as the principal facility in the country with all other branches utilizing the data processing center. As such, the building is a crucial target which may attract terrorist attacks. Threat and vulnerability assessments were performed as a basis to define and overall fully-integrated security system of passive and active countermeasures for the facility. After separately assessing the range of threats and vulnerabilities, a combined matrix of threats and vulnerabilities was used to identify the crucial combinations. A set of architectural-structural passive measures was added to the active components of the security system

  17. Computer aided systems human engineering: A hypermedia tool

    Science.gov (United States)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  18. The mobilize center: an NIH big data to knowledge center to advance human movement research and improve mobility.

    Science.gov (United States)

    Ku, Joy P; Hicks, Jennifer L; Hastie, Trevor; Leskovec, Jure; Ré, Christopher; Delp, Scott L

    2015-11-01

    Regular physical activity helps prevent heart disease, stroke, diabetes, and other chronic diseases, yet a broad range of conditions impair mobility at great personal and societal cost. Vast amounts of data characterizing human movement are available from research labs, clinics, and millions of smartphones and wearable sensors, but integration and analysis of this large quantity of mobility data are extremely challenging. The authors have established the Mobilize Center (http://mobilize.stanford.edu) to harness these data to improve human mobility and help lay the foundation for using data science methods in biomedicine. The Center is organized around 4 data science research cores: biomechanical modeling, statistical learning, behavioral and social modeling, and integrative modeling. Important biomedical applications, such as osteoarthritis and weight management, will focus the development of new data science methods. By developing these new approaches, sharing data and validated software tools, and training thousands of researchers, the Mobilize Center will transform human movement research. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  19. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    Energy Technology Data Exchange (ETDEWEB)

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art.

  20. Polymer waveguides for electro-optical integration in data centers and high-performance computers.

    Science.gov (United States)

    Dangel, Roger; Hofrichter, Jens; Horst, Folkert; Jubin, Daniel; La Porta, Antonio; Meier, Norbert; Soganci, Ibrahim Murat; Weiss, Jonas; Offrein, Bert Jan

    2015-02-23

    To satisfy the intra- and inter-system bandwidth requirements of future data centers and high-performance computers, low-cost low-power high-throughput optical interconnects will become a key enabling technology. To tightly integrate optics with the computing hardware, particularly in the context of CMOS-compatible silicon photonics, optical printed circuit boards using polymer waveguides are considered as a formidable platform. IBM Research has already demonstrated the essential silicon photonics and interconnection building blocks. A remaining challenge is electro-optical packaging, i.e., the connection of the silicon photonics chips with the system. In this paper, we present a new single-mode polymer waveguide technology and a scalable method for building the optical interface between silicon photonics chips and single-mode polymer waveguides.

  1. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    International Nuclear Information System (INIS)

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art

  2. Initial Flight Test of the Production Support Flight Control Computers at NASA Dryden Flight Research Center

    Science.gov (United States)

    Carter, John; Stephenson, Mark

    1999-01-01

    The NASA Dryden Flight Research Center has completed the initial flight test of a modified set of F/A-18 flight control computers that gives the aircraft a research control law capability. The production support flight control computers (PSFCC) provide an increased capability for flight research in the control law, handling qualities, and flight systems areas. The PSFCC feature a research flight control processor that is "piggybacked" onto the baseline F/A-18 flight control system. This research processor allows for pilot selection of research control law operation in flight. To validate flight operation, a replication of a standard F/A-18 control law was programmed into the research processor and flight-tested over a limited envelope. This paper provides a brief description of the system, summarizes the initial flight test of the PSFCC, and describes future experiments for the PSFCC.

  3. The Radiation Safety Information Computational Center (RSICC): A Resource for Nuclear Science Applications

    International Nuclear Information System (INIS)

    Kirk, Bernadette Lugue

    2009-01-01

    The Radiation Safety Information Computational Center (RSICC) has been in existence since 1963. RSICC collects, organizes, evaluates and disseminates technical information (software and nuclear data) involving the transport of neutral and charged particle radiation, and shielding and protection from the radiation associated with: nuclear weapons and materials, fission and fusion reactors, outer space, accelerators, medical facilities, and nuclear waste management. RSICC serves over 12,000 scientists and engineers from about 100 countries. An important activity of RSICC is its participation in international efforts on computational and experimental benchmarks. An example is the Shielding Integral Benchmarks Archival Database (SINBAD), which includes shielding benchmarks for fission, fusion and accelerators. RSICC is funded by the United States Department of Energy, Department of Homeland Security and Nuclear Regulatory Commission.

  4. Overview Electrotactile Feedback for Enhancing Human Computer Interface

    Science.gov (United States)

    Pamungkas, Daniel S.; Caesarendra, Wahyu

    2018-04-01

    To achieve effective interaction between a human and a computing device or machine, adequate feedback from the computing device or machine is required. Recently, haptic feedback is increasingly being utilised to improve the interactivity of the Human Computer Interface (HCI). Most existing haptic feedback enhancements aim at producing forces or vibrations to enrich the user’s interactive experience. However, these force and/or vibration actuated haptic feedback systems can be bulky and uncomfortable to wear and only capable of delivering a limited amount of information to the user which can limit both their effectiveness and the applications they can be applied to. To address this deficiency, electrotactile feedback is used. This involves delivering haptic sensations to the user by electrically stimulating nerves in the skin via electrodes placed on the surface of the skin. This paper presents a review and explores the capability of electrotactile feedback for HCI applications. In addition, a description of the sensory receptors within the skin for sensing tactile stimulus and electric currents alsoseveral factors which influenced electric signal to transmit to the brain via human skinare explained.

  5. Human-computer systems interaction backgrounds and applications 3

    CERN Document Server

    Kulikowski, Juliusz; Mroczek, Teresa; Wtorek, Jerzy

    2014-01-01

    This book contains an interesting and state-of the art collection of papers on the recent progress in Human-Computer System Interaction (H-CSI). It contributes the profound description of the actual status of the H-CSI field and also provides a solid base for further development and research in the discussed area. The contents of the book are divided into the following parts: I. General human-system interaction problems; II. Health monitoring and disabled people helping systems; and III. Various information processing systems. This book is intended for a wide audience of readers who are not necessarily experts in computer science, machine learning or knowledge engineering, but are interested in Human-Computer Systems Interaction. The level of particular papers and specific spreading-out into particular parts is a reason why this volume makes fascinating reading. This gives the reader a much deeper insight than he/she might glean from research papers or talks at conferences. It touches on all deep issues that ...

  6. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    Science.gov (United States)

    Landgrebe, Anton J.

    1987-01-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  7. Quality Improvement Project to Improve Patient Satisfaction With Pain Management: Using Human-Centered Design.

    Science.gov (United States)

    Trail-Mahan, Tracy; Heisler, Scott; Katica, Mary

    2016-01-01

    In this quality improvement project, our health system developed a comprehensive, patient-centered approach to improving inpatient pain management and assessed its impact on patient satisfaction across 21 medical centers. Using human-centered design principles, a bundle of 6 individual and team nursing practices was developed. Patient satisfaction with pain management, as measured by the Hospital Consumer Assessment of Healthcare Providers and Systems pain composite score, increased from the 25th to just under the 75th national percentile.

  8. Human-Centered Command and Control of Future Autonomous Systems

    Science.gov (United States)

    2013-06-01

    introduce challenges with situation awareness, automation reliance, and accountability (Bainbridge, 1983). If not carefully designed and integrated...into users’ tasks, automation’s costs can quickly outweigh its benefits. A tempting solution to compensate for inherent human cognitive limitations is... Drury & Scott, 2008; Nehme, Scott, Cummings, & Furusho, 2006; Scott & Cummings, 2006). However, there have not been detailed prescriptive task

  9. The Erasmus Computing Grid - Building a Super-Computer Virtually for Free at the Erasmus Medical Center and the Hogeschool Rotterdam

    NARCIS (Netherlands)

    T.A. Knoch (Tobias); L.V. de Zeeuw (Luc)

    2006-01-01

    textabstractThe Set-Up of the 20 Teraflop Erasmus Computing Grid: To meet the enormous computational needs of live- science research as well as clinical diagnostics and treatment the Hogeschool Rotterdam and the Erasmus Medical Center are currently setting up one of the largest desktop

  10. Opportunities for Increasing Human Papillomavirus Vaccine Provision in School Health Centers

    Science.gov (United States)

    Moss, Jennifer L.; Feld, Ashley L.; O'Malley, Brittany; Entzel, Pamela; Smith, Jennifer S.; Gilkey, Melissa B.; Brewer, Noel T.

    2014-01-01

    Background: Uptake of human papillomavirus (HPV) vaccine remains low among adolescents in the United States. We sought to assess barriers to HPV vaccine provision in school health centers to inform subsequent interventions. Methods: We conducted structured interviews in the fall of 2010 with staff from all 33 school health centers in North…

  11. Computer-aided dispatch--traffic management center field operational test final detailed test plan : WSDOT deployment

    Science.gov (United States)

    2003-10-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : WSDOT deployment". This document defines the objective, approach,...

  12. Computer-aided dispatch--traffic management center field operational test final test plans : state of Utah

    Science.gov (United States)

    2004-01-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : state of Utah". This document defines the objective, approach, an...

  13. Computer simulation of human motion in sports biomechanics.

    Science.gov (United States)

    Vaughan, C L

    1984-01-01

    This chapter has covered some important aspects of the computer simulation of human motion in sports biomechanics. First the definition and the advantages and limitations of computer simulation were discussed; second, research on various sporting activities were reviewed. These activities included basic movements, aquatic sports, track and field athletics, winter sports, gymnastics, and striking sports. This list was not exhaustive and certain material has, of necessity, been omitted. However, it was felt that a sufficiently broad and interesting range of activities was chosen to illustrate both the advantages and the pitfalls of simulation. It is almost a decade since Miller [53] wrote a review chapter similar to this one. One might be tempted to say that things have changed radically since then--that computer simulation is now a widely accepted and readily applied research tool in sports biomechanics. This is simply not true, however. Biomechanics researchers still tend to emphasize the descriptive type of study, often unfortunately, when a little theoretical explanation would have been more helpful [29]. What will the next decade bring? Of one thing we can be certain: The power of computers, particularly the readily accessible and portable microcomputer, will expand beyond all recognition. The memory and storage capacities will increase dramatically on the hardware side, and on the software side the trend will be toward "user-friendliness." It is likely that a number of software simulation packages designed specifically for studying human motion [31, 96] will be extensively tested and could gain wide acceptance in the biomechanics research community. Nevertheless, a familiarity with Newtonian and Lagrangian mechanics, optimization theory, and computers in general, as well as practical biomechanical insight, will still be a prerequisite for successful simulation models of human motion. Above all, the biomechanics researcher will still have to bear in mind that

  14. Coevolution between human's anticancer activities and functional foods from crop origin center in the world.

    Science.gov (United States)

    Zeng, Ya-Wen; Du, Juan; Pu, Xiao-Ying; Yang, Jia-Zhen; Yang, Tao; Yang, Shu-Ming; Yang, Xiao-Meng

    2015-01-01

    Cancer is the leading cause of death around the world. Anticancer activities from many functional food sources have been reported in years, but correlation between cancer prevalence and types of food with anticancer activities from crop origin center in the world as well as food source with human migration are unclear. Hunger from food shortage is the cause of early human evolution from Africa to Asia and later into Eurasia. The richest functional foods are found in crop origin centers, housing about 70% in the world populations. Crop origin centers have lower cancer incidence and mortality in the world, especially Central Asia, Middle East, Southwest China, India and Ethiopia. Asia and Africa with the richest anticancer crops is not only the most important evolution base of humans and origin center of anticancer functional crop, but also is the lowest mortality and incidence of cancers in the world. Cancer prevention of early human migrations was associated with functional foods from crop origin centers, especially Asia with four centers and one subcenter of crop origin, accounting for 58% of the world population. These results reveal that coevolution between human's anticancer activities associated with functional foods for crop origin centers, especially in Asia and Africa.

  15. Advances in Human-Computer Interaction: Graphics and Animation Components for Interface Design

    Science.gov (United States)

    Cipolla Ficarra, Francisco V.; Nicol, Emma; Cipolla-Ficarra, Miguel; Richardson, Lucy

    We present an analysis of communicability methodology in graphics and animation components for interface design, called CAN (Communicability, Acceptability and Novelty). This methodology has been under development between 2005 and 2010, obtaining excellent results in cultural heritage, education and microcomputing contexts. In studies where there is a bi-directional interrelation between ergonomics, usability, user-centered design, software quality and the human-computer interaction. We also present the heuristic results about iconography and layout design in blogs and websites of the following countries: Spain, Italy, Portugal and France.

  16. Electromagnetic Modeling of Human Body Using High Performance Computing

    Science.gov (United States)

    Ng, Cho-Kuen; Beall, Mark; Ge, Lixin; Kim, Sanghoek; Klaas, Ottmar; Poon, Ada

    Realistic simulation of electromagnetic wave propagation in the actual human body can expedite the investigation of the phenomenon of harvesting implanted devices using wireless powering coupled from external sources. The parallel electromagnetics code suite ACE3P developed at SLAC National Accelerator Laboratory is based on the finite element method for high fidelity accelerator simulation, which can be enhanced to model electromagnetic wave propagation in the human body. Starting with a CAD model of a human phantom that is characterized by a number of tissues, a finite element mesh representing the complex geometries of the individual tissues is built for simulation. Employing an optimal power source with a specific pattern of field distribution, the propagation and focusing of electromagnetic waves in the phantom has been demonstrated. Substantial speedup of the simulation is achieved by using multiple compute cores on supercomputers.

  17. Toward a human-centered aircraft automation philosophy

    Science.gov (United States)

    Billings, Charles E.

    1989-01-01

    The evolution of automation in civil aircraft is examined in order to discern trends in the respective roles and functions of automation technology and the humans who operate these aircraft. The effects of advances in automation technology on crew reaction is considered and it appears that, though automation may well have decreased the frequency of certain types of human errors in flight, it may also have enabled new categories of human errors, some perhaps less obvious and therefore more serious than those it has alleviated. It is suggested that automation could be designed to keep the pilot closer to the control of the vehicle, while providing an array of information management and aiding functions designed to provide the pilot with data regarding flight replanning, degraded system operation, and the operational status and limits of the aircraft, its systems, and the physical and operational environment. The automation would serve as the pilot's assistant, providing and calculating data, watching for the unexpected, and keeping track of resources and their rate of expenditure.

  18. Research on operation and maintenance support system adaptive to human recognition and understanding in human-centered plant

    International Nuclear Information System (INIS)

    Numano, Masayoshi; Matsuoka, Takeshi; Mitomo, N.

    2004-01-01

    As a human-centered plant, advanced nuclear power plant needs appropriate role sharing between human and mobile intelligent agents. Human-machine cooperation for plant operation and maintenance activities is also required with an advanced interface. Plant's maintenance is programmed using mobile robots working under the radiation environments instead of human beings. Operation and maintenance support system adaptive to human recognition and understanding should be developed to establish adequate human and machine interface so as to induce human capabilities to the full and enable human to take responsibility for plan's operation. Plant's operation and maintenance can be cooperative activities between human and intelligent automonous agents having surveillance and control functions. Infrastructure of multi-agent simulation system for the support system has been investigated and developed based on work plans derived from the scheduler. (T. Tanaka)

  19. Changing the batch system in a Tier 1 computing center: why and how

    Science.gov (United States)

    Chierici, Andrea; Dal Pra, Stefano

    2014-06-01

    At the Italian Tierl Center at CNAF we are evaluating the possibility to change the current production batch system. This activity is motivated mainly because we are looking for a more flexible licensing model as well as to avoid vendor lock-in. We performed a technology tracking exercise and among many possible solutions we chose to evaluate Grid Engine as an alternative because its adoption is increasing in the HEPiX community and because it's supported by the EMI middleware that we currently use on our computing farm. Another INFN site evaluated Slurm and we will compare our results in order to understand pros and cons of the two solutions. We will present the results of our evaluation of Grid Engine, in order to understand if it can fit the requirements of a Tier 1 center, compared to the solution we adopted long ago. We performed a survey and a critical re-evaluation of our farming infrastructure: many production softwares (accounting and monitoring on top of all) rely on our current solution and changing it required us to write new wrappers and adapt the infrastructure to the new system. We believe the results of this investigation can be very useful to other Tier-ls and Tier-2s centers in a similar situation, where the effort of switching may appear too hard to stand. We will provide guidelines in order to understand how difficult this operation can be and how long the change may take.

  20. An Audit on the Appropriateness of Coronary Computed Tomography Angiography Referrals in a Tertiary Cardiac Center.

    Science.gov (United States)

    Alderazi, Ahmed Ali; Lynch, Mary

    2017-01-01

    In response to growing concerns regarding the overuse of coronary computed tomography angiography (CCTA) in the clinical setting, multiple societies, including the American College of Cardiology Foundation, have jointly published revised criteria regarding the appropriate use of this imaging modality. However, previous research indicates significant discrepancies in the rate of adherence to these guidelines. To assess the appropriateness of CCTA referrals in a tertiary cardiac center in Bahrain. This retrospective clinical audit examined the records of patients referred to CCTA between the April 1, 2015 and December 31, 2015 in Mohammed bin Khalifa Cardiac Center. Using information from medical records, each case was meticulously audited against guidelines to categorize it as appropriate, inappropriate, or uncertain. Of the 234 records examined, 176 (75.2%) were appropriate, 47 (20.1%) were uncertain, and 11 (4.7%) were inappropriate. About 74.4% of all referrals were to investigate coronary artery disease (CAD). The most common indication that was deemed appropriate was the detection of CAD in the setting of suspected ischemic equivalent in patients with an intermediate pretest probability of CAD (65.9%). Most referrals deemed inappropriate were requested to detect CAD in asymptomatic patients at low or intermediate risk of CAD (63.6%). This audit demonstrates a relatively low rate of inappropriate CCTA referrals, indicating the appropriate and efficient use of this resource in the Mohammed bin Khalifa Cardiac Center. Agreement on and reclassification of "uncertain" cases by guideline authorities would facilitate a deeper understanding of referral appropriateness.

  1. Examining the Fundamental Obstructs of Adopting Cloud Computing for 9-1-1 Dispatch Centers in the USA

    Science.gov (United States)

    Osman, Abdulaziz

    2016-01-01

    The purpose of this research study was to examine the unknown fears of embracing cloud computing which stretches across measurements like fear of change from leaders and the complexity of the technology in 9-1-1 dispatch centers in USA. The problem that was addressed in the study was that many 9-1-1 dispatch centers in USA are still using old…

  2. Human Systems Engineering for Launch processing at Kennedy Space Center (KSC)

    Science.gov (United States)

    Henderson, Gena; Stambolian, Damon B.; Stelges, Katrine

    2012-01-01

    Launch processing at Kennedy Space Center (KSC) is primarily accomplished by human users of expensive and specialized equipment. In order to reduce the likelihood of human error, to reduce personal injuries, damage to hardware, and loss of mission the design process for the hardware needs to include the human's relationship with the hardware. Just as there is electrical, mechanical, and fluids, the human aspect is just as important. The focus of this presentation is to illustrate how KSC accomplishes the inclusion of the human aspect in the design using human centered hardware modeling and engineering. The presentations also explain the current and future plans for research and development for improving our human factors analysis tools and processes.

  3. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability

    Energy Technology Data Exchange (ETDEWEB)

    Pagliosa, Andre; Raucci-Neto, Walter; Silva-Souza, Yara Teresinha Correa; Alfredo, Edson, E-mail: ysousa@unaerp.br [Universidade de Ribeirao Preto (UNAERP), SP (Brazil). Fac. de Odontologia; Sousa-Neto, Manoel Damiao; Versiani, Marco Aurelio [Universidade de Sao Paulo (USP), Ribeirao Preto, SP (Brazil). Fac. de Odoentologia

    2015-03-01

    The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi) rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10), according to the biomechanical preparative system used: Hero 642 (HR), Liberator (LB), ProTaper (PT), and Twisted File (TF). The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey’s tests (α = 0.05). The results demonstrated no significant difference (p > 0.05) in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR); -0.004 ± 0.044 mm (LB); -0.003 ± 0.064 mm (PT); -0.021 ± 0.064 mm (TF). The mean canal centering ability was: -0.093 ± 0.147 mm (HR); -0.001 ± 0.100 mm (LB); -0.002 ± 0.134 mm (PT); -0.033 ± 0.133 mm (TF). Also, there was no significant difference among the root segments (p > 0.05). It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape. (author)

  4. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability

    Directory of Open Access Journals (Sweden)

    André PAGLIOSA

    2015-01-01

    Full Text Available Abstract : The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10, according to the biomechanical preparative system used: Hero 642 (HR, Liberator (LB, ProTaper (PT, and Twisted File (TF. The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey’s tests (α = 0.05. The results demonstrated no significant difference (p > 0.05 in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR; -0.004 ± 0.044 mm (LB; -0.003 ± 0.064 mm (PT; -0.021 ± 0.064 mm (TF. The mean canal centering ability was: -0.093 ± 0.147 mm (HR; -0.001 ± 0.100 mm (LB; -0.002 ± 0.134 mm (PT; -0.033 ± 0.133 mm (TF. Also, there was no significant difference among the root segments (p > 0.05. It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape.

  5. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability

    International Nuclear Information System (INIS)

    Pagliosa, Andre; Raucci-Neto, Walter; Silva-Souza, Yara Teresinha Correa; Alfredo, Edson; Sousa-Neto, Manoel Damiao; Versiani, Marco Aurelio

    2015-01-01

    The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi) rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10), according to the biomechanical preparative system used: Hero 642 (HR), Liberator (LB), ProTaper (PT), and Twisted File (TF). The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey’s tests (α = 0.05). The results demonstrated no significant difference (p > 0.05) in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR); -0.004 ± 0.044 mm (LB); -0.003 ± 0.064 mm (PT); -0.021 ± 0.064 mm (TF). The mean canal centering ability was: -0.093 ± 0.147 mm (HR); -0.001 ± 0.100 mm (LB); -0.002 ± 0.134 mm (PT); -0.033 ± 0.133 mm (TF). Also, there was no significant difference among the root segments (p > 0.05). It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape. (author)

  6. Human-centered modeling in human reliability analysis: some trends based on case studies

    International Nuclear Information System (INIS)

    Mosneron-Dupin, F.; Reer, B.; Heslinga, G.; Straeter, O.; Gerdes, V.; Saliou, G.; Ullwer, W.

    1997-01-01

    As an informal working group of researchers from France, Germany and The Netherlands created in 1993, the EARTH association is investigating significant subjects in the field of human reliability analysis (HRA). Our initial review of cases from nuclear operating experience showed that decision-based unrequired actions (DUA) contribute to risk significantly on the one hand. On the other hand, our evaluation of current HRA methods showed that these methods do not cover such actions adequately. Especially, practice-oriented guidelines for their predictive identification are lacking. We assumed that a basic cause for such difficulties was that these methods actually use a limited representation of the stimulus-organism-response (SOR) paradigm. We proposed a human-centered model, which better highlights the active role of the operators and the importance of their culture, attitudes and goals. This orientation was encouraged by our review of current HRA research activities. We therefore decided to envisage progress by identifying cognitive tendencies in the context of operating and simulator experience. For this purpose, advanced approaches for retrospective event analysis were discussed. Some orientations for improvements were proposed. By analyzing cases, various cognitive tendencies were identified, together with useful information about their context. Some of them match psychological findings already published in the literature, some of them are not covered adequately by the literature that we reviewed. Finally, this exploratory study shows that contextual and case-illustrated findings about cognitive tendencies provide useful help for the predictive identification of DUA in HRA. More research should be carried out to complement our findings and elaborate more detailed and systematic guidelines for using them in HRA studies

  7. Pain, Work-related Characteristics, and Psychosocial Factors among Computer Workers at a University Center.

    Science.gov (United States)

    Mainenti, Míriam Raquel Meira; Felicio, Lilian Ramiro; Rodrigues, Erika de Carvalho; Ribeiro da Silva, Dalila Terrinha; Vigário Dos Santos, Patrícia

    2014-04-01

    [Purpose] Complaint of pain is common in computer workers, encouraging the investigation of pain-related workplace factors. This study investigated the relationship among work-related characteristics, psychosocial factors, and pain among computer workers from a university center. [Subjects and Methods] Fifteen subjects (median age, 32.0 years; interquartile range, 26.8-34.5 years) were subjected to measurement of bioelectrical impedance; photogrammetry; workplace measurements; and pain complaint, quality of life, and motivation questionnaires. [Results] The low back was the most prevalent region of complaint (76.9%). The number of body regions for which subjects complained of pain was greater in the no rest breaks group, which also presented higher prevalences of neck (62.5%) and low back (100%) pain. There were also observed associations between neck complaint and quality of life; neck complaint and head protrusion; wrist complaint and shoulder angle; and use of a chair back and thoracic pain. [Conclusion] Complaint of pain was associated with no short rest breaks, no use of a chair back, poor quality of life, high head protrusion, and shoulder angle while using the mouse of a computer.

  8. Concurrent validity of an automated algorithm for computing the center of pressure excursion index (CPEI).

    Science.gov (United States)

    Diaz, Michelle A; Gibbons, Mandi W; Song, Jinsup; Hillstrom, Howard J; Choe, Kersti H; Pasquale, Maria R

    2018-01-01

    Center of Pressure Excursion Index (CPEI), a parameter computed from the distribution of plantar pressures during stance phase of barefoot walking, has been used to assess dynamic foot function. The original custom program developed to calculate CPEI required the oversight of a user who could manually correct for certain exceptions to the computational rules. A new fully automatic program has been developed to calculate CPEI with an algorithm that accounts for these exceptions. The purpose of this paper is to compare resulting CPEI values computed by these two programs on plantar pressure data from both asymptomatic and pathologic subjects. If comparable, the new program offers significant benefits-reduced potential for variability due to rater discretion and faster CPEI calculation. CPEI values were calculated from barefoot plantar pressure distributions during comfortable paced walking on 61 healthy asymptomatic adults, 19 diabetic adults with moderate hallux valgus, and 13 adults with mild hallux valgus. Right foot data for each subject was analyzed with linear regression and a Bland-Altman plot. The automated algorithm yielded CPEI values that were linearly related to the original program (R 2 =0.99; Pcomputation methods. Results of this analysis suggest that the new automated algorithm may be used to calculate CPEI on both healthy and pathologic feet. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Intermittent control: a computational theory of human control.

    Science.gov (United States)

    Gawthrop, Peter; Loram, Ian; Lakie, Martin; Gollee, Henrik

    2011-02-01

    The paradigm of continuous control using internal models has advanced understanding of human motor control. However, this paradigm ignores some aspects of human control, including intermittent feedback, serial ballistic control, triggered responses and refractory periods. It is shown that event-driven intermittent control provides a framework to explain the behaviour of the human operator under a wider range of conditions than continuous control. Continuous control is included as a special case, but sampling, system matched hold, an intermittent predictor and an event trigger allow serial open-loop trajectories using intermittent feedback. The implementation here may be described as "continuous observation, intermittent action". Beyond explaining unimodal regulation distributions in common with continuous control, these features naturally explain refractoriness and bimodal stabilisation distributions observed in double stimulus tracking experiments and quiet standing, respectively. Moreover, given that human control systems contain significant time delays, a biological-cybernetic rationale favours intermittent over continuous control: intermittent predictive control is computationally less demanding than continuous predictive control. A standard continuous-time predictive control model of the human operator is used as the underlying design method for an event-driven intermittent controller. It is shown that when event thresholds are small and sampling is regular, the intermittent controller can masquerade as the underlying continuous-time controller and thus, under these conditions, the continuous-time and intermittent controller cannot be distinguished. This explains why the intermittent control hypothesis is consistent with the continuous control hypothesis for certain experimental conditions.

  10. Computed tomography of human joints and radioactive waste drums

    International Nuclear Information System (INIS)

    Martz, Harry E.; Roberson, G. Patrick; Hollerbach, Karin; Logan, Clinton M.; Ashby, Elaine; Bernardi, Richard

    1999-01-01

    X- and gamma-ray imaging techniques in nondestructive evaluation (NDE) and assay (NDA) have seen increasing use in an array of industrial, environmental, military, and medical applications. Much of this growth in recent years is attributed to the rapid development of computed tomography (CT) and the use of NDE throughout the life-cycle of a product. Two diverse examples of CT are discussed, 1.) Our computational approach to normal joint kinematics and prosthetic joint analysis offers an opportunity to evaluate and improve prosthetic human joint replacements before they are manufactured or surgically implanted. Computed tomography data from scanned joints are segmented, resulting in the identification of bone and other tissues of interest, with emphasis on the articular surfaces. 2.) We are developing NDE and NDA techniques to analyze closed waste drums accurately and quantitatively. Active and passive computed tomography (A and PCT) is a comprehensive and accurate gamma-ray NDA method that can identify all detectable radioisotopes present in a container and measure their radioactivity

  11. The New Robotics-towards human-centered machines.

    Science.gov (United States)

    Schaal, Stefan

    2007-07-01

    Research in robotics has moved away from its primary focus on industrial applications. The New Robotics is a vision that has been developed in past years by our own university and many other national and international research institutions and addresses how increasingly more human-like robots can live among us and take over tasks where our current society has shortcomings. Elder care, physical therapy, child education, search and rescue, and general assistance in daily life situations are some of the examples that will benefit from the New Robotics in the near future. With these goals in mind, research for the New Robotics has to embrace a broad interdisciplinary approach, ranging from traditional mathematical issues of robotics to novel issues in psychology, neuroscience, and ethics. This paper outlines some of the important research problems that will need to be resolved to make the New Robotics a reality.

  12. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-03-24

    Roughly ~50% of the human genome, contains noncoding sequences serving as regulatory elements responsible for the diverse gene expression of the cells in the body. One very well studied category of regulatory elements is the category of enhancers. Enhancers increase the transcriptional output in cells through chromatin remodeling or recruitment of complexes of binding proteins. Identification of enhancer using computational techniques is an interesting area of research and up to now several approaches have been proposed. However, the current state-of-the-art methods face limitations since the function of enhancers is clarified, but their mechanism of function is not well understood. This PhD thesis presents a bioinformatics/computer science study that focuses on the problem of identifying enhancers in different human cells using computational techniques. The dissertation is decomposed into four main tasks that we present in different chapters. First, since many of the enhancer’s functions are not well understood, we study the basic biological models by which enhancers trigger transcriptional functions and we survey comprehensively over 30 bioinformatics approaches for identifying enhancers. Next, we elaborate more on the availability of enhancer data as produced by different enhancer identification methods and experimental procedures. In particular, we analyze advantages and disadvantages of existing solutions and we report obstacles that require further consideration. To mitigate these problems we developed the Database of Integrated Human Enhancers (DENdb), a centralized online repository that archives enhancer data from 16 ENCODE cell-lines. The integrated enhancer data are also combined with many other experimental data that can be used to interpret the enhancers content and generate a novel enhancer annotation that complements the existing integrative annotation proposed by the ENCODE consortium. Next, we propose the first deep-learning computational

  13. The design of neonatal incubators: a systems-oriented, human-centered approach.

    Science.gov (United States)

    Ferris, T K; Shepley, M M

    2013-04-01

    This report describes a multidisciplinary design project conducted in an academic setting reflecting a systems-oriented, human-centered philosophy in the design of neonatal incubator technologies. Graduate students in Architectural Design and Human Factors Engineering courses collaborated in a design effort that focused on supporting the needs of three user groups of incubator technologies: infant patients, family members and medical personnel. Design teams followed established human-centered design methods that included interacting with representatives from the user groups, analyzing sets of critical tasks and conducting usability studies with existing technologies. An iterative design and evaluation process produced four conceptual designs of incubators and supporting equipment that better address specific needs of the user groups. This report introduces the human-centered design approach, highlights some of the analysis findings and design solutions, and offers a set of design recommendations for future incubation technologies.

  14. CHI '13 Extended Abstracts on Human Factors in Computing Systems

    DEFF Research Database (Denmark)

    also deeply appreciate the huge amount of time donated to this process by the 211-member program committee, who paid their own way to attend the face-to-face program committee meeting, an event larger than the average ACM conference. We are proud of the work of the CHI 2013 program committee and hope...... a tremendous amount of work from all areas of the human-computer interaction community. As co-chairs of the process, we are amazed at the ability of the community to organize itself to accomplish this task. We would like to thank the 2680 individual reviewers for their careful consideration of these papers. We...

  15. Code system to compute radiation dose in human phantoms

    International Nuclear Information System (INIS)

    Ryman, J.C.; Cristy, M.; Eckerman, K.F.; Davis, J.L.; Tang, J.S.; Kerr, G.D.

    1986-01-01

    Monte Carlo photon transport code and a code using Monte Carlo integration of a point kernel have been revised to incorporate human phantom models for an adult female, juveniles of various ages, and a pregnant female at the end of the first trimester of pregnancy, in addition to the adult male used earlier. An analysis code has been developed for deriving recommended values of specific absorbed fractions of photon energy. The computer code system and calculational method are described, emphasizing recent improvements in methods

  16. Shape perception in human and computer vision an interdisciplinary perspective

    CERN Document Server

    Dickinson, Sven J

    2013-01-01

    This comprehensive and authoritative text/reference presents a unique, multidisciplinary perspective on Shape Perception in Human and Computer Vision. Rather than focusing purely on the state of the art, the book provides viewpoints from world-class researchers reflecting broadly on the issues that have shaped the field. Drawing upon many years of experience, each contributor discusses the trends followed and the progress made, in addition to identifying the major challenges that still lie ahead. Topics and features: examines each topic from a range of viewpoints, rather than promoting a speci

  17. Research and development of grid computing technology in center for computational science and e-systems of Japan Atomic Energy Agency

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of the Japan Atomic Energy Agency (CCSE/JAEA) has carried out R and D of grid computing technology. Since 1995, R and D to realize computational assistance for researchers called Seamless Thinking Aid (STA) and then to share intellectual resources called Information Technology Based Laboratory (ITBL) have been conducted, leading to construct an intelligent infrastructure for the atomic energy research called Atomic Energy Grid InfraStructure (AEGIS) under the Japanese national project 'Development and Applications of Advanced High-Performance Supercomputer'. It aims to enable synchronization of three themes: 1) Computer-Aided Research and Development (CARD) to realize and environment for STA, 2) Computer-Aided Engineering (CAEN) to establish Multi Experimental Tools (MEXT), and 3) Computer Aided Science (CASC) to promote the Atomic Energy Research and Investigation (AERI). This article reviewed achievements in R and D of grid computing technology so far obtained. (T. Tanaka)

  18. Virtual reality/ augmented reality technology : the next chapter of human-computer interaction

    OpenAIRE

    Huang, Xing

    2015-01-01

    No matter how many different size and shape the computer has, the basic components of computers are still the same. If we use the user perspective to look for the development of computer history, we can surprisingly find that it is the input output device that leads the development of the industry development, in one word, human-computer interaction changes the development of computer history. Human computer interaction has been gone through three stages, the first stage relies on the inpu...

  19. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment.

    Science.gov (United States)

    Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A

    2016-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. My4Sight: A Human Computation Platform for Improving Flu Predictions

    OpenAIRE

    Akupatni, Vivek Bharath

    2015-01-01

    While many human computation (human-in-the-loop) systems exist in the field of Artificial Intelligence (AI) to solve problems that can't be solved by computers alone, comparatively fewer platforms exist for collecting human knowledge, and evaluation of various techniques for harnessing human insights in improving forecasting models for infectious diseases, such as Influenza and Ebola. In this thesis, we present the design and implementation of My4Sight, a human computation system develope...

  1. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    Science.gov (United States)

    Molthan, Andrew L.; Limaye, Ashutosh S.; Srikishen, Jayanthi

    2011-01-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula s "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA s National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA s SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT s experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by

  2. Examining human rights and mental health among women in drug abuse treatment centers in Afghanistan.

    Science.gov (United States)

    Abadi, Melissa Harris; Shamblen, Stephen R; Johnson, Knowlton; Thompson, Kirsten; Young, Linda; Courser, Matthew; Vanderhoff, Jude; Browne, Thom

    2012-01-01

    Denial of human rights, gender disparities, and living in a war zone can be associated with severe depression and poor social functioning, especially for female drug abusers. This study of Afghan women in drug abuse treatment (DAT) centers assesses (a) the extent to which these women have experienced human rights violations and mental health problems prior to entering the DAT centers, and (b) whether there are specific risk factors for human rights violations among this population. A total of 176 in-person interviews were conducted with female patients admitted to three drug abuse treatment centers in Afghanistan in 2010. Nearly all women (91%) reported limitations with social functioning. Further, 41% of the women indicated they had suicide ideation and 27% of the women had attempted suicide at least once 30 days prior to entering the DAT centers due to feelings of sadness or hopelessness. Half of the women (50%) experienced at least one human rights violation in the past year prior to entering the DAT centers. Risk factors for human rights violations among this population include marital status, ethnicity, literacy, employment status, entering treatment based on one's own desire, limited social functioning, and suicide attempts. Conclusions stemming from the results are discussed.

  3. What do we mean by Human-Centered Design of Life-Critical Systems?

    Science.gov (United States)

    Boy, Guy A

    2012-01-01

    Human-centered design is not a new approach to design. Aerospace is a good example of a life-critical systems domain where participatory design was fully integrated, involving experimental test pilots and design engineers as well as many other actors of the aerospace engineering community. This paper provides six topics that are currently part of the requirements of the Ph.D. Program in Human-Centered Design of the Florida Institute of Technology (FIT.) This Human-Centered Design program offers principles, methods and tools that support human-centered sustainable products such as mission or process control environments, cockpits and hospital operating rooms. It supports education and training of design thinkers who are natural leaders, and understand complex relationships among technology, organizations and people. We all need to understand what we want to do with technology, how we should organize ourselves to a better life and finally find out whom we are and have become. Human-centered design is being developed for all these reasons and issues.

  4. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    Science.gov (United States)

    Molthan, A.; Limaye, A. S.

    2011-12-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula's "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA's National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA's SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT's experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by

  5. Evidence Report: Risk of Inadequate Human-Computer Interaction

    Science.gov (United States)

    Holden, Kritina; Ezer, Neta; Vos, Gordon

    2013-01-01

    Human-computer interaction (HCI) encompasses all the methods by which humans and computer-based systems communicate, share information, and accomplish tasks. When HCI is poorly designed, crews have difficulty entering, navigating, accessing, and understanding information. HCI has rarely been studied in an operational spaceflight context, and detailed performance data that would support evaluation of HCI have not been collected; thus, we draw much of our evidence from post-spaceflight crew comments, and from other safety-critical domains like ground-based power plants, and aviation. Additionally, there is a concern that any potential or real issues to date may have been masked by the fact that crews have near constant access to ground controllers, who monitor for errors, correct mistakes, and provide additional information needed to complete tasks. We do not know what types of HCI issues might arise without this "safety net". Exploration missions will test this concern, as crews may be operating autonomously due to communication delays and blackouts. Crew survival will be heavily dependent on available electronic information for just-in-time training, procedure execution, and vehicle or system maintenance; hence, the criticality of the Risk of Inadequate HCI. Future work must focus on identifying the most important contributing risk factors, evaluating their contribution to the overall risk, and developing appropriate mitigations. The Risk of Inadequate HCI includes eight core contributing factors based on the Human Factors Analysis and Classification System (HFACS): (1) Requirements, policies, and design processes, (2) Information resources and support, (3) Allocation of attention, (4) Cognitive overload, (5) Environmentally induced perceptual changes, (6) Misperception and misinterpretation of displayed information, (7) Spatial disorientation, and (8) Displays and controls.

  6. A hypothesis on the formation of the primary ossification centers in the membranous neurocranium: a mathematical and computational model.

    Science.gov (United States)

    Garzón-Alvarado, Diego A

    2013-01-21

    This article develops a model of the appearance and location of the primary centers of ossification in the calvaria. The model uses a system of reaction-diffusion equations of two molecules (BMP and Noggin) whose behavior is of type activator-substrate and its solution produces Turing patterns, which represents the primary ossification centers. Additionally, the model includes the level of cell maturation as a function of the location of mesenchymal cells. Thus the mature cells can become osteoblasts due to the action of BMP2. Therefore, with this model, we can have two frontal primary centers, two parietal, and one, two or more occipital centers. The location of these centers in the simplified computational model is highly consistent with those centers found at an embryonic level. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Gaze-and-brain-controlled interfaces for human-computer and human-robot interaction

    Directory of Open Access Journals (Sweden)

    Shishkin S. L.

    2017-09-01

    Full Text Available Background. Human-machine interaction technology has greatly evolved during the last decades, but manual and speech modalities remain single output channels with their typical constraints imposed by the motor system’s information transfer limits. Will brain-computer interfaces (BCIs and gaze-based control be able to convey human commands or even intentions to machines in the near future? We provide an overview of basic approaches in this new area of applied cognitive research. Objective. We test the hypothesis that the use of communication paradigms and a combination of eye tracking with unobtrusive forms of registering brain activity can improve human-machine interaction. Methods and Results. Three groups of ongoing experiments at the Kurchatov Institute are reported. First, we discuss the communicative nature of human-robot interaction, and approaches to building a more e cient technology. Specifically, “communicative” patterns of interaction can be based on joint attention paradigms from developmental psychology, including a mutual “eye-to-eye” exchange of looks between human and robot. Further, we provide an example of “eye mouse” superiority over the computer mouse, here in emulating the task of selecting a moving robot from a swarm. Finally, we demonstrate a passive, noninvasive BCI that uses EEG correlates of expectation. This may become an important lter to separate intentional gaze dwells from non-intentional ones. Conclusion. The current noninvasive BCIs are not well suited for human-robot interaction, and their performance, when they are employed by healthy users, is critically dependent on the impact of the gaze on selection of spatial locations. The new approaches discussed show a high potential for creating alternative output pathways for the human brain. When support from passive BCIs becomes mature, the hybrid technology of the eye-brain-computer (EBCI interface will have a chance to enable natural, fluent, and the

  8. Seeking Humanizing Care in Patient-Centered Care Process: A Grounded Theory Study.

    Science.gov (United States)

    Cheraghi, Mohammad Ali; Esmaeili, Maryam; Salsali, Mahvash

    Patient-centered care is both a goal in itself and a tool for enhancing health outcomes. The application of patient-centered care in health care services globally however is diverse. This article reports on a study that sought to introduce patient-centered care. The aim of this study is to explore the process of providing patient-centered care in critical care units. The study used a grounded theory method. Data were collected on 5 critical care units in Tehran University of Medical Sciences. Purposive and theoretical sampling directed the collection of data using 29 semistructured interviews with 27 participants (nurses, patients, and physician). Data obtained were analyzed according to the analysis stages of grounded theory and constant comparison to identify the concepts, context, and process of the study. The core category of this grounded theory is "humanizing care," which consisted of 4 interrelated phases, including patient acceptance, purposeful patient assessment and identification, understanding patients, and patient empowerment. A core category of humanizing care integrated the theory. Humanizing care was an outcome and process. Patient-centered care is a dynamic and multifaceted process provided according to the nurses' understanding of the concept. Patient-centered care does not involve repeating routine tasks; rather, it requires an all-embracing understanding of the patients and showing respect for their values, needs, and preferences.

  9. Examining human rights and mental health among women in drug abuse treatment centers in Afghanistan

    Directory of Open Access Journals (Sweden)

    Abadi MH

    2012-04-01

    Full Text Available Melissa Harris Abadi1, Stephen R Shamblen1, Knowlton Johnson1, Kirsten Thompson1, Linda Young1, Matthew Courser1, Jude Vanderhoff1, Thom Browne21Pacific Institute for Research and Evaluation – Louisville Center, Louisville, KY, USA; 2United States Department of State, Bureau of International Narcotics and Law Enforcement, Washington, DC, USAAbstract: Denial of human rights, gender disparities, and living in a war zone can be associated with severe depression and poor social functioning, especially for female drug abusers. This study of Afghan women in drug abuse treatment (DAT centers assesses (a the extent to which these women have experienced human rights violations and mental health problems prior to entering the DAT centers, and (b whether there are specific risk factors for human rights violations among this population. A total of 176 in-person interviews were conducted with female patients admitted to three drug abuse treatment centers in Afghanistan in 2010. Nearly all women (91% reported limitations with social functioning. Further, 41% of the women indicated they had suicide ideation and 27% of the women had attempted suicide at least once 30 days prior to entering the DAT centers due to feelings of sadness or hopelessness. Half of the women (50% experienced at least one human rights violation in the past year prior to entering the DAT centers. Risk factors for human rights violations among this population include marital status, ethnicity, literacy, employment status, entering treatment based on one’s own desire, limited social functioning, and suicide attempts. Conclusions stemming from the results are discussed.Keywords: Afghanistan, women, human rights, mental health, drug abuse treatment

  10. Development of a framework of human-centered automation for the nuclear industry

    International Nuclear Information System (INIS)

    Nelson, W.R.; Haney, L.N.

    1993-01-01

    Introduction of automated systems into control rooms for advanced reactor designs is often justified on the basis of increased efficiency and reliability, without a detailed assessment of how the new technologies will influence the role of the operator. Such a ''technology-centered'' approach carries with it the risk that entirely new mechanisms for human error will be introduced, resulting in some unpleasant surprises when the plant goes into operation. The aviation industry has experienced some of these surprises since the introduction of automated systems into the cockpits of advanced technology aircraft. Pilot errors have actually been induced by automated systems, especially when the pilot doesn't fully understand what the automated systems are doing during all modes of operation. In order to structure the research program for investigating these problems, the National Aeronautics and Space Administration (NASA) has developed a framework for human-centered automation. This framework is described in the NASA document Human-Centered Aircraft Automation Philosophy by Charles Billings. It is the thesis of this paper that a corresponding framework of human-centered automation should be developed for the nuclear industry. Such a framework would serve to guide the design and regulation of automated systems for advanced reactor designs, and would help prevent some of the problems that have arisen in other applications that have followed a ''technology-centered'' approach

  11. Human-computer interface incorporating personal and application domains

    Science.gov (United States)

    Anderson, Thomas G [Albuquerque, NM

    2011-03-29

    The present invention provides a human-computer interface. The interface includes provision of an application domain, for example corresponding to a three-dimensional application. The user is allowed to navigate and interact with the application domain. The interface also includes a personal domain, offering the user controls and interaction distinct from the application domain. The separation into two domains allows the most suitable interface methods in each: for example, three-dimensional navigation in the application domain, and two- or three-dimensional controls in the personal domain. Transitions between the application domain and the personal domain are under control of the user, and the transition method is substantially independent of the navigation in the application domain. For example, the user can fly through a three-dimensional application domain, and always move to the personal domain by moving a cursor near one extreme of the display.

  12. A computational model of human auditory signal processing and perception

    DEFF Research Database (Denmark)

    Jepsen, Morten Løve; Ewert, Stephan D.; Dau, Torsten

    2008-01-01

    A model of computational auditory signal-processing and perception that accounts for various aspects of simultaneous and nonsimultaneous masking in human listeners is presented. The model is based on the modulation filterbank model described by Dau et al. [J. Acoust. Soc. Am. 102, 2892 (1997...... discrimination with pure tones and broadband noise, tone-in-noise detection, spectral masking with narrow-band signals and maskers, forward masking with tone signals and tone or noise maskers, and amplitude-modulation detection with narrow- and wideband noise carriers. The model can account for most of the key...... properties of the data and is more powerful than the original model. The model might be useful as a front end in technical applications....

  13. Human-computer interface glove using flexible piezoelectric sensors

    Science.gov (United States)

    Cha, Youngsu; Seo, Jeonggyu; Kim, Jun-Sik; Park, Jung-Min

    2017-05-01

    In this note, we propose a human-computer interface glove based on flexible piezoelectric sensors. We select polyvinylidene fluoride as the piezoelectric material for the sensors because of advantages such as a steady piezoelectric characteristic and good flexibility. The sensors are installed in a fabric glove by means of pockets and Velcro bands. We detect changes in the angles of the finger joints from the outputs of the sensors, and use them for controlling a virtual hand that is utilized in virtual object manipulation. To assess the sensing ability of the piezoelectric sensors, we compare the processed angles from the sensor outputs with the real angles from a camera recoding. With good agreement between the processed and real angles, we successfully demonstrate the user interaction system with the virtual hand and interface glove based on the flexible piezoelectric sensors, for four hand motions: fist clenching, pinching, touching, and grasping.

  14. Whatever works: a systematic user-centered training protocol to optimize brain-computer interfacing individually.

    Directory of Open Access Journals (Sweden)

    Elisabeth V C Friedrich

    Full Text Available This study implemented a systematic user-centered training protocol for a 4-class brain-computer interface (BCI. The goal was to optimize the BCI individually in order to achieve high performance within few sessions for all users. Eight able-bodied volunteers, who were initially naïve to the use of a BCI, participated in 10 sessions over a period of about 5 weeks. In an initial screening session, users were asked to perform the following seven mental tasks while multi-channel EEG was recorded: mental rotation, word association, auditory imagery, mental subtraction, spatial navigation, motor imagery of the left hand and motor imagery of both feet. Out of these seven mental tasks, the best 4-class combination as well as most reactive frequency band (between 8-30 Hz was selected individually for online control. Classification was based on common spatial patterns and Fisher's linear discriminant analysis. The number and time of classifier updates varied individually. Selection speed was increased by reducing trial length. To minimize differences in brain activity between sessions with and without feedback, sham feedback was provided in the screening and calibration runs in which usually no real-time feedback is shown. Selected task combinations and frequency ranges differed between users. The tasks that were included in the 4-class combination most often were (1 motor imagery of the left hand (2, one brain-teaser task (word association or mental subtraction (3, mental rotation task and (4 one more dynamic imagery task (auditory imagery, spatial navigation, imagery of the feet. Participants achieved mean performances over sessions of 44-84% and peak performances in single-sessions of 58-93% in this user-centered 4-class BCI protocol. This protocol is highly adjustable to individual users and thus could increase the percentage of users who can gain and maintain BCI control. A high priority for future work is to examine this protocol with severely

  15. Simple, accurate equations for human blood O2 dissociation computations.

    Science.gov (United States)

    Severinghaus, J W

    1979-03-01

    Hill's equation can be slightly modified to fit the standard human blood O2 dissociation curve to within plus or minus 0.0055 fractional saturation (S) from O less than S less than 1. Other modifications of Hill's equation may be used to compute Po2 (Torr) from S (Eq. 2), and the temperature coefficient of Po2 (Eq. 3). Variations of the Bohr coefficient with Po2 are given by Eq. 4. S = (((Po2(3) + 150 Po2)(-1) x 23,400) + 1)(-1) (1) In Po2 = 0.385 In (S-1 - 1)(-1) + 3.32 - (72 S)(-1) - 0.17(S6) (2) DELTA In Po2/delta T = 0.058 ((0.243 X Po2/100)(3.88) + 1)(-1) + 0.013 (3) delta In Po2/delta pH = (Po2/26.6)(0.184) - 2.2 (4) Procedures are described to determine Po2 and S of blood iteratively after extraction or addition of a defined amount of O2 and to compute P50 of blood from a single sample after measuring Po2, pH, and S.

  16. Energy-Efficient Management of Data Center Resources for Cloud Computing: A Vision, Architectural Elements, and Open Challenges

    OpenAIRE

    Buyya, Rajkumar; Beloglazov, Anton; Abawajy, Jemal

    2010-01-01

    Cloud computing is offering utility-oriented IT services to users worldwide. Based on a pay-as-you-go model, it enables hosting of pervasive applications from consumer, scientific, and business domains. However, data centers hosting Cloud applications consume huge amounts of energy, contributing to high operational costs and carbon footprints to the environment. Therefore, we need Green Cloud computing solutions that can not only save energy for the environment but also reduce operational cos...

  17. Assessing Human Judgment of Computationally Generated Swarming Behavior

    Directory of Open Access Journals (Sweden)

    John Harvey

    2018-02-01

    Full Text Available Computer-based swarm systems, aiming to replicate the flocking behavior of birds, were first introduced by Reynolds in 1987. In his initial work, Reynolds noted that while it was difficult to quantify the dynamics of the behavior from the model, observers of his model immediately recognized them as a representation of a natural flock. Considerable analysis has been conducted since then on quantifying the dynamics of flocking/swarming behavior. However, no systematic analysis has been conducted on human identification of swarming. In this paper, we assess subjects’ assessment of the behavior of a simplified version of Reynolds’ model. Factors that affect the identification of swarming are discussed and future applications of the resulting models are proposed. Differences in decision times for swarming-related questions asked during the study indicate that different brain mechanisms may be involved in different elements of the behavior assessment task. The relatively simple but finely tunable model used in this study provides a useful methodology for assessing individual human judgment of swarming behavior.

  18. Mode 2 in action. Working across sectors to create a Center for Humanities and Technology

    NARCIS (Netherlands)

    Wyatt, S.M.E.

    2015-01-01

    This article examines recent developments in Amsterdam to establish a Center for Humanities and Technology (CHAT). The project is a collaboration between public research institutions and a private partner. To date, a White Paper has been produced that sets out a shared research agenda addressing

  19. Dragons, Ladybugs, and Softballs: Girls' STEM Engagement with Human-Centered Robotics

    Science.gov (United States)

    Gomoll, Andrea; Hmelo-Silver, Cindy E.; Šabanovic, Selma; Francisco, Matthew

    2016-01-01

    Early experiences in science, technology, engineering, and math (STEM) are important for getting youth interested in STEM fields, particularly for girls. Here, we explore how an after-school robotics club can provide informal STEM experiences that inspire students to engage with STEM in the future. Human-centered robotics, with its emphasis on the…

  20. Radiological and Environmental Research Division, Center for Human Radiobiology. Annual report, July 1980-June 1981

    International Nuclear Information System (INIS)

    1982-03-01

    Separate abstracts were prepared for the 22 papers of this annual report of the Center for Human Radiobiology. Abstracts were not written for 2 appendices which contain data on the exposure and radium-induced malignancies of 2259 persons whose radium content has been determined at least once

  1. Astigmatic single photon emission computed tomography imaging with a displaced center of rotation

    International Nuclear Information System (INIS)

    Wang, H.; Smith, M.F.; Stone, C.D.; Jaszczak, R.J.

    1998-01-01

    A filtered backprojection algorithm is developed for single photon emission computed tomography (SPECT) imaging with an astigmatic collimator having a displaced center of rotation. The astigmatic collimator has two perpendicular focal lines, one that is parallel to the axis of rotation of the gamma camera and one that is perpendicular to this axis. Using SPECT simulations of projection data from a hot rod phantom and point source arrays, it is found that a lack of incorporation of the mechanical shift in the reconstruction algorithm causes errors and artifacts in reconstructed SPECT images. The collimator and acquisition parameters in the astigmatic reconstruction formula, which include focal lengths, radius of rotation, and mechanical shifts, are often partly unknown and can be determined using the projections of a point source at various projection angles. The accurate determination of these parameters by a least squares fitting technique using projection data from numerically simulated SPECT acquisitions is studied. These studies show that the accuracy of parameter determination is improved as the distance between the point source and the axis of rotation of the gamma camera is increased. The focal length to the focal line perpendicular to the axis of rotation is determined more accurately than the focal length to the focal line parallel to this axis. copyright 1998 American Association of Physicists in Medicine

  2. Establishment of computed tomography reference dose levels in Onassis Cardiac Surgery Center

    International Nuclear Information System (INIS)

    Tsapaki, V.; Kyrozi, E.; Syrigou, T.; Mastorakou, I.; Kottou, S.

    2001-01-01

    The purpose of the study was to apply European Commission (EC) Reference Dose Levels (RDL) in Computed Tomography (CT) examinations at Onassis Cardiac Surgery Center (OCSC). These are weighted CT Dose Index (CTDI w ) for a single slice and Dose-Length Product (DLP) for a complete examination. During the period 1998-1999, the total number of CT examinations, every type of CT examination, patient related data and technical parameters of the examinations were recorded. The most frequent examinations were chosen for investigation which were the head, chest, abdomen and pelvis. CTDI measurements were performed and CTDI w and DLP were calculated. Third Quartile values of CTDI w were chosen to be 43mGy for head, 8mGy for chest, and 22mGy for abdomen and pelvis examinations. Third quartile values of DLP were chosen to be 740mGycm for head, 370mGycm for chest, 490mGycm for abdomen and 420mGycm for pelvis examination. Results confirm that OCSC follows successfully the proposed RDL for the head, chest, abdomen and pelvis examinations in terms of radiation dose. (author)

  3. Hybrid Human-Computing Distributed Sense-Making: Extending the SOA Paradigm for Dynamic Adjudication and Optimization of Human and Computer Roles

    Science.gov (United States)

    Rimland, Jeffrey C.

    2013-01-01

    In many evolving systems, inputs can be derived from both human observations and physical sensors. Additionally, many computation and analysis tasks can be performed by either human beings or artificial intelligence (AI) applications. For example, weather prediction, emergency event response, assistive technology for various human sensory and…

  4. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    International genome sequencing projects have produced billions of nucleotides (letters) of DNA sequence data, including the complete genome sequences of 74 organisms. These genome sequences have created many new scientific opportunities, including the ability to identify sequence variations among individuals within a species. These genetic differences, which are known as single nucleotide polymorphisms (SNPs), are particularly important in understanding the genetic basis for disease susceptibility. Since the report of the complete human genome sequence, over two million human SNPs have been identified, including a large-scale comparison of an entire chromosome from twenty individuals. Of the protein coding SNPs (cSNPs), approximately half leads to a single amino acid change in the encoded protein (non-synonymous coding SNPs). Most of these changes are functionally silent, while the remainder negatively impact the protein and sometimes cause human disease. To date, over 550 SNPs have been found to cause single locus (monogenic) diseases and many others have been associated with polygenic diseases. SNPs have been linked to specific human diseases, including late-onset Parkinson disease, autism, rheumatoid arthritis and cancer. The ability to predict accurately the effects of these SNPs on protein function would represent a major advance toward understanding these diseases. To date several attempts have been made toward predicting the effects of such mutations. The most successful of these is a computational approach called ''Sorting Intolerant From Tolerant'' (SIFT). This method uses sequence conservation among many similar proteins to predict which residues in a protein are functionally important. However, this method suffers from several limitations. First, a query sequence must have a sufficient number of relatives to infer sequence conservation. Second, this method does not make use of or provide any information on protein structure, which

  5. NASA Human Health and Performance Center: Open innovation successes and collaborative projects

    Science.gov (United States)

    Richard, Elizabeth E.; Davis, Jeffrey R.

    2014-11-01

    In May 2007, what was then the Space Life Sciences Directorate published the 2007 Space Life Sciences Strategy for Human Space Exploration, setting the course for development and implementation of new business models and significant advances in external collaboration over the next five years. The strategy was updated on the basis of these accomplishments and reissued as the NASA Human Health and Performance Strategy in 2012, and continues to drive new approaches to innovation for the directorate. This short paper describes the successful execution of the strategy, driving organizational change through open innovation efforts and collaborative projects, including efforts of the NASA Human Health and Performance Center (NHHPC).

  6. Abstracts of the International Congress of Research Center in Sports Sciences, Health Sciences & Human Development (2016

    Directory of Open Access Journals (Sweden)

    Vitor Reis

    2017-06-01

    Full Text Available The papers published in this book of abstracts / proceedings were submitted to the Scientific Commission of the International Congress of Research Center in Sports Sciences, Health Sciences & Human Development, held on 11 and 12 November 2016, at the University of Évora, Évora, Portugal, under the topic of Exercise and Health, Sports and Human Development. The content of the abstracts is solely and exclusively of its authors responsibility. The editors and the Scientific Committee of the International Congress of Research Center in Sports Sciences, Health Sciences & Human Development do not assume any responsibility for the opinions and statements expressed by the authors. Partial reproduction of the texts and their use without commercial purposes is allowed, provided the source / reference is duly mentioned.

  7. Human Pacman: A Mobile Augmented Reality Entertainment System Based on Physical, Social, and Ubiquitous Computing

    Science.gov (United States)

    Cheok, Adrian David

    This chapter details the Human Pacman system to illuminate entertainment computing which ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on infrastructure provided by mobile computing, wireless LAN, and ubiquitous computing. With Human Pacman, we have a physical role-playing computer fantasy together with real human-social and mobile-gaming that emphasizes on collaboration and competition between players in a wide outdoor physical area that allows natural wide-area human-physical movements. Pacmen and Ghosts are now real human players in the real world experiencing mixed computer graphics fantasy-reality provided by using the wearable computers on them. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide novel experiences of seamless transitions between the real and virtual worlds. This is an example of a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.

  8. Open-Box Muscle-Computer Interface: Introduction to Human-Computer Interactions in Bioengineering, Physiology, and Neuroscience Courses

    Science.gov (United States)

    Landa-Jiménez, M. A.; González-Gaspar, P.; Pérez-Estudillo, C.; López-Meraz, M. L.; Morgado-Valle, C.; Beltran-Parrazal, L.

    2016-01-01

    A Muscle-Computer Interface (muCI) is a human-machine system that uses electromyographic (EMG) signals to communicate with a computer. Surface EMG (sEMG) signals are currently used to command robotic devices, such as robotic arms and hands, and mobile robots, such as wheelchairs. These signals reflect the motor intention of a user before the…

  9. Enrichment of Human-Computer Interaction in Brain-Computer Interfaces via Virtual Environments

    Directory of Open Access Journals (Sweden)

    Alonso-Valerdi Luz María

    2017-01-01

    Full Text Available Tridimensional representations stimulate cognitive processes that are the core and foundation of human-computer interaction (HCI. Those cognitive processes take place while a user navigates and explores a virtual environment (VE and are mainly related to spatial memory storage, attention, and perception. VEs have many distinctive features (e.g., involvement, immersion, and presence that can significantly improve HCI in highly demanding and interactive systems such as brain-computer interfaces (BCI. BCI is as a nonmuscular communication channel that attempts to reestablish the interaction between an individual and his/her environment. Although BCI research started in the sixties, this technology is not efficient or reliable yet for everyone at any time. Over the past few years, researchers have argued that main BCI flaws could be associated with HCI issues. The evidence presented thus far shows that VEs can (1 set out working environmental conditions, (2 maximize the efficiency of BCI control panels, (3 implement navigation systems based not only on user intentions but also on user emotions, and (4 regulate user mental state to increase the differentiation between control and noncontrol modalities.

  10. Human-computer interaction for alert warning and attention allocation systems of the multimodal watchstation

    Science.gov (United States)

    Obermayer, Richard W.; Nugent, William A.

    2000-11-01

    The SPAWAR Systems Center San Diego is currently developing an advanced Multi-Modal Watchstation (MMWS); design concepts and software from this effort are intended for transition to future United States Navy surface combatants. The MMWS features multiple flat panel displays and several modes of user interaction, including voice input and output, natural language recognition, 3D audio, stylus and gestural inputs. In 1999, an extensive literature review was conducted on basic and applied research concerned with alerting and warning systems. After summarizing that literature, a human computer interaction (HCI) designer's guide was prepared to support the design of an attention allocation subsystem (AAS) for the MMWS. The resultant HCI guidelines are being applied in the design of a fully interactive AAS prototype. An overview of key findings from the literature review, a proposed design methodology with illustrative examples, and an assessment of progress made in implementing the HCI designers guide are presented.

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  12. A Human/Computer Learning Network to Improve Biodiversity Conservation and Research

    OpenAIRE

    Kelling, Steve; Gerbracht, Jeff; Fink, Daniel; Lagoze, Carl; Wong, Weng-Keen; Yu, Jun; Damoulas, Theodoros; Gomes, Carla

    2012-01-01

    In this paper we describe eBird, a citizen-science project that takes advantage of the human observational capacity to identify birds to species, which is then used to accurately represent patterns of bird occurrences across broad spatial and temporal extents. eBird employs artificial intelligence techniques such as machine learning to improve data quality by taking advantage of the synergies between human computation and mechanical computation. We call this a Human-Computer Learning Network,...

  13. A collaborative brain-computer interface for improving human performance.

    Directory of Open Access Journals (Sweden)

    Yijun Wang

    Full Text Available Electroencephalogram (EEG based brain-computer interfaces (BCI have been studied since the 1970s. Currently, the main focus of BCI research lies on the clinical use, which aims to provide a new communication channel to patients with motor disabilities to improve their quality of life. However, the BCI technology can also be used to improve human performance for normal healthy users. Although this application has been proposed for a long time, little progress has been made in real-world practices due to technical limits of EEG. To overcome the bottleneck of low single-user BCI performance, this study proposes a collaborative paradigm to improve overall BCI performance by integrating information from multiple users. To test the feasibility of a collaborative BCI, this study quantitatively compares the classification accuracies of collaborative and single-user BCI applied to the EEG data collected from 20 subjects in a movement-planning experiment. This study also explores three different methods for fusing and analyzing EEG data from multiple subjects: (1 Event-related potentials (ERP averaging, (2 Feature concatenating, and (3 Voting. In a demonstration system using the Voting method, the classification accuracy of predicting movement directions (reaching left vs. reaching right was enhanced substantially from 66% to 80%, 88%, 93%, and 95% as the numbers of subjects increased from 1 to 5, 10, 15, and 20, respectively. Furthermore, the decision of reaching direction could be made around 100-250 ms earlier than the subject's actual motor response by decoding the ERP activities arising mainly from the posterior parietal cortex (PPC, which are related to the processing of visuomotor transmission. Taken together, these results suggest that a collaborative BCI can effectively fuse brain activities of a group of people to improve the overall performance of natural human behavior.

  14. Transforming Education for a Transition into Human-centered Economy and Post-normal Times

    Directory of Open Access Journals (Sweden)

    Elif Çepni

    2017-10-01

    -driven. Diversity of educational models, even within a given country, is something that should be encouraged (Chuan, 2015. The main aim of this paper is to discuss and show the need for new alternative education systems which could eliminate the basic deficiencies of the current systems in the post-normal times. Citing the main reasons behind the necessity for formulating new ways of thinking and using them in the formulation of new education policies is another aim of the paper. The know-how and analytical skills that have made people indispensable in the knowledge economy will no longer give them an advantage over increasingly intelligent machines. Employees in a human-centered economy will need to possess values like creativity, character, passion and collaboration that cannot be programmed into computer traits. Our human qualities will set us apart from machines and make organizations superior (Seidman, 2014. The fundamental gap between the clear success of knowledge acquisition in the natural sciences versus the rather minimal success in understanding the dynamics of the social realm is the inherent non-linearity, instability, and uncertainty of social systems’ behaviour. There could be possible alternative ways of closing this gap. Today we need deep ecological ethics, especially in science. Sometimes what scientists do is not life-furthering and life-preserving, but life-destroying. The systems view of life (the whole is bigger than the sum of its parts may overcome the Cartesian metaphor. Physics, together with chemistry, is essential to understand the behaviour of the molecules in living cells, but it is not sufficient to describe their self-organizing patterns and processes. Every system, every part of it, is connected to every other system, at least indirectly. Systems and parts of a system interact in ways that can produce surprising and counter-intuitive results. The tendency to produce unexpected results makes predicting the outcome of systems’ interaction

  15. [Computational prediction of human immunodeficiency resistance to reverse transcriptase inhibitors].

    Science.gov (United States)

    Tarasova, O A; Filimonov, D A; Poroikov, V V

    2017-10-01

    Human immunodeficiency virus (HIV) causes acquired immunodeficiency syndrome (AIDS) and leads to over one million of deaths annually. Highly active antiretroviral treatment (HAART) is a gold standard in the HIV/AIDS therapy. Nucleoside and non-nucleoside inhibitors of HIV reverse transcriptase (RT) are important component of HAART, but their effect depends on the HIV susceptibility/resistance. HIV resistance mainly occurs due to mutations leading to conformational changes in the three-dimensional structure of HIV RT. The aim of our work was to develop and test a computational method for prediction of HIV resistance associated with the mutations in HIV RT. Earlier we have developed a method for prediction of HIV type 1 (HIV-1) resistance; it is based on the usage of position-specific descriptors. These descriptors are generated using the particular amino acid residue and its position; the position of certain residue is determined in a multiple alignment. The training set consisted of more than 1900 sequences of HIV RT from the Stanford HIV Drug Resistance database; for these HIV RT variants experimental data on their resistance to ten inhibitors are presented. Balanced accuracy of prediction varies from 80% to 99% depending on the method of classification (support vector machine, Naive Bayes, random forest, convolutional neural networks) and the drug, resistance to which is obtained. Maximal balanced accuracy was obtained for prediction of resistance to zidovudine, stavudine, didanosine and efavirenz by the random forest classifier. Average accuracy of prediction is 89%.

  16. Institutionalizing human-computer interaction for global health.

    Science.gov (United States)

    Gulliksen, Jan

    2017-06-01

    Digitalization is the societal change process in which new ICT-based solutions bring forward completely new ways of doing things, new businesses and new movements in the society. Digitalization also provides completely new ways of addressing issues related to global health. This paper provides an overview of the field of human-computer interaction (HCI) and in what way the field has contributed to international development in different regions of the world. Additionally, it outlines the United Nations' new sustainability goals from December 2015 and what these could contribute to the development of global health and its relationship to digitalization. Finally, it argues why and how HCI could be adopted and adapted to fit the contextual needs, the need for localization and for the development of new digital innovations. The research methodology is mostly qualitative following an action research paradigm in which the actual change process that the digitalization is evoking is equally important as the scientific conclusions that can be drawn. In conclusion, the paper argues that digitalization is fundamentally changing the society through the development and use of digital technologies and may have a profound effect on the digital development of every country in the world. But it needs to be developed based on local practices, it needs international support and to not be limited by any technological constraints. Particularly digitalization to support global health requires a profound understanding of the users and their context, arguing for user-centred systems design methodologies as particularly suitable.

  17. Remotely Telling Humans and Computers Apart: An Unsolved Problem

    Science.gov (United States)

    Hernandez-Castro, Carlos Javier; Ribagorda, Arturo

    The ability to tell humans and computers apart is imperative to protect many services from misuse and abuse. For this purpose, tests called CAPTCHAs or HIPs have been designed and put into production. Recent history shows that most (if not all) can be broken given enough time and commercial interest: CAPTCHA design seems to be a much more difficult problem than previously thought. The assumption that difficult-AI problems can be easily converted into valid CAPTCHAs is misleading. There are also some extrinsic problems that do not help, especially the big number of in-house designs that are put into production without any prior public critique. In this paper we present a state-of-the-art survey of current HIPs, including proposals that are now into production. We classify them regarding their basic design ideas. We discuss current attacks as well as future attack paths, and we also present common errors in design, and how many implementation flaws can transform a not necessarily bad idea into a weak CAPTCHA. We present examples of these flaws, using specific well-known CAPTCHAs. In a more theoretical way, we discuss the threat model: confronted risks and countermeasures. Finally, we introduce and discuss some desirable properties that new HIPs should have, concluding with some proposals for future work, including methodologies for design, implementation and security assessment.

  18. Inferring Human Activity in Mobile Devices by Computing Multiple Contexts.

    Science.gov (United States)

    Chen, Ruizhi; Chu, Tianxing; Liu, Keqiang; Liu, Jingbin; Chen, Yuwei

    2015-08-28

    This paper introduces a framework for inferring human activities in mobile devices by computing spatial contexts, temporal contexts, spatiotemporal contexts, and user contexts. A spatial context is a significant location that is defined as a geofence, which can be a node associated with a circle, or a polygon; a temporal context contains time-related information that can be e.g., a local time tag, a time difference between geographical locations, or a timespan; a spatiotemporal context is defined as a dwelling length at a particular spatial context; and a user context includes user-related information that can be the user's mobility contexts, environmental contexts, psychological contexts or social contexts. Using the measurements of the built-in sensors and radio signals in mobile devices, we can snapshot a contextual tuple for every second including aforementioned contexts. Giving a contextual tuple, the framework evaluates the posteriori probability of each candidate activity in real-time using a Naïve Bayes classifier. A large dataset containing 710,436 contextual tuples has been recorded for one week from an experiment carried out at Texas A&M University Corpus Christi with three participants. The test results demonstrate that the multi-context solution significantly outperforms the spatial-context-only solution. A classification accuracy of 61.7% is achieved for the spatial-context-only solution, while 88.8% is achieved for the multi-context solution.

  19. Annual report of R and D activities in Center for Promotion of Computational Science and Engineering and Center for Computational Science and e-Systems from April 1, 2005 to March 31, 2006

    International Nuclear Information System (INIS)

    2007-03-01

    This report provides an overview of research and development activities in Center for Computational Science and Engineering (CCSE), JAERI in the former half of the fiscal year 2005 (April 1, 2005 - Sep. 30, 2006) and those in Center for Computational Science and e-Systems (CCSE), JAEA, in the latter half of the fiscal year 2005(Oct 1, 2005 - March 31, 2006). In the former half term, the activities have been performed by 5 research groups, Research Group for Computational Science in Atomic Energy, Research Group for Computational Material Science in Atomic Energy, R and D Group for Computer Science, R and D Group for Numerical Experiments, and Quantum Bioinformatics Group in CCSE. At the beginning of the latter half term, these 5 groups were integrated into two offices, Simulation Technology Research and Development Office and Computer Science Research and Development Office at the moment of the unification of JNC (Japan Nuclear Cycle Development Institute) and JAERI (Japan Atomic Energy Research Institute), and the latter-half term activities were operated by the two offices. A big project, ITBL (Information Technology Based Laboratory) project and fundamental computational research for atomic energy plant were performed mainly by two groups, the R and D Group for Computer Science and the Research Group for Computational Science in Atomic Energy in the former half term and their integrated office, Computer Science Research and Development Office in the latter half one, respectively. The main result was verification by using structure analysis for real plant executable on the Grid environment, and received Honorable Mentions of Analytic Challenge in the conference 'Supercomputing (SC05)'. The materials science and bioinformatics in atomic energy research field were carried out by three groups, Research Group for Computational Material Science in Atomic Energy, R and D Group for Computer Science, R and D Group for Numerical Experiments, and Quantum Bioinformatics

  20. Psychosocial and Cultural Modeling in Human Computation Systems: A Gamification Approach

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Riensche, Roderick M.; Haack, Jereme N.; Butner, R. Scott

    2013-11-20

    “Gamification”, the application of gameplay to real-world problems, enables the development of human computation systems that support decision-making through the integration of social and machine intelligence. One of gamification’s major benefits includes the creation of a problem solving environment where the influence of cognitive and cultural biases on human judgment can be curtailed through collaborative and competitive reasoning. By reducing biases on human judgment, gamification allows human computation systems to exploit human creativity relatively unhindered by human error. Operationally, gamification uses simulation to harvest human behavioral data that provide valuable insights for the solution of real-world problems.

  1. NASA Human Health and Performance Center: Open Innovation Successes and Collaborative Projects

    Science.gov (United States)

    Davis, Jeffrey R.; Richard, Elizabeth E.

    2014-01-01

    In May 2007, what was then the Space Life Sciences Directorate published the 2007 Space Life Sciences Strategy for Human Space Exploration, which resulted in the development and implementation of new business models and significant advances in external collaboration over the next five years. The strategy was updated on the basis of these accomplishments and reissued as the NASA Human Health and Performance Strategy in 2012, and continues to drive new approaches to innovation for the directorate. This short paper describes the open innovation successes and collaborative projects developed over this timeframe, including the efforts of the NASA Human Health and Performance Center (NHHPC), which was established to advance human health and performance innovations for spaceflight and societal benefit via collaboration in new markets.

  2. Human-Centered Development of an Online Social Network for Metabolic Syndrome Management.

    Science.gov (United States)

    Núñez-Nava, Jefersson; Orozco-Sánchez, Paola A; López, Diego M; Ceron, Jesus D; Alvarez-Rosero, Rosa E

    2016-01-01

    According to the International Diabetes Federation (IDF), a quarter of the world's population has Metabolic Syndrome (MS). To develop (and assess the users' degree of satisfaction of) an online social network for patients who suffer from Metabolic Syndrome, based on the recommendations and requirements of the Human-Centered Design. Following the recommendations of the ISO 9241-210 for Human-Centered Design (HCD), an online social network was designed to promote physical activity and healthy nutrition. In order to guarantee the active participation of the users during the development of the social network, a survey, an in-depth interview, a focal group, and usability tests were carried out with people suffering from MS. The study demonstrated how the different activities, recommendations, and requirements of the ISO 9241-210 are integrated into a traditional software development process. Early usability tests demonstrated that the user's acceptance and the effectiveness and efficiency of the social network are satisfactory.

  3. Toward human-centered man-machine system in nuclear power plants

    International Nuclear Information System (INIS)

    Tanabe, Fumiya

    1993-01-01

    The Japanese LWR power plants are classified into 4 categories, from the viewpoints of the control panel in central control room and the extent of automation. Their characteristics are outlined. The potential weaknesses indwelt in the conventional approaches are discussed; that are the loss of applicability to the unanticipated facts and the loss of morale of the operators. The need for the construction of human-centered man-machine system is emphasized in order to overcome these potential weaknesses. The most important features required for the system are, in short term, to support operators in dificulties, and at the same time, in long term, to assure the acquisition and conservation of the personnels' morale and potential to cope with the problems. The concepts of the 'ecological interface' and 'adaptive aiding' system are introduced as the design concepts for the human-centered man-machine system. (J.P.N.)

  4. Give Design a Chance: A Case for a Human Centered Approach to Operational Art

    Science.gov (United States)

    2017-03-30

    shortcoming, organizational theorist Jamshid Gharajedaghi suggested, “design is a vehicle for enhancement of choice and holistic thinking ” that goes beyond...To address this question and confront assumptions and current methods of thinking , there is a need for a holistic and human centered approach in...MDMP). This monograph proposes a way of thinking and planning that goes beyond current Army doctrinal methodologies to address the changing

  5. A review of the design and development processes of simulation for training in healthcare - A technology-centered versus a human-centered perspective.

    Science.gov (United States)

    Persson, Johanna

    2017-01-01

    This article reviews literature about simulation systems for training in healthcare regarding the prevalence of human-centered approaches in the design and development of these systems, motivated by a tradition in this field of working technology-centered. The results show that the focus on human needs and context of use is limited. It is argued that a reduction of the focus on technical advancements in favor of the needs of the users and the healthcare community, underpinned by human factors and ergonomics theory, is favorable. Due to the low number of identified articles describing or discussing human-centered approaches it is furthermore concluded that the publication culture promotes technical descriptions and summative evaluations rather than descriptions and reflections regarding the design and development processes. Shifting the focus from a technology-centered approach to a human-centered one can aid in the process of creating simulation systems for training in healthcare that are: 1) relevant to the learning objectives, 2) adapted to the needs of users, context and task, and 3) not selected based on technical or fidelity criteria. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Human resources management in fitness centers and their relationship with the organizational performance

    Directory of Open Access Journals (Sweden)

    Jerónimo García Fernández

    2014-12-01

    Full Text Available Purpose: Human capital is essential in organizations providing sports services. However, there are few studies that examine what practices are carried out and whether they, affect sports organizations achieve better results are. Therefore the aim of this paper is to analyze the practices of human resource management in private fitness centers and the relationship established with organizational performance.Design/methodology/approach: Questionnaire to 101 managers of private fitness centers in Spain, performing exploratory and confirmatory factor analysis, and linear regressions between the variables.Findings: In organizations of fitness, the findings show that training practices, reward, communication and selection are positively correlated with organizational performance.Research limitations/implications: The fact that you made a convenience sampling in a given country and reduce the extrapolation of the results to the market.Originality/value: First, it represents a contribution to the fact that there are no studies analyzing the management of human resources in sport organizations from the point of view of the top leaders. On the other hand, allows fitness center managers to adopt practices to improve organizational performance.

  7. Human-Centred Computing for Assisting Nuclear Safeguards

    International Nuclear Information System (INIS)

    Szoke, I.

    2015-01-01

    With the rapid evolution of enabling hardware and software, technologies including 3D simulation, virtual reality (VR), augmented reality (AR), advanced user interfaces (UI), and geographical information systems (GIS) are increasingly employed in many aspects of modern life. In line with this, the nuclear industry is rapidly adopting emerging technologies to improve efficiency and safety by supporting planning and optimization of maintenance and decommissioning work, as well as for knowledge management, surveillance, training and briefing field operatives, education, etc. For many years, the authors have been involved in research and development (R&D) into the application of 3D simulation, VR, and AR, for mobile, desktop, and immersive 3D systems, to provide a greater sense of presence and situation awareness, for training, briefing, and in situ work by field operators. This work has resulted in a unique software base and experience (documented in numerous reports) from evaluating the effects of the design of training programmes and briefing sessions on human performance and training efficiency when applying various emerging technologies. In addition, the authors are involved in R&D into the use of 3D simulation, advanced UIs, mobile computing, and GIS systems to support realistic visualization of the combined radiological and geographical environment, as well as acquisition, analyzes, visualization and sharing of radiological and other data, within nuclear installations and their surroundings. The toolkit developed by the authors, and the associated knowledge base, has been successfully applied to various aspects of the nuclear industry, and has great potential within the safeguards domain. It can be used to train safeguards inspectors, brief inspectors before inspections, assist inspectors in situ (data registration, analyzes, and communication), support the design and verification of safeguards systems, conserve data and experience, educate future safeguards

  8. User-centered design in brain-computer interfaces-a case study.

    Science.gov (United States)

    Schreuder, Martijn; Riccio, Angela; Risetti, Monica; Dähne, Sven; Ramsay, Andrew; Williamson, John; Mattia, Donatella; Tangermann, Michael

    2013-10-01

    The array of available brain-computer interface (BCI) paradigms has continued to grow, and so has the corresponding set of machine learning methods which are at the core of BCI systems. The latter have evolved to provide more robust data analysis solutions, and as a consequence the proportion of healthy BCI users who can use a BCI successfully is growing. With this development the chances have increased that the needs and abilities of specific patients, the end-users, can be covered by an existing BCI approach. However, most end-users who have experienced the use of a BCI system at all have encountered a single paradigm only. This paradigm is typically the one that is being tested in the study that the end-user happens to be enrolled in, along with other end-users. Though this corresponds to the preferred study arrangement for basic research, it does not ensure that the end-user experiences a working BCI. In this study, a different approach was taken; that of a user-centered design. It is the prevailing process in traditional assistive technology. Given an individual user with a particular clinical profile, several available BCI approaches are tested and - if necessary - adapted to him/her until a suitable BCI system is found. Described is the case of a 48-year-old woman who suffered from an ischemic brain stem stroke, leading to a severe motor- and communication deficit. She was enrolled in studies with two different BCI systems before a suitable system was found. The first was an auditory event-related potential (ERP) paradigm and the second a visual ERP paradigm, both of which are established in literature. The auditory paradigm did not work successfully, despite favorable preconditions. The visual paradigm worked flawlessly, as found over several sessions. This discrepancy in performance can possibly be explained by the user's clinical deficit in several key neuropsychological indicators, such as attention and working memory. While the auditory paradigm relies

  9. Bridging the digital divide by increasing computer and cancer literacy: community technology centers for head-start parents and families.

    Science.gov (United States)

    Salovey, Peter; Williams-Piehota, Pamela; Mowad, Linda; Moret, Marta Elisa; Edlund, Denielle; Andersen, Judith

    2009-01-01

    This article describes the establishment of two community technology centers affiliated with Head Start early childhood education programs focused especially on Latino and African American parents of children enrolled in Head Start. A 6-hour course concerned with computer and cancer literacy was presented to 120 parents and other community residents who earned a free, refurbished, Internet-ready computer after completing the program. Focus groups provided the basis for designing the structure and content of the course and modifying it during the project period. An outcomes-based assessment comparing program participants with 70 nonparticipants at baseline, immediately after the course ended, and 3 months later suggested that the program increased knowledge about computers and their use, knowledge about cancer and its prevention, and computer use including health information-seeking via the Internet. The creation of community computer technology centers requires the availability of secure space, capacity of a community partner to oversee project implementation, and resources of this partner to ensure sustainability beyond core funding.

  10. Certification of version 1.2 of the PORFLO-3 code for the WHC scientific and engineering computational center

    International Nuclear Information System (INIS)

    Kline, N.W.

    1994-01-01

    Version 1.2 of the PORFLO-3 Code has migrated from the Hanford Cray computer to workstations in the WHC Scientific and Engineering Computational Center. The workstation-based configuration and acceptance testing are inherited from the CRAY-based configuration. The purpose of this report is to document differences in the new configuration as compared to the parent Cray configuration, and summarize some of the acceptance test results which have shown that the migrated code is functioning correctly in the new environment

  11. L'ordinateur a visage humain (The Computer in Human Guise).

    Science.gov (United States)

    Otman, Gabriel

    1986-01-01

    Discusses the tendency of humans to describe parts and functions of a computer with terminology that refers to human characteristics; for example, parts of the body (electronic brain), intellectual activities (optical memory), and physical activities (command). Computers are also described through metaphors, connotations, allusions, and analogies…

  12. Computer science security research and human subjects: emerging considerations for research ethics boards.

    Science.gov (United States)

    Buchanan, Elizabeth; Aycock, John; Dexter, Scott; Dittrich, David; Hvizdak, Erin

    2011-06-01

    This paper explores the growing concerns with computer science research, and in particular, computer security research and its relationship with the committees that review human subjects research. It offers cases that review boards are likely to confront, and provides a context for appropriate consideration of such research, as issues of bots, clouds, and worms enter the discourse of human subjects review.

  13. High performance computing in science and engineering '09: transactions of the High Performance Computing Center, Stuttgart (HLRS) 2009

    National Research Council Canada - National Science Library

    Nagel, Wolfgang E; Kröner, Dietmar; Resch, Michael

    2010-01-01

    ...), NIC/JSC (J¨ u lich), and LRZ (Munich). As part of that strategic initiative, in May 2009 already NIC/JSC has installed the first phase of the GCS HPC Tier-0 resources, an IBM Blue Gene/P with roughly 300.000 Cores, this time in J¨ u lich, With that, the GCS provides the most powerful high-performance computing infrastructure in Europe alread...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  15. Implementations of the CC'01 Human-Computer Interaction Guidelines Using Bloom's Taxonomy

    Science.gov (United States)

    Manaris, Bill; Wainer, Michael; Kirkpatrick, Arthur E.; Stalvey, RoxAnn H.; Shannon, Christine; Leventhal, Laura; Barnes, Julie; Wright, John; Schafer, J. Ben; Sanders, Dean

    2007-01-01

    In today's technology-laden society human-computer interaction (HCI) is an important knowledge area for computer scientists and software engineers. This paper surveys existing approaches to incorporate HCI into computer science (CS) and such related issues as the perceived gap between the interests of the HCI community and the needs of CS…

  16. Domain Decomposition for Computing Extremely Low Frequency Induced Current in the Human Body

    OpenAIRE

    Perrussel , Ronan; Voyer , Damien; Nicolas , Laurent; Scorretti , Riccardo; Burais , Noël

    2011-01-01

    International audience; Computation of electromagnetic fields in high resolution computational phantoms requires solving large linear systems. We present an application of Schwarz preconditioners with Krylov subspace methods for computing extremely low frequency induced fields in a phantom issued from the Visible Human.

  17. Human-Computer Interfaces for Wearable Computers: A Systematic Approach to Development and Evaluation

    OpenAIRE

    Witt, Hendrik

    2007-01-01

    The research presented in this thesis examines user interfaces for wearable computers.Wearable computers are a special kind of mobile computers that can be worn on the body. Furthermore, they integrate themselves even more seamlessly into different activities than a mobile phone or a personal digital assistant can.The thesis investigates the development and evaluation of user interfaces for wearable computers. In particular, it presents fundamental research results as well as supporting softw...

  18. Integrated multimodal human-computer interface and augmented reality for interactive display applications

    Science.gov (United States)

    Vassiliou, Marius S.; Sundareswaran, Venkataraman; Chen, S.; Behringer, Reinhold; Tam, Clement K.; Chan, M.; Bangayan, Phil T.; McGee, Joshua H.

    2000-08-01

    We describe new systems for improved integrated multimodal human-computer interaction and augmented reality for a diverse array of applications, including future advanced cockpits, tactical operations centers, and others. We have developed an integrated display system featuring: speech recognition of multiple concurrent users equipped with both standard air- coupled microphones and novel throat-coupled sensors (developed at Army Research Labs for increased noise immunity); lip reading for improving speech recognition accuracy in noisy environments, three-dimensional spatialized audio for improved display of warnings, alerts, and other information; wireless, coordinated handheld-PC control of a large display; real-time display of data and inferences from wireless integrated networked sensors with on-board signal processing and discrimination; gesture control with disambiguated point-and-speak capability; head- and eye- tracking coupled with speech recognition for 'look-and-speak' interaction; and integrated tetherless augmented reality on a wearable computer. The various interaction modalities (speech recognition, 3D audio, eyetracking, etc.) are implemented a 'modality servers' in an Internet-based client-server architecture. Each modality server encapsulates and exposes commercial and research software packages, presenting a socket network interface that is abstracted to a high-level interface, minimizing both vendor dependencies and required changes on the client side as the server's technology improves.

  19. Flat panel computed tomography of human ex vivo heart and bone specimens: initial experience

    Energy Technology Data Exchange (ETDEWEB)

    Nikolaou, Konstantin; Becker, Christoph R.; Reiser, Maximilian F. [Ludwig-Maximilians-University, Department of Clinical Radiology, Munich (Germany); Flohr, Thomas; Stierstorfer, Karl [CT Division, Siemens Medical Solutions, Forchheim (Germany)

    2005-02-01

    The aim of this technical investigation was the detailed description of a prototype flat panel detector computed tomography system (FPCT) and its initial evaluation in an ex vivo setting. The prototype FPCT scanner consists of a conventional radiographic flat panel detector, mounted on a multi-slice CT scanner gantry. Explanted human ex vivo heart and foot specimens were examined. Images were reformatted with various reconstruction algorithms and were evaluated for high-resolution anatomic information. For comparison purposes, the ex vivo specimens were also scanned with a conventional 16-detector-row CT scanner (Sensation 16, Siemens Medical Solutions, Forchheim, Germany). With the FPCT prototype used, a 1,024 x 768 resolution matrix can be obtained, resulting in an isotropic voxel size of 0.25 x 0.25 x 0.25 mm at the iso-center. Due to the high spatial resolution, very small structures such as trabecular bone or third-degree, distal branches of coronary arteries could be visualized. This first evaluation showed that flat panel detector systems can be used in a cone-beam computed tomography scanner and that very high spatial resolutions can be achieved. However, there are limitations for in vivo use due to constraints in low contrast resolution and slow scan speed. (orig.)

  20. The Internet and Computer User Profile: a questionnaire for determining intervention targets in occupational therapy at mental health vocational centers.

    Science.gov (United States)

    Regev, Sivan; Hadas-Lidor, Noami; Rosenberg, Limor

    2016-08-01

    In this study, the assessment tool "Internet and Computer User Profile" questionnaire (ICUP) is presented and validated. It was developed in order to gather information for setting intervention goals to meet current demands. Sixty-eight subjects aged 23-68 participated in the study. The study group (n = 28) was sampled from two vocational centers. The control group consisted of 40 participants from the general population that were sampled by convenience sampling based on the demographics of the study group. Subjects from both groups answered the ICUP questionnaire. Subjects of the study group answered the General Self- Efficacy (GSE) questionnaire and performed the Assessment of Computer Task Performance (ACTP) test in order to examine the convergent validity of the ICUP. Twenty subjects from both groups retook the ICUP questionnaire in order to obtain test-retest results. Differences between groups were tested using multiple analysis of variance (MANOVA) tests. Pearson and Spearman's tests were used for calculating correlations. Cronbach's alpha coefficient and k equivalent were used to assess internal consistency. The results indicate that the questionnaire is valid and reliable. They emphasize that the layout of the ICUP items facilitates in making a comprehensive examination of the client's perception regarding his participation in computer and internet activities. Implications for Rehabiliation The assessment tool "Internet and Computer User Profile" (ICUP) questionnaire is a novel assessment tool that evaluates operative use and individual perception of computer activities. The questionnaire is valid and reliable for use with participants of vocational centers dealing with mental illness. It is essential to facilitate access to computers for people with mental illnesses, seeing that they express similar interest in computers and internet as people from the general population of the same age. Early intervention will be particularly effective for young

  1. Computer modeling with randomized-controlled trial data informs the development of person-centered aged care homes.

    Science.gov (United States)

    Chenoweth, Lynn; Vickland, Victor; Stein-Parbury, Jane; Jeon, Yun-Hee; Kenny, Patricia; Brodaty, Henry

    2015-10-01

    To answer questions on the essential components (services, operations and resources) of a person-centered aged care home (iHome) using computer simulation. iHome was developed with AnyLogic software using extant study data obtained from 60 Australian aged care homes, 900+ clients and 700+ aged care staff. Bayesian analysis of simulated trial data will determine the influence of different iHome characteristics on care service quality and client outcomes. Interim results: A person-centered aged care home (socio-cultural context) and care/lifestyle services (interactional environment) can produce positive outcomes for aged care clients (subjective experiences) in the simulated environment. Further testing will define essential characteristics of a person-centered care home.

  2. Appearance-based human gesture recognition using multimodal features for human computer interaction

    Science.gov (United States)

    Luo, Dan; Gao, Hua; Ekenel, Hazim Kemal; Ohya, Jun

    2011-03-01

    The use of gesture as a natural interface plays an utmost important role for achieving intelligent Human Computer Interaction (HCI). Human gestures include different components of visual actions such as motion of hands, facial expression, and torso, to convey meaning. So far, in the field of gesture recognition, most previous works have focused on the manual component of gestures. In this paper, we present an appearance-based multimodal gesture recognition framework, which combines the different groups of features such as facial expression features and hand motion features which are extracted from image frames captured by a single web camera. We refer 12 classes of human gestures with facial expression including neutral, negative and positive meanings from American Sign Languages (ASL). We combine the features in two levels by employing two fusion strategies. At the feature level, an early feature combination can be performed by concatenating and weighting different feature groups, and LDA is used to choose the most discriminative elements by projecting the feature on a discriminative expression space. The second strategy is applied on decision level. Weighted decisions from single modalities are fused in a later stage. A condensation-based algorithm is adopted for classification. We collected a data set with three to seven recording sessions and conducted experiments with the combination techniques. Experimental results showed that facial analysis improve hand gesture recognition, decision level fusion performs better than feature level fusion.

  3. Integrating Human and Computer Intelligence. Technical Report No. 32.

    Science.gov (United States)

    Pea, Roy D.

    This paper explores the thesis that advances in computer applications and artificial intelligence have important implications for the study of development and learning in psychology. Current approaches to the use of computers as devices for problem solving, reasoning, and thinking--i.e., expert systems and intelligent tutoring systems--are…

  4. Developing Educational Computer Animation Based on Human Personality Types

    Science.gov (United States)

    Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol

    2015-01-01

    Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…

  5. Computerized Cognitive Rehabilitation: Comparing Different Human-Computer Interactions.

    Science.gov (United States)

    Quaglini, Silvana; Alloni, Anna; Cattani, Barbara; Panzarasa, Silvia; Pistarini, Caterina

    2017-01-01

    In this work we describe an experiment involving aphasic patients, where the same speech rehabilitation exercise was administered in three different modalities, two of which are computer-based. In particular, one modality exploits the "Makey Makey", an electronic board which allows interacting with the computer using physical objects.

  6. A Cost Analysis of Day Care Centers in Pennsylvania. Center for Human Service Development Report No. 21.

    Science.gov (United States)

    Hu, Teh-Wei; Wise, Karl

    The purpose of this study is to provide day care center management and government funding agencies with empirical estimates of the costs of day care centers in Pennsylvania. Based on cost data obtained from the Department of Public Welfare and survey information from the Pennsylvania Day Care Study Project, average and marginal costs of day care…

  7. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    Science.gov (United States)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  8. Spectrum of tablet computer use by medical students and residents at an academic medical center.

    Science.gov (United States)

    Robinson, Robert

    2015-01-01

    Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians. Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM) in July and August of 2012. Results. There were 76 medical student responses (26% response rate) and 66 resident/fellow responses to this survey (21% response rate). Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035). The most common reported uses were for accessing medical reference applications (46%), e-Books (45%), and board study (32%). Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010), review radiology images (27% vs. 12%, p = 0.019), and enter patient care orders (26% vs. 3%, p e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks. Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on resident physicians. Further study is needed better understand how tablet computers and other mobile devices may assist in medical education and patient care.

  9. Constructing a Computer Model of the Human Eye Based on Tissue Slice Images

    OpenAIRE

    Dai, Peishan; Wang, Boliang; Bao, Chunbo; Ju, Ying

    2010-01-01

    Computer simulation of the biomechanical and biological heat transfer in ophthalmology greatly relies on having a reliable computer model of the human eye. This paper proposes a novel method on the construction of a geometric model of the human eye based on tissue slice images. Slice images were obtained from an in vitro Chinese human eye through an embryo specimen processing methods. A level set algorithm was used to extract contour points of eye tissues while a principle component analysi...

  10. Proceedings of the topical meeting on advances in human factors research on man/computer interactions

    International Nuclear Information System (INIS)

    Anon.

    1990-01-01

    This book discusses the following topics: expert systems and knowledge engineering-I; verification and validation of software; methods for modeling UMAN/computer performance; MAN/computer interaction problems in producing procedures -1-2; progress and problems with automation-1-2; experience with electronic presentation of procedures-2; intelligent displays and monitors; modeling user/computer interface; and computer-based human decision-making aids

  11. Autonomous Robot Navigation in Human-Centered Environments Based on 3D Data Fusion

    Directory of Open Access Journals (Sweden)

    Rüdiger Dillmann

    2007-01-01

    Full Text Available Efficient navigation of mobile platforms in dynamic human-centered environments is still an open research topic. We have already proposed an architecture (MEPHISTO for a navigation system that is able to fulfill the main requirements of efficient navigation: fast and reliable sensor processing, extensive global world modeling, and distributed path planning. Our architecture uses a distributed system of sensor processing, world modeling, and path planning units. In this arcticle, we present implemented methods in the context of data fusion algorithms for 3D world modeling and real-time path planning. We also show results of the prototypic application of the system at the museum ZKM (center for art and media in Karlsruhe.

  12. Autonomous Robot Navigation in Human-Centered Environments Based on 3D Data Fusion

    Science.gov (United States)

    Steinhaus, Peter; Strand, Marcus; Dillmann, Rüdiger

    2007-12-01

    Efficient navigation of mobile platforms in dynamic human-centered environments is still an open research topic. We have already proposed an architecture (MEPHISTO) for a navigation system that is able to fulfill the main requirements of efficient navigation: fast and reliable sensor processing, extensive global world modeling, and distributed path planning. Our architecture uses a distributed system of sensor processing, world modeling, and path planning units. In this arcticle, we present implemented methods in the context of data fusion algorithms for 3D world modeling and real-time path planning. We also show results of the prototypic application of the system at the museum ZKM (center for art and media) in Karlsruhe.

  13. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-01-01

    Finally, we take a step further by developing a novel feature selection method suitable for defining a computational framework capable of analyzing the genomic content of enhancers and reporting cell-line specific predictive signatures.

  14. Computational analysis of human miRNAs phylogenetics

    African Journals Online (AJOL)

    User

    2011-05-02

    May 2, 2011 ... Human DNA. 71. 100.00. 1.94E-28. AL138714. Human DNA sequence from clone RP11-. 121J7 on chromosome 13q32.1-32.3. Contains the 3' end of a novel gene, the 5' end of the GPC5 gene for glypican 5, 5 ..... including human, chimpanzee, orangutan, and macaque, and find that miRNAs were ...

  15. Nurse Knowledge Exchange Plus: Human-Centered Implementation for Spread and Sustainability.

    Science.gov (United States)

    Lin, Mike; Heisler, Scott; Fahey, Linda; McGinnis, Juli; Whiffen, Teri L

    2015-07-01

    Kaiser Permanente implemented a new model of nursing communication at shift change-in the bedside nursing report known as the Nurse Knowledge Exchange (NKE) in 2004-but noted variations in its spread and sustainability across medical centers five years later. The six core elements of NKEplus were as follows: team rounding in the last hour before shift changes, pre-shift patient assignments that limit the number of departing nurses at shift change, unit support for uninterrupted bedside reporting, standardization for report and safety check formats, and collaboration with patients to update in-room care boards. In January 2011 Kaiser Permanente Southern California (KPSC; Pasadena) began implementing NKEplus in 125 nursing units across 14 hospitals, with the use of human-centered design principles: creating shared understanding of the need for change, minimum specifications, and customization by frontline staff. Champion teams on each nursing unit designed and pilot tested unit-specific versions of NKEplus for four to eight weeks. Implementation occurred in waves and proceeded from medical/surgical units to specialty units. Traditional performance improvement strategies of accountability, measurement, and management were also applied. By the end of 2012, 100% of the 64 medical/surgical units and 47 (77.0%) of the 61 specialty units in KPSC medical centers implemented NKEplus-as had all but 1 of the specialty units by May 2013. The mean KPSC score on the NKEplus nursing behavior bundle improved from 65.9% in 2010 to 71.3% in the first quarter of 2014. The mean KPSC Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) score for nurse communication improved from 73.1% in 2010 to 76.4% in the first quarter of 2014 (p < . 001). Human-centered implementation appeared to help spread a new model of nursing handoffs and change the culture of professional nursing practice related to shift change.

  16. Computer-Aided Diagnosis of Breast Cancer: A Multi-Center Demonstrator

    National Research Council Canada - National Science Library

    Floyd, Carey

    2000-01-01

    .... The focus has been to gather data from multiple sites in order to verify and whether the artificial neural network computer aid to the diagnosis of breast cancer can be translated between locations...

  17. Computed tomography-guided percutaneous gastrostomy: initial experience at a cancer center

    Energy Technology Data Exchange (ETDEWEB)

    Tyng, Chiang Jeng; Santos, Erich Frank Vater; Guerra, Luiz Felipe Alves; Bitencourt, Almir Galvao Vieira; Barbosa, Paula Nicole Vieira Pinto; Chojniak, Rubens [A. C. Camargo Cancer Center, Sao Paulo, SP (Brazil); Universidade Federal do Espirito Santo (HUCAM/UFES), Vitoria, ES (Brazil). Hospital Universitario Cassiano Antonio de Morais. Radiologia e Diagnostico por Imagem

    2017-03-15

    Gastrostomy is indicated for patients with conditions that do not allow adequate oral nutrition. To reduce the morbidity and costs associated with the procedure, there is a trend toward the use of percutaneous gastrostomy, guided by endoscopy, fluoroscopy, or, most recently, computed tomography. The purpose of this paper was to review the computed tomography-guided gastrostomy procedure, as well as the indications for its use and the potential complications. (author)

  18. Computed tomography-guided percutaneous gastrostomy: initial experience at a cancer center

    International Nuclear Information System (INIS)

    Tyng, Chiang Jeng; Santos, Erich Frank Vater; Guerra, Luiz Felipe Alves; Bitencourt, Almir Galvao Vieira; Barbosa, Paula Nicole Vieira Pinto; Chojniak, Rubens; Universidade Federal do Espirito Santo

    2017-01-01

    Gastrostomy is indicated for patients with conditions that do not allow adequate oral nutrition. To reduce the morbidity and costs associated with the procedure, there is a trend toward the use of percutaneous gastrostomy, guided by endoscopy, fluoroscopy, or, most recently, computed tomography. The purpose of this paper was to review the computed tomography-guided gastrostomy procedure, as well as the indications for its use and the potential complications. (author)

  19. Cortical Activation during Landmark-Centered vs. Gaze-Centered Memory of Saccade Targets in the Human: An FMRI Study

    Directory of Open Access Journals (Sweden)

    Ying Chen

    2017-06-01

    Full Text Available A remembered saccade target could be encoded in egocentric coordinates such as gaze-centered, or relative to some external allocentric landmark that is independent of the target or gaze (landmark-centered. In comparison to egocentric mechanisms, very little is known about such a landmark-centered representation. Here, we used an event-related fMRI design to identify brain areas supporting these two types of spatial coding (i.e., landmark-centered vs. gaze-centered for target memory during the Delay phase where only target location, not saccade direction, was specified. The paradigm included three tasks with identical display of visual stimuli but different auditory instructions: Landmark Saccade (remember target location relative to a visual landmark, independent of gaze, Control Saccade (remember original target location relative to gaze fixation, independent of the landmark, and a non-spatial control, Color Report (report target color. During the Delay phase, the Control and Landmark Saccade tasks activated overlapping areas in posterior parietal cortex (PPC and frontal cortex as compared to the color control, but with higher activation in PPC for target coding in the Control Saccade task and higher activation in temporal and occipital cortex for target coding in Landmark Saccade task. Gaze-centered directional selectivity was observed in superior occipital gyrus and inferior occipital gyrus, whereas landmark-centered directional selectivity was observed in precuneus and midposterior intraparietal sulcus. During the Response phase after saccade direction was specified, the parietofrontal network in the left hemisphere showed higher activation for rightward than leftward saccades. Our results suggest that cortical activation for coding saccade target direction relative to a visual landmark differs from gaze-centered directional selectivity for target memory, from the mechanisms for other types of allocentric tasks, and from the directionally

  20. Expression of human ferredoxin and assembly of the [2Fe-2S] center in Escherichia coli

    International Nuclear Information System (INIS)

    Coghlan, V.M.; Vickery, L.E.

    1989-01-01

    A cDNA fragment encoding human ferredoxin, a mitochondrial [2Fe-2S] protein, was introduced into Escherichia coli by using an expression vector based on the approach of Nagai and Thogersen. Expression was under control of the λP L promoter and resulted in production of ferredoxin as a cleavable fusion protein with an amino-terminal fragment derived from bacteriophage λcII protein. The fusion protein was isolated from the soluble fraction of induced cells and was specifically cleaved to yield mature recombinant ferredoxin. The recombinant protein was shown to be identical in size to ferredoxin isolated from human placenta (13,546 Da) by NaDodSO 4 /PAGE and partial amino acid sequencing. E. coli cells expressing human ferredoxin were brown in color, and absorbance and electron paramagnetic resonance spectra of the purified recombinant protein established that the [2Fe-2S]center was assembled and incorporated into ferredoxin in vivo. Recombinant ferredoxin was active in steroid hydroxylations when reconstituted with cytochromes P-450 sec and P-450 11β and exhibited rates comparable to those observed for ferredoxin isolated from human placenta. This expression system should be useful in production of native and structurally altered forms of human ferredoxin for studies of ferredoxin structure and function

  1. Establishing and evaluating bar-code technology in blood sampling system: a model based on human centered human-centered design method.

    Science.gov (United States)

    Chou, Shin-Shang; Yan, Hsiu-Fang; Huang, Hsiu-Ya; Tseng, Kuan-Jui; Kuo, Shu-Chen

    2012-01-01

    This study intended to use a human-centered design study method to develop a bar-code technology in blood sampling process. By using the multilevel analysis to gather the information, the bar-code technology has been constructed to identify the patient's identification, simplify the work process, and prevent medical error rates. A Technology Acceptance Model questionnaire was developed to assess the effectiveness of system and the data of patient's identification and sample errors were collected daily. The average scores of 8 items users' perceived ease of use was 25.21(3.72), 9 items users' perceived usefulness was 28.53(5.00), and 14 items task-technology fit was 52.24(7.09), the rate of patient identification error and samples with order cancelled were down to zero, however, new errors were generated after the new system deployed; which were the position of barcode stickers on the sample tubes. Overall, more than half of nurses (62.5%) were willing to use the new system.

  2. Human-centered design of a cyber-physical system for advanced response to Ebola (CARE).

    Science.gov (United States)

    Dimitrov, Velin; Jagtap, Vinayak; Skorinko, Jeanine; Chernova, Sonia; Gennert, Michael; Padir, Taşkin

    2015-01-01

    We describe the process towards the design of a safe, reliable, and intuitive emergency treatment unit to facilitate a higher degree of safety and situational awareness for medical staff, leading to an increased level of patient care during an epidemic outbreak in an unprepared, underdeveloped, or disaster stricken area. We start with a human-centered design process to understand the design challenge of working with Ebola treatment units in Western Africa in the latest Ebola outbreak, and show preliminary work towards cyber-physical technologies applicable to potentially helping during the next outbreak.

  3. Environmental Research Division annual report: Center for Human Radiobiology, July 1982-June 1983

    International Nuclear Information System (INIS)

    1984-03-01

    This is the fourteenth Annual Report of the Center for Human Radiobiology. New cases of bone cancer and carcinoma of head sinuses are occurring at a rate of about one per year in patients who acquired radium burdens 50 to 60 years ago. Several papers deal with dosimetry of alpha-emitting radionuclides in man, in animals, or in the environment. The report concludes with an appendix containing data on the exposure of 2312 persons whose radium content has been determined and an appendix listing the classical radium-related malignancies (osteosarcomas and carcinomas of the paranasal sinuses and mastoid)

  4. Critical remarks on Simon Caney's humanity- centered approach to global justice

    Directory of Open Access Journals (Sweden)

    Julian Culp

    2016-09-01

    The practice-independent approach to theorizing justice (PIA holds that the social practices to which a particular conception of justice is meant to apply are of no importance for the justification of such a conception. In this paper I argue that this approach to theorizing justice is incompatible with the method of reflective equilibrium (MRE because the MRE is antithetical to a clean separation between issues of justification and application. In particular I will be maintaining that this incompatibility renders Simon Caney’s cosmopolitan theory of global justice inconsistent, because Caney claims to endorse both a humanity-centered PIA and the MRE.

  5. Advancements in Violin-Related Human-Computer Interaction

    DEFF Research Database (Denmark)

    Overholt, Daniel

    2014-01-01

    of human intelligence and emotion is at the core of the Musical Interface Technology Design Space, MITDS. This is a framework that endeavors to retain and enhance such traits of traditional instruments in the design of interactive live performance interfaces. Utilizing the MITDS, advanced Human...

  6. Applying systemic-structural activity theory to design of human-computer interaction systems

    CERN Document Server

    Bedny, Gregory Z; Bedny, Inna

    2015-01-01

    Human-Computer Interaction (HCI) is an interdisciplinary field that has gained recognition as an important field in ergonomics. HCI draws on ideas and theoretical concepts from computer science, psychology, industrial design, and other fields. Human-Computer Interaction is no longer limited to trained software users. Today people interact with various devices such as mobile phones, tablets, and laptops. How can you make such interaction user friendly, even when user proficiency levels vary? This book explores methods for assessing the psychological complexity of computer-based tasks. It also p

  7. Measurements and predictions of the air distribution systems in high compute density (Internet) data centers

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jinkyun [HIMEC (Hanil Mechanical Electrical Consultants) Ltd., Seoul 150-103 (Korea); Department of Architectural Engineering, Yonsei University, Seoul 120-749 (Korea); Lim, Taesub; Kim, Byungseon Sean [Department of Architectural Engineering, Yonsei University, Seoul 120-749 (Korea)

    2009-10-15

    When equipment power density increases, a critical goal of a data center cooling system is to separate the equipment exhaust air from the equipment intake air in order to prevent the IT server from overheating. Cooling systems for data centers are primarily differentiated according to the way they distribute air. The six combinations of flooded and locally ducted air distribution make up the vast majority of all installations, except fully ducted air distribution methods. Once the air distribution system (ADS) is selected, there are other elements that must be integrated into the system design. In this research, the design parameters and IT environmental aspects of the cooling system were studied with a high heat density data center. CFD simulation analysis was carried out in order to compare the heat removal efficiencies of various air distribution systems. The IT environment of an actual operating data center is measured to validate a model for predicting the effect of different air distribution systems. A method for planning and design of the appropriate air distribution system is described. IT professionals versed in precision air distribution mechanisms, components, and configurations can work more effectively with mechanical engineers to ensure the specification and design of optimized cooling solutions. (author)

  8. Risk factors for computer visual syndrome (CVS) among operators of two call centers in São Paulo, Brazil.

    Science.gov (United States)

    Sa, Eduardo Costa; Ferreira Junior, Mario; Rocha, Lys Esther

    2012-01-01

    The aims of this study were to investigate work conditions, to estimate the prevalence and to describe risk factors associated with Computer Vision Syndrome among two call centers' operators in São Paulo (n = 476). The methods include a quantitative cross-sectional observational study and an ergonomic work analysis, using work observation, interviews and questionnaires. The case definition was the presence of one or more specific ocular symptoms answered as always, often or sometimes. The multiple logistic regression model, were created using the stepwise forward likelihood method and remained the variables with levels below 5% (p vision (43.5%). The prevalence of Computer Vision Syndrome was 54.6%. Associations verified were: being female (OR 2.6, 95% CI 1.6 to 4.1), lack of recognition at work (OR 1.4, 95% CI 1.1 to 1.8), organization of work in call center (OR 1.4, 95% CI 1.1 to 1.7) and high demand at work (OR 1.1, 95% CI 1.0 to 1.3). The organization and psychosocial factors at work should be included in prevention programs of visual syndrome among call centers' operators.

  9. A Human-Centered Smart Home System with Wearable-Sensor Behavior Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Jianting; Liu, Ting; Shen, Chao; Wu, Hongyu; Liu, Wenyi; Su, Man; Chen, Siyun; Jia, Zhanpei

    2016-11-17

    Smart home has recently attracted much research interest owing to its potential in improving the quality of human life. How to obtain user's demand is the most important and challenging task for appliance optimal scheduling in smart home, since it is highly related to user's unpredictable behavior. In this paper, a human-centered smart home system is proposed to identify user behavior, predict their demand and schedule the household appliances. Firstly, the sensor data from user's wearable devices are monitored to profile user's full-day behavior. Then, the appliance-demand matrix is constructed to predict user's demand on home environment, which is extracted from the history of appliance load data and user behavior. Two simulations are designed to demonstrate user behavior identification, appliance-demand matrix construction and strategy of appliance optimal scheduling generation.

  10. Science, humanism, judgement, ethics: person-centered medicine as an emergent model of modern clinical practice.

    Science.gov (United States)

    Miles, Andrew

    2013-01-01

    The Medical University of Plovdiv (MUP) has as its motto 'Committed to humanity". But what does humanity in modern medicine mean? Is it possible to practise a form of medicine that is without humanity? In the current article, it is argued that modern medicine is increasingly being practised in a de-personalised fashion, where the patient is understood not as a unique human individual, a person, but rather as a subject or an object and more in the manner of a complex biological machine. Medicine has, it is contended, become distracted from its duty to care, comfort and console as well as to ameliorate, attenuate and cure and that the rapid development of medicine's scientific knowledge is, paradoxically, principally causative. Signal occurrences in the 'patient as a person' movement are reviewed, together with the emergence of the evidence-based medicine (EBM) and patient-centered care (PCC) movements. The characteristics of a model of medicine evolving in response to medicine's current deficiencies--person-centered healthcare (PCH)--are noted and described. In seeking to apply science with humanism, via clinical judgement, within an ethical framework, it is contended that PCH will prove to be far more responsive to the needs of the individual patient and his/her personal circumstances than current models of practice, so that neither a reductive anatomico-pathological, disease-centric model of illness (EBM), nor an aggressive patient-directed, consumerist form of care (PCC) is allowed continued dominance within modern healthcare systems. In conclusion, it is argued that PCH will enable affordable advances in biomedicine and technology to be delivered to patients within a humanistic framework of clinical practice that recognises the patient as a person and which takes full account of his/her stories, values, preferences, goals, aspirations, fears, worries, hopes, cultural context and which responds to his/her psychological, emotional, spiritual and social necessities

  11. HuRECA: Human Reliability Evaluator for Computer-based Control Room Actions

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Lee, Seung Jun; Jang, Seung Cheol

    2011-01-01

    As computer-based design features such as computer-based procedures (CBP), soft controls (SCs), and integrated information systems are being adopted in main control rooms (MCR) of nuclear power plants, a human reliability analysis (HRA) method capable of dealing with the effects of these design features on human reliability is needed. From the observations of human factors engineering verification and validation experiments, we have drawn some major important characteristics on operator behaviors and design-related influencing factors (DIFs) from the perspective of human reliability. Firstly, there are new DIFs that should be considered in developing an HRA method for computer-based control rooms including especially CBP and SCs. In the case of the computer-based procedure rather than the paper-based procedure, the structural and managerial elements should be considered as important PSFs in addition to the procedural contents. In the case of the soft controllers, the so-called interface management tasks (or secondary tasks) should be reflected in the assessment of human error probability. Secondly, computer-based control rooms can provide more effective error recovery features than conventional control rooms. Major error recovery features for computer-based control rooms include the automatic logic checking function of the computer-based procedure and the information sharing feature of the general computer-based designs

  12. Computational Modeling of Human Multiple-Task Performance

    National Research Council Canada - National Science Library

    Kieras, David E; Meyer, David

    2005-01-01

    This is the final report for a project that was a continuation of an earlier, long-term project on the development and validation of the EPIC cognitive architecture for modeling human cognition and performance...

  13. Biomedical optics centers: forty years of multidisciplinary clinical translation for improving human health

    Science.gov (United States)

    Tromberg, Bruce J.; Anderson, R. Rox; Birngruber, Reginald; Brinkmann, Ralf; Berns, Michael W.; Parrish, John A.; Apiou-Sbirlea, Gabriela

    2016-12-01

    Despite widespread government and public interest, there are significant barriers to translating basic science discoveries into clinical practice. Biophotonics and biomedical optics technologies can be used to overcome many of these hurdles, due, in part, to offering new portable, bedside, and accessible devices. The current JBO special issue highlights promising activities and examples of translational biophotonics from leading laboratories around the world. We identify common essential features of successful clinical translation by examining the origins and activities of three major international academic affiliated centers with beginnings traceable to the mid-late 1970s: The Wellman Center for Photomedicine (Mass General Hospital, USA), the Beckman Laser Institute and Medical Clinic (University of California, Irvine, USA), and the Medical Laser Center Lübeck at the University of Lübeck, Germany. Major factors driving the success of these programs include visionary founders and leadership, multidisciplinary research and training activities in light-based therapies and diagnostics, diverse funding portfolios, and a thriving entrepreneurial culture that tolerates risk. We provide a brief review of how these three programs emerged and highlight critical phases and lessons learned. Based on these observations, we identify pathways for encouraging the growth and formation of similar programs in order to more rapidly and effectively expand the impact of biophotonics and biomedical optics on human health.

  14. Human-Computer Interaction Software: Lessons Learned, Challenges Ahead

    Science.gov (United States)

    1989-01-01

    domain communi- Iatelligent s t s s Me cation. Users familiar with problem Inteligent support systes. High-func- anddomains but inxperienced with comput...8217i. April 1987, pp. 7.3-78. His research interests include artificial intel- Creating better HCI softw-are will have a 8. S.K Catrd. I.P. Moran. arid

  15. Individual Difference Effects in Human-Computer Interaction

    Science.gov (United States)

    1991-10-01

    service staff. The subjects who participated in the experiment constituted the organization’s decision network . The subjects were presented the same...at the centek of an information-collection network . From the center of this network , the person can access and conmunicate with a variety of...with reverse video option) in slot 3; a software-controlled switch (i.e., the Videx " Softswitch ") for switching between 40- and 80-column display

  16. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT) imaging center

    Energy Technology Data Exchange (ETDEWEB)

    Santana, Priscila do Carmo; Oliveira, Paulo Marcio Campos de; Mamede, Marcelo; Silveira, Mariana de Castro; Aguiar, Polyanna; Real, Raphaela Vila, E-mail: pridili@gmail.com [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil); Silva, Teogenes Augusto da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2015-01-15

    Objective: to evaluate the level of ambient radiation in a PET/CT center. Materials and methods: previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results: in none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/ year) or free area (0.5 mSv/year) as recommended by the Brazilian regulations. Conclusion: in the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed. (author)

  17. PRODEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP : HIGH PERFORMANCE COMPUTING WITH QCDOC AND BLUEGENE.

    Energy Technology Data Exchange (ETDEWEB)

    CHRIST,N.; DAVENPORT,J.; DENG,Y.; GARA,A.; GLIMM,J.; MAWHINNEY,R.; MCFADDEN,E.; PESKIN,A.; PULLEYBLANK,W.

    2003-03-11

    Staff of Brookhaven National Laboratory, Columbia University, IBM and the RIKEN BNL Research Center organized a one-day workshop held on February 28, 2003 at Brookhaven to promote the following goals: (1) To explore areas other than QCD applications where the QCDOC and BlueGene/L machines can be applied to good advantage, (2) To identify areas where collaboration among the sponsoring institutions can be fruitful, and (3) To expose scientists to the emerging software architecture. This workshop grew out of an informal visit last fall by BNL staff to the IBM Thomas J. Watson Research Center that resulted in a continuing dialog among participants on issues common to these two related supercomputers. The workshop was divided into three sessions, addressing the hardware and software status of each system, prospective applications, and future directions.

  18. Noise-Resilient Quantum Computing with a Nitrogen-Vacancy Center and Nuclear Spins.

    Science.gov (United States)

    Casanova, J; Wang, Z-Y; Plenio, M B

    2016-09-23

    Selective control of qubits in a quantum register for the purposes of quantum information processing represents a critical challenge for dense spin ensembles in solid-state systems. Here we present a protocol that achieves a complete set of selective electron-nuclear gates and single nuclear rotations in such an ensemble in diamond facilitated by a nearby nitrogen-vacancy (NV) center. The protocol suppresses internuclear interactions as well as unwanted coupling between the NV center and other spins of the ensemble to achieve quantum gate fidelities well exceeding 99%. Notably, our method can be applied to weakly coupled, distant spins representing a scalable procedure that exploits the exceptional properties of nuclear spins in diamond as robust quantum memories.

  19. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT) imaging center

    Science.gov (United States)

    Santana, Priscila do Carmo; de Oliveira, Paulo Marcio Campos; Mamede, Marcelo; Silveira, Mariana de Castro; Aguiar, Polyanna; Real, Raphaela Vila; da Silva, Teógenes Augusto

    2015-01-01

    Objective To evaluate the level of ambient radiation in a PET/CT center. Materials and Methods Previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results In none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/year) or free area (0.5 mSv/year) as recommended by the Brazilian regulations. Conclusion In the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed. PMID:25798004

  20. Can human experts predict solubility better than computers?

    Science.gov (United States)

    Boobier, Samuel; Osbourn, Anne; Mitchell, John B O

    2017-12-13

    In this study, we design and carry out a survey, asking human experts to predict the aqueous solubility of druglike organic compounds. We investigate whether these experts, drawn largely from the pharmaceutical industry and academia, can match or exceed the predictive power of algorithms. Alongside this, we implement 10 typical machine learning algorithms on the same dataset. The best algorithm, a variety of neural network known as a multi-layer perceptron, gave an RMSE of 0.985 log S units and an R 2 of 0.706. We would not have predicted the relative success of this particular algorithm in advance. We found that the best individual human predictor generated an almost identical prediction quality with an RMSE of 0.942 log S units and an R 2 of 0.723. The collection of algorithms contained a higher proportion of reasonably good predictors, nine out of ten compared with around half of the humans. We found that, for either humans or algorithms, combining individual predictions into a consensus predictor by taking their median generated excellent predictivity. While our consensus human predictor achieved very slightly better headline figures on various statistical measures, the difference between it and the consensus machine learning predictor was both small and statistically insignificant. We conclude that human experts can predict the aqueous solubility of druglike molecules essentially equally well as machine learning algorithms. We find that, for either humans or algorithms, combining individual predictions into a consensus predictor by taking their median is a powerful way of benefitting from the wisdom of crowds.

  1. Experimental evaluation of multimodal human computer interface for tactical audio applications

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.; Jovanov, E.; Oy, S.

    2002-01-01

    Mission critical and information overwhelming applications require careful design of the human computer interface. Typical applications include night vision or low visibility mission navigation, guidance through a hostile territory, and flight navigation and orientation. Additional channels of

  2. Design Science in Human-Computer Interaction: A Model and Three Examples

    Science.gov (United States)

    Prestopnik, Nathan R.

    2013-01-01

    Humanity has entered an era where computing technology is virtually ubiquitous. From websites and mobile devices to computers embedded in appliances on our kitchen counters and automobiles parked in our driveways, information and communication technologies (ICTs) and IT artifacts are fundamentally changing the ways we interact with our world.…

  3. Eyewear Computing – Augmenting the Human with Head-mounted Wearable Assistants (Dagstuhl Seminar 16042)

    OpenAIRE

    Bulling, Andreas; Cakmakci, Ozan; Kunze, Kai; Rehg, James M.

    2016-01-01

    The seminar was composed of workshops and tutorials on head-mounted eye tracking, egocentric vision, optics, and head-mounted displays. The seminar welcomed 30 academic and industry researchers from Europe, the US, and Asia with a diverse background, including wearable and ubiquitous computing, computer vision, developmental psychology, optics, and human-computer interaction. In contrast to several previous Dagstuhl seminars, we used an ignite talk format to reduce the time of talks to...

  4. Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center

    International Nuclear Information System (INIS)

    Bancroft, G.; Plessel, T.; Merritt, F.; Watson, V.

    1989-01-01

    Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers. 7 refs

  5. McMaster University: College and University Computing Environment.

    Science.gov (United States)

    CAUSE/EFFECT, 1988

    1988-01-01

    The computing and information services (CIS) organization includes administrative computing, academic computing, and networking and has three divisions: computing services, development services, and information services. Other computing activities include Health Sciences, Humanities Computing Center, and Department of Computer Science and Systems.…

  6. A canonical perturbation method for computing the guiding-center motion in magnetized axisymmetric plasma columns

    International Nuclear Information System (INIS)

    Gratreau, P.

    1987-01-01

    The motion of charged particles in a magnetized plasma column, such as that of a magnetic mirror trap or a tokamak, is determined in the framework of the canonical perturbation theory through a method of variation of constants which preserves the energy conservation and the symmetry invariance. The choice of a frame of coordinates close to that of the magnetic coordinates allows a relatively precise determination of the guiding-center motion with a low-ordered approximation in the adiabatic parameter. A Hamiltonian formulation of the motion equations is obtained

  7. The Human Genome Project: Biology, Computers, and Privacy.

    Science.gov (United States)

    Cutter, Mary Ann G.; Drexler, Edward; Gottesman, Kay S.; Goulding, Philip G.; McCullough, Laurence B.; McInerney, Joseph D.; Micikas, Lynda B.; Mural, Richard J.; Murray, Jeffrey C.; Zola, John

    This module, for high school teachers, is the second of two modules about the Human Genome Project (HGP) produced by the Biological Sciences Curriculum Study (BSCS). The first section of this module provides background information for teachers about the structure and objectives of the HGP, aspects of the science and technology that underlie the…

  8. Using Noninvasive Brain Measurement to Explore the Psychological Effects of Computer Malfunctions on Users during Human-Computer Interactions

    Directory of Open Access Journals (Sweden)

    Leanne M. Hirshfield

    2014-01-01

    Full Text Available In today’s technologically driven world, there is a need to better understand the ways that common computer malfunctions affect computer users. These malfunctions may have measurable influences on computer user’s cognitive, emotional, and behavioral responses. An experiment was conducted where participants conducted a series of web search tasks while wearing functional near-infrared spectroscopy (fNIRS and galvanic skin response sensors. Two computer malfunctions were introduced during the sessions which had the potential to influence correlates of user trust and suspicion. Surveys were given after each session to measure user’s perceived emotional state, cognitive load, and perceived trust. Results suggest that fNIRS can be used to measure the different cognitive and emotional responses associated with computer malfunctions. These cognitive and emotional changes were correlated with users’ self-report levels of suspicion and trust, and they in turn suggest future work that further explores the capability of fNIRS for the measurement of user experience during human-computer interactions.

  9. Human-centered automation and AI - Ideas, insights, and issues from the Intelligent Cockpit Aids research effort

    Science.gov (United States)

    Abbott, Kathy H.; Schutte, Paul C.

    1989-01-01

    A development status evaluation is presented for the NASA-Langley Intelligent Cockpit Aids research program, which encompasses AI, human/machine interfaces, and conventional automation. Attention is being given to decision-aiding concepts for human-centered automation, with emphasis on inflight subsystem fault management, inflight mission replanning, and communications management. The cockpit envisioned is for advanced commercial transport aircraft.

  10. Recent Advances in Computational Mechanics of the Human Knee Joint

    Science.gov (United States)

    Kazemi, M.; Dabiri, Y.; Li, L. P.

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling. PMID:23509602

  11. Computational simulation of chromosome breaks in human liver

    International Nuclear Information System (INIS)

    Yang Jianshe; Li Wenjian; Jin Xiaodong

    2006-01-01

    An easy method was established for computing chromosome breaks in cells exposed to heavily charged particles. The cell chromosome break value by 12 C +6 ions was theoretically calculated, and was tested with experimental data of chromosome breaks by using a premature chromosome condensation technique. The theoretical chromosome break value agreed well with the experimental data. The higher relative biological effectiveness of the heavy ions was closely correlated to its physical characteristics. In addition, the chromosome break value can be predicted off line. (authors)

  12. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  13. Distinguishing humans from computers in the game of go: A complex network approach

    Science.gov (United States)

    Coquidé, C.; Georgeot, B.; Giraud, O.

    2017-08-01

    We compare complex networks built from the game of go and obtained from databases of human-played games with those obtained from computer-played games. Our investigations show that statistical features of the human-based networks and the computer-based networks differ, and that these differences can be statistically significant on a relatively small number of games using specific estimators. We show that the deterministic or stochastic nature of the computer algorithm playing the game can also be distinguished from these quantities. This can be seen as a tool to implement a Turing-like test for go simulators.

  14. MoCog1: A computer simulation of recognition-primed human decision making

    Science.gov (United States)

    Gevarter, William B.

    1991-01-01

    The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  15. CICART Center For Integrated Computation And Analysis Of Reconnection And Turbulence

    International Nuclear Information System (INIS)

    Bhattacharjee, Amitava

    2016-01-01

    CICART is a partnership between the University of New Hampshire (UNH) and Dartmouth College. CICART addresses two important science needs of the DoE: the basic understanding of magnetic reconnection and turbulence that strongly impacts the performance of fusion plasmas, and the development of new mathematical and computational tools that enable the modeling and control of these phenomena. The principal participants of CICART constitute an interdisciplinary group, drawn from the communities of applied mathematics, astrophysics, computational physics, fluid dynamics, and fusion physics. It is a main premise of CICART that fundamental aspects of magnetic reconnection and turbulence in fusion devices, smaller-scale laboratory experiments, and space and astrophysical plasmas can be viewed from a common perspective, and that progress in understanding in any of these interconnected fields is likely to lead to progress in others. The establishment of CICART has strongly impacted the education and research mission of a new Program in Integrated Applied Mathematics in the College of Engineering and Applied Sciences at UNH by enabling the recruitment of a tenure-track faculty member, supported equally by UNH and CICART, and the establishment of an IBM-UNH Computing Alliance. The proposed areas of research in magnetic reconnection and turbulence in astrophysical, space, and laboratory plasmas include the following topics: (A) Reconnection and secondary instabilities in large high-Lundquist-number plasmas, (B) Particle acceleration in the presence of multiple magnetic islands, (C) Gyrokinetic reconnection: comparison with fluid and particle-in-cell models, (D) Imbalanced turbulence, (E) Ion heating, and (F) Turbulence in laboratory (including fusion-relevant) experiments. These theoretical studies make active use of three high-performance computer simulation codes: (1) The Magnetic Reconnection Code, based on extended two-fluid (or Hall MHD) equations, in an Adaptive Mesh

  16. Biomedical Computing Technology Information Center (BCTIC): Final progress report, March 1, 1986-September 30, 1986

    International Nuclear Information System (INIS)

    1987-01-01

    During this time, BCTIC packaged and disseminated computing technology and honored all requests made before September 1, 1986. The final month of operation was devoted to completing code requests, returning submitted codes, and sending out notices of BCTIC's termination of services on September 30th. Final BCTIC library listings were distributed to members of the active mailing list. Also included in the library listing are names and addresses of program authors and contributors in order that users may have continued support of their programs. The BCTIC library list is attached

  17. CICART Center For Integrated Computation And Analysis Of Reconnection And Turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Bhattacharjee, Amitava [Univ. of New Hampshire, Durham, NH (United States)

    2016-03-27

    CICART is a partnership between the University of New Hampshire (UNH) and Dartmouth College. CICART addresses two important science needs of the DoE: the basic understanding of magnetic reconnection and turbulence that strongly impacts the performance of fusion plasmas, and the development of new mathematical and computational tools that enable the modeling and control of these phenomena. The principal participants of CICART constitute an interdisciplinary group, drawn from the communities of applied mathematics, astrophysics, computational physics, fluid dynamics, and fusion physics. It is a main premise of CICART that fundamental aspects of magnetic reconnection and turbulence in fusion devices, smaller-scale laboratory experiments, and space and astrophysical plasmas can be viewed from a common perspective, and that progress in understanding in any of these interconnected fields is likely to lead to progress in others. The establishment of CICART has strongly impacted the education and research mission of a new Program in Integrated Applied Mathematics in the College of Engineering and Applied Sciences at UNH by enabling the recruitment of a tenure-track faculty member, supported equally by UNH and CICART, and the establishment of an IBM-UNH Computing Alliance. The proposed areas of research in magnetic reconnection and turbulence in astrophysical, space, and laboratory plasmas include the following topics: (A) Reconnection and secondary instabilities in large high-Lundquist-number plasmas, (B) Particle acceleration in the presence of multiple magnetic islands, (C) Gyrokinetic reconnection: comparison with fluid and particle-in-cell models, (D) Imbalanced turbulence, (E) Ion heating, and (F) Turbulence in laboratory (including fusion-relevant) experiments. These theoretical studies make active use of three high-performance computer simulation codes: (1) The Magnetic Reconnection Code, based on extended two-fluid (or Hall MHD) equations, in an Adaptive Mesh

  18. Computer modelling of HT gas metabolism in humans

    International Nuclear Information System (INIS)

    Peterman, B.F.

    1982-01-01

    A mathematical model was developed to simulate the metabolism of HT gas in humans. The rate constants of the model were estimated by fitting the calculated curves to the experimental data by Pinson and Langham in 1957. The calculations suggest that the oxidation of HT gas (which probably occurs as a result of the enzymatic action of hydrogenase present in bacteria of human gut) occurs at a relatively low rate with a half-time of 10-12 hours. The inclusion of the dose due to the production of the HT oxidation product (HTO) in the soft tissues lowers the value of derived air concentration by about 50%. Furthermore the relationship between the concentration of HTO in urine and the dose to the lung from HT in the air in lungs is linear after short HT exposures, and hence HTO concentrations in urine can be used to estimate the upper limits on the lung dose from HT exposures. (author)

  19. Measuring Human Performance within Computer Security Incident Response Teams

    Energy Technology Data Exchange (ETDEWEB)

    McClain, Jonathan T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva, Austin Ray [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Avina, Glory Emmanuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Forsythe, James C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Human performance has become a pertinen t issue within cyber security. However, this research has been stymied by the limited availability of expert cyber security professionals. This is partly attributable to the ongoing workload faced by cyber security professionals, which is compound ed by the limited number of qualified personnel and turnover of p ersonnel across organizations. Additionally, it is difficult to conduct research, and particularly, openly published research, due to the sensitivity inherent to cyber ope rations at most orga nizations. As an alternative, the current research has focused on data collection during cyb er security training exercises. These events draw individuals with a range of knowledge and experience extending from seasoned professionals to recent college gradu ates to college students. The current paper describes research involving data collection at two separate cyber security exercises. This data collection involved multiple measures which included behavioral performance based on human - machine transactions and questionnaire - based assessments of cyber security experience.

  20. Computer simulation of mucosal waves on vibrating human vocal folds

    Czech Academy of Sciences Publication Activity Database

    Vampola, T.; Horáček, Jaromír; Klepáček, I.

    2016-01-01

    Roč. 36, č. 3 (2016), s. 451-465 ISSN 0208-5216 R&D Projects: GA ČR GA16-01246S; GA ČR(CZ) GAP101/12/1306 Institutional support: RVO:61388998 Keywords : biomechanics of human voice * 3D FE model of human larynx * finite element method * proper orthogonal decomposition analysis Subject RIV: BI - Acoustics Impact factor: 1.031, year: 2016 http://ac.els-cdn.com/S0208521616300298/1-s2.0-S0208521616300298-main.pdf?_tid=e0b15360-28a9-11e6-9119-00000aab0f27&acdnat=1464862256_9ef3bcd835b40b3ce495106c65295508

  1. Transnational HCI: Humans, Computers and Interactions in Global Contexts

    DEFF Research Database (Denmark)

    Vertesi, Janet; Lindtner, Silvia; Shklovski, Irina

    2011-01-01

    , but as evolving in relation to global processes, boundary crossings, frictions and hybrid practices. In doing so, we expand upon existing research in HCI to consider the effects, implications for individuals and communities, and design opportunities in times of increased transnational interactions. We hope...... to broaden the conversation around the impact of technology in global processes by bringing together scholars from HCI and from related humanities, media arts and social sciences disciplines....

  2. Computational Human Performance Modeling For Alarm System Design

    Energy Technology Data Exchange (ETDEWEB)

    Jacques Hugo

    2012-07-01

    The introduction of new technologies like adaptive automation systems and advanced alarms processing and presentation techniques in nuclear power plants is already having an impact on the safety and effectiveness of plant operations and also the role of the control room operator. This impact is expected to escalate dramatically as more and more nuclear power utilities embark on upgrade projects in order to extend the lifetime of their plants. One of the most visible impacts in control rooms will be the need to replace aging alarm systems. Because most of these alarm systems use obsolete technologies, the methods, techniques and tools that were used to design the previous generation of alarm system designs are no longer effective and need to be updated. The same applies to the need to analyze and redefine operators’ alarm handling tasks. In the past, methods for analyzing human tasks and workload have relied on crude, paper-based methods that often lacked traceability. New approaches are needed to allow analysts to model and represent the new concepts of alarm operation and human-system interaction. State-of-the-art task simulation tools are now available that offer a cost-effective and efficient method for examining the effect of operator performance in different conditions and operational scenarios. A discrete event simulation system was used by human factors researchers at the Idaho National Laboratory to develop a generic alarm handling model to examine the effect of operator performance with simulated modern alarm system. It allowed analysts to evaluate alarm generation patterns as well as critical task times and human workload predicted by the system.

  3. Advanced approaches to characterize the human intestinal microbiota by computational meta-analysis

    NARCIS (Netherlands)

    Nikkilä, J.; Vos, de W.M.

    2010-01-01

    GOALS: We describe advanced approaches for the computational meta-analysis of a collection of independent studies, including over 1000 phylogenetic array datasets, as a means to characterize the variability of human intestinal microbiota. BACKGROUND: The human intestinal microbiota is a complex

  4. The Socioemotional Effects of a Computer-Simulated Animal on Children's Empathy and Humane Attitudes

    Science.gov (United States)

    Tsai, Yueh-Feng Lily; Kaufman, David M.

    2009-01-01

    This study investigated the potential of using a computer-simulated animal in a handheld virtual pet videogame to improve children's empathy and humane attitudes. Also investigated was whether sex differences existed in children's development of empathy and humane attitudes resulting from play, as well as their feelings for a virtual pet. The…

  5. Operational characteristics optimization of human-computer system

    Directory of Open Access Journals (Sweden)

    Zulquernain Mallick

    2010-09-01

    Full Text Available Computer operational parameters are having vital influence on the operators efficiency from readability viewpoint. Four parameters namely font, text/background color, viewing angle and viewing distance are analyzed. The text reading task, in the form of English text, was presented on the computer screen to the participating subjects and their performance, measured in terms of number of words read per minute (NWRPM, was recorded. For the purpose of optimization, the Taguchi method is used to find the optimal parameters to maximize operators’ efficiency for performing readability task. Two levels of each parameter have been considered in this study. An orthogonal array, the signal-to-noise (S/N ratio and the analysis of variance (ANOVA were employed to investigate the operators’ performance/efficiency. Results showed that Times Roman font, black text on white background, 40 degree viewing angle and 60 cm viewing distance, the subjects were quite comfortable, efficient and read maximum number of words per minute. Text/background color was dominant parameter with a percentage contribution of 76.18% towards the laid down objective followed by font type at 18.17%, viewing distance 7.04% and viewing angle 0.58%. Experimental results are provided to confirm the effectiveness of this approach.

  6. Computational modeling of human oral bioavailability: what will be next?

    Science.gov (United States)

    Cabrera-Pérez, Miguel Ángel; Pham-The, Hai

    2018-06-01

    The oral route is the most convenient way of administrating drugs. Therefore, accurate determination of oral bioavailability is paramount during drug discovery and development. Quantitative structure-property relationship (QSPR), rule-of-thumb (RoT) and physiologically based-pharmacokinetic (PBPK) approaches are promising alternatives to the early oral bioavailability prediction. Areas covered: The authors give insight into the factors affecting bioavailability, the fundamental theoretical framework and the practical aspects of computational methods for predicting this property. They also give their perspectives on future computational models for estimating oral bioavailability. Expert opinion: Oral bioavailability is a multi-factorial pharmacokinetic property with its accurate prediction challenging. For RoT and QSPR modeling, the reliability of datasets, the significance of molecular descriptor families and the diversity of chemometric tools used are important factors that define model predictability and interpretability. Likewise, for PBPK modeling the integrity of the pharmacokinetic data, the number of input parameters, the complexity of statistical analysis and the software packages used are relevant factors in bioavailability prediction. Although these approaches have been utilized independently, the tendency to use hybrid QSPR-PBPK approaches together with the exploration of ensemble and deep-learning systems for QSPR modeling of oral bioavailability has opened new avenues for development promising tools for oral bioavailability prediction.

  7. Computations on the massively parallel processor at the Goddard Space Flight Center

    Science.gov (United States)

    Strong, James P.

    1991-01-01

    Described are four significant algorithms implemented on the massively parallel processor (MPP) at the Goddard Space Flight Center. Two are in the area of image analysis. Of the other two, one is a mathematical simulation experiment and the other deals with the efficient transfer of data between distantly separated processors in the MPP array. The first algorithm presented is the automatic determination of elevations from stereo pairs. The second algorithm solves mathematical logistic equations capable of producing both ordered and chaotic (or random) solutions. This work can potentially lead to the simulation of artificial life processes. The third algorithm is the automatic segmentation of images into reasonable regions based on some similarity criterion, while the fourth is an implementation of a bitonic sort of data which significantly overcomes the nearest neighbor interconnection constraints on the MPP for transferring data between distant processors.

  8. Computer simulations of low energy displacement cascades in a face centered cubic lattice

    International Nuclear Information System (INIS)

    Schiffgens, J.O.; Bourquin, R.D.

    1976-09-01

    Computer simulations of atomic motion in a copper lattice following the production of primary knock-on atoms (PKAs) with energies from 25 to 200 eV are discussed. In this study, a mixed Moliere-Englert pair potential is used to model the copper lattice. The computer code COMENT, which employs the dynamical method, is used to analyze the motion of up to 6000 atoms per time step during cascade evolution. The atoms are specified as initially at rest on the sites of an ideal lattice. A matrix of 12 PKA directions and 6 PKA energies is investigated. Displacement thresholds in the [110] and [100] are calculated to be approximately 17 and 20 eV, respectively. A table showing the stability of isolated Frenkel pairs with different vacancy and interstitial orientations and separations is presented. The numbers of Frenkel pairs and atomic replacements are tabulated as a function of PKA direction for each energy. For PKA energies of 25, 50, 75, 100, 150, and 200 eV, the average number of Frenkel pairs per PKA are 0.4, 0.6, 1.0, 1.2, 1.4, and 2.2 and the average numbers of replacements per PKA are 2.4, 4.0, 3.3, 4.9, 9.3, and 15.8

  9. Computation of Electromagnetic Fields Scattered From Dielectric Objects of Uncertain Shapes Using MLMC Center for Uncertainty

    KAUST Repository

    Litvinenko, Alexander

    2015-01-05

    Simulators capable of computing scattered fields from objects of uncertain shapes are highly useful in electromagnetics and photonics, where device designs are typically subject to fabrication tolerances. Knowledge of statistical variations in scattered fields is useful in ensuring error-free functioning of devices. Oftentimes such simulators use a Monte Carlo (MC) scheme to sample the random domain, where the variables parameterize the uncertainties in the geometry. At each sample, which corresponds to a realization of the geometry, a deterministic electromagnetic solver is executed to compute the scattered fields. However, to obtain accurate statistics of the scattered fields, the number of MC samples has to be large. This significantly increases the total execution time. In this work, to address this challenge, the Multilevel MC (MLMC) scheme is used together with a (deterministic) surface integral equation solver. The MLMC achieves a higher efficiency by “balancing” the statistical errors due to sampling of the random domain and the numerical errors due to discretization of the geometry at each of these samples. Error balancing results in a smaller number of samples requiring coarser discretizations. Consequently, total execution time is significantly shortened.

  10. Automated planning target volume generation: an evaluation pitting a computer-based tool against human experts

    International Nuclear Information System (INIS)

    Ketting, Case H.; Austin-Seymour, Mary; Kalet, Ira; Jacky, Jon; Kromhout-Schiro, Sharon; Hummel, Sharon; Unger, Jonathan; Fagan, Lawrence M.; Griffin, Tom

    1997-01-01

    Purpose: Software tools are seeing increased use in three-dimensional treatment planning. However, the development of these tools frequently omits careful evaluation before placing them in clinical use. This study demonstrates the application of a rigorous evaluation methodology using blinded peer review to an automated software tool that produces ICRU-50 planning target volumes (PTVs). Methods and Materials: Seven physicians from three different institutions involved in three-dimensional treatment planning participated in the evaluation. Four physicians drew partial PTVs on nine test cases, consisting of four nasopharynx and five lung primaries. Using the same information provided to the human experts, the computer tool generated PTVs for comparison. The remaining three physicians, designated evaluators, individually reviewed the PTVs for acceptability. To exclude bias, the evaluators were blinded to the source (human or computer) of the PTVs they reviewed. Their scorings of the PTVs were statistically examined to determine if the computer tool performed as well as the human experts. Results: The computer tool was as successful as the human experts in generating PTVs. Failures were primarily attributable to insufficient margins around the clinical target volume and to encroachment upon critical structures. In a qualitative analysis, the human and computer experts displayed similar types and distributions of errors. Conclusions: Rigorous evaluation of computer-based radiotherapy tools requires comparison to current practice and can reveal areas for improvement before the tool enters clinical practice

  11. Human Inspired Self-developmental Model of Neural Network (HIM): Introducing Content/Form Computing

    Science.gov (United States)

    Krajíček, Jiří

    This paper presents cross-disciplinary research between medical/psychological evidence on human abilities and informatics needs to update current models in computer science to support alternative methods for computation and communication. In [10] we have already proposed hypothesis introducing concept of human information model (HIM) as cooperative system. Here we continue on HIM design in detail. In our design, first we introduce Content/Form computing system which is new principle of present methods in evolutionary computing (genetic algorithms, genetic programming). Then we apply this system on HIM (type of artificial neural network) model as basic network self-developmental paradigm. Main inspiration of our natural/human design comes from well known concept of artificial neural networks, medical/psychological evidence and Sheldrake theory of "Nature as Alive" [22].

  12. A human-centered framework for innovation in conservation incentive programs.

    Science.gov (United States)

    Sorice, Michael G; Donlan, C Josh

    2015-12-01

    The promise of environmental conservation incentive programs that provide direct payments in exchange for conservation outcomes is that they enhance the value of engaging in stewardship behaviors. An insidious but important concern is that a narrow focus on optimizing payment levels can ultimately suppress program participation and subvert participants' internal motivation to engage in long-term conservation behaviors. Increasing participation and engendering stewardship can be achieved by recognizing that participation is not simply a function of the payment; it is a function of the overall structure and administration of the program. Key to creating innovative and more sustainable programs is fitting them within the existing needs and values of target participants. By focusing on empathy for participants, co-designing program approaches, and learning from the rapid prototyping of program concepts, a human-centered approach to conservation incentive program design enhances the propensity for discovery of novel and innovative solutions to pressing conservation issues.

  13. Human trafficking for organ removal in India: a victim-centered, evidence-based report.

    Science.gov (United States)

    Budiani-Saberi, Debra A; Raja, Kallakurichi Rajendiran; Findley, Katie C; Kerketta, Ponsian; Anand, Vijay

    2014-02-27

    Enhancements in the national transplant law to prohibit commercial transplants in India have curbed the trade. Yet, the human rights abuse of human trafficking for organ removal (HTOR) continues in various transplant centers throughout India. Beginning in September 2010 until May 2012, in-depth interviews were conducted with 103 victims of HTOR in India in which victims described their experiences of a commercial kidney removal in compelling detail. Victims were located in Tamil Nadu, and reference is made to the broader study that included 50 additional victims in small towns and villages in West Bengal and Karnataka. Fourteen cases (14%) in Tamil Nadu and an additional 20 cases (40%) from West Bengal and Karnataka occurred between 2009 to May 2012. The cases in Tamil Nadu ranged in age from 19 to 55 years, with an average age of 33 years in Erode and 36 years in Chennai. Fifty-seven percent of the victims in Erode are female, and 87% of the victims in Chennai are female. Twelve percent of the individuals were widowed or abandoned, 79% were married, and 91% were parents with an average of two kids. Of those interviewed, 28% had no formal education, 19% had some primary schooling, 22% had some secondary schooling, and no individuals reported schooling above high school. All victims interviewed lived in abject poverty with monthly income levels well below the national average. The majority of victims reported long lasting health, economic, social, and psychological consequences. No matter the reason expressed for an organ sale, all victims reported that they would not have agreed to the organ removal if their economic circumstances were not so dire. One hundred percent of the victims interviewed expressed that they need assistance to cope with these consequences. Human trafficking for an organ removal continues in private transplant centers throughout India, service to foreign patients is ongoing, and victims' consequences are long lasting. A rights-based response

  14. Building communication strategy on health prevention through the human-centered design

    Directory of Open Access Journals (Sweden)

    Karine de Mello Freire

    2016-03-01

    Full Text Available It has been identified a latent need for developing efficient communication strategies for prevention of diseases and also, design as a potential agent to create communications artifacts that are able to promote self-care. In order to analyze a design process that develops this kind of artifact, an action research in IAPI Health Center in Porto Alegre was done. The action’s goal was to design a strategy to promote self-care to prevent cervical cancer. The process was conducted from the human centered design approach - HCD, which seeks to create solutions desirable for people and feasible for organizations from three main phases: a Hear, in which inspirations are originated from stories collected from people; b Create, which aims to translate these knowledge into prototypes; and, c Deliver, where the prototypes are tested and developed with users. Communication strategies were supported by design studies about visual-verbal rhetoric. As results, this design approach has shown adequate to create communication strategies targeted at self-care behaviors, aiming to empower users to change their behavior.

  15. User-Centered Design for Developing Interventions to Improve Clinician Recommendation of Human Papillomavirus Vaccination.

    Science.gov (United States)

    Henninger, Michelle L; Mcmullen, Carmit K; Firemark, Alison J; Naleway, Allison L; Henrikson, Nora B; Turcotte, Joseph A

    2017-01-01

    Human papillomavirus (HPV) is the most common sexually transmitted infection in the US and is associated with multiple types of cancer. Although effective HPV vaccines have been available since 2006, coverage rates in the US remain much lower than with other adolescent vaccinations. Prior research has shown that a strong recommendation from a clinician is a critical determinant in HPV vaccine uptake and coverage. However, few published studies to date have specifically addressed the issue of helping clinicians communicate more effectively with their patients about the HPV vaccine. To develop one or more novel interventions for helping clinicians make strong and effective recommendations for HPV vaccination. Using principles of user-centered design, we conducted qualitative interviews, interviews with persons from analogous industries, and a data synthesis workshop with multiple stakeholders. Five potential intervention strategies targeted at health care clinicians, youth, and their parents were developed. The two most popular choices to pursue were a values-based communication strategy and a puberty education workbook. User-centered design is a useful strategy for developing potential interventions to improve the rate and success of clinicians recommending the HPV vaccine. Further research is needed to test the effectiveness and acceptability of these interventions in clinical settings.

  16. Improving Primary Care with Human-Centered Design and Partnership-Based Leadership

    Directory of Open Access Journals (Sweden)

    May-Lynn Andresen

    2017-06-01

    Full Text Available Objective: The purpose of this quality improvement project was to empower and activate first-line staff (FLS to improve the six-month depression remission rate in a primary care clinic. Background: Lack of workforce engagement has been identified as an emerging national problem in health care and health care leaders have urged practice redesign to foster the Triple Aim of improved population health, improved care experience, and reduced cost of care (Berwick et al., 2008. Depression is difficult to manage and often exacerbates chronic illnesses and shortens lifespans, yet despite known effective treatments, six-month remission rates are low and care practices are often inadequate. Engaging in empowering leadership behaviors has demonstrated improvement in motivation, work outcomes, and empowerment in various industry settings across the world. Core approaches include: enhancing staff self-determination, encouraging participation in decision-making, and ensuring that staff have the knowledge and tools to achieve their performance goals, in addition to leadership communications that increase confidence in staff’s potential to perform at high levels, and their recognition that their efforts have an impact on improving organizational effectiveness. Methods: In this outpatient setting, care was siloed, staff were disengaged and a hierarchical paradigm was evident. Human-centered design principles were employed to intensively explore stakeholders’ experiences and to deeply engage end users in improving depression remission rates by creating, participating, and partnering in solutions. Leadership was educated in and deployed empowering leadership behaviors, which were synergistic with design thinking, and fostered empowerment. Results: Pre- and post-surveys demonstrated statistically significant improvement in empowerment. The six-month depression remission rate increased 167%, from 7.3% (N=261 to 19.4% (N=247. Conclusion: The convergence of

  17. USING RESEARCH METHODS IN HUMAN COMPUTER INTERACTION TO DESIGN TECHNOLOGY FOR RESILIENCE

    OpenAIRE

    Lopes, Arminda Guerra

    2016-01-01

    ABSTRACT Research in human computer interaction (HCI) covers both technological and human behavioural concerns. As a consequence, the contributions made in HCI research tend to be aware to either engineering or the social sciences. In HCI the purpose of practical research contributions is to reveal unknown insights about human behaviour and its relationship to technology. Practical research methods normally used in HCI include formal experiments, field experiments, field studies, interviews, ...

  18. Optimal design methods for a digital human-computer interface based on human reliability in a nuclear power plant

    International Nuclear Information System (INIS)

    Jiang, Jianjun; Zhang, Li; Xie, Tian; Wu, Daqing; Li, Min; Wang, Yiqun; Peng, Yuyuan; Peng, Jie; Zhang, Mengjia; Li, Peiyao; Ma, Congmin; Wu, Xing

    2017-01-01

    Highlights: • A complete optimization process is established for digital human-computer interfaces of Npps. • A quick convergence search method is proposed. • The authors propose an affinity error probability mapping function to test human reliability. - Abstract: This is the second in a series of papers describing the optimal design method for a digital human-computer interface of nuclear power plant (Npp) from three different points based on human reliability. The purpose of this series is to explore different optimization methods from varying perspectives. This present paper mainly discusses the optimal design method for quantity of components of the same factor. In monitoring process, quantity of components has brought heavy burden to operators, thus, human errors are easily triggered. To solve the problem, the authors propose an optimization process, a quick convergence search method and an affinity error probability mapping function. Two balanceable parameter values of the affinity error probability function are obtained by experiments. The experimental results show that the affinity error probability mapping function about human-computer interface has very good sensitivity and stability, and that quick convergence search method for fuzzy segments divided by component quantity has better performance than general algorithm.

  19. Computational Modelling of the Human Islet Amyloid Polypeptide

    DEFF Research Database (Denmark)

    Skeby, Katrine Kirkeby

    2014-01-01

    to interpret results correctly. Computational studies and molecular dynamics (MD) simulations in particular have become important tools in the effort to understand biological mechanisms. The strength of these methods is the high resolution in time and space, and the ability to specifically design the system....... Using MD simulations we have investigated the binding of 13 different imaging agents to a fibril segment. Using clustering analysis and binding energy calculations we have identified a common binding mode for the 13 agents in the surface grooves of the fibril, which are present on all amyloid fibrils....... This information combined with specific knowledge about the AD amyloid fibril is the building block for the design of highly specific amyloid imaging agents. We have also used MD simulations to study the interaction between hIAPP and a phospholipid membrane. At neutral pH, we find that the attraction is mainly...

  20. Computing Stability Effects of Mutations in Human Superoxide Dismutase 1

    DEFF Research Database (Denmark)

    Kepp, Kasper Planeta

    2014-01-01

    Protein stability is affected in several diseases and is of substantial interest in efforts to correlate genotypes to phenotypes. Superoxide dismutase 1 (SOD1) is a suitable test case for such correlations due to its abundance, stability, available crystal structures and thermochemical data......, and physiological importance. In this work, stability changes of SOD1 mutations were computed with five methods, CUPSAT, I-Mutant2.0, I-Mutant3.0, PoPMuSiC, and SDM, with emphasis on structural sensitivity as a potential issue in structure-based protein calculation. The large correlation between experimental...... literature data of SOD1 dimers and monomers (r = 0.82) suggests that mutations in separate protein monomers are mostly additive. PoPMuSiC was most accurate (typical MAE ∼ 1 kcal/mol, r ∼ 0.5). The relative performance of the methods was not very structure-dependent, and the more accurate methods also...

  1. A computational model of blast loading on the human eye.

    Science.gov (United States)

    Bhardwaj, Rajneesh; Ziegler, Kimberly; Seo, Jung Hee; Ramesh, K T; Nguyen, Thao D

    2014-01-01

    Ocular injuries from blast have increased in recent wars, but the injury mechanism associated with the primary blast wave is unknown. We employ a three-dimensional fluid-structure interaction computational model to understand the stresses and deformations incurred by the globe due to blast overpressure. Our numerical results demonstrate that the blast wave reflections off the facial features around the eye increase the pressure loading on and around the eye. The blast wave produces asymmetric loading on the eye, which causes globe distortion. The deformation response of the globe under blast loading was evaluated, and regions of high stresses and strains inside the globe were identified. Our numerical results show that the blast loading results in globe distortion and large deviatoric stresses in the sclera. These large deviatoric stresses may be indicator for the risk of interfacial failure between the tissues of the sclera and the orbit.

  2. An approach to human-centered design of nuclear medical equipment: the system of caption of the thyroid

    International Nuclear Information System (INIS)

    Santos, Isaac J.A. Luquetti; Silva, Carlos Borges da; Santana, Marcos; Carvalho, Paulo Victor R.; Oliveira, Mauro Vitor de; Mol, Antonio Carlos Mol; Grecco, Claudio Henrique; Augusto, Silas Cordeiro

    2005-01-01

    Technology plays an important role in modern medical centers, making health care increasingly complex, relying on complex technical equipment. This technical complexity is particularly noticeable in the nuclear medicine and can increase the risks for human error. Human error has many causes such as performance shaping factors, organizational factors and user interface design. Poorly design human system interfaces of nuclear medical equipment can increase the risks for human error. If all nuclear medical equipment had been designed with good user interfaces, incidents and accidents could be reduced as well as he time required to learn how to use the equipment. Although some manufacturers of nuclear medical equipment have already integrate human factors principles in their products, there is still a need to steer the development of nuclear medical technology toward more human-centered approach. The aim of this paper is to propose a methodology that contributes to the design, development and evaluation of nuclear medical equipment and human system interface, towards a human-centered approach. This methodology includes the ergonomic approach, based on the operator activity analysis, together with human factors standards and guidelines, questionnaires and user based testing. We describe a case study in which this methodology is being applied in evaluation of the thyroid uptake system, getting essential information and data, that ill be used in development of a new system. (author)

  3. Human anatomy nomenclature rules for the computer age.

    Science.gov (United States)

    Neumann, Paul E; Baud, Robert; Sprumont, Pierre

    2017-04-01

    Information systems are increasing in importance in biomedical sciences and medical practice. The nomenclature rules of human anatomy were reviewed for adequacy with respect to modern needs. New rules are proposed here to ensure that each Latin term is uniquely associated with an anatomical entity, as short and simple as possible, and machine-interpretable. Observance of these recommendations will also benefit students and translators of the Latin terms into other languages. Clin. Anat. 30:300-302, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  4. Brain-Computer Interfaces Applying Our Minds to Human-computer Interaction

    CERN Document Server

    Tan, Desney S

    2010-01-01

    For generations, humans have fantasized about the ability to create devices that can see into a person's mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science fiction stories. Recent advances in cognitive neuroscience and brain imaging technologies have started to turn these myths into a reality, and are providing us with the ability to interface directly with the human brain. This ability is made possible through the use of sensors that monitor physical p

  5. Brain-computer interface signal processing at the Wadsworth Center: mu and sensorimotor beta rhythms.

    Science.gov (United States)

    McFarland, Dennis J; Krusienski, Dean J; Wolpaw, Jonathan R

    2006-01-01

    The Wadsworth brain-computer interface (BCI), based on mu and beta sensorimotor rhythms, uses one- and two-dimensional cursor movement tasks and relies on user training. This is a real-time closed-loop system. Signal processing consists of channel selection, spatial filtering, and spectral analysis. Feature translation uses a regression approach and normalization. Adaptation occurs at several points in this process on the basis of different criteria and methods. It can use either feedforward (e.g., estimating the signal mean for normalization) or feedback control (e.g., estimating feature weights for the prediction equation). We view this process as the interaction between a dynamic user and a dynamic system that coadapt over time. Understanding the dynamics of this interaction and optimizing its performance represent a major challenge for BCI research.

  6. Abstracts of digital computer code packages. Assembled by the Radiation Shielding Information Center

    International Nuclear Information System (INIS)

    McGill, B.; Maskewitz, B.F.; Anthony, C.M.; Comolander, H.E.; Hendrickson, H.R.

    1976-01-01

    The term ''code package'' is used to describe a miscellaneous grouping of materials which, when interpreted in connection with a digital computer, enables the scientist--user to solve technical problems in the area for which the material was designed. In general, a ''code package'' consists of written material--reports, instructions, flow charts, listings of data, and other useful material and IBM card decks (or, more often, a reel of magnetic tape) on which the source decks, sample problem input (including libraries of data) and the BCD/EBCDIC output listing from the sample problem are written. In addition to the main code, and any available auxiliary routines are also included. The abstract format was chosen to give to a potential code user several criteria for deciding whether or not he wishes to request the code package

  7. Abstracts of digital computer code packages. Assembled by the Radiation Shielding Information Center. [Radiation transport codes

    Energy Technology Data Exchange (ETDEWEB)

    McGill, B.; Maskewitz, B.F.; Anthony, C.M.; Comolander, H.E.; Hendrickson, H.R.

    1976-01-01

    The term ''code package'' is used to describe a miscellaneous grouping of materials which, when interpreted in connection with a digital computer, enables the scientist--user to solve technical problems in the area for which the material was designed. In general, a ''code package'' consists of written material--reports, instructions, flow charts, listings of data, and other useful material and IBM card decks (or, more often, a reel of magnetic tape) on which the source decks, sample problem input (including libraries of data) and the BCD/EBCDIC output listing from the sample problem are written. In addition to the main code, and any available auxiliary routines are also included. The abstract format was chosen to give to a potential code user several criteria for deciding whether or not he wishes to request the code package. (RWR)

  8. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1980-05-01

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone who wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables

  9. Predicting Structures of Ru-Centered Dyes: A Computational Screening Tool.

    Science.gov (United States)

    Fredin, Lisa A; Allison, Thomas C

    2016-04-07

    Dye-sensitized solar cells (DSCs) represent a means for harvesting solar energy to produce electrical power. Though a number of light harvesting dyes are in use, the search continues for more efficient and effective compounds to make commercially viable DSCs a reality. Computational methods have been increasingly applied to understand the dyes currently in use and to aid in the search for improved light harvesting compounds. Semiempirical quantum chemistry methods have a well-deserved reputation for giving good quality results in a very short amount of computer time. The most recent semiempirical models such as PM6 and PM7 are parametrized for a wide variety of molecule types, including organometallic complexes similar to DSC chromophores. In this article, the performance of PM6 is tested against a set of 20 molecules whose geometries were optimized using a density functional theory (DFT) method. It is found that PM6 gives geometries that are in good agreement with the optimized DFT structures. In order to reduce the differences between geometries optimized using PM6 and geometries optimized using DFT, the PM6 basis set parameters have been optimized for a subset of the molecules. It is found that it is sufficient to optimize the basis set for Ru alone to improve the agreement between the PM6 results and the DFT results. When this optimized Ru basis set is used, the mean unsigned error in Ru-ligand bond lengths is reduced from 0.043 to 0.017 Å in the set of 20 test molecules. Though the magnitude of these differences is small, the effect on the calculated UV/vis spectra is significant. These results clearly demonstrate the value of using PM6 to screen DSC chromophores as well as the value of optimizing PM6 basis set parameters for a specific set of molecules.

  10. Annual report of R and D activities in center for promotion of computational science and engineering from April 1, 2003 to March 31, 2004

    International Nuclear Information System (INIS)

    2005-08-01

    Major Research and development activities of Center for Promotion of Computational Science and Engineering (CCSE), JAERI, have focused on ITBL (IT Based Laboratory) project, computational material science and Quantum Bioinformatics. This report provides an overview of research and development activities in (CCSE) in the fiscal year 2003 (April 1, 2003 - March 31, 2004). (author)

  11. Associating Human-Centered Concepts with Social Networks Using Fuzzy Sets

    Science.gov (United States)

    Yager, Ronald R.

    The rapidly growing global interconnectivity, brought about to a large extent by the Internet, has dramatically increased the importance and diversity of social networks. Modern social networks cut across a spectrum from benign recreational focused websites such as Facebook to occupationally oriented websites such as LinkedIn to criminally focused groups such as drug cartels to devastation and terror focused groups such as Al-Qaeda. Many organizations are interested in analyzing and extracting information related to these social networks. Among these are governmental police and security agencies as well marketing and sales organizations. To aid these organizations there is a need for technologies to model social networks and intelligently extract information from these models. While established technologies exist for the modeling of relational networks [1-7] few technologies exist to extract information from these, compatible with human perception and understanding. Data bases is an example of a technology in which we have tools for representing our information as well as tools for querying and extracting the information contained. Our goal is in some sense analogous. We want to use the relational network model to represent information, in this case about relationships and interconnections, and then be able to query the social network using intelligent human-centered concepts. To extend our capabilities to interact with social relational networks we need to associate with these network human concepts and ideas. Since human beings predominantly use linguistic terms in which to reason and understand we need to build bridges between human conceptualization and the formal mathematical representation of the social network. Consider for example a concept such as "leader". An analyst may be able to express, in linguistic terms, using a network relevant vocabulary, properties of a leader. Our task is to translate this linguistic description into a mathematical formalism

  12. Brain-Computer Interfaces. Applying our Minds to Human-Computer Interaction

    NARCIS (Netherlands)

    Tan, Desney S.; Nijholt, Antinus

    2010-01-01

    For generations, humans have fantasized about the ability to create devices that can see into a person’s mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science

  13. Environmental Research Division annual report: Center for Human Radiobiology, July 1983-June 1984. Part 2

    International Nuclear Information System (INIS)

    1985-04-01

    Epidemiological studies of the late effects of internal radium in man, and mechanistic investigations of those effects, have continued. The current status of the study is summarized. An experimental technique for preparing thin sections of bone and the application of that technique in studying the comparative distribution of radium and plutonium are described. Radiological dental changes due to radium in man and dog are compared. Survival of human fibroblasts irradiated with alpha particles in vitro was found to be higher when the average LET was higher. In the study of the late effects of thorium in man, the relative activities of the daughter products in the lung have been determined spectrometrically in vivo. The exhalation of thoron in these persons has been investigated in relation to lung burden of thorium and to personal factors such as smoking, age, and weight. The administration of two isotopes to large mammals has been used to demonstrate that the metabolism of plutonium is independent of route of entry and to determine the gastrointestinal absorption of plutonium. The effect of thermoluminescence on a scintillation radon counting system has been investigated quantitatively. Data on the exposure of 88 persons to radium were added to the data base, bringing the total to 2400 radium cases under study by the Center for Human Radiobiology. Separate abstracts were prepared for individual papers

  14. A systems engineering perspective on the human-centered design of health information systems.

    Science.gov (United States)

    Samaras, George M; Horst, Richard L

    2005-02-01

    The discipline of systems engineering, over the past five decades, has used a structured systematic approach to managing the "cradle to grave" development of products and processes. While elements of this approach are typically used to guide the development of information systems that instantiate a significant user interface, it appears to be rare for the entire process to be implemented. In fact, a number of authors have put forth development lifecycle models that are subsets of the classical systems engineering method, but fail to include steps such as incremental hazard analysis and post-deployment corrective and preventative actions. In that most health information systems have safety implications, we argue that the design and development of such systems would benefit by implementing this systems engineering approach in full. Particularly with regard to bringing a human-centered perspective to the formulation of system requirements and the configuration of effective user interfaces, this classical systems engineering method provides an excellent framework for incorporating human factors (ergonomics) knowledge and integrating ergonomists in the interdisciplinary development of health information systems.

  15. Situated dialog in speech-based human-computer interaction

    CERN Document Server

    Raux, Antoine; Lane, Ian; Misu, Teruhisa

    2016-01-01

    This book provides a survey of the state-of-the-art in the practical implementation of Spoken Dialog Systems for applications in everyday settings. It includes contributions on key topics in situated dialog interaction from a number of leading researchers and offers a broad spectrum of perspectives on research and development in the area. In particular, it presents applications in robotics, knowledge access and communication and covers the following topics: dialog for interacting with robots; language understanding and generation; dialog architectures and modeling; core technologies; and the analysis of human discourse and interaction. The contributions are adapted and expanded contributions from the 2014 International Workshop on Spoken Dialog Systems (IWSDS 2014), where researchers and developers from industry and academia alike met to discuss and compare their implementation experiences, analyses and empirical findings.

  16. Computational model of soft tissues in the human upper airway.

    Science.gov (United States)

    Pelteret, J-P V; Reddy, B D

    2012-01-01

    This paper presents a three-dimensional finite element model of the tongue and surrounding soft tissues with potential application to the study of sleep apnoea and of linguistics and speech therapy. The anatomical data was obtained from the Visible Human Project, and the underlying histological data was also extracted and incorporated into the model. Hyperelastic constitutive models were used to describe the material behaviour, and material incompressibility was accounted for. An active Hill three-element muscle model was used to represent the muscular tissue of the tongue. The neural stimulus for each muscle group was determined through the use of a genetic algorithm-based neural control model. The fundamental behaviour of the tongue under gravitational and breathing-induced loading is investigated. It is demonstrated that, when a time-dependent loading is applied to the tongue, the neural model is able to control the position of the tongue and produce a physiologically realistic response for the genioglossus.

  17. Computed Tomography Features of Benign and Malignant Calcified Thyroid Nodules: A Single-Center Study.

    Science.gov (United States)

    Kim, Donghyun; Kim, Dong Wook; Heo, Young Jin; Baek, Jin Wook; Lee, Yoo Jin; Park, Young Mi; Baek, Hye Jin; Jung, Soo Jin

    No previous studies have investigated thyroid calcification on computed tomography (CT) quantitatively by using Hounsfield unit (HU) values. This study aimed to analyze quantitative HU values of thyroid calcification on preoperative neck CT and to assess the characteristics of benign and malignant calcified thyroid nodules (CTNs). Two hundred twenty patients who underwent neck CT before thyroid surgery from January 2015 to June 2016 were included. On soft-tissue window CT images, CTNs with calcified components of 3 mm or larger in minimum diameter were included in this study. The HU values and types of CTNs were determined and analyzed. Of 61 CTNs in 49 patients, there were 42 malignant nodules and 19 benign nodules. The mean largest diameter of the calcified component was 5.3 (2.5) mm (range, 3.1-17.1 mm). A statistically significant difference was observed in the HU values of calcified portions between benign and malignant CTNs, whereas there was no significant difference in patient age or sex or in the size, location, or type of each CTN. Of the 8 CTNs with pure calcification, 3 exhibited a honeycomb pattern on bone window CT images, and these 3 CTNs were all diagnosed as papillary thyroid carcinoma on histopathological examination. Hounsfield unit values of CTNs may be helpful for differentiating malignancy from benignity.

  18. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    Science.gov (United States)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

  19. Can Computers Foster Human Users’ Creativity? Theory and Praxis of Mixed-Initiative Co-Creativity

    Directory of Open Access Journals (Sweden)

    Antonios Liapis

    2016-07-01

    Full Text Available This article discusses the impact of artificially intelligent computers to the process of design, play and educational activities. A computational process which has the necessary intelligence and creativity to take a proactive role in such activities can not only support human creativity but also foster it and prompt lateral thinking. The argument is made both from the perspective of human creativity, where the computational input is treated as an external stimulus which triggers re-framing of humans’ routines and mental associations, but also from the perspective of computational creativity where human input and initiative constrains the search space of the algorithm, enabling it to focus on specific possible solutions to a problem rather than globally search for the optimal. The article reviews four mixed-initiative tools (for design and educational play based on how they contribute to human-machine co-creativity. These paradigms serve different purposes, afford different human interaction methods and incorporate different computationally creative processes. Assessing how co-creativity is facilitated on a per-paradigm basis strengthens the theoretical argument and provides an initial seed for future work in the burgeoning domain of mixed-initiative interaction.

  20. Current (1984) status of the study of 226Ra and 228Ra in humans at the Center for Human Radiobiology

    International Nuclear Information System (INIS)

    Rundo, J.; Keane, A.T.; Lucas, H.F.; Schlenker, R.A.; Stebbings, J.H.; Stehney, A.F.

    1984-01-01

    The Center for Human Radiobiology has identified 5784 persons by name and type of exposure to 226 Ra and 228 Ra. Included are 4863 dial painters (mostly women) and non-laboratory employees of the radium dial industry, 410 laboratory workers, 399 persons who received radium for supposed therapeutic effects, and 112 in other categories. Body contents of radium have been measured in 1916 of the dial workers and about one-half of the subjects in the other groups. Bone sarcomas, carcinomas of the paranasal sinuses and mastoids, and deterioration of skeletal tissue are still the only effects unequivocally attributable to internal radium. Excess leukemias have not been observed and other malignancies, if in excess, appear more likely to be related to external gamma radiation or radon than to internal radium. Positive correlations with radium burdens have been found for the incidence of benign exostoses among subjects exposed to radium before age 18 and for shortened latency of ocular cataracts. 26 references, 3 figures, 5 tables

  1. Current (1984) status of the study of 226Ra and 228Ra in humans at the Center for Human Radiobiology

    International Nuclear Information System (INIS)

    Rundo, J.; Keane, A.T.; Lucas, H.F.; Schlenker, R.A.; Stebbings, J.H.; Stehney, A.F.

    1985-01-01

    The Center for Human Radiobiology has identified 5784 persons by name and type of exposure to 226 Ra and 228 Ra. Included are 4863 dial painters (mostly women) and non-laboratory employees of the radium dial industry, 410 laboratory workers, 399 persons who received radium for supposed therapeutic effects, and 112 in other categories. Body contents of radium have been measured in 1916 of the dial workers and about one-half of the subjects in the other groups. Bone sarcomas, carcinomas of the paranasal sinuses and mastoids, and deterioration of skeletal tissue are still the only effects unequivocally attributable to internal radium. Excess leukemias have not been observed and other malignancies, if in excess, appear more likely to be related to external gamma radiation or radon than to internal radium. Positive correlations with radium burdens have been found for the incidence of benign exostoses among subjects exposed to radium before age 18 and for shortened latency of ocular cataracts. 27 references, 3 figures, 5 tables

  2. EVALUATION OF PROPTOSIS BY USING COMPUTED TOMOGRAPHY IN A TERTIARY CARE CENTER, BURLA, SAMBALPUR, ODISHA

    Directory of Open Access Journals (Sweden)

    Vikas Agrawal

    2017-07-01

    Full Text Available BACKGROUND Proptosis is defined as the abnormal anterior protrusion of the globe beyond the orbital margins.1 It is an important clinical manifestation of various orbital as well as systemic disorders. Aetiology ranging from infection to malignant tumours, among which space occupying lesions within the orbits are the most important. Proptosis is defined as an abnormal protrusion of the eyeball. MATERIALS AND METHODS A total of 32 patients referred from various departments mainly from ophthalmology and medicine with history and clinical features suggestive of proptosis were evaluated in our department and after proper history taking and clinical examination, Computed Tomography (CT scan was done. RESULTS The age of the patients ranged from 1-55 years. Associated chief complaints in case of proptosis were in decreasing order from pain / headache, restricted eye movement, diminished vision and diplopia. Mass lesions (46.87% were the most common cause of proptosis followed by inflammatory lesions (37.5%. Trauma vascular lesions and congenital conditions were infrequent causes of proptosis. In children, common causes of proptosis were retinoblastoma (35.71% and orbital cellulitis (28.57% and in adults the common causes were thyroid ophthalmopathy (22.22%, trauma (16.66% and pseudo-tumour (16.66%. CONCLUSION Mass lesions (46.87% were the most common cause of proptosis followed by inflammatory lesions (37.5%. CT scanning should be the chief investigation in evaluation of lesions causing proptosis. It is the most useful in detecting characterising and determining the extent of disease process. The overall accuracy of CT scan in diagnosis of proptosis is 96.87%.

  3. Operating Room-to-ICU Patient Handovers: A Multidisciplinary Human-Centered Design Approach.

    Science.gov (United States)

    Segall, Noa; Bonifacio, Alberto S; Barbeito, Atilio; Schroeder, Rebecca A; Perfect, Sharon R; Wright, Melanie C; Emery, James D; Atkins, B Zane; Taekman, Jeffrey M; Mark, Jonathan B

    2016-09-01

    Patient handovers (handoffs) following surgery have often been characterized by poor teamwork, unclear procedures, unstructured processes, and distractions. A study was conducted to apply a human-centered approach to the redesign of operating room (OR)-to-ICU patient handovers in a broad surgical ICU (SICU) population. This approach entailed (1) the study of existing practices, (2) the redesign of the handover on the basis of the input of hand over participants and evidence in the medical literature, and (3) the study of the effects of this change on processes and communication. The Durham [North Carolina] Veterans Affairs Medical Center SICU is an 11-bed mixed surgical specialty unit. To understand the existing process for receiving postoperative patients in the SICU, ethnographic methods-a series of observations, surveys, interviews, and focus groups-were used. The handover process was redesigned to better address providers' work flow, information needs, and expectations, as well as concerns identified in the literature. Technical and communication flaws were uncovered, and the handover was redesigned to address them. For the 49 preintervention and 49 postintervention handovers, the information transfer score and number of interruptions were not significantly different. However, staff workload and team behaviors scores improved significantly, while the hand over duration was not prolonged by the new process. Handover participants were also significantly more satisfied with the new handover method. An HCD approach led to improvements in the patient handover process from the OR to the ICU in a mixed adult surgical population. Although the specific handover process would unlikely be optimal in another clinical setting if replicated exactly, the HCD foundation behind the redesign process is widely applicable.

  4. Proceedings of the Third International Conference on Intelligent Human Computer Interaction

    CERN Document Server

    Pokorný, Jaroslav; Snášel, Václav; Abraham, Ajith

    2013-01-01

    The Third International Conference on Intelligent Human Computer Interaction 2011 (IHCI 2011) was held at Charles University, Prague, Czech Republic from August 29 - August 31, 2011. This conference was third in the series, following IHCI 2009 and IHCI 2010 held in January at IIIT Allahabad, India. Human computer interaction is a fast growing research area and an attractive subject of interest for both academia and industry. There are many interesting and challenging topics that need to be researched and discussed. This book aims to provide excellent opportunities for the dissemination of interesting new research and discussion about presented topics. It can be useful for researchers working on various aspects of human computer interaction. Topics covered in this book include user interface and interaction, theoretical background and applications of HCI and also data mining and knowledge discovery as a support of HCI applications.

  5. Treatment of human-computer interface in a decision support system

    International Nuclear Information System (INIS)

    Heger, A.S.; Duran, F.A.; Cox, R.G.

    1992-01-01

    One of the most challenging applications facing the computer community is development of effective adaptive human-computer interface. This challenge stems from the complex nature of the human part of this symbiosis. The application of this discipline to the environmental restoration and waste management is further complicated due to the nature of environmental data. The information that is required to manage environmental impacts of human activity is fundamentally complex. This paper will discuss the efforts at Sandia National Laboratories in developing the adaptive conceptual model manager within the constraint of the environmental decision-making. A computer workstation, that hosts the Conceptual Model Manager and the Sandia Environmental Decision Support System will also be discussed

  6. Investigation and evaluation into the usability of human-computer interfaces using a typical CAD system

    Energy Technology Data Exchange (ETDEWEB)

    Rickett, J D

    1987-01-01

    This research program covers three topics relating to the human-computer interface namely, voice recognition, tools and techniques for evaluation, and user and interface modeling. An investigation into the implementation of voice-recognition technologies examines how voice recognizers may be evaluated in commercial software. A prototype system was developed with the collaboration of FEMVIEW Ltd. (marketing a CAD package). A theoretical approach to evaluation leads to the hypothesis that human-computer interaction is affected by personality, influencing types of dialogue, preferred methods for providing helps, etc. A user model based on personality traits, or habitual-behavior patterns (HBP) is presented. Finally, a practical framework is provided for the evaluation of human-computer interfaces. It suggests that evaluation is an integral part of design and that the iterative use of evaluation techniques throughout the conceptualization, design, implementation and post-implementation stages will ensure systems that satisfy the needs of the users and fulfill the goal of usability.

  7. [Results of the first human papilloma virus center in Hungary (2007-2011)].

    Science.gov (United States)

    Galamb, Adám; Pajor, Attila; Langmár, Zoltán; Sobel, Gábor

    2011-11-06

    Human papilloma virus (HPV) is the most common sexually transmitted infection in the 21st century. It has been established that infections with specific HPV types are contributing factors to cervical cancer. Approximately 99.7% of cervical cancers are associated with high risk HPV types. HPV testing plays an important role in the prevention, by decreasing the prevalence and the mortality of cervical cancer. There are 16 HPV-centers operating in Hungary, in which patients undergo HPV screening, cervical exams, and treatment based on standardized guidelines. The first HPV-center was founded in 2007 in Budapest, at the 2nd Department of Obstetrics and Gynecology, Semmelweis University. This study aimed to define the presence and prevalence of HPV-DNA in the cervical swab samples obtained from patients in our center. Authors conducted to assess the age-specific-prevalence, and HPV type distribution, the associated cervical abnormalities, comparing our results with international data. Overall 1155 woman underwent HPV-testing and genotyping, using polymerase chain reaction. Overall, 55.5% of patients had positive test for HPV DNA types, in which 38.5% for high-risk HPV DNA. Overall prevalence was the highest among females aged 15 to 25 years (62.9%). The most common HPV type found was the high risk type 16 (19.5% among the patients with positive HPV testing). Presence of high risk HPV with concurrent cervical cytological abnormality was in 32%. More than two-thirds of woman with cytological atypia (70.6%) were infected with two or more high risk HPV types. HPV 16 was detected in 32% of patients with cytological abnormalities. The results suggest that the prevalence of HPV in this study population exceeds the international data. The results attracts the attention the peak prevalence of the high risk types in the youngest age-group, and the higher risk of cervical abnormality in case of presence of two or more HPV types. The dominance of type 16 and 18 was predictable, but

  8. Distribution of absorbed dose in human eye simulated by SRNA-2KG computer code

    International Nuclear Information System (INIS)

    Ilic, R.; Pesic, M.; Pavlovic, R.; Mostacci, D.

    2003-01-01

    Rapidly increasing performances of personal computers and development of codes for proton transport based on Monte Carlo methods will allow, very soon, the introduction of the computer planning proton therapy as a normal activity in regular hospital procedures. A description of SRNA code used for such applications and results of calculated distributions of proton-absorbed dose in human eye are given in this paper. (author)

  9. Computer-Based Training in Eating and Nutrition Facilitates Person-Centered Hospital Care: A Group Concept Mapping Study.

    Science.gov (United States)

    Westergren, Albert; Edfors, Ellinor; Norberg, Erika; Stubbendorff, Anna; Hedin, Gita; Wetterstrand, Martin; Rosas, Scott R; Hagell, Peter

    2018-04-01

    Studies have shown that computer-based training in eating and nutrition for hospital nursing staff increased the likelihood that patients at risk of undernutrition would receive nutritional interventions. This article seeks to provide understanding from the perspective of nursing staff of conceptually important areas for computer-based nutritional training, and their relative importance to nutritional care, following completion of the training. Group concept mapping, an integrated qualitative and quantitative methodology, was used to conceptualize important factors relating to the training experiences through four focus groups (n = 43), statement sorting (n = 38), and importance rating (n = 32), followed by multidimensional scaling and cluster analysis. Sorting of 38 statements yielded four clusters. These clusters (number of statements) were as follows: personal competence and development (10), practice close care development (10), patient safety (9), and awareness about the nutrition care process (9). First and second clusters represented "the learning organization," and third and fourth represented "quality improvement." These findings provide a conceptual basis for understanding the importance of training in eating and nutrition, which contributes to a learning organization and quality improvement, and can be linked to and facilitates person-centered nutritional care and patient safety.

  10. Production Support Flight Control Computers: Research Capability for F/A-18 Aircraft at Dryden Flight Research Center

    Science.gov (United States)

    Carter, John F.

    1997-01-01

    NASA Dryden Flight Research Center (DFRC) is working with the United States Navy to complete ground testing and initiate flight testing of a modified set of F/A-18 flight control computers. The Production Support Flight Control Computers (PSFCC) can give any fleet F/A-18 airplane an in-flight, pilot-selectable research control law capability. NASA DFRC can efficiently flight test the PSFCC for the following four reasons: (1) Six F/A-18 chase aircraft are available which could be used with the PSFCC; (2) An F/A-18 processor-in-the-loop simulation exists for validation testing; (3) The expertise has been developed in programming the research processor in the PSFCC; and (4) A well-defined process has been established for clearing flight control research projects for flight. This report presents a functional description of the PSFCC. Descriptions of the NASA DFRC facilities, PSFCC verification and validation process, and planned PSFCC projects are also provided.

  11. Donor-recipient human leukocyte antigen matching practices in vascularized composite tissue allotransplantation: a survey of major transplantation centers.

    Science.gov (United States)

    Ashvetiya, Tamara; Mundinger, Gerhard S; Kukuruga, Debra; Bojovic, Branko; Christy, Michael R; Dorafshar, Amir H; Rodriguez, Eduardo D

    2014-07-01

    Vascularized composite tissue allotransplant recipients are often highly sensitized to human leukocyte antigens because of multiple prior blood transfusions and other reconstructive operations. The use of peripheral blood obtained from dead donors for crossmatching may be insufficient because of life support measures taken for the donor before donation. No study has been published investigating human leukocyte antigen matching practices in this field. A survey addressing human leukocyte antigen crossmatching methods was generated and sent to 22 vascularized composite tissue allotransplantation centers with active protocols worldwide. Results were compiled by center and compared using two-tailed t tests. Twenty of 22 centers (91 percent) responded to the survey. Peripheral blood was the most commonly reported donor sample for vascularized composite tissue allotransplant crossmatching [78 percent of centers (n=14)], with only 22 percent (n=4) using lymph nodes. However, 56 percent of the 18 centers (n=10) that had performed vascularized composite tissue allotransplantation reported that they harvested lymph nodes for crossmatching. Of responding individuals, 62.5 percent (10 of 16 individuals) felt that lymph nodes were the best donor sample for crossmatching. A slight majority of vascularized composite tissue allotransplant centers that have performed clinical transplants have used lymph nodes for human leukocyte antigen matching, and centers appear to be divided on the utility of lymph node harvest. The use of lymph nodes may offer a number of potential benefits. This study highlights the need for institutional review board-approved crossmatching protocols specific to vascularized composite tissue allotransplantation, and the need for global databases for sharing of vascularized composite tissue allotransplantation experiences.

  12. Human-computer interaction handbook fundamentals, evolving technologies and emerging applications

    CERN Document Server

    Sears, Andrew

    2007-01-01

    This second edition of The Human-Computer Interaction Handbook provides an updated, comprehensive overview of the most important research in the field, including insights that are directly applicable throughout the process of developing effective interactive information technologies. It features cutting-edge advances to the scientific knowledge base, as well as visionary perspectives and developments that fundamentally transform the way in which researchers and practitioners view the discipline. As the seminal volume of HCI research and practice, The Human-Computer Interaction Handbook feature

  13. Annual report of Nuclear Human Resource Development Center. April 1, 2015 - March 31, 2016

    International Nuclear Information System (INIS)

    2017-07-01

    This annual report summarizes the activities of Nuclear Human Resource Development Center (NuHRDeC) of Japan Atomic Energy Agency (JAEA) in the fiscal year (FY) 2015. In FY 2015, we were actively engaged in organizing special training courses in response to external training needs, cooperating with universities, and offering international training courses for Asian countries in addition to the regular training programs at NuHRDeC. In accordance to the annual plan for national training, we conducted training courses for radioisotopes and radiation engineers, nuclear energy engineers, and national qualification examinations, as well as for officials in Nuclear Regulatory Authority and prefectural and municipal officials in Fukushima as outreach activities in order to meet the training needs from the external organizations. We continued to enhance cooperative activities with universities, such as the acceptance of postdoctoral researchers, the cooperation according to the cooperative graduate school system, including the acceptance of students from Nuclear Professional School of University of Tokyo. Furthermore, through utilizing the remote education system, the joint course was successfully held with seven universities, and the intensive summer course and the practical exercise at Nuclear Fuel Cycle Engineering Laboratories were also conducted as part of the collaboration network with universities. The Instructor Training Program (ITP) was continually offered to the ITP participating countries (Bangladesh, China, Indonesia, Kazakhstan, Malaysia, Mongolia, Philippines, Saudi Arabia, Sri Lanka, Thailand, Turkey and Viet Nam) in FY2015 under contact with Ministry of Education, Culture, Sports, Science and Technology. As part of the ITP, the Instructor Training Course and the Nuclear Technology Seminar were organized at NuHRDeC such as “Reactor Engineering Course” and “Basic Radiation Knowledge for School Education Seminar”. Eight and eleven countries

  14. Dynamic Human-Centered Suit Design: A Computational and Experimental Method

    Data.gov (United States)

    National Aeronautics and Space Administration — Introduction: Manned space flight necessitates an ability to provide life support to crewmembers during multiple mission stages, in the form of space suits. With...

  15. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  16. Supporting Clinical Cognition: A Human-Centered Approach to a Novel ICU Information Visualization Dashboard.

    Science.gov (United States)

    Faiola, Anthony; Srinivas, Preethi; Duke, Jon

    2015-01-01

    Advances in intensive care unit bedside displays/interfaces and electronic medical record (EMR) technology have not adequately addressed the topic of visual clarity of patient data/information to further reduce cognitive load during clinical decision-making. We responded to these challenges with a human-centered approach to designing and testing a decision-support tool: MIVA 2.0 (Medical Information Visualization Assistant, v.2). Envisioned as an EMR visualization dashboard to support rapid analysis of real-time clinical data-trends, our primary goal originated from a clinical requirement to reduce cognitive overload. In the study, a convenience sample of 12 participants were recruited, in which quantitative and qualitative measures were used to compare MIVA 2.0 with ICU paper medical-charts, using time-on-task, post-test questionnaires, and interviews. Findings demonstrated a significant difference in speed and accuracy with the use of MIVA 2.0. Qualitative outcomes concurred, with participants acknowledging the potential impact of MIVA 2.0 for reducing cognitive load and enabling more accurate and quicker decision-making.

  17. Implementation an human resources shared services center: Multinational company strategy in fusion context

    Directory of Open Access Journals (Sweden)

    João Paulo Bittencourt

    2016-09-01

    Full Text Available The aim of this research was to analyze the process of implementation and management of the Shared Services Center for Human Resources, in a multinational company in the context of mergers and acquisitions. The company analyzed was called here Alpha, and is one of the largest food companies in the country that was born of a merger between Beta and Delta in 2008. The CSC may constitute a tool for strategic management of HR that allows repositioning of the role of the area in order to be more strategic at corporate level and more profitable at the operating level. The research was based on a descriptive and exploratory study of qualitative approach. Among the results, there is the fact that shared services were strategic to support, standardize and ensure the expansion of the company. The challenges found were associated with the development of a culture of service and the relationship with users and the definition of HR activities scope. The following management procedures include the adequacy of wage differences between employees, the career path limitation and the need to attract and retain talent and international expansion.

  18. Alternative Ultrasound Gel for a Sustainable Ultrasound Program: Application of Human Centered Design.

    Directory of Open Access Journals (Sweden)

    Margaret Salmon

    Full Text Available This paper describes design of a low cost, ultrasound gel from local products applying aspects of Human Centered Design methodology. A multidisciplinary team worked with clinicians who use ultrasound where commercial gel is cost prohibitive and scarce. The team followed the format outlined in the Ideo Took Kit. Research began by defining the challenge "how to create locally available alternative ultrasound gel for a low-resourced environment? The "End-Users," were identified as clinicians who use ultrasound in Democratic Republic of the Congo and Ethiopia. An expert group was identified and queried for possible alternatives to commercial gel. Responses included shampoo, oils, water and cornstarch. Cornstarch, while a reasonable solution, was either not available or too expensive. We then sought deeper knowledge of locally sources materials from local experts, market vendors, to develop a similar product. Suggested solutions gleaned from these interviews were collected and used to create ultrasound gel accounting for cost, image quality, manufacturing capability. Initial prototypes used cassava root flour from Great Lakes Region (DRC, Rwanda, Uganda, Tanzania and West Africa, and bula from Ethiopia. Prototypes were tested in the field and resulting images evaluated by our user group. A final prototype was then selected. Cassava and bula at a 32 part water, 8 part flour and 4 part salt, heated, mixed then cooled was the product design of choice.

  19. Alternative Ultrasound Gel for a Sustainable Ultrasound Program: Application of Human Centered Design.

    Science.gov (United States)

    Salmon, Margaret; Salmon, Christian; Bissinger, Alexa; Muller, Mundenga Mutendi; Gebreyesus, Alegnta; Geremew, Haimanot; Wendel, Sarah K; Wendell, Sarah; Azaza, Aklilu; Salumu, Maurice; Benfield, Nerys

    2015-01-01

    This paper describes design of a low cost, ultrasound gel from local products applying aspects of Human Centered Design methodology. A multidisciplinary team worked with clinicians who use ultrasound where commercial gel is cost prohibitive and scarce. The team followed the format outlined in the Ideo Took Kit. Research began by defining the challenge "how to create locally available alternative ultrasound gel for a low-resourced environment? The "End-Users," were identified as clinicians who use ultrasound in Democratic Republic of the Congo and Ethiopia. An expert group was identified and queried for possible alternatives to commercial gel. Responses included shampoo, oils, water and cornstarch. Cornstarch, while a reasonable solution, was either not available or too expensive. We then sought deeper knowledge of locally sources materials from local experts, market vendors, to develop a similar product. Suggested solutions gleaned from these interviews were collected and used to create ultrasound gel accounting for cost, image quality, manufacturing capability. Initial prototypes used cassava root flour from Great Lakes Region (DRC, Rwanda, Uganda, Tanzania) and West Africa, and bula from Ethiopia. Prototypes were tested in the field and resulting images evaluated by our user group. A final prototype was then selected. Cassava and bula at a 32 part water, 8 part flour and 4 part salt, heated, mixed then cooled was the product design of choice.

  20. Creativity and Innovation in Health Care: Tapping Into Organizational Enablers Through Human-Centered Design.

    Science.gov (United States)

    Zuber, Christi Dining; Moody, Louise

    There is an increasing drive in health care for creativity and innovation to tackle key health challenges, improve quality and access, and reduce harm and costs. Human-centered design (HCD) is a potential approach to achieving organizational innovation. However, research suggests the nursing workforce feels unsupported to take the risks needed for innovation, and leaders may not understand the conditions required to fully support them. The aim of this study was to identify enabling conditions that support frontline nurses in their attempts to behave as champions of innovation and change. An HCD workshop was undertaken with 125 nurses employed in clinical practice at Kaiser Permanente. The workshop included empathy mapping and semistructured questions that probed participant experiences with innovation and change. The data were collated and thematic analysis undertaken through a Grounded Theory approach. The data were analyzed to identify key enabling conditions. Seven enablers emerged: personal need for a solution; challenges that have meaningful purpose; clarity of goal and control of resources; active experimentation; experiences indicating progress; positive encouragement and confidence; and provision of psychological safety. These enablers were then translated into pragmatic guidelines for leaders on how the tools of HCD may be leveraged for innovation and change in health care.

  1. Technical, organizational and human-centered requirements for the purpose of accident management

    International Nuclear Information System (INIS)

    Berning, A.; Fassmann, W.; Preischl, W.

    1998-01-01

    A catalog of ergonomic recommendations for organizational measures and design of paper documented work aids for accident management situations in nuclear power plants was developed. Attention was given to provide recommendations meeting practical needs and being sufficiently flexible to allow plant specific [aptation. A weight was assigned to each recommendation indicating its importance. The development of the recommendations was based on the state of the art concerning research and practical experience. Results from walk-/talk-through experiments, training and exercises, discussions with on-site experts, and investigations of emergency manuals from German and foreign nuclear power plants were taken into account. The catalog is founded on a bro[ knowledge base covering important aspects. The catalog is intended for qualitative evaluation and design of organizational measures and procedures. The catalog shall assure high quality. The project further provides an important contribution to the standardization of organizational and human centered demands concerning accident management procedures. Thus it can contribute to develop general regulations regarding ergonomic design of accident management measures. (orig.) [de

  2. Human factors review of electric power dispatch control centers. Volume 4. Operator information needs. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Miller, R.J.; Najaf-Zadeh, K.; Darlington, H.T.; McNair, H.D.; Seidenstein, S.; Williams, A.R.

    1982-10-01

    Human factors is a systems-oriented interdisciplinary specialty concerned with the design of systems, equipment, facilities and the operational environment. An important aspect leading to the design requirements is the determination of the information requirements for electric power dispatch control centers. There are significant differences between the system operator's actions during normal and degraded states of power system operation, and power system restoration. This project evaluated the information the operator requires for normal power system and control system operations and investigates the changes of information required by the operator as the power system and/or the control system degrades from a normal operating state. The Phase II study, published in two volumes, defines power system states and control system conditions to which operator information content can be related. This volume presents detailed data concerning operator information needs that identify the needs for and the uses of power system information by a system operator in conditions ranging from normal through degraded operation. The study defines power system states and control system conditions to which operator information content can be related, and it identifies the requisite information as consistent with current industry practice so as to aid control system designers. Training requirements are also included for planning entry-level and follow-on training for operators.

  3. Human Computer Collaboration at the Edge: Enhancing Collective Situation Understanding with Controlled Natural Language

    Science.gov (United States)

    2016-09-06

    conversational agent with information exchange disabled until the end of the experiment run. The meaning of the indicator in the top- right of the agent... Human Computer Collaboration at the Edge: Enhancing Collective Situation Understanding with Controlled Natural Language Alun Preece∗, William...email: PreeceAD@cardiff.ac.uk †Emerging Technology Services, IBM United Kingdom Ltd, Hursley Park, Winchester, UK ‡US Army Research Laboratory, Human

  4. Modelling flow and heat transfer around a seated human body by computational fluid dynamics

    DEFF Research Database (Denmark)

    Sørensen, Dan Nørtoft; Voigt, Lars Peter Kølgaard

    2003-01-01

    A database (http://www.ie.dtu.dk/manikin) containing a detailed representation of the surface geometry of a seated female human body was created from a surface scan of a thermal manikin (minus clothing and hair). The radiative heat transfer coefficient and the natural convection flow around...... of the computational manikin has all surface features of a human being; (2) the geometry is an exact copy of an experimental thermal manikin, enabling detailed comparisons between calculations and experiments....

  5. Annual report of Nuclear Human Resource Development Center. April 1, 2014 - March 31, 2015

    International Nuclear Information System (INIS)

    2017-06-01

    This annual report summarizes the activities of Nuclear Human Resource Development Center (NuHRDeC) of Japan Atomic Energy Agency (JAEA) in the fiscal year (FY) 2014. In FY 2014, we flexibly designed special training courses corresponding with the outside training needs, while organizing the annually scheduled regular training programs. We also actively addressed the challenging issues on human resource development, such as to enhance the collaboration with academia and to organize international training for Asian countries. Besides these regular courses, we also organized the special training courses based on the outside needs, e.g. Nuclear Regulatory Authority or the people in Naraha town in Fukushima Prefecture. JAEA continued its cooperative activities with universities. In respect of the cooperation with graduate school of The University of Tokyo, we accepted nuclear major students and cooperatively conducted lectures and practical exercises for one year. In terms of the collaboration network with universities, the joint course was successfully held with six universities through utilizing the remote education system. Besides, the intensive summer course and practical exercise at Nuclear Fuel Cycle Engineering Laboratories were also conducted. Furthermore, JAEA had re-signed the agreement “Japan Nuclear Education Network” with 7 Universities in Feb. 2015 for the new participation of Nagoya University from FY 2015. Concerning International training, we continuously implemented the Instructor Training Program (ITP) by receiving the annual sponsorship from Ministry of Education, Culture, Sports, Science and Technology. In FY 2014, eight countries (i.e. Bangladesh, Indonesia, Kazakhstan, Malaysia, Mongolia, Philippines, Thailand and Vietnam) joined this Instructor training courses such as “Reactor Engineering Course”. Furthermore, we organized nuclear technology seminar courses, e.g. “Basic Radiation Knowledge for School Education”. In respect of

  6. Developing Human-Computer Interface Models and Representation Techniques(Dialogue Management as an Integral Part of Software Engineering)

    OpenAIRE

    Hartson, H. Rex; Hix, Deborah; Kraly, Thomas M.

    1987-01-01

    The Dialogue Management Project at Virginia Tech is studying the poorly understood problem of human-computer dialogue development. This problem often leads to low usability in human-computer dialogues. The Dialogue Management Project approaches solutions to low usability in interfaces by addressing human-computer dialogue development as an integral and equal part of the total system development process. This project consists of two rather distinct, but dependent, parts. One is development of ...

  7. Radiological and Environmental Research Division, Center for Human Radiobiology. Annual report, July 1980-June 1981. [Lead abstract

    Energy Technology Data Exchange (ETDEWEB)

    1982-03-01

    Separate abstracts were prepared for the 22 papers of this annual report of the Center for Human Radiobiology. Abstracts were not written for 2 appendices which contain data on the exposure and radium-induced malignancies of 2259 persons whose radium content has been determined at least once. (KRM)

  8. Learning to Design Backwards: Examining a Means to Introduce Human-Centered Design Processes to Teachers and Students

    Science.gov (United States)

    Gibson, Michael R.

    2016-01-01

    "Designing backwards" is presented here as a means to utilize human-centered processes in diverse educational settings to help teachers and students learn to formulate and operate design processes to achieve three sequential and interrelated goals. The first entails teaching them to effectively and empathetically identify, frame and…

  9. A human centered GeoVisualization framework to facilitate visual exploration of telehealth data: a case study.

    Science.gov (United States)

    Joshi, Ashish; de Araujo Novaes, Magdala; Machiavelli, Josiane; Iyengar, Sriram; Vogler, Robert; Johnson, Craig; Zhang, Jiajie; Hsu, Chiehwen E

    2012-01-01

    Public health data is typically organized by geospatial units. Routine geographic monitoring of health data enables an understanding of the spatial patterns of events in terms of causes and controls. GeoVisualization (GeoVis) allows users to see information hidden both visually and explicitly on a map. Despite the applicability of GeoVis in public health, it is still underused for visualizing public health data. The objective of this study is to examine the perception of telehealth users' to utilize GeoVis as a proof of concept to facilitate visual exploration of telehealth data in Brazil using principles of human centered approach and cognitive fit theory. A mixed methods approach combining qualitative and quantitative assessments was utilized in this cross sectional study conducted at the Telehealth Center of the Federal University of Pernambuco (NUTE-UFPE), Recife, Brazil. A convenient sample of 20 participants currently involved in NUTES was drawn during a period of Sep-Oct 2011. Data was gathered using previously tested questionnaire surveys and in-person interviews. Socio-demographic Information such as age, gender, prior education, familiarity with the use of computer and GeoVis was gathered. Other information gathered included participants' prior spatial analysis skills, level of motivation and use of GeoVis in telehealth. Audio recording was done for all interviews conducted in both English and Portuguese, and transcription of the audio content to English was done by a certified translator. Univariate analysis was performed and means and standard deviations were reported for the continuous variables and frequency distributions for the categorical variables. For the open-ended questions, we utilized a grounded theory to identify themes and their relationship as they emerge from the data. Analysis of the quantitative data was performed using SAS V9.1 and qualitative data was performed using NVivo9. The average age of participants was 28 years (SD=7), a

  10. Ergonomic guidelines for using notebook personal computers. Technical Committee on Human-Computer Interaction, International Ergonomics Association.

    Science.gov (United States)

    Saito, S; Piccoli, B; Smith, M J; Sotoyama, M; Sweitzer, G; Villanueva, M B; Yoshitake, R

    2000-10-01

    In the 1980's, the visual display terminal (VDT) was introduced in workplaces of many countries. Soon thereafter, an upsurge in reported cases of related health problems, such as musculoskeletal disorders and eyestrain, was seen. Recently, the flat panel display or notebook personal computer (PC) became the most remarkable feature in modern workplaces with VDTs and even in homes. A proactive approach must be taken to avert foreseeable ergonomic and occupational health problems from the use of this new technology. Because of its distinct physical and optical characteristics, the ergonomic requirements for notebook PCs in terms of machine layout, workstation design, lighting conditions, among others, should be different from the CRT-based computers. The Japan Ergonomics Society (JES) technical committee came up with a set of guidelines for notebook PC use following exploratory discussions that dwelt on its ergonomic aspects. To keep in stride with this development, the Technical Committee on Human-Computer Interaction under the auspices of the International Ergonomics Association worked towards the international issuance of the guidelines. This paper unveils the result of this collaborative effort.

  11. The retention of health human resources in primary healthcare centers in Lebanon: a national survey.

    Science.gov (United States)

    Alameddine, Mohamad; Saleh, Shadi; El-Jardali, Fadi; Dimassi, Hani; Mourad, Yara

    2012-11-22

    Critical shortages of health human resources (HHR), associated with high turnover rates, have been a concern in many countries around the globe. Of particular interest is the effect of such a trend on the primary healthcare (PHC) sector; considered a cornerstone in any effective healthcare system. This study is a rare attempt to investigate PHC HHR work characteristics, level of burnout and likelihood to quit as well as the factors significantly associated with staff retention at PHC centers in Lebanon. A cross-sectional design was utilized to survey all health providers at 81 PHC centers dispersed in all districts of Lebanon. The questionnaire consisted of four sections: socio-demographic/ professional background, organizational/institutional characteristics, likelihood to quit and level of professional burnout (using the Maslach-Burnout Inventory). A total of 755 providers completed the questionnaire (60.5% response rate). Bivariate analyses and multinomial logistic regression were used to determine factors associated with likelihood to quit. Two out of five respondents indicated likelihood to quit their jobs within the next 1-3 years and an additional 13.4% were not sure about quitting. The top three reasons behind likelihood to quit were poor salary (54.4%), better job opportunities outside the country (35.1%) and lack of professional development (33.7%). A U-shaped relationship was observed between age and likelihood to quit. Regression analysis revealed that high levels of burnout, lower level of education and low tenure were all associated with increased likelihood to quit. The study findings reflect an unstable workforce and are not conducive to supporting an expanded role for PHC in the Lebanese healthcare system. While strategies aiming at improving staff retention would be important to develop and implement for all PHC HHR; targeted retention initiatives should focus on the young-new recruits and allied health professionals. Particular attention should

  12. The retention of health human resources in primary healthcare centers in Lebanon: a national survey

    Directory of Open Access Journals (Sweden)

    Alameddine Mohamad

    2012-11-01

    Full Text Available Abstract Background Critical shortages of health human resources (HHR, associated with high turnover rates, have been a concern in many countries around the globe. Of particular interest is the effect of such a trend on the primary healthcare (PHC sector; considered a cornerstone in any effective healthcare system. This study is a rare attempt to investigate PHC HHR work characteristics, level of burnout and likelihood to quit as well as the factors significantly associated with staff retention at PHC centers in Lebanon. Methods A cross-sectional design was utilized to survey all health providers at 81 PHC centers dispersed in all districts of Lebanon. The questionnaire consisted of four sections: socio-demographic/ professional background, organizational/institutional characteristics, likelihood to quit and level of professional burnout (using the Maslach-Burnout Inventory. A total of 755 providers completed the questionnaire (60.5% response rate. Bivariate analyses and multinomial logistic regression were used to determine factors associated with likelihood to quit. Results Two out of five respondents indicated likelihood to quit their jobs within the next 1–3 years and an additional 13.4% were not sure about quitting. The top three reasons behind likelihood to quit were poor salary (54.4%, better job opportunities outside the country (35.1% and lack of professional development (33.7%. A U-shaped relationship was observed between age and likelihood to quit. Regression analysis revealed that high levels of burnout, lower level of education and low tenure were all associated with increased likelihood to quit. Conclusions The study findings reflect an unstable workforce and are not conducive to supporting an expanded role for PHC in the Lebanese healthcare system. While strategies aiming at improving staff retention would be important to develop and implement for all PHC HHR; targeted retention initiatives should focus on the young-new recruits

  13. Human vs. Computer Diagnosis of Students' Natural Selection Knowledge: Testing the Efficacy of Text Analytic Software

    Science.gov (United States)

    Nehm, Ross H.; Haertig, Hendrik

    2012-01-01

    Our study examines the efficacy of Computer Assisted Scoring (CAS) of open-response text relative to expert human scoring within the complex domain of evolutionary biology. Specifically, we explored whether CAS can diagnose the explanatory elements (or Key Concepts) that comprise undergraduate students' explanatory models of natural selection with…

  14. A hybrid approach to the computational aeroacoustics of human voice production

    Czech Academy of Sciences Publication Activity Database

    Šidlof, Petr; Zörner, S.; Huppe, A.

    2015-01-01

    Roč. 14, č. 3 (2015), s. 473-488 ISSN 1617-7959 R&D Projects: GA ČR(CZ) GAP101/11/0207 Institutional support: RVO:61388998 Keywords : computational aeroacoustics * parallel CFD * human voice * vocal folds * ventricular folds Subject RIV: BI - Acoustics Impact factor: 3.032, year: 2015

  15. HCI^2 Workbench: A Development Tool for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Wenzhe, Shi; Pantic, Maja

    In this paper, we present a novel software tool designed and implemented to simplify the development process of Multimodal Human-Computer Interaction (MHCI) systems. This tool, which is called the HCI^2 Workbench, exploits a Publish / Subscribe (P/S) architecture [13] [14] to facilitate efficient

  16. HCI^2 Framework: A software framework for multimodal human-computer interaction systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2013-01-01

    This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a

  17. Research Summary 3-D Computational Fluid Dynamics (CFD) Model Of The Human Respiratory System

    Science.gov (United States)

    The U.S. EPA’s Office of Research and Development (ORD) has developed a 3-D computational fluid dynamics (CFD) model of the human respiratory system that allows for the simulation of particulate based contaminant deposition and clearance, while being adaptable for age, ethnicity,...

  18. The Human-Computer Interaction of Cross-Cultural Gaming Strategy

    Science.gov (United States)

    Chakraborty, Joyram; Norcio, Anthony F.; Van Der Veer, Jacob J.; Andre, Charles F.; Miller, Zachary; Regelsberger, Alexander

    2015-01-01

    This article explores the cultural dimensions of the human-computer interaction that underlies gaming strategies. The article is a desktop study of existing literature and is organized into five sections. The first examines the cultural aspects of knowledge processing. The social constructs technology interaction is discussed. Following this, the…

  19. Enhancing Human-Computer Interaction Design Education: Teaching Affordance Design for Emerging Mobile Devices

    Science.gov (United States)

    Faiola, Anthony; Matei, Sorin Adam

    2010-01-01

    The evolution of human-computer interaction design (HCID) over the last 20 years suggests that there is a growing need for educational scholars to consider new and more applicable theoretical models of interactive product design. The authors suggest that such paradigms would call for an approach that would equip HCID students with a better…

  20. Computational Model-Based Prediction of Human Episodic Memory Performance Based on Eye Movements

    Science.gov (United States)

    Sato, Naoyuki; Yamaguchi, Yoko

    Subjects' episodic memory performance is not simply reflected by eye movements. We use a ‘theta phase coding’ model of the hippocampus to predict subjects' memory performance from their eye movements. Results demonstrate the ability of the model to predict subjects' memory performance. These studies provide a novel approach to computational modeling in the human-machine interface.

  1. Rational behavior in decision making. A comparison between humans, computers and fast and frugal strategies

    NARCIS (Netherlands)

    Snijders, C.C.P.

    2007-01-01

    Rational behavior in decision making. A comparison between humans, computers, and fast and frugal strategies Chris Snijders and Frits Tazelaar (Eindhoven University of Technology, The Netherlands) Real life decisions often have to be made in "noisy" circumstances: not all crucial information is

  2. Human brain as the model of a new computer system. II

    Energy Technology Data Exchange (ETDEWEB)

    Holtz, K; Langheld, E

    1981-12-09

    For Pt. I see IBID., Vol. 29, No. 22, P. 13 (1981). The authors describe the self-generating system of connections of a self-teaching no-program associative computer. The self-generating systems of connections are regarded as simulation models of the human brain and compared with the brain structure. The system hardware comprises microprocessor, PROM, memory, VDU, keyboard unit.

  3. Seismic-load-induced human errors and countermeasures using computer graphics in plant-operator communication

    International Nuclear Information System (INIS)

    Hara, Fumio

    1988-01-01

    This paper remarks the importance of seismic load-induced human errors in plant operation by delineating the characteristics of the task performance of human beings under seismic loads. It focuses on man-machine communication via multidimensional data like that conventionally displayed on large panels in a plant control room. It demonstrates a countermeasure to human errors using a computer graphics technique that conveys the global state of the plant operation to operators through cartoon-like, colored graphs in the form of faces that, with different facial expressions, show the plant safety status. (orig.)

  4. MoCog1: A computer simulation of recognition-primed human decision making, considering emotions

    Science.gov (United States)

    Gevarter, William B.

    1992-01-01

    The successful results of the first stage of a research effort to develop a versatile computer model of motivated human cognitive behavior are reported. Most human decision making appears to be an experience-based, relatively straightforward, largely automatic response to situations, utilizing cues and opportunities perceived from the current environment. The development, considering emotions, of the architecture and computer program associated with such 'recognition-primed' decision-making is described. The resultant computer program (MoCog1) was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  5. Heuristic and optimal policy computations in the human brain during sequential decision-making.

    Science.gov (United States)

    Korn, Christoph W; Bach, Dominik R

    2018-01-23

    Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.

  6. Computer-based personality judgments are more accurate than those made by humans

    Science.gov (United States)

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-01

    Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

  7. Computer-based personality judgments are more accurate than those made by humans.

    Science.gov (United States)

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-27

    Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy.

  8. SCELib2: the new revision of SCELib, the parallel computational library of molecular properties in the single center approach

    Science.gov (United States)

    Sanna, N.; Morelli, G.

    2004-09-01

    In this paper we present the new version of the SCELib program (CPC Catalogue identifier ADMG) a full numerical implementation of the Single Center Expansion (SCE) method. The physics involved is that of producing the SCE description of molecular electronic densities, of molecular electrostatic potentials and of molecular perturbed potentials due to a point negative or positive charge. This new revision of the program has been optimized to run in serial as well as in parallel execution mode, to support a larger set of molecular symmetries and to permit the restart of long-lasting calculations. To measure the performance of this new release, a comparative study has been carried out on the most powerful computing architectures in serial and parallel runs. The results of the calculations reported in this paper refer to real cases medium to large molecular systems and they are reported in full details to benchmark at best the parallel architectures the new SCELib code will run on. Program summaryTitle of program: SCELib2 Catalogue identifier: ADGU Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADGU Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Reference to previous versions: Comput. Phys. Commun. 128 (2) (2000) 139 (CPC catalogue identifier: ADMG) Does the new version supersede the original program?: Yes Computer for which the program is designed and others on which it has been tested: HP ES45 and rx2600, SUN ES4500, IBM SP and any single CPU workstation based on Alpha, SPARC, POWER, Itanium2 and X86 processors Installations: CASPUR, local Operating systems under which the program has been tested: HP Tru64 V5.X, SUNOS V5.8, IBM AIX V5.X, Linux RedHat V8.0 Programming language used: C Memory required to execute with typical data: 10 Mwords. Up to 2000 Mwords depending on the molecular system and runtime parameters No. of bits in a word: 64 No. of processors used: 1 to 32 Has the code been vectorized or parallelized?: Yes

  9. Development and evaluation of a computer-aided system for analyzing human error in railway operations

    International Nuclear Information System (INIS)

    Kim, Dong San; Baek, Dong Hyun; Yoon, Wan Chul

    2010-01-01

    As human error has been recognized as one of the major contributors to accidents in safety-critical systems, there has been a strong need for techniques that can analyze human error effectively. Although many techniques have been developed so far, much room for improvement remains. As human error analysis is a cognitively demanding and time-consuming task, it is particularly necessary to develop a computerized system supporting this task. This paper presents a computer-aided system for analyzing human error in railway operations, called Computer-Aided System for Human Error Analysis and Reduction (CAS-HEAR). It supports analysts to find multiple levels of error causes and their causal relations by using predefined links between contextual factors and causal factors as well as links between causal factors. In addition, it is based on a complete accident model; hence, it helps analysts to conduct a thorough analysis without missing any important part of human error analysis. A prototype of CAS-HEAR was evaluated by nine field investigators from six railway organizations in Korea. Its overall usefulness in human error analysis was confirmed, although development of its simplified version and some modification of the contextual factors and causal factors are required in order to ensure its practical use.

  10. Human Environmental Disease Network: A computational model to assess toxicology of contaminants.

    Science.gov (United States)

    Taboureau, Olivier; Audouze, Karine

    2017-01-01

    During the past decades, many epidemiological, toxicological and biological studies have been performed to assess the role of environmental chemicals as potential toxicants associated with diverse human disorders. However, the relationships between diseases based on chemical exposure rarely have been studied by computational biology. We developed a human environmental disease network (EDN) to explore and suggest novel disease-disease and chemical-disease relationships. The presented scored EDN model is built upon the integration of systems biology and chemical toxicology using information on chemical contaminants and their disease relationships reported in the TDDB database. The resulting human EDN takes into consideration the level of evidence of the toxicant-disease relationships, allowing inclusion of some degrees of significance in the disease-disease associations. Such a network can be used to identify uncharacterized connections between diseases. Examples are discussed for type 2 diabetes (T2D). Additionally, this computational model allows confirmation of already known links between chemicals and diseases (e.g., between bisphenol A and behavioral disorders) and also reveals unexpected associations between chemicals and diseases (e.g., between chlordane and olfactory alteration), thus predicting which chemicals may be risk factors to human health. The proposed human EDN model allows exploration of common biological mechanisms of diseases associated with chemical exposure, helping us to gain insight into disease etiology and comorbidity. This computational approach is an alternative to animal testing supporting the 3R concept.

  11. Annual report of Nuclear Human Resource Development Center. April 1, 2011 - March 31, 2012

    International Nuclear Information System (INIS)

    2013-11-01

    This annual report summarizes the activities of Nuclear Human Resource Development Center (NuHRDeC) of Japan Atomic Energy Agency (JAEA) in the fiscal year 2011. In this fiscal year, we flexibly designed and conducted training courses corresponding with the needs from outside, while conducting the annually scheduled training programs, and also actively addressed the challenge of human resource development, such as to enhance the collaboration with academia and to organize international training for Asian countries. The number of trainees who completed the domestic training courses in 2011 was increased to 387, which is 14 percent more than the previous year. And also, in order to respond to the Tokyo Electric Power Company (TEPCO)'s Fukushima No.1 nuclear power plant accident, we also newly designed and organized the special training courses on radiation survey for the subcontracting companies working with TEPCO, and the training courses on decontamination work for the construction companies in Fukushima prefecture. The total number of attendees in these special courses was 3,800 persons. JAEA continued its cooperative activities with universities. In respect of the cooperation with graduate school of University of Tokyo, we accepted 17 students and cooperatively conducted practical exercises for nuclear major. Furthermore, we also actively continued cooperation on practical exercises for students of universities which were signed in Nuclear HRD Program. In terms of the collaboration network with universities, the joint course was held with six universities through utilizing the remote education system. Furthermore, the intensive course at Okayama University and practical exercise at Nuclear Fuel Cycle Engineering Laboratories of JAEA were also conducted. In respect of International training, NuHRDeC continuously implemented the Instructor Training Program (ITP) by receiving the annual sponsorship from MEXT. In fiscal year 2011, seven countries (i.e. Bangladesh

  12. Annual report of Nuclear Human Resource Development Center. April 1, 2013 - March 31, 2014

    International Nuclear Information System (INIS)

    2015-07-01

    This annual report summarizes the activities of Nuclear Human Resource Development Center (NuHRDeC) of Japan Atomic Energy Agency (JAEA) in the FY2013. In FY2013, we flexibly designed special training courses corresponding with the outside training needs, while organizing the annually scheduled regular training programs. We also actively addressed the challenging issues on human resource development, such as to enhance the collaboration with academia and to organize international training for Asian countries. The number of trainees who participated in the domestic regular training courses in 2013 was more than 300 persons. Besides these regular courses, we also organized the special training courses based on the outside needs, e.g. the training courses on radiation survey and decontamination work in Fukushima prefecture for the subcontracting companies of the Tokyo Electric Power Company (TEPCO) working to respond to the TEPCO's Fukushima Daiichi nuclear power station accident. JAEA continued its cooperative activities with universities. In respect of the cooperation with graduate school of University of Tokyo, we accepted nuclear major students and cooperatively conducted lectures and practical exercises for one year. In terms of the collaboration network with universities, the joint course was successfully held with six universities through utilizing the remote education system. Furthermore, the intensive course at Okayama University, University of Fukui, and practical exercise at Nuclear Fuel Cycle Engineering Laboratories of JAEA were also conducted. In respect of International training, we continuously implemented the Instructor Training Program (ITP) by receiving the annual sponsorship from Ministry of Education, Culture, Sports, Science and Technology. In fiscal year 2013, eight countries (i.e. Bangladesh, Indonesia, Kazakhstan, Malaysia, Mongolia, Philippines, Thailand, Vietnam) joined this Instructor training courses. Furthermore, we organized nuclear

  13. Annual report of Nuclear Human Resource Development Center. April 1, 2010 - March 31, 2011

    International Nuclear Information System (INIS)

    2012-03-01

    This annual report summarizes the activities of Nuclear Human Resource Development Center (NuHRDeC) of Japan Atomic Energy Agency (JAEA) in the fiscal year 2010. In this fiscal year, NuHRDeC flexibly designed and conducted as need training courses upon requests while conducting the annually scheduled training programs, and actively addressed the challenge of human resource development, such as to enhance the collaboration with academia and to expand the number of participating countries for international training. The number of trainees who completed the domestic training courses in 2010 was slightly increased to 340, which is 6 percent more than the previous year. The number of those who completed the staff technical training courses was 879 in 2010, which is 12 percent more than the previous year. As a result, the total number of trainees during this period is about 10 percent more than the previous year. In order to correspond with the needs from outside of JAEA, four temporary courses were held upon the request from Nuclear and Industrial Safety Agency (NISA), Ministry of Economy, Trade and Industry (METI). JAEA continued its cooperative activities with universities; cooperation with graduate school of University of Tokyo, and the cooperative graduate school program was enlarged to cooperate with totally 19 graduate schools, one faculty of undergraduate school, and one technical college, including the newly joined 1 graduate school in 2010. JAEA also continued cooperative activities with Nuclear HRD Program initiated by MEXT and METI in 2007. The joint course has continued networking with six universities through utilizing the remote education system, Japan Nuclear Education Network (JNEN), and special lectures, summer and winter practice were also conducted. In respect of International training, NuHRDeC continuously implemented the Instructor Training Program (ITP) by receiving the annual sponsorship from MEXT. In fiscal year 2010, four countries (Bangladesh

  14. Annual report of Nuclear Human Resource Development Center. April 1, 2012 - March 31, 2013

    International Nuclear Information System (INIS)

    2014-03-01

    This annual report summarizes the activities of Nuclear Human Resource Development Center (NuHRDeC) of Japan Atomic Energy Agency (JAEA) in the fiscal year 2012. In this fiscal year, we flexibly designed training courses corresponding with the needs from outside, while organizing the annually scheduled training programs, and also actively addressed the challenging issues on human resource development, such as to enhance the collaboration with academia and to organize international training for Asian countries. The number of trainees who completed the domestic training courses in 2012 was increased to 525, which is 30 percent more than the previous year. And also, in order to respond to the Tokyo Electric Power Company (TEPCO)'s Fukushima No.1 nuclear power plant accident, we also organized the special training courses on radiation survey for the subcontracting companies working with TEPCO, and the training courses on decontamination work for the construction companies in Fukushima prefecture. The total number of attendees in these special courses was more than 4,000 persons. JAEA continued its cooperative activities with universities. In respect of the cooperation with graduate school of University of Tokyo, we accepted 14 students and cooperatively conducted practical exercises for nuclear major. Furthermore, we also actively continued cooperation on practical exercises for students of universities which were signed in Nuclear HRD Program. In terms of the collaboration network with universities, the joint course was held with six universities through utilizing the remote education system. Furthermore, the intensive course at Okayama University, Fukui University, and practical exercise at Nuclear Fuel Cycle Engineering Laboratories of JAEA were also conducted. In respect of International training, NuHRDeC continuously implemented the Instructor Training Program (ITP) by receiving the annual sponsorship from MEXT. In fiscal year 2012, eight countries (i

  15. New human-centered linear and nonlinear motion cueing algorithms for control of simulator motion systems

    Science.gov (United States)

    Telban, Robert J.

    While the performance of flight simulator motion system hardware has advanced substantially, the development of the motion cueing algorithm, the software that transforms simulated aircraft dynamics into realizable motion commands, has not kept pace. To address this, new human-centered motion cueing algorithms were developed. A revised "optimal algorithm" uses time-invariant filters developed by optimal control, incorporating human vestibular system models. The "nonlinear algorithm" is a novel approach that is also formulated by optimal control, but can also be updated in real time. It incorporates a new integrated visual-vestibular perception model that includes both visual and vestibular sensation and the interaction between the stimuli. A time-varying control law requires the matrix Riccati equation to be solved in real time by a neurocomputing approach. Preliminary pilot testing resulted in the optimal algorithm incorporating a new otolith model, producing improved motion cues. The nonlinear algorithm vertical mode produced a motion cue with a time-varying washout, sustaining small cues for longer durations and washing out large cues more quickly compared to the optimal algorithm. The inclusion of the integrated perception model improved the responses to longitudinal and lateral cues. False cues observed with the NASA adaptive algorithm were absent. As a result of unsatisfactory sensation, an augmented turbulence cue was added to the vertical mode for both the optimal and nonlinear algorithms. The relative effectiveness of the algorithms, in simulating aircraft maneuvers, was assessed with an eleven-subject piloted performance test conducted on the NASA Langley Visual Motion Simulator (VMS). Two methods, the quasi-objective NASA Task Load Index (TLX), and power spectral density analysis of pilot control, were used to assess pilot workload. TLX analysis reveals, in most cases, less workload and variation among pilots with the nonlinear algorithm. Control input

  16. THE NATIONAL CENTER FOR RADIOECOLOGY: A NETWORK OF EXCELLENCE FOR ENVIRONMENTAL AND HUMAN RADIATION RISK REDUCTION

    Energy Technology Data Exchange (ETDEWEB)

    Jannik, T.

    2013-01-09

    Radioecology in the United States can be traced back to the early 1950s when small research programs were established to address the fate and effects of radionuclides released in the environment from activities at nuclear facilities. These programs focused primarily on local environmental effects, but global radioactive fallout from nuclear weapons testing and the potential for larger scale local releases of radioisotopes resulted in major concerns about the threat, not only to humans, but to other species and to ecosystems that support all life. These concerns were shared by other countries and it was quickly recognized that a multi-disciplinary approach would be required to address and understand the implications of anthropogenic radioactivity in the environment. The management, clean-up and long-term monitoring of legacy wastes at Department of Energy (DOE), Department of Defense (DOD), and Nuclear Regulatory Commission (NRC)-regulated facilities continues to be of concern as long as nuclear operations continue. Research conducted through radioecology programs provides the credible scientific data needed for decision-making purposes. The current status of radioecology programs in the United States are: fragmented with little coordination to identify national strategies and direct programs; suffering from a steadily decreasing funding base; soon to be hampered by closure of key infrastructure; hampered by aging and retiring workforce (loss of technical expertise); and in need of training of young scientists to ensure continuation of the science (no formal graduate education program in radioecology remaining in the U.S.). With these concerns in mind, the Savannah River National Laboratory (SRNL) took the lead to establish the National Center for Radioecology (NCoRE) as a network of excellence of the remaining radioecology expertise in the United States. As part of the NCoRE mission, scientists at SRNL are working with six key partner universities to re-establish a

  17. Cognitive engineering in the design of human-computer interaction and expert systems

    International Nuclear Information System (INIS)

    Salvendy, G.

    1987-01-01

    The 68 papers contributing to this book cover the following areas: Theories of Interface Design; Methodologies of Interface Design; Applications of Interface Design; Software Design; Human Factors in Speech Technology and Telecommunications; Design of Graphic Dialogues; Knowledge Acquisition for Knowledge-Based Systems; Design, Evaluation and Use of Expert Systems. This demonstrates the dual role of cognitive engineering. On the one hand cognitive engineering is utilized to design computing systems which are compatible with human cognition and can be effectively and be easily utilized by all individuals. On the other hand, cognitive engineering is utilized to transfer human cognition into the computer for the purpose of building expert systems. Two papers are of interest to INIS

  18. Human factors with nonhumans - Factors that affect computer-task performance

    Science.gov (United States)

    Washburn, David A.

    1992-01-01

    There are two general strategies that may be employed for 'doing human factors research with nonhuman animals'. First, one may use the methods of traditional human factors investigations to examine the nonhuman animal-to-machine interface. Alternatively, one might use performance by nonhuman animals as a surrogate for or model of performance by a human operator. Each of these approaches is illustrated with data in the present review. Chronic ambient noise was found to have a significant but inconsequential effect on computer-task performance by rhesus monkeys (Macaca mulatta). Additional data supported the generality of findings such as these to humans, showing that rhesus monkeys are appropriate models of human psychomotor performance. It is argued that ultimately the interface between comparative psychology and technology will depend on the coordinated use of both strategies of investigation.

  19. The role of beliefs in lexical alignment: evidence from dialogs with humans and computers.

    Science.gov (United States)

    Branigan, Holly P; Pickering, Martin J; Pearson, Jamie; McLean, Janet F; Brown, Ash

    2011-10-01

    Five experiments examined the extent to which speakers' alignment (i.e., convergence) on words in dialog is mediated by beliefs about their interlocutor. To do this, we told participants that they were interacting with another person or a computer in a task in which they alternated between selecting pictures that matched their 'partner's' descriptions and naming pictures themselves (though in reality all responses were scripted). In both text- and speech-based dialog, participants tended to repeat their partner's choice of referring expression. However, they showed a stronger tendency to align with 'computer' than with 'human' partners, and with computers that were presented as less capable than with computers that were presented as more capable. The tendency to align therefore appears to be mediated by beliefs, with the relevant beliefs relating to an interlocutor's perceived communicative capacity. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. A conceptual and computational model of moral decision making in human and artificial agents.

    Science.gov (United States)

    Wallach, Wendell; Franklin, Stan; Allen, Colin

    2010-07-01

    Recently, there has been a resurgence of interest in general, comprehensive models of human cognition. Such models aim to explain higher-order cognitive faculties, such as deliberation and planning. Given a computational representation, the validity of these models can be tested in computer simulations such as software agents or embodied robots. The push to implement computational models of this kind has created the field of artificial general intelligence (AGI). Moral decision making is arguably one of the most challenging tasks for computational approaches to higher-order cognition. The need for increasingly autonomous artificial agents to factor moral considerations into their choices and actions has given rise to another new field of inquiry variously known as Machine Morality, Machine Ethics, Roboethics, or Friendly AI. In this study, we discuss how LIDA, an AGI model of human cognition, can be adapted to model both affective and rational features of moral decision making. Using the LIDA model, we will demonstrate how moral decisions can be made in many domains using the same mechanisms that enable general decision making. Comprehensive models of human cognition typically aim for compatibility with recent research in the cognitive and neural sciences. Global workspace theory, proposed by the neuropsychologist Bernard Baars (1988), is a highly regarded model of human cognition that is currently being computationally instantiated in several software implementations. LIDA (Franklin, Baars, Ramamurthy, & Ventura, 2005) is one such computational implementation. LIDA is both a set of computational tools and an underlying model of human cognition, which provides mechanisms that are capable of explaining how an agent's selection of its next action arises from bottom-up collection of sensory data and top-down processes for making sense of its current situation. We will describe how the LIDA model helps integrate emotions into the human decision-making process, and we