WorldWideScience

Sample records for human centered computing

  1. Human-centered Computing: Toward a Human Revolution

    OpenAIRE

    Jaimes, Alejandro; Gatica-Perez, Daniel; Sebe, Nicu; Thomas S. Huang

    2007-01-01

    Human-centered computing studies the design, development, and deployment of mixed-initiative human-computer systems. HCC is emerging from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts.

  2. Supporting Human Activities - Exploring Activity-Centered Computing

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob

    2002-01-01

    In this paper we explore an activity-centered computing paradigm that is aimed at supporting work processes that are radically different from the ones known from office work. Our main inspiration is healthcare work that is characterized by an extreme degree of mobility, many interruptions, ad...... objects. We also present an exploratory prototype design and first implementation and present some initial results from evaluations in a healthcare environment....

  3. Human-Centered Software Engineering: Software Engineering Architectures, Patterns, and Sodels for Human Computer Interaction

    Science.gov (United States)

    Seffah, Ahmed; Vanderdonckt, Jean; Desmarais, Michel C.

    The Computer-Human Interaction and Software Engineering (CHISE) series of edited volumes originated from a number of workshops and discussions over the latest research and developments in the field of Human Computer Interaction (HCI) and Software Engineering (SE) integration, convergence and cross-pollination. A first volume in this series (CHISE Volume I - Human-Centered Software Engineering: Integrating Usability in the Development Lifecycle) aims at bridging the gap between the field of SE and HCI, and addresses specifically the concerns of integrating usability and user-centered systems design methods and tools into the software development lifecycle and practices. This has been done by defining techniques, tools and practices that can fit into the entire software engineering lifecycle as well as by defining ways of addressing the knowledge and skills needed, and the attitudes and basic values that a user-centered development methodology requires. The first volume has been edited as Vol. 8 in the Springer HCI Series (Seffah, Gulliksen and Desmarais, 2005).

  4. Deep Space Network (DSN), Network Operations Control Center (NOCC) computer-human interfaces

    Science.gov (United States)

    Ellman, Alvin; Carlton, Magdi

    1993-01-01

    The Network Operations Control Center (NOCC) of the DSN is responsible for scheduling the resources of DSN, and monitoring all multi-mission spacecraft tracking activities in real-time. Operations performs this job with computer systems at JPL connected to over 100 computers at Goldstone, Australia and Spain. The old computer system became obsolete, and the first version of the new system was installed in 1991. Significant improvements for the computer-human interfaces became the dominant theme for the replacement project. Major issues required innovating problem solving. Among these issues were: How to present several thousand data elements on displays without overloading the operator? What is the best graphical representation of DSN end-to-end data flow? How to operate the system without memorizing mnemonics of hundreds of operator directives? Which computing environment will meet the competing performance requirements? This paper presents the technical challenges, engineering solutions, and results of the NOCC computer-human interface design.

  5. Exploring Effective Decision Making through Human-Centered and Computational Intelligence Methods

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kyungsik; Cook, Kristin A.; Shih, Patrick C.

    2016-06-13

    Decision-making has long been studied to understand a psychological, cognitive, and social process of selecting an effective choice from alternative options. Its studies have been extended from a personal level to a group and collaborative level, and many computer-aided decision-making systems have been developed to help people make right decisions. There has been significant research growth in computational aspects of decision-making systems, yet comparatively little effort has existed in identifying and articulating user needs and requirements in assessing system outputs and the extent to which human judgments could be utilized for making accurate and reliable decisions. Our research focus is decision-making through human-centered and computational intelligence methods in a collaborative environment, and the objectives of this position paper are to bring our research ideas to the workshop, and share and discuss ideas.

  6. Human-Centered Design of Human-Computer-Human Dialogs in Aerospace Systems

    Science.gov (United States)

    Mitchell, Christine M.

    1998-01-01

    A series of ongoing research programs at Georgia Tech established a need for a simulation support tool for aircraft computer-based aids. This led to the design and development of the Georgia Tech Electronic Flight Instrument Research Tool (GT-EFIRT). GT-EFIRT is a part-task flight simulator specifically designed to study aircraft display design and single pilot interaction. ne simulator, using commercially available graphics and Unix workstations, replicates to a high level of fidelity the Electronic Flight Instrument Systems (EFIS), Flight Management Computer (FMC) and Auto Flight Director System (AFDS) of the Boeing 757/767 aircraft. The simulator can be configured to present information using conventional looking B757n67 displays or next generation Primary Flight Displays (PFD) such as found on the Beech Starship and MD-11.

  7. User participation in the development of the human/computer interface for control centers

    Science.gov (United States)

    Broome, Richard; Quick-Campbell, Marlene; Creegan, James; Dutilly, Robert

    1996-01-01

    Technological advances coupled with the requirements to reduce operations staffing costs led to the demand for efficient, technologically-sophisticated mission operations control centers. The control center under development for the earth observing system (EOS) is considered. The users are involved in the development of a control center in order to ensure that it is cost-efficient and flexible. A number of measures were implemented in the EOS program in order to encourage user involvement in the area of human-computer interface development. The following user participation exercises carried out in relation to the system analysis and design are described: the shadow participation of the programmers during a day of operations; the flight operations personnel interviews; and the analysis of the flight operations team tasks. The user participation in the interface prototype development, the prototype evaluation, and the system implementation are reported on. The involvement of the users early in the development process enables the requirements to be better understood and the cost to be reduced.

  8. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  9. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  10. Rethinking Human-Centered Computing: Finding the Customer and Negotiated Interactions at the Airport

    Science.gov (United States)

    Wales, Roxana; O'Neill, John; Mirmalek, Zara

    2003-01-01

    The breakdown in the air transportation system over the past several years raises an interesting question for researchers: How can we help improve the reliability of airline operations? In offering some answers to this question, we make a statement about Huuman-Centered Computing (HCC). First we offer the definition that HCC is a multi-disciplinary research and design methodology focused on supporting humans as they use technology by including cognitive and social systems, computational tools and the physical environment in the analysis of organizational systems. We suggest that a key element in understanding organizational systems is that there are external cognitive and social systems (customers) as well as internal cognitive and social systems (employees) and that they interact dynamically to impact the organization and its work. The design of human-centered intelligent systems must take this outside-inside dynamic into account. In the past, the design of intelligent systems has focused on supporting the work and improvisation requirements of employees but has often assumed that customer requirements are implicitly satisfied by employee requirements. Taking a customer-centric perspective provides a different lens for understanding this outside-inside dynamic, the work of the organization and the requirements of both customers and employees In this article we will: 1) Demonstrate how the use of ethnographic methods revealed the important outside-inside dynamic in an airline, specifically the consequential relationship between external customer requirements and perspectives and internal organizational processes and perspectives as they came together in a changing environment; 2) Describe how taking a customer centric perspective identifies places where the impact of the outside-inside dynamic is most critical and requires technology that can be adaptive; 3) Define and discuss the place of negotiated interactions in airline operations, identifying how these

  11. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  12. Human Computation

    CERN Document Server

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  13. Human factors in computing systems: focus on patient-centered health communication at the ACM SIGCHI conference.

    Science.gov (United States)

    Wilcox, Lauren; Patel, Rupa; Chen, Yunan; Shachak, Aviv

    2013-12-01

    Health Information Technologies, such as electronic health records (EHR) and secure messaging, have already transformed interactions among patients and clinicians. In addition, technologies supporting asynchronous communication outside of clinical encounters, such as email, SMS, and patient portals, are being increasingly used for follow-up, education, and data reporting. Meanwhile, patients are increasingly adopting personal tools to track various aspects of health status and therapeutic progress, wishing to review these data with clinicians during consultations. These issues have drawn increasing interest from the human-computer interaction (HCI) community, with special focus on critical challenges in patient-centered interactions and design opportunities that can address these challenges. We saw this community presenting and interacting at the ACM SIGCHI 2013, Conference on Human Factors in Computing Systems, (also known as CHI), held April 27-May 2nd, 2013 at the Palais de Congrès de Paris in France. CHI 2013 featured many formal avenues to pursue patient-centered health communication: a well-attended workshop, tracks of original research, and a lively panel discussion. In this report, we highlight these events and the main themes we identified. We hope that it will help bring the health care communication and the HCI communities closer together.

  14. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  15. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  16. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  17. User-Centered Computer Aided Language Learning

    Science.gov (United States)

    Zaphiris, Panayiotis, Ed.; Zacharia, Giorgos, Ed.

    2006-01-01

    In the field of computer aided language learning (CALL), there is a need for emphasizing the importance of the user. "User-Centered Computer Aided Language Learning" presents methodologies, strategies, and design approaches for building interfaces for a user-centered CALL environment, creating a deeper understanding of the opportunities and…

  18. Human Performance Research Center

    Data.gov (United States)

    Federal Laboratory Consortium — Biochemistry:Improvements in energy metabolism, muscular strength and endurance capacity have a basis in biochemical and molecular adaptations within the human body....

  19. Human-Centered Aspects

    NARCIS (Netherlands)

    Kulyk, O.; Kosara, R.; Urquiza, J.; Wassink, I.; Kerren, A.; Ebert, A.; Meyer, J.

    2007-01-01

    Humans have remarkable perceptual capabilities. These capabilities are heavily underestimated in current visualizations. Often, this is due to the lack of an in-depth user study to set the requirements for optimal visualizations. The designer does not understand what kind of information should be vi

  20. Ubiquitous Human Computing

    OpenAIRE

    Zittrain, Jonathan L.

    2008-01-01

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a thumb tack and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This short essay explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

  1. Computer Networks and African Studies Centers.

    Science.gov (United States)

    Kuntz, Patricia S.

    The use of electronic communication in the 12 Title VI African Studies Centers is discussed, and the networks available for their use are reviewed. It is argued that the African Studies Centers should be on the cutting edge of contemporary electronic communication and that computer networks should be a fundamental aspect of their programs. An…

  2. When computers were human

    CERN Document Server

    Grier, David Alan

    2013-01-01

    Before Palm Pilots and iPods, PCs and laptops, the term ""computer"" referred to the people who did scientific calculations by hand. These workers were neither calculating geniuses nor idiot savants but knowledgeable people who, in other circumstances, might have become scientists in their own right. When Computers Were Human represents the first in-depth account of this little-known, 200-year epoch in the history of science and technology. Beginning with the story of his own grandmother, who was trained as a human computer, David Alan Grier provides a poignant introduction to the wider wo

  3. Human-Centered Design Capability

    Science.gov (United States)

    Fitts, David J.; Howard, Robert

    2009-01-01

    For NASA, human-centered design (HCD) seeks opportunities to mitigate the challenges of living and working in space in order to enhance human productivity and well-being. Direct design participation during the development stage is difficult, however, during project formulation, a HCD approach can lead to better more cost-effective products. HCD can also help a program enter the development stage with a clear vision for product acquisition. HCD tools for clarifying design intent are listed. To infuse HCD into the spaceflight lifecycle the Space and Life Sciences Directorate developed the Habitability Design Center. The Center has collaborated successfully with program and project design teams and with JSC's Engineering Directorate. This presentation discusses HCD capabilities and depicts the Center's design examples and capabilities.

  4. Computational Center for Studies of Plasma Microturbulence

    Energy Technology Data Exchange (ETDEWEB)

    William Dorland

    2006-10-11

    The Maryland Computational Center for Studies of Microturbulence (CCSM) was one component of a larger, multi-institutional Plasma Microturbulence Project, funded through what eventually became DOE's Scientific Discovery Through Advanced Computing Program. The primary focus of research in CCSM was to develop, deploy, maintain, and utilize kinetic simulation techniques, especially the gyrokinetic code called GS2.

  5. Plug Pulled on Chemistry Computer Center.

    Science.gov (United States)

    Robinson, Arthur L.

    1980-01-01

    Discusses the controversy surrounding the initial decision to establish, and the current decision to phase out, the National Resource for Computation in Chemistry (NRCC), a computational chemistry center jointly sponsored by the National Science Foundation and the Department of Energy. (CS)

  6. Activity report of Computing Research Center

    Energy Technology Data Exchange (ETDEWEB)

    1997-07-01

    On April 1997, National Laboratory for High Energy Physics (KEK), Institute of Nuclear Study, University of Tokyo (INS), and Meson Science Laboratory, Faculty of Science, University of Tokyo began to work newly as High Energy Accelerator Research Organization after reconstructing and converting their systems, under aiming at further development of a wide field of accelerator science using a high energy accelerator. In this Research Organization, Applied Research Laboratory is composed of four Centers to execute assistance of research actions common to one of the Research Organization and their relating research and development (R and D) by integrating the present four centers and their relating sections in Tanashi. What is expected for the assistance of research actions is not only its general assistance but also its preparation and R and D of a system required for promotion and future plan of the research. Computer technology is essential to development of the research and can communize for various researches in the Research Organization. On response to such expectation, new Computing Research Center is required for promoting its duty by coworking and cooperating with every researchers at a range from R and D on data analysis of various experiments to computation physics acting under driving powerful computer capacity such as supercomputer and so forth. Here were described on report of works and present state of Data Processing Center of KEK at the first chapter and of the computer room of INS at the second chapter and on future problems for the Computing Research Center. (G.K.)

  7. Computational human body models

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Dommelen, J.A.W. van

    2005-01-01

    Computational human body models are widely used for automotive crashsafety research and design and as such have significantly contributed to a reduction of traffic injuries and fatalities. Currently crash simulations are mainly performed using models based on crash-dummies. However crash dummies dif

  8. Computational human body models

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Dommelen, J.A.W. van

    2005-01-01

    Computational human body models are widely used for automotive crashsafety research and design and as such have significantly contributed to a reduction of traffic injuries and fatalities. Currently crash simulations are mainly performed using models based on crash-dummies. However crash dummies

  9. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  10. Hibbing Community College's Community Computer Center.

    Science.gov (United States)

    Regional Technology Strategies, Inc., Carrboro, NC.

    This paper reports on the development of the Community Computer Center (CCC) at Hibbing Community College (HCC) in Minnesota. HCC is located in the largest U.S. iron mining area in the United States. Closures of steel-producing plants are affecting the Hibbing area. Outmigration, particularly of younger workers and their families, has been…

  11. Human-Centered Information Fusion

    CERN Document Server

    Hall, David L

    2010-01-01

    Information fusion refers to the merging of information from disparate sources with differing conceptual, contextual and typographical representations. Rather than focusing on traditional data fusion applications which have been mainly concerned with physical military targets, this unique resource explores new human-centered trends, such as locations, identity, and interactions of individuals and groups (social networks). Moreover, the book discusses two new major sources of information: human observations and web-based information.This cutting-edge volume presents a new view of multi-sensor d

  12. Telemetry Computer System at Wallops Flight Center

    Science.gov (United States)

    Bell, H.; Strock, J.

    1980-01-01

    This paper describes the Telemetry Computer System in operation at NASA's Wallops Flight Center for real-time or off-line processing, storage, and display of telemetry data from rockets and aircraft. The system accepts one or two PCM data streams and one FM multiplex, converting each type of data into computer format and merging time-of-day information. A data compressor merges the active streams, and removes redundant data if desired. Dual minicomputers process data for display, while storing information on computer tape for further processing. Real-time displays are located at the station, at the rocket launch control center, and in the aircraft control tower. The system is set up and run by standard telemetry software under control of engineers and technicians. Expansion capability is built into the system to take care of possible future requirements.

  13. Handbook of human computation

    CERN Document Server

    Michelucci, Pietro

    2013-01-01

    This volume addresses the emerging area of human computation, The chapters, written by leading international researchers, explore existing and future opportunities to combine the respective strengths of both humans and machines in order to create powerful problem-solving capabilities. The book bridges scientific communities, capturing and integrating the unique perspective and achievements of each. It coalesces contributions from industry and across related disciplines in order to motivate, define, and anticipate the future of this exciting new frontier in science and cultural evolution. Reade

  14. Computer Bits: The Ideal Computer System for Your Center.

    Science.gov (United States)

    Brown, Dennis; Neugebauer, Roger

    1986-01-01

    Reviews five computer systems that can address the needs of a child care center: (1) Sperry PC IT with Bernoulli Box, (2) Compaq DeskPro 286, (3) Macintosh Plus, (4) Epson Equity II, and (5) Leading Edge Model "D." (HOD)

  15. Energy Consumption in Cloud Computing Data Centers

    Directory of Open Access Journals (Sweden)

    Uchechukwu Awada

    2014-06-01

    Full Text Available The implementation of cloud computing has attracted computing as a utility and enables penetrative applications from scientific, consumer and business domains. However, this implementation faces tremendous energy consumption, carbon dioxide emission and associated costs concerns. With energy consumption becoming key issue for the operation and maintenance of cloud datacenters, cloud computing providers are becoming profoundly concerned.  In this paper, we present formulations and solutions for Green Cloud Environments (GCE to minimize its environmental impact and energy consumption under new models by considering static and dynamic portions of cloud components. Our proposed methodology captures cloud computing data centers and presents a generic model for them. To implement this objective, an in-depth knowledge of energy consumption patterns in cloud environment is necessary. We investigate energy consumption patterns and show that by applying suitable optimization policies directed through our energy consumption models, it is possible to save 20% of energy consumption in cloud data centers. Our research results can be integrated into cloud computing systems to monitor energy consumption and support static and dynamic system level-optimization.

  16. Los Alamos Center for Computer Security formal computer security model

    Energy Technology Data Exchange (ETDEWEB)

    Dreicer, J.S.; Hunteman, W.J.; Markin, J.T.

    1989-01-01

    This paper provides a brief presentation of the formal computer security model currently being developed at the Los Alamos Department of Energy (DOE) Center for Computer Security (CCS). The need to test and verify DOE computer security policy implementation first motivated this effort. The actual analytical model was a result of the integration of current research in computer security and previous modeling and research experiences. The model is being developed to define a generic view of the computer and network security domains, to provide a theoretical basis for the design of a security model, and to address the limitations of present formal mathematical models for computer security. The fundamental objective of computer security is to prevent the unauthorized and unaccountable access to a system. The inherent vulnerabilities of computer systems result in various threats from unauthorized access. The foundation of the Los Alamos DOE CCS model is a series of functionally dependent probability equations, relations, and expressions. The model is undergoing continued discrimination and evolution. We expect to apply the model to the discipline of the Bell and LaPadula abstract sets of objects and subjects. 6 refs.

  17. The Computational Physics Program of the national MFE Computer Center

    Energy Technology Data Exchange (ETDEWEB)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs.

  18. Trip report: Marshall Space Center computed tomography

    Science.gov (United States)

    Harbour, J. R.; Andrews, M. K.

    BIR Inc. is a small company out of the Chicago area which sells equipment for producing images by tomography. They have built a relatively large instrument, called ACTIS, for NASA at the Marshall Space Center in Huntsville, Alabama and still gave access to this instrument. BIR has a grant from the Department of Energy (DOE) to determine the utility of computed tomography (CT) for characterization of nuclear and hazardous waste within the DOE complex. As part of this effort, the potential of this technique for obtaining images of canistered waste forms has been investigated. Funding for data acquisition was provided through this grant.

  19. Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations

    Science.gov (United States)

    Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.

    2001-01-01

    The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.

  20. User-centered support to localized activities in ubiquitous computing environments

    OpenAIRE

    Pinto, Helder; José, Rui

    2004-01-01

    The design of pervasive and ubiquitous computing systems must be centered on users’ activity in order to bring computing systems closer to people. The adoption of an activity-centered approach to the design of pervasive and ubiquitous computing systems should consider: a) how humans naturally accomplish an activity; and b) how computing artifacts from both the local and personal domains should contribute to the accomplishment of an activity. This work particularly focuses on localized a...

  1. Visualizing Humans by Computer.

    Science.gov (United States)

    Magnenat-Thalmann, Nadia

    1992-01-01

    Presents an overview of the problems and techniques involved in visualizing humans in a three-dimensional scene. Topics discussed include human shape modeling, including shape creation and deformation; human motion control, including facial animation and interaction with synthetic actors; and human rendering and clothing, including textures and…

  2. The problem of organization of a coastal coordinating computer center

    Science.gov (United States)

    Dyubkin, I. A.; Lodkin, I. I.

    1974-01-01

    The fundamental principles of the operation of a coastal coordinating and computing center under conditions of automation are presented. Special attention is devoted to the work of Coastal Computer Center of the Arctic and Antarctic Scientific Research Institute. This center generalizes from data collected in expeditions and also from observations made at polar stations.

  3. Human Centered Hardware Modeling and Collaboration

    Science.gov (United States)

    Stambolian Damon; Lawrence, Brad; Stelges, Katrine; Henderson, Gena

    2013-01-01

    In order to collaborate engineering designs among NASA Centers and customers, to in clude hardware and human activities from multiple remote locations, live human-centered modeling and collaboration across several sites has been successfully facilitated by Kennedy Space Center. The focus of this paper includes innovative a pproaches to engineering design analyses and training, along with research being conducted to apply new technologies for tracking, immersing, and evaluating humans as well as rocket, vehic le, component, or faci lity hardware utilizing high resolution cameras, motion tracking, ergonomic analysis, biomedical monitoring, wor k instruction integration, head-mounted displays, and other innovative human-system integration modeling, simulation, and collaboration applications.

  4. Human-centered design of a distributed knowledge management system.

    Science.gov (United States)

    Rinkus, Susan; Walji, Muhammad; Johnson-Throop, Kathy A; Malin, Jane T; Turley, James P; Smith, Jack W; Zhang, Jiajie

    2005-02-01

    Many healthcare technology projects fail due to the lack of consideration of human issues, such as workflow, organizational change, and usability, during the design and implementation stages of a project's development process. Even when human issues are considered, the consideration is typically on designing better user interfaces. We argue that human-centered computing goes beyond a better user interface: it should include considerations of users, functions and tasks that are fundamental to human-centered computing. From this perspective, we integrated a previously developed human-centered methodology with a Project Design Lifecycle, and we applied this integration in the design of a complex distributed knowledge management system for the Biomedical Engineer (BME) domain in the Mission Control Center at NASA Johnson Space Center. We analyzed this complex system, identified its problems, generated systems requirements, and provided specifications of a replacement prototype for effective organizational memory and knowledge management. We demonstrated the value provided by our human-centered approach and described the unique properties, structures, and processes discovered using this methodology and how they contributed in the design of the prototype.

  5. Study of Root Canal Anatomy in Human Permanent Teeth in A Subpopulation of Brazil's Center Region Using Cone-Beam Computed Tomography - Part 1.

    Science.gov (United States)

    Estrela, Carlos; Bueno, Mike R; Couto, Gabriela S; Rabelo, Luiz Eduardo G; Alencar, Ana Helena G; Silva, Ricardo Gariba; Pécora, Jesus Djalma; Sousa-Neto, Manoel Damião

    2015-10-01

    The aim of this study was to evaluate the frequency of roots, root canals and apical foramina in human permanent teeth using cone beam computed tomography (CBCT). CBCT images of 1,400 teeth from database previously evaluated were used to determine the frequency of number of roots, root canals and apical foramina. All teeth were evaluated by preview of the planes sagittal, axial, and coronal. Navigation in axial slices of 0.1 mm/0.1 mm followed the coronal to apical direction, as well as the apical to coronal direction. Two examiners assessed all CBCT images. Statistical data were analyzed including frequency distribution and cross-tabulation. The highest frequency of four root canals and four apical foramina was found in maxillary first molars (76%, 33%, respectively), followed by maxillary second molars (41%, 25%, respectively). The frequency of four root canals in mandibular first molars was 51%. Mandibular first premolars had two root canals and two apical foramina in 29% and 20% of the cases, respectively. Mandibular central and lateral incisors and canines presented two root canals in 35%, 42% and 22% of the cases, respectively. The navigation strategy in CBCT images favors a better identification of frequency and position of roots, root canals and apical foramina in human permanent teeth.

  6. Center for Computing Research Summer Research Proceedings 2015.

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, Andrew Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Parks, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-18

    The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).

  7. Computer Labs and Media Centers: A Natural Fit.

    Science.gov (United States)

    Anderson, Mary Alice

    2000-01-01

    Discusses how to integrate a computer lab with a school library media center. Topics include planning; management issues; printing guidelines; scheduling; instruction and integration; staff development; supervision while classes are in the computer lab; and technical issues. (LRW)

  8. NASA Center for Computational Sciences: History and Resources

    Science.gov (United States)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  9. Minimal mobile human computer interaction

    NARCIS (Netherlands)

    el Ali, A.

    2013-01-01

    In the last 20 years, the widespread adoption of personal, mobile computing devices in everyday life, has allowed entry into a new technological era in Human Computer Interaction (HCI). The constant change of the physical and social context in a user's situation made possible by the portability of m

  10. Bioinformatics and Computational Core Technology Center

    Data.gov (United States)

    Federal Laboratory Consortium — SERVICES PROVIDED BY THE COMPUTER CORE FACILITYEvaluation, purchase, set up, and maintenance of the computer hardware and network for the 170 users in the research...

  11. Bioinformatics and Computational Core Technology Center

    Data.gov (United States)

    Federal Laboratory Consortium — SERVICES PROVIDED BY THE COMPUTER CORE FACILITY Evaluation, purchase, set up, and maintenance of the computer hardware and network for the 170 users in the research...

  12. School Library Media Centers: The Human Environment.

    Science.gov (United States)

    Doll, Carol A.

    1992-01-01

    Review of the literature on aspects of human behavior relevant to library media center design discusses personal space, territoriality, privacy, variety, and color. Suggestions for media center design in the areas of color, carpeting, seating, private spaces, variety in spaces, ownership, and control are offered; and research needs are identified.…

  13. Human Centered Design and Development for NASA's MerBoard

    Science.gov (United States)

    Trimble, Jay

    2003-01-01

    This viewgraph presentation provides an overview of the design and development process for NASA's MerBoard. These devices are large interactive display screens which can be shown on the user's computer, which will allow scientists in many locations to interpret and evaluate mission data in real-time. These tools are scheduled to be used during the 2003 Mars Exploration Rover (MER) expeditions. Topics covered include: mission overview, Mer Human Centered Computers, FIDO 2001 observations and MerBoard prototypes.

  14. Human Computation An Integrated Approach to Learning from the Crowd

    CERN Document Server

    Law, Edith

    2011-01-01

    Human computation is a new and evolving research area that centers around harnessing human intelligence to solve computational problems that are beyond the scope of existing Artificial Intelligence (AI) algorithms. With the growth of the Web, human computation systems can now leverage the abilities of an unprecedented number of people via the Web to perform complex computation. There are various genres of human computation applications that exist today. Games with a purpose (e.g., the ESP Game) specifically target online gamers who generate useful data (e.g., image tags) while playing an enjoy

  15. Computer addiction. When monitor becomes control center.

    Science.gov (United States)

    Christensen, M H; Orzack, M H; Babington, L M; Patsoaughter, C A

    2001-03-01

    Computer addiction is a newly recognized problem. While controversy exists about whether computer addiction should be considered a primary psychiatric disorder, clinicians are treating increasing numbers of clients experiencing problems caused by excessive computer use. Case studies are provided that include typical histories and symptoms. Behavioral cognitive therapy is discussed as a treatment approach. The stages of change theory is recommended as a strategy to help clients plan and implement change.

  16. Making IBM's Computer, Watson, Human

    Science.gov (United States)

    Rachlin, Howard

    2012-01-01

    This essay uses the recent victory of an IBM computer (Watson) in the TV game, Jeopardy, to speculate on the abilities Watson would need, in addition to those it has, to be human. The essay's basic premise is that to be human is to behave as humans behave and to function in society as humans function. Alternatives to this premise are considered and rejected. The viewpoint of the essay is that of teleological behaviorism. Mental states are defined as temporally extended patterns of overt behavior. From this viewpoint (although Watson does not currently have them), essential human attributes such as consciousness, the ability to love, to feel pain, to sense, to perceive, and to imagine may all be possessed by a computer. Most crucially, a computer may possess self-control and may act altruistically. However, the computer's appearance, its ability to make specific movements, its possession of particular internal structures (e.g., whether those structures are organic or inorganic), and the presence of any nonmaterial “self,” are all incidental to its humanity. PMID:22942530

  17. Humans, computers and wizards human (simulated) computer interaction

    CERN Document Server

    Fraser, Norman; McGlashan, Scott; Wooffitt, Robin

    2013-01-01

    Using data taken from a major European Union funded project on speech understanding, the SunDial project, this book considers current perspectives on human computer interaction and argues for the value of an approach taken from sociology which is based on conversation analysis.

  18. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2003-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  19. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2002-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  20. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2001-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  1. Merging a Library and a Computing Center.

    Science.gov (United States)

    Plane, Robert A.

    1982-01-01

    Details are provided of the development of a new library at Clarkson College. It was seen from the start that the new center needed to be viewed as the hub of a campus-wide system to provide integrated informational support for instructional, research, and administrative activities of the entire college. (MP)

  2. Resource Centered Computing delivering high parallel performance

    OpenAIRE

    2014-01-01

    International audience; Modern parallel programming requires a combination of differentparadigms, expertise and tuning, that correspond to the differentlevels in today's hierarchical architectures. To cope with theinherent difficulty, ORWL (ordered read-write locks) presents a newparadigm and toolbox centered around local or remote resources, suchas data, processors or accelerators. ORWL programmers describe theircomputation in terms of access to these resources during criticalsections. Exclu...

  3. Human-computer interface design

    Energy Technology Data Exchange (ETDEWEB)

    Bowser, S.E.

    1995-04-01

    Modern military forces assume that computer-based information is reliable, timely, available, usable, and shared. The importance of computer-based information is based on the assumption that {open_quotes}shared situation awareness, coupled with the ability to conduct continuous operations, will allow information age armies to observe, decide, and act faster, more correctly and more precisely than their enemies.{close_quotes} (Sullivan and Dubik 1994). Human-Computer Interface (HCI) design standardization is critical to the realization of the previously stated assumptions. Given that a key factor of a high-performance, high-reliability system is an easy-to-use, effective design of the interface between the hardware, software, and the user, it follows logically that the interface between the computer and the military user is critical to the success of the information-age military. The proliferation of computer technology has resulted in the development of an extensive variety of computer-based systems and the implementation of varying HCI styles on these systems. To accommodate the continued growth in computer-based systems, minimize HCI diversity, and improve system performance and reliability, the U.S. Department of Defense (DoD) is continuing to adopt interface standards for developing computer-based systems.

  4. The Quantum Human Computer (QHC) Hypothesis

    Science.gov (United States)

    Salmani-Nodoushan, Mohammad Ali

    2008-01-01

    This article attempts to suggest the existence of a human computer called Quantum Human Computer (QHC) on the basis of an analogy between human beings and computers. To date, there are two types of computers: Binary and Quantum. The former operates on the basis of binary logic where an object is said to exist in either of the two states of 1 and…

  5. Lecture 4: Cloud Computing in Large Computer Centers

    CERN Document Server

    CERN. Geneva

    2013-01-01

    This lecture will introduce Cloud Computing concepts identifying and analyzing its characteristics, models, and applications. Also, you will learn how CERN built its Cloud infrastructure and which tools are been used to deploy and manage it. About the speaker: Belmiro Moreira is an enthusiastic software engineer passionate about the challenges and complexities of architecting and deploying Cloud Infrastructures in ve...

  6. Human ear recognition by computer

    CERN Document Server

    Bhanu, Bir; Chen, Hui

    2010-01-01

    Biometrics deals with recognition of individuals based on their physiological or behavioral characteristics. The human ear is a new feature in biometrics that has several merits over the more common face, fingerprint and iris biometrics. Unlike the fingerprint and iris, it can be easily captured from a distance without a fully cooperative subject, although sometimes it may be hidden with hair, scarf and jewellery. Also, unlike a face, the ear is a relatively stable structure that does not change much with the age and facial expressions. ""Human Ear Recognition by Computer"" is the first book o

  7. Toward human-centered algorithm design

    Directory of Open Access Journals (Sweden)

    Eric PS Baumer

    2017-07-01

    Full Text Available As algorithms pervade numerous facets of daily life, they are incorporated into systems for increasingly diverse purposes. These systems’ results are often interpreted differently by the designers who created them than by the lay persons who interact with them. This paper offers a proposal for human-centered algorithm design, which incorporates human and social interpretations into the design process for algorithmically based systems. It articulates three specific strategies for doing so: theoretical, participatory, and speculative. Drawing on the author’s work designing and deploying multiple related systems, the paper provides a detailed example of using a theoretical approach. It also discusses findings pertinent to participatory and speculative design approaches. The paper addresses both strengths and challenges for each strategy in helping to center the process of designing algorithmically based systems around humans.

  8. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  9. Center for Computational Wind Turbine Aerodynamics and Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Sørensen, Jens Nørkær

    2014-01-01

    In order to design and operate a wind farm optimally it is necessary to know in detail how the wind behaves and interacts with the turbines in a farm. This not only requires knowledge about meteorology, turbulence and aerodynamics, but it also requires access to powerful computers and efficient...... software. Center for Computational Wind Turbine Aerodynamics and Atmospheric Turbulence was established in 2010 in order to create a world-leading cross-disciplinary flow center that covers all relevant disciplines within wind farm meteorology and aerodynamics....

  10. Overview of Center for Domain-Specific Computing

    Institute of Scientific and Technical Information of China (English)

    Jason Cong

    2011-01-01

    In this short article,we would like to introduce the Center for Domain-Specific Computing (CDSC) established in 2009,primarily funded by the US National Science Foundation with an award from the 2009 Expeditions in Computing Program.In this project we look beyond parallelization and focus on customization as the next disruptive technology to bring orders-of-magnitude power-performance efficiency improvement for applications in a specific domain.

  11. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Göttingen

    CERN Document Server

    Meyer, J; The ATLAS collaboration; Weber, P

    2010-01-01

    GoeGrid is a grid resource center located in Goettingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center will be presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster will be detailed. The benefits are an efficient use of computer and manpower resources. Further interdisciplinary projects are commonly organized courses for students of all fields to support education on grid-computing.

  12. Argonne's Laboratory computing center - 2007 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.; Pieper, G. W.

    2008-05-28

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific

  13. Wings: A New Paradigm in Human-Centered Design

    Science.gov (United States)

    Schutte, Paul C.

    1997-01-01

    Many aircraft accidents/incidents investigations cite crew error as a causal factor (Boeing Commercial Airplane Group 1996). Human factors experts suggest that crew error has many underlying causes and should be the start of an accident investigation and not the end. One of those causes, the flight deck design, is correctable. If a flight deck design does not accommodate the human's unique abilities and deficits, crew error may simply be the manifestation of this mismatch. Pilots repeatedly report that they are "behind the aircraft" , i.e., they do not know what the automated aircraft is doing or how the aircraft is doing it until after the fact. Billings (1991) promotes the concept of "human-centered automation"; calling on designers to allocate appropriate control and information to the human. However, there is much ambiguity regarding what it mean's to be human-centered. What often are labeled as "human-centered designs" are actually designs where a human factors expert has been involved in the design process or designs where tests have shown that humans can operate them. While such designs may be excellent, they do not represent designs that are systematically produced according to some set of prescribed methods and procedures. This paper describes a design concept, called Wings, that offers a clearer definition for human-centered design. This new design concept is radically different from current design processes in that the design begins with the human and uses the human body as a metaphor for designing the aircraft. This is not because the human is the most important part of the aircraft (certainly the aircraft would be useless without lift and thrust), but because he is the least understood, the least programmable, and one of the more critical elements. The Wings design concept has three properties: a reversal in the design process, from aerodynamics-, structures-, and propulsion-centered to truly human-centered; a design metaphor that guides function

  14. The Utility of Computer Tracking Tools for User-Centered Design.

    Science.gov (United States)

    Gay, Geri; Mazur, Joan

    1993-01-01

    Describes tracking tools used by designers and users to evaluate the efficacy of hypermedia systems. Highlights include human-computer interaction research; tracking tools and user-centered design; and three examples from the Interactive Multimedia Group at Cornell University that illustrate uses of various tracking tools. (27 references) (LRW)

  15. Funding Public Computing Centers: Balancing Broadband Availability and Expected Demand

    Science.gov (United States)

    Jayakar, Krishna; Park, Eun-A

    2012-01-01

    The National Broadband Plan (NBP) recently announced by the Federal Communication Commission visualizes a significantly enhanced commitment to public computing centers (PCCs) as an element of the Commission's plans for promoting broadband availability. In parallel, the National Telecommunications and Information Administration (NTIA) has…

  16. Computational geometry lectures at the morningside center of mathematics

    CERN Document Server

    Wang, Ren-Hong

    2003-01-01

    Computational geometry is a borderline subject related to pure and applied mathematics, computer science, and engineering. The book contains articles on various topics in computational geometry, which are based on invited lectures and some contributed papers presented by researchers working during the program on Computational Geometry at the Morningside Center of Mathematics of the Chinese Academy of Science. The opening article by R.-H. Wang gives a nice survey of various aspects of computational geometry, many of which are discussed in more detail in other papers in the volume. The topics include problems of optimal triangulation, splines, data interpolation, problems of curve and surface design, problems of shape control, quantum teleportation, and others.

  17. Computational Techniques of Electromagnetic Dosimetry for Humans

    Science.gov (United States)

    Hirata, Akimasa; Fujiwara, Osamu

    There has been increasing public concern about the adverse health effects of human exposure to electromagnetic fields. This paper reviews the rationale of international safety guidelines for human protection against electromagnetic fields. Then, this paper also presents computational techniques to conduct dosimetry in anatomically-based human body models. Computational examples and remaining problems are also described briefly.

  18. UC Merced Center for Computational Biology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Colvin, Michael; Watanabe, Masakatsu

    2010-11-30

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformation of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs

  19. The Social Computer: Combining Machine and Human Computation

    OpenAIRE

    Giunchiglia, Fausto; Robertson, Dave

    2010-01-01

    The social computer is a future computational system that harnesses the innate problem solving, action and information gathering powers of humans and the environments in which they live in order to tackle large scale social problems that are beyond our current capabilities. The hardware of a social computer is supplied by people’s brains and bodies, the environment where they live, including artifacts, e.g., buildings and roads, sensors into the environment, networks and computers; while the ...

  20. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Goettingen

    CERN Document Server

    Meyer, J; The ATLAS collaboration; Weber, P

    2011-01-01

    GoeGrid is a grid resource center located in G¨ottingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center is presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster is detailed. The benefits are an efficient use of computer and manpower resources.

  1. Cooperation in human-computer communication

    OpenAIRE

    Kronenberg, Susanne

    2000-01-01

    The goal of this thesis is to simulate cooperation in human-computer communication to model the communicative interaction process of agents in natural dialogs in order to provide advanced human-computer interaction in that coherence is maintained between contributions of both agents, i.e. the human user and the computer. This thesis contributes to certain aspects of understanding and generation and their interaction in the German language. In spontaneous dialogs agents cooperate by the pro...

  2. Cloud Computing in Science and Engineering and the “SciShop.ru” Computer Simulation Center

    Directory of Open Access Journals (Sweden)

    E. V. Vorozhtsov

    2011-12-01

    Full Text Available Various aspects of cloud computing applications for scientific research, applied design, and remote education are described in this paper. An analysis of the different aspects is performed based on the experience from the “SciShop.ru” Computer Simulation Center. This analysis shows that cloud computing technology has wide prospects in scientific research applications, applied developments and also remote education of specialists, postgraduates, and students.

  3. Implementing the Data Center Energy Productivity Metric in a High Performance Computing Data Center

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Marquez, Andres; Rawson, Andrew; Cader, Tahir; Fox, Kevin M.; Gustafson, William I.; Mundy, Christopher J.

    2013-06-30

    As data centers proliferate in size and number, the improvement of their energy efficiency and productivity has become an economic and environmental imperative. Making these improvements requires metrics that are robust, interpretable, and practical. We discuss the properties of a number of the proposed metrics of energy efficiency and productivity. In particular, we focus on the Data Center Energy Productivity (DCeP) metric, which is the ratio of useful work produced by the data center to the energy consumed performing that work. We describe our approach for using DCeP as the principal outcome of a designed experiment using a highly instrumented, high-performance computing data center. We found that DCeP was successful in clearly distinguishing different operational states in the data center, thereby validating its utility as a metric for identifying configurations of hardware and software that would improve energy productivity. We also discuss some of the challenges and benefits associated with implementing the DCeP metric, and we examine the efficacy of the metric in making comparisons within a data center and between data centers.

  4. Language evolution and human-computer interaction

    Science.gov (United States)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  5. Language evolution and human-computer interaction

    Science.gov (United States)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  6. New computer system for the Japan Tier-2 center

    CERN Multimedia

    Hiroyuki Matsunaga

    2007-01-01

    The ICEPP (International Center for Elementary Particle Physics) of the University of Tokyo has been operating an LCG Tier-2 center dedicated to the ATLAS experiment, and is going to switch over to the new production system which has been recently installed. The system will be of great help to the exciting physics analyses for coming years. The new computer system includes brand-new blade servers, RAID disks, a tape library system and Ethernet switches. The blade server is DELL PowerEdge 1955 which contains two Intel dual-core Xeon (WoodCrest) CPUs running at 3GHz, and a total of 650 servers will be used as compute nodes. Each of the RAID disks is configured to be RAID-6 with 16 Serial ATA HDDs. The equipment as well as the cooling system is placed in a new large computer room, and both are hooked up to UPS (uninterruptible power supply) units for stable operation. As a whole, the system has been built with redundant configuration in a cost-effective way. The next major upgrade will take place in thre...

  7. Computer Vision Method in Human Motion Detection

    Institute of Scientific and Technical Information of China (English)

    FU Li; FANG Shuai; XU Xin-he

    2007-01-01

    Human motion detection based on computer vision is a frontier research topic and is causing an increasing attention in the field of computer vision research. The wavelet transform is used to sharpen the ambiguous edges in human motion image. The shadow's effect to the image processing is also removed. The edge extraction can be successfully realized.This is an effective method for the research of human motion analysis system.

  8. An academic medical center's response to widespread computer failure.

    Science.gov (United States)

    Genes, Nicholas; Chary, Michael; Chason, Kevin W

    2013-01-01

    As hospitals incorporate information technology (IT), their operations become increasingly vulnerable to technological breakdowns and attacks. Proper emergency management and business continuity planning require an approach to identify, mitigate, and work through IT downtime. Hospitals can prepare for these disasters by reviewing case studies. This case study details the disruption of computer operations at Mount Sinai Medical Center (MSMC), an urban academic teaching hospital. The events, and MSMC's response, are narrated and the impact on hospital operations is analyzed. MSMC's disaster management strategy prevented computer failure from compromising patient care, although walkouts and time-to-disposition in the emergency department (ED) notably increased. This incident highlights the importance of disaster preparedness and mitigation. It also demonstrates the value of using operational data to evaluate hospital responses to disasters. Quantifying normal hospital functions, just as with a patient's vital signs, may help quantitatively evaluate and improve disaster management and business continuity planning.

  9. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  10. High Performance Computing in Science and Engineering '15 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2015. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  11. Object categorization: computer and human vision perspectives

    National Research Council Canada - National Science Library

    Dickinson, Sven J

    2009-01-01

    .... The result of a series of four highly successful workshops on the topic, the book gathers many of the most distinguished researchers from both computer and human vision to reflect on their experience...

  12. Handling emotions in human-computer dialogues

    CERN Document Server

    Pittermann, Johannes; Minker, Wolfgang

    2010-01-01

    This book presents novel methods to perform robust speech-based emotion recognition at low complexity. It describes a flexible dialogue model to conveniently integrate emotions and other dialogue-influencing parameters in human-computer interaction.

  13. Microscopic computation in human brain evolution.

    Science.gov (United States)

    Wallace, R

    1995-04-01

    When human psychological performance is viewed in terms of cognitive modules, our species displays remarkable differences in computational power. Algorithmically simple computations are generally difficult to perform, whereas optimal routing or "Traveling Salesman" Problems (TSP) of far greater complexity are solved on an everyday basis. It is argued that even "simple" instances of TSP are not purely Euclidian problems in human computations, but involve emotional, autonomic, and cognitive constraints. They therefore require a level of parallel processing not possible in a macroscopic system to complete the algorithm within a brief period of time. A microscopic neurobiological model emphasizing the computational power of excited atoms within the neuronal membrane is presented as an alternative to classical connectionist approaches. The evolution of the system is viewed in terms of specific natural selection pressures driving satisfying computations toward global optimization. The relationship of microscopic computation to the nature of consciousness is examined, and possible mathematical models as a basis for simulation studies are briefly discussed.

  14. Human-centered incubator: beyond a design concept

    NARCIS (Netherlands)

    Goossens, R.H.M.; Willemsen, H.

    2013-01-01

    We read with interest the paper by Ferris and Shepley1 on a human-centered design project with university students on neonatal incubators. It is interesting to see that in the design solutions and concepts as presented by Ferris and Shepley,1 human-centered design played an important role. In 2005,

  15. Students' Ways of Experiencing Human-Centered Design

    Science.gov (United States)

    Zoltowski, Carla B.

    2010-01-01

    This study investigated the qualitatively different ways which students experienced human-centered design. The findings of this research are important in developing effective design learning experiences and have potential impact across design education. This study provides the basis for being able to assess learning of human-centered design which…

  16. Human-Computer Interactions and Decision Behavior

    Science.gov (United States)

    1984-01-01

    software interfaces. The major components of the reseach program included the Diaiogue Management System. (DMS) operating environment, the role of...specification; and new methods for modeling, designing, and developing human-computer interfaces based on syntactic and semantic specification. The DMS...achieving communication is language. Accordingly, the transaction model employs a linguistic model consisting of parts that relate computer responses

  17. Human Adaptation to the Computer.

    Science.gov (United States)

    1986-09-01

    Resistance to Change ; Stress; Adaptation to Computers ABSTRACT (Continue on reverie of necessary and identify by block number) This thesis is a study of...OF RESISTANCE TO CHANGE -------------- 48 B. OVERCOMING RESISTANCE TO CHANGE ------------- 50 C. SPECIFIC RECOMMENDATIONS TO OVERCOME RESISTANCE...greater his bewilderment and the greater his bewilderment, the greater his resistance will be [Ref. 7:p. 539]. Overcoming man’s resistance to change

  18. Exposure science and the U.S. EPA National Center for Computational Toxicology.

    Science.gov (United States)

    Cohen Hubal, Elaine A; Richard, Ann M; Shah, Imran; Gallagher, Jane; Kavlock, Robert; Blancato, Jerry; Edwards, Stephen W

    2010-05-01

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. The integration of modern computing with molecular biology and chemistry will allow scientists to better prioritize data, inform decision makers on chemical risk assessments and understand a chemical's progression from the environment to the target tissue within an organism and ultimately to the key steps that trigger an adverse health effect. In this paper, several of the major research activities being sponsored by Environmental Protection Agency's National Center for Computational Toxicology are highlighted. Potential links between research in computational toxicology and human exposure science are identified. As with the traditional approaches for toxicity testing and hazard assessment, exposure science is required to inform design and interpretation of high-throughput assays. In addition, common themes inherent throughout National Center for Computational Toxicology research activities are highlighted for emphasis as exposure science advances into the 21st century.

  19. Fundamentals of human-computer interaction

    CERN Document Server

    Monk, Andrew F

    1985-01-01

    Fundamentals of Human-Computer Interaction aims to sensitize the systems designer to the problems faced by the user of an interactive system. The book grew out of a course entitled """"The User Interface: Human Factors for Computer-based Systems"""" which has been run annually at the University of York since 1981. This course has been attended primarily by systems managers from the computer industry. The book is organized into three parts. Part One focuses on the user as processor of information with studies on visual perception; extracting information from printed and electronically presented

  20. Deep architectures for Human Computer Interaction

    NARCIS (Netherlands)

    Noulas, A.K.; Kröse, B.J.A.

    2008-01-01

    In this work we present the application of Conditional Restricted Boltzmann Machines in Human Computer Interaction. These provide a well suited framework to model the complex temporal patterns produced from humans in the audio and video modalities. They can be trained in a semisupervised fashion and

  1. Exploring human inactivity in computer power consumption

    Science.gov (United States)

    Candrawati, Ria; Hashim, Nor Laily Binti

    2016-08-01

    Managing computer power consumption has become an important challenge in computer society and this is consistent with a trend where a computer system is more important to modern life together with a request for increased computing power and functions continuously. Unfortunately, previous approaches are still inadequately designed to handle the power consumption problem due to unpredictable workload of a system caused by unpredictable human behaviors. This is happens due to lack of knowledge in a software system and the software self-adaptation is one approach in dealing with this source of uncertainty. Human inactivity is handled by adapting the behavioral changes of the users. This paper observes human inactivity in the computer usage and finds that computer power usage can be reduced if the idle period can be intelligently sensed from the user activities. This study introduces Control, Learn and Knowledge model that adapts the Monitor, Analyze, Planning, Execute control loop integrates with Q Learning algorithm to learn human inactivity period to minimize the computer power consumption. An experiment to evaluate this model was conducted using three case studies with same activities. The result show that the proposed model obtained those 5 out of 12 activities shows the power decreasing compared to others.

  2. Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    John Mellor-Crummey

    2008-02-29

    Rice University's achievements as part of the Center for Programming Models for Scalable Parallel Computing include: (1) design and implemention of cafc, the first multi-platform CAF compiler for distributed and shared-memory machines, (2) performance studies of the efficiency of programs written using the CAF and UPC programming models, (3) a novel technique to analyze explicitly-parallel SPMD programs that facilitates optimization, (4) design, implementation, and evaluation of new language features for CAF, including communication topologies, multi-version variables, and distributed multithreading to simplify development of high-performance codes in CAF, and (5) a synchronization strength reduction transformation for automatically replacing barrier-based synchronization with more efficient point-to-point synchronization. The prototype Co-array Fortran compiler cafc developed in this project is available as open source software from http://www.hipersoft.rice.edu/caf.

  3. Human Computer Interface Design Criteria. Volume 1. User Interface Requirements

    Science.gov (United States)

    2010-03-19

    2 entitled Human Computer Interface ( HCI )Design Criteria Volume 1: User Interlace Requirements which contains the following major changes from...MISSILE SYSTEMS CENTER Air Force Space Command 483 N. Aviation Blvd. El Segundo, CA 90245 4. This standard has been approved for use on all Space and...and efficient model of how the system works and can generalize this knowledge to other systems. According to Mayhew in Principles and Guidelines in

  4. High Performance Computing in Science and Engineering '98 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    1999-01-01

    The book contains reports about the most significant projects from science and industry that are using the supercomputers of the Federal High Performance Computing Center Stuttgart (HLRS). These projects are from different scientific disciplines, with a focus on engineering, physics and chemistry. They were carefully selected in a peer-review process and are showcases for an innovative combination of state-of-the-art physical modeling, novel algorithms and the use of leading-edge parallel computer technology. As HLRS is in close cooperation with industrial companies, special emphasis has been put on the industrial relevance of results and methods.

  5. High Performance Computing in Science and Engineering '99 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2000-01-01

    The book contains reports about the most significant projects from science and engineering of the Federal High Performance Computing Center Stuttgart (HLRS). They were carefully selected in a peer-review process and are showcases of an innovative combination of state-of-the-art modeling, novel algorithms and the use of leading-edge parallel computer technology. The projects of HLRS are using supercomputer systems operated jointly by university and industry and therefore a special emphasis has been put on the industrial relevance of results and methods.

  6. Los Alamos CCS (Center for Computer Security) formal computer security model

    Energy Technology Data Exchange (ETDEWEB)

    Dreicer, J.S.; Hunteman, W.J. (Los Alamos National Lab., NM (USA))

    1989-01-01

    This paper provides a brief presentation of the formal computer security model currently being developed at the Los Alamos Department of Energy (DOE) Center for Computer Security (CCS). The initial motivation for this effort was the need to provide a method by which DOE computer security policy implementation could be tested and verified. The actual analytical model was a result of the integration of current research in computer security and previous modeling and research experiences. The model is being developed to define a generic view of the computer and network security domains, to provide a theoretical basis for the design of a security model, and to address the limitations of present models. Formal mathematical models for computer security have been designed and developed in conjunction with attempts to build secure computer systems since the early 70's. The foundation of the Los Alamos DOE CCS model is a series of functionally dependent probability equations, relations, and expressions. The mathematical basis appears to be justified and is undergoing continued discrimination and evolution. We expect to apply the model to the discipline of the Bell-Lapadula abstract sets of objects and subjects. 5 refs.

  7. Participatory Design of Human-Centered Cyberinfrastructure (Invited)

    Science.gov (United States)

    Pennington, D. D.; Gates, A. Q.

    2010-12-01

    Cyberinfrastructure, by definition, is about people sharing resources to achieve outcomes that cannot be reached independently. CI depends not just on creating discoverable resources, or tools that allow those resources to be processed, integrated, and visualized -- but on human activation of flows of information across those resources. CI must be centered on human activities. Yet for those CI projects that are directed towards observational science, there are few models for organizing collaborative research in ways that align individual research interests into a collective vision of CI-enabled science. Given that the emerging technologies are themselves expected to change the way science is conducted, it is not simply a matter of conducting requirements analysis on how scientists currently work, or building consensus among the scientists on what is needed. Developing effective CI depends on generating a new, creative vision of problem solving within a community based on computational concepts that are, in some cases, still very abstract and theoretical. The computer science theory may (or may not) be well formalized, but the potential for impact on any particular domain is typically ill-defined. In this presentation we will describe approaches being developed and tested at the CyberShARE Center of Excellence at University of Texas in El Paso for ill-structured problem solving within cross-disciplinary teams of scientists and computer scientists working on data intensive environmental and geoscience. These approaches deal with the challenges associated with sharing and integrating knowledge across disciplines; the challenges of developing effective teamwork skills in a culture that favors independent effort; and the challenges of evolving shared, focused research goals from ill-structured, vague starting points - all issues that must be confronted by every interdisciplinary CI project. We will introduce visual and semantic-based tools that can enable the

  8. High Performance Computing in Science and Engineering '02 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2003-01-01

    This book presents the state-of-the-art in modeling and simulation on supercomputers. Leading German research groups present their results achieved on high-end systems of the High Performance Computing Center Stuttgart (HLRS) for the year 2002. Reports cover all fields of supercomputing simulation ranging from computational fluid dynamics to computer science. Special emphasis is given to industrially relevant applications. Moreover, by presenting results for both vector sytems and micro-processor based systems the book allows to compare performance levels and usability of a variety of supercomputer architectures. It therefore becomes an indispensable guidebook to assess the impact of the Japanese Earth Simulator project on supercomputing in the years to come.

  9. Human-centered social media analytics

    CERN Document Server

    Fu, Yun

    2014-01-01

    Provides a survey of next-generation social computational methodologies, from fundamentals to state-of-the-art techniques Includes perspectives from an international and interdisciplinary selection of pre-eminent authorities Presents balanced coverage of both detailed theoretical analysis and real-world applications

  10. Human Computer Interaction: An intellectual approach

    Directory of Open Access Journals (Sweden)

    Kuntal Saroha

    2011-08-01

    Full Text Available This paper discusses the research that has been done in thefield of Human Computer Interaction (HCI relating tohuman psychology. Human-computer interaction (HCI isthe study of how people design, implement, and useinteractive computer systems and how computers affectindividuals, organizations, and society. This encompassesnot only ease of use but also new interaction techniques forsupporting user tasks, providing better access toinformation, and creating more powerful forms ofcommunication. It involves input and output devices andthe interaction techniques that use them; how information ispresented and requested; how the computer’s actions arecontrolled and monitored; all forms of help, documentation,and training; the tools used to design, build, test, andevaluate user interfaces; and the processes that developersfollow when creating Interfaces.

  11. Human/computer control of undersea teleoperators

    Science.gov (United States)

    Sheridan, T. B.; Verplank, W. L.; Brooks, T. L.

    1978-01-01

    The potential of supervisory controlled teleoperators for accomplishment of manipulation and sensory tasks in deep ocean environments is discussed. Teleoperators and supervisory control are defined, the current problems of human divers are reviewed, and some assertions are made about why supervisory control has potential use to replace and extend human diver capabilities. The relative roles of man and computer and the variables involved in man-computer interaction are next discussed. Finally, a detailed description of a supervisory controlled teleoperator system, SUPERMAN, is presented.

  12. Alternative treatment technology information center computer database system

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, D. [Environmental Protection Agency, Edison, NJ (United States)

    1995-10-01

    The Alternative Treatment Technology Information Center (ATTIC) computer database system was developed pursuant to the 1986 Superfund law amendments. It provides up-to-date information on innovative treatment technologies to clean up hazardous waste sites. ATTIC v2.0 provides access to several independent databases as well as a mechanism for retrieving full-text documents of key literature. It can be accessed with a personal computer and modem 24 hours a day, and there are no user fees. ATTIC provides {open_quotes}one-stop shopping{close_quotes} for information on alternative treatment options by accessing several databases: (1) treatment technology database; this contains abstracts from the literature on all types of treatment technologies, including biological, chemical, physical, and thermal methods. The best literature as viewed by experts is highlighted. (2) treatability study database; this provides performance information on technologies to remove contaminants from wastewaters and soils. It is derived from treatability studies. This database is available through ATTIC or separately as a disk that can be mailed to you. (3) underground storage tank database; this presents information on underground storage tank corrective actions, surface spills, emergency response, and remedial actions. (4) oil/chemical spill database; this provides abstracts on treatment and disposal of spilled oil and chemicals. In addition to these separate databases, ATTIC allows immediate access to other disk-based systems such as the Vendor Information System for Innovative Treatment Technologies (VISITT) and the Bioremediation in the Field Search System (BFSS). The user may download these programs to their own PC via a high-speed modem. Also via modem, users are able to download entire documents through the ATTIC system. Currently, about fifty publications are available, including Superfund Innovative Technology Evaluation (SITE) program documents.

  13. Computational Electronic Structure of Antiferromagnetic Centers in Metalloproteins.

    Science.gov (United States)

    Rodriguez, Jorge H.

    2003-03-01

    Nature uses the properties of transition metal ions to carry out a variety of functions associated with vital life processes such as respiration and the transport of oxygen. Oxo-bridged diiron centers are intriguing structural motifs which are present in dioxygen transporting proteins and display antiferromagnetic ordering. We have performed a comprehensive study of the electronic structure and magnetic properties of structurally characterized models for diiron-oxo proteins. Results from Kohn-Sham density functional theory show that the models are antiferromagnetically coupled in agreement with experiment. The physical origin of the spin coupling has been elucidated as the main superexchange pathways responsible for magnetic ordering have been identified. In addition, the exchange constants that parameterize the Heisenberg Hamiltonian, H=JS_1.S_2, have been predicted in excellent agreement with experiment. Our results are important for establishing correlations between electronic structure and biomolecular function and show that computational electronic structure can be used as a powerful tool for the investigation of biomolecular magnetism.

  14. Applying Human Computation Methods to Information Science

    Science.gov (United States)

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  15. Developing a computational model of human hand kinetics using AVS

    Energy Technology Data Exchange (ETDEWEB)

    Abramowitz, Mark S. [State Univ. of New York, Binghamton, NY (United States)

    1996-05-01

    As part of an ongoing effort to develop a finite element model of the human hand at the Institute for Scientific Computing Research (ISCR), this project extended existing computational tools for analyzing and visualizing hand kinetics. These tools employ a commercial, scientific visualization package called AVS. FORTRAN and C code, originally written by David Giurintano of the Gillis W. Long Hansen`s Disease Center, was ported to a different computing platform, debugged, and documented. Usability features were added and the code was made more modular and readable. When the code is used to visualize bone movement and tendon paths for the thumb, graphical output is consistent with expected results. However, numerical values for forces and moments at the thumb joints do not yet appear to be accurate enough to be included in ISCR`s finite element model. Future work includes debugging the parts of the code that calculate forces and moments and verifying the correctness of these values.

  16. Soft Computing in Humanities and Social Sciences

    CERN Document Server

    González, Veronica

    2012-01-01

    The field of Soft Computing in Humanities and Social Sciences is at a turning point. The strong distinction between “science” and “humanities” has been criticized from many fronts and, at the same time, an increasing cooperation between the so-called “hard sciences” and “soft sciences” is taking place in a wide range of scientific projects dealing with very complex and interdisciplinary topics. In the last fifteen years the area of Soft Computing has also experienced a gradual rapprochement to disciplines in the Humanities and Social Sciences, and also in the field of Medicine, Biology and even the Arts, a phenomenon that did not occur much in the previous years.   The collection of this book presents a generous sampling of the new and burgeoning field of Soft Computing in Humanities and Social Sciences, bringing together a wide array of authors and subject matters from different disciplines. Some of the contributors of the book belong to the scientific and technical areas of Soft Computing w...

  17. Designing a knowledge management system for distributed activities: a human centered approach.

    Science.gov (United States)

    Rinkus, Susan; Johnson-Throop, Kathy A; Zhang, Jiajie

    2003-01-01

    In this study we use the principles of distributed cognition and the methodology of human-centered distributed information design to analyze a complex distributed human-computer system, identify its problems, and generate design requirements and implementation specifications of a replacement prototype for effective organizational memory and knowledge management. We argue that a distributed human-computer information system has unique properties, structures and processes that are best described in the language of distributed cognition. Distributed cognition provides researchers a richer theoretical understanding of human-computer interactions and enables re-searchers to capture the phenomenon that emerges in social interactions as well as the interactions between people and structures in their environment.

  18. Introduction to human-computer interaction

    CERN Document Server

    Booth, Paul

    2014-01-01

    Originally published in 1989 this title provided a comprehensive and authoritative introduction to the burgeoning discipline of human-computer interaction for students, academics, and those from industry who wished to know more about the subject. Assuming very little knowledge, the book provides an overview of the diverse research areas that were at the time only gradually building into a coherent and well-structured field. It aims to explain the underlying causes of the cognitive, social and organizational problems typically encountered when computer systems are introduced. It is clear and co

  19. 1st AAU Workshop on Human-Centered Robotics

    DEFF Research Database (Denmark)

    interaction among researchers from multiple relevant disciplines in the human-centered robotics, and consequently, to promote collaborations across departments of all faculties towards making our center a center of excellence in robotics. The workshop becomes a great success, with 13 presentations, attracting......The 2012 AAU Workshop on Human-Centered Robotics took place on 15 Nov. 2012, at Aalborg University, Aalborg. The workshop provides a platform for robotics researchers, including professors, PhD and Master students to exchange their ideas and latest results. The objective is to foster closer...... more than 45 participants from AAU, SDU, DTI and industrial companies as well. The proceedings contain 7 full papers selected out from the full papers submitted afterwards on the basis of workshop abstracts. The papers represent major research development of robotics at AAU, including medical robots...

  20. Human brain mapping: Experimental and computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)

    1998-11-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  1. Human brain mapping: Experimental and computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)

    1998-11-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  2. Center for computation and visualization of geometric structures. [Annual], Progress report

    Energy Technology Data Exchange (ETDEWEB)

    1993-02-12

    The mission of the Center is to establish a unified environment promoting research, education, and software and tool development. The work is centered on computing, interpreted in a broad sense to include the relevant theory, development of algorithms, and actual implementation. The research aspects of the Center are focused on geometry; correspondingly the computational aspects are focused on three (and higher) dimensional visualization. The educational aspects are likewise centered on computing and focused on geometry. A broader term than education is `communication` which encompasses the challenge of explaining to the world current research in mathematics, and specifically geometry.

  3. B3: Fuzzy-Based Data Center Load Optimization in Cloud Computing

    Directory of Open Access Journals (Sweden)

    M. Jaiganesh

    2013-01-01

    Full Text Available Cloud computing started a new era in getting variety of information puddles through various internet connections by any connective devices. It provides pay and use method for grasping the services by the clients. Data center is a sophisticated high definition server, which runs applications virtually in cloud computing. It moves the application, services, and data to a large data center. Data center provides more service level, which covers maximum of users. In order to find the overall load efficiency, the utilization service in data center is a definite task. Hence, we propose a novel method to find the efficiency of the data center in cloud computing. The goal is to optimize date center utilization in terms of three big factors—Bandwidth, Memory, and Central Processing Unit (CPU cycle. We constructed a fuzzy expert system model to obtain maximum Data Center Load Efficiency (DCLE in cloud computing environments. The advantage of the proposed system lies in DCLE computing. While computing, it allows regular evaluation of services to any number of clients. This approach indicates that the current cloud needs an order of magnitude in data center management to be used in next generation computing.

  4. Brain-Computer Interfaces Revolutionizing Human-Computer Interaction

    CERN Document Server

    Graimann, Bernhard; Allison, Brendan

    2010-01-01

    A brain-computer interface (BCI) establishes a direct output channel between the human brain and external devices. BCIs infer user intent via direct measures of brain activity and thus enable communication and control without movement. This book, authored by experts in the field, provides an accessible introduction to the neurophysiological and signal-processing background required for BCI, presents state-of-the-art non-invasive and invasive approaches, gives an overview of current hardware and software solutions, and reviews the most interesting as well as new, emerging BCI applications. The book is intended not only for students and young researchers, but also for newcomers and other readers from diverse backgrounds keen to learn about this vital scientific endeavour.

  5. Human-centered text mining: a new software system

    NARCIS (Netherlands)

    Poelmans, J.; Elzinga, P.; Neznanov, A.A.; Dedene, G.; Viaene, S.; Kuznetsov, S.

    2012-01-01

    In this paper we introduce a novel human-centered data mining software system which was designed to gain intelligence from unstructured textual data. The architecture takes its roots in several case studies which were a collaboration between the Amsterdam-Amstelland Police, GasthuisZusters Antwerpen

  6. Human-Centered Design Bill of Rights for Educators.

    Science.gov (United States)

    Sugar, William A.

    This paper presents a potential solution to encourage technology adoption and integration within schools by proposing a human-centered technology "bill of rights" for educators. The intention of this bill of rights it to influence educators' beliefs towards technology and to enable educators to confront with confidence the seemingly…

  7. Wooden Spaceships: Human-Centered Vehicle Design for Space

    Science.gov (United States)

    Twyford, Evan

    2009-01-01

    Presentation will focus on creative human centered design solutions in relation to manned space vehicle design and development in the NASA culture. We will talk about design process, iterative prototyping, mockup building and user testing and evaluation. We will take an inside look at how new space vehicle concepts are developed and designed for real life exploration scenarios.

  8. Human-Centered Design for the Personal Satellite Assistant

    Science.gov (United States)

    Bradshaw, Jeffrey M.; Sierhuis, Maarten; Gawdiak, Yuri; Thomas, Hans; Greaves, Mark; Clancey, William J.; Swanson, Keith (Technical Monitor)

    2000-01-01

    The Personal Satellite Assistant (PSA) is a softball-sized flying robot designed to operate autonomously onboard manned spacecraft in pressurized micro-gravity environments. We describe how the Brahms multi-agent modeling and simulation environment in conjunction with a KAoS agent teamwork approach can be used to support human-centered design for the PSA.

  9. Center for Advanced Energy Studies: Computer Assisted Virtual Environment (CAVE)

    Data.gov (United States)

    Federal Laboratory Consortium — The laboratory contains a four-walled 3D computer assisted virtual environment - or CAVE TM — that allows scientists and engineers to literally walk into their data...

  10. Diamond NV centers for quantum computing and quantum networks

    NARCIS (Netherlands)

    Childress, L.; Hanson, R.

    2013-01-01

    The exotic features of quantum mechanics have the potential to revolutionize information technologies. Using superposition and entanglement, a quantum processor could efficiently tackle problems inaccessible to current-day computers. Nonlocal correlations may be exploited for intrinsically secure co

  11. Computer Simulation of the Beating Human Heart

    Science.gov (United States)

    Peskin, Charles S.; McQueen, David M.

    2001-06-01

    The mechanical function of the human heart couples together the fluid mechanics of blood and the soft tissue mechanics of the muscular heart walls and flexible heart valve leaflets. We discuss a unified mathematical formulation of this problem in which the soft tissue looks like a specialized part of the fluid in which additional forces are applied. This leads to a computational scheme known as the Immersed Boundary (IB) method for solving the coupled equations of motion of the whole system. The IB method is used to construct a three-dimensional Virtual Heart, including representations of all four chambers of the heart and all four valves, in addition to the large arteries and veins that connect the heart to the rest of the circulation. The chambers, valves, and vessels are all modeled as collections of elastic (and where appropriate, actively contractile) fibers immersed in viscous incompressible fluid. Results are shown as a computer-generated video animation of the beating heart.

  12. The epistemology and ontology of human-computer interaction

    NARCIS (Netherlands)

    Brey, Philip

    2005-01-01

    This paper analyzes epistemological and ontological dimensions of Human-Computer Interaction (HCI) through an analysis of the functions of computer systems in relation to their users. It is argued that the primary relation between humans and computer systems has historically been epistemic: computer

  13. Human-Computer Interaction in Smart Environments

    Directory of Open Access Journals (Sweden)

    Gianluca Paravati

    2015-08-01

    Full Text Available Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  14. Computer Aided Design in Digital Human Modeling for Human Computer Interaction in Ergonomic Assessment: A Review

    Directory of Open Access Journals (Sweden)

    Suman Mukhopadhyay , Sanjib Kumar Das and Tania Chakraborty

    2012-12-01

    Full Text Available Research in Human-Computer Interaction (HCI hasbeen enormously successful in the area of computeraidedergonomics or human-centric designs. Perfectfit for people has always been a target for productdesign. Designers traditionally used anthropometricdimensions for 3D product design which created a lotof fitting problems when dealing with thecomplexities of the human body shapes. Computeraided design (CAD, also known as Computer aideddesign and drafting (CADD is the computertechnology used for the design processing and designdocumentation. CAD has now been used extensivelyin many applications such as automotive,shipbuilding, aerospace industries, architectural andindustrial designs, prosthetics, computer animationfor special effects in movies, advertising andtechnical manuals. As a technology, digital humanmodeling (DHM has rapidly emerged as atechnology that creates, manipulates and controlhuman representations and human-machine systemsscenes on computers for interactive ergonomic designproblem solving. DHM promises to profoundlychange how products or systems are designed, howergonomics analysis is performed, how disorders andimpairments are assessed and how therapies andsurgeries are conducted. The imperative andemerging need for the DHM appears to be consistentwith the fact that the past decade has witnessedsignificant growth in both the software systemsoffering DHM capabilities as well as the corporateadapting the technology.The authors shall dwell atlength and deliberate on how research in DHM hasfinally brought about an enhanced HCI, in thecontext of computer-aided ergonomics or humancentricdesign and discuss about future trends in thiscontext.

  15. Computers vs. Humans in Galaxy Classification

    Science.gov (United States)

    Kohler, Susanna

    2016-04-01

    In this age of large astronomical surveys, one major scientific bottleneck is the analysis of enormous data sets. Traditionally, this task requires human input but could computers eventually take over? A pair of scientists explore this question by testing whether computers can classify galaxies as well as humans.Examples of disagreement: galaxies that Galaxy-Zoo humans classified as spirals with 95% agreement, but the computer algorithm classified as ellipticals with 70% certainty. Most are cases where the computer got it wrong but not all of them. [Adapted from Kuminski et al. 2016]Limits of Citizen ScienceGalaxy Zoo is an internet-based citizen science project that uses non-astronomer volunteers to classify galaxy images. This is an innovative way to provide more manpower, but its still only practical for limited catalog sizes. How do we handle the data from upcoming surveys like the Large Synoptic Survey Telescope (LSST), which will produce billions of galaxy images when it comes online?In a recent study by Evan Kuminski and Lior Shamir, two computer scientists at Lawrence Technological University in Michigan, a machine learning algorithm known as Wndchrm was used to classify a dataset of Sloan Digital Sky Survey (SDSS) galaxies into ellipticals and spirals. The authors goal is to determine whether their algorithm can classify galaxies as accurately as the human volunteers for Galaxy Zoo.Automatic ClassificationAfter training their classifier on a small set of spiral and elliptical galaxies, Kuminski and Shamir set it loose on a catalog of ~3 million SDSS galaxies. The classifier first computes a set of 2,885 numerical descriptors (like textures, edges, and shapes) for each galaxy image, and then uses these descriptors to categorize the galaxy as spiral or elliptical.Rate of agreement of the computer classification with human classification (for the Galaxy Zoo superclean subset) for different ranges of computed classification certainties. For certainties above

  16. National Energy Research Scientific Computing Center 2007 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hules, John A.; Bashor, Jon; Wang, Ucilia; Yarris, Lynn; Preuss, Paul

    2008-10-23

    This report presents highlights of the research conducted on NERSC computers in a variety of scientific disciplines during the year 2007. It also reports on changes and upgrades to NERSC's systems and services aswell as activities of NERSC staff.

  17. Performance of Cloud Computing Centers with Multiple Priority Classes

    NARCIS (Netherlands)

    Ellens, W.; Zivkovic, M.; Akkerboom, J.D.; Litjens, R.; Berg, J.L. van den

    2012-01-01

    In this paper we consider the general problem of resource provisioning within cloud computing. We analyze the problem of how to allocate resources to different clients such that the service level agreements (SLAs) for all of these clients are met. A model with multiple service request classes genera

  18. Performance of Cloud Computing Centers with Multiple Priority Classes

    NARCIS (Netherlands)

    Ellens, W.; Zivkovic, Miroslav; Akkerboom, J.; Litjens, R.; van den Berg, Hans Leo

    In this paper we consider the general problem of resource provisioning within cloud computing. We analyze the problem of how to allocate resources to different clients such that the service level agreements (SLAs) for all of these clients are met. A model with multiple service request classes

  19. Centering in-the-large Computing referential discourse segments

    CERN Document Server

    Hahn, U; Hahn, Udo; Strube, Michael

    1997-01-01

    We specify an algorithm that builds up a hierarchy of referential discourse segments from local centering data. The spatial extension and nesting of these discourse segments constrain the reachability of potential antecedents of an anaphoric expression beyond the local level of adjacent center pairs. Thus, the centering model is scaled up to the level of the global referential structure of discourse. An empirical evaluation of the algorithm is supplied. From no-reply@xxx.lanl.gov Tue Oct 12 10:01 MET 1999 Received: from newmint.cern.ch (newmint.cern.ch [137.138.26.94]) by sundh98.cern.ch (8.8.5/8.8.5) with ESMTP id KAA04927 for ; Tue, 12 Oct 1999 10:01:50 +0200 (MET DST) Received: from uuu.lanl.gov (root@uuu.lanl.gov [204.121.6.59]) by newmint.cern.ch (8.9.3/8.9.3) with ESMTP id KAA17757 for ; Tue, 12 Oct 1999 10:01:49 +0200 (MET DST) Received: from xxx.lanl.gov (xxx.lanl.gov [204.121.6.57]) by uuu.lanl.gov (x.x.x/x.x.x) with ESMTP id CAA11514; Tue, 12 Oct 1999 02:01:42 -0600 (MDT) Received: (from e-prints@lo...

  20. Human-Computer Interaction The Agency Perspective

    CERN Document Server

    Oliveira, José

    2012-01-01

    Agent-centric theories, approaches and technologies are contributing to enrich interactions between users and computers. This book aims at highlighting the influence of the agency perspective in Human-Computer Interaction through a careful selection of research contributions. Split into five sections; Users as Agents, Agents and Accessibility, Agents and Interactions, Agent-centric Paradigms and Approaches, and Collective Agents, the book covers a wealth of novel, original and fully updated material, offering:   ü  To provide a coherent, in depth, and timely material on the agency perspective in HCI ü  To offer an authoritative treatment of the subject matter presented by carefully selected authors ü  To offer a balanced and broad coverage of the subject area, including, human, organizational, social, as well as technological concerns. ü  To offer a hands-on-experience by covering representative case studies and offering essential design guidelines   The book will appeal to a broad audience of resea...

  1. Computer vision research at Marshall Space Flight Center

    Science.gov (United States)

    Vinz, Frank L.

    1990-01-01

    Orbital docking, inspection, and sevicing are operations which have the potential for capability enhancement as well as cost reduction for space operations by the application of computer vision technology. Research at MSFC has been a natural outgrowth of orbital docking simulations for remote manually controlled vehicles such as the Teleoperator Retrieval System and the Orbital Maneuvering Vehicle (OMV). Baseline design of the OMV dictates teleoperator control from a ground station. This necessitates a high data-rate communication network and results in several seconds of time delay. Operational costs and vehicle control difficulties could be alleviated by an autonomous or semi-autonomous control system onboard the OMV which would be based on a computer vision system having capability to recognize video images in real time. A concept under development at MSFC with these attributes is based on syntactic pattern recognition. It uses tree graphs for rapid recognition of binary images of known orbiting target vehicles. This technique and others being investigated at MSFC will be evaluated in realistic conditions by the use of MSFC orbital docking simulators. Computer vision is also being applied at MSFC as part of the supporting development for Work Package One of Space Station Freedom.

  2. For operation of the Computer Software Management and Information Center (COSMIC)

    Science.gov (United States)

    Carmon, J. L.

    1983-01-01

    Progress report on current status of computer software management and information center (COSMIC) includes the following areas: inventory, evaluation and publication, marketing, customer service, maintenance and support, and budget summary.

  3. Leveraging human-centered design in chronic disease prevention.

    Science.gov (United States)

    Matheson, Gordon O; Pacione, Chris; Shultz, Rebecca K; Klügl, Martin

    2015-04-01

    Bridging the knowing-doing gap in the prevention of chronic disease requires deep appreciation and understanding of the complexities inherent in behavioral change. Strategies that have relied exclusively on the implementation of evidence-based data have not yielded the desired progress. The tools of human-centered design, used in conjunction with evidence-based data, hold much promise in providing an optimal approach for advancing disease prevention efforts. Directing the focus toward wide-scale education and application of human-centered design techniques among healthcare professionals will rapidly multiply their effective ability to bring the kind of substantial results in disease prevention that have eluded the healthcare industry for decades. This, in turn, would increase the likelihood of prevention by design.

  4. Cognitive approach to human-centered systems design

    Science.gov (United States)

    Taylor, Robert M.

    1996-04-01

    User requirements and system cognitive quality are considered in relation to the integration of new technology, in particular for aiding cognitive functions. Intuitive interfaces and display design matching user mental models and memory schema are identified as human-centered design strategies. Situational awareness is considered in terms of schema theory and perceptual control. A new method for measuring cognitive compatibility is described, and linked to the SRK taxonomy of human performance, in order to provide a framework for analyzing and specifying user cognitive requirements.

  5. High Volume Throughput Computing: Identifying and Characterizing Throughput Oriented Workloads in Data Centers

    CERN Document Server

    Zhan, Jianfeng; Sun, Ninghui; Wang, Lei; Jia, Zhen; Luo, Chunjie

    2012-01-01

    For the first time, this paper systematically identifies three categories of throughput oriented workloads in data centers: services, data processing applications, and interactive real-time applications, whose targets are to increase the volume of throughput in terms of processed requests or data, or supported maximum number of simultaneous subscribers, respectively, and we coins a new term high volume throughput computing (in short HVC) to describe those workloads and data center systems designed for them. We characterize and compare HVC with other computing paradigms, e.g., high throughput computing, warehouse-scale computing, and cloud computing, in terms of levels, workloads, metrics, coupling degree, data scales, and number of jobs or service instances. We also preliminarily report our ongoing work on the metrics and benchmarks for HVC systems, which is the foundation of designing innovative data center systems for HVC workloads.

  6. FcgammaRIIb expression on human germinal center B lymphocytes.

    Science.gov (United States)

    Macardle, Peter J; Mardell, Carolyn; Bailey, Sheree; Wheatland, Loretta; Ho, Alice; Jessup, Claire; Roberton, Donal M; Zola, Heddy

    2002-12-01

    IgG antibody can specifically suppress the antibody response to antigen. This has been explained by the hypothesis that signaling through the B cell antigen receptor is negatively modulated by the co-ligation of immunoglobulin with the receptor for IgG, FcgammaRIIb. We hypothesized that inhibitory signaling through FcgammaRIIb would be counter-productive in germinal center cells undergoing selection by affinity maturation, since these cells are thought to receive a survival/proliferative signal by interacting with antigen displayed on follicular dendritic cells. We have identified and characterized a population of B lymphocytes with low/negative FcgammaRIIb expression that are present in human tonsil. Phenotypically these cells correspond to germinal center B cells and comprise both centroblast and centrocyte populations. In examining expression at the molecular level we determined that these B cells do not express detectable mRNA for FcgammaRIIb. We examined several culture conditions to induce expression of FcgammaRIIb on germinal center cells but could not determine conditions that altered expression. We then examined the functional consequence of cross-linking membrane immunoglobulin and the receptor for IgG on human B lymphocytes. Our results cast some doubt on the value of anti-IgG as a model for antigen-antibody complexes in studying human B cell regulation.

  7. Intention and Usage of Computer Based Information Systems in Primary Health Centers

    Science.gov (United States)

    Hosizah; Kuntoro; Basuki N., Hari

    2016-01-01

    The computer-based information system (CBIS) is adopted by almost all of in health care setting, including the primary health center in East Java Province Indonesia. Some of softwares available were SIMPUS, SIMPUSTRONIK, SIKDA Generik, e-puskesmas. Unfortunately they were most of the primary health center did not successfully implemented. This…

  8. CNC Turning Center Advanced Operations. Computer Numerical Control Operator/Programmer. 444-332.

    Science.gov (United States)

    Skowronski, Steven D.; Tatum, Kenneth

    This student guide provides materials for a course designed to introduce the student to the operations and functions of a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 presents course expectations and syllabus, covers safety precautions, and describes the CNC turning center components, CNC…

  9. CNC Turning Center Advanced Operations. Computer Numerical Control Operator/Programmer. 444-332.

    Science.gov (United States)

    Skowronski, Steven D.; Tatum, Kenneth

    This student guide provides materials for a course designed to introduce the student to the operations and functions of a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 presents course expectations and syllabus, covers safety precautions, and describes the CNC turning center components, CNC…

  10. Human computer interaction using hand gestures

    CERN Document Server

    Premaratne, Prashan

    2014-01-01

    Human computer interaction (HCI) plays a vital role in bridging the 'Digital Divide', bringing people closer to consumer electronics control in the 'lounge'. Keyboards and mouse or remotes do alienate old and new generations alike from control interfaces. Hand Gesture Recognition systems bring hope of connecting people with machines in a natural way. This will lead to consumers being able to use their hands naturally to communicate with any electronic equipment in their 'lounge.' This monograph will include the state of the art hand gesture recognition approaches and how they evolved from their inception. The author would also detail his research in this area for the past 8 years and how the future might turn out to be using HCI. This monograph will serve as a valuable guide for researchers (who would endeavour into) in the world of HCI.

  11. Advanced Technologies, Embedded and Multimedia for Human-Centric Computing

    CERN Document Server

    Chao, Han-Chieh; Deng, Der-Jiunn; Park, James; HumanCom and EMC 2013

    2014-01-01

    The theme of HumanCom and EMC are focused on the various aspects of human-centric computing for advances in computer science and its applications, embedded and multimedia computing and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. And the theme of EMC (Advanced in Embedded and Multimedia Computing) is focused on the various aspects of embedded system, smart grid, cloud and multimedia computing, and it provides an opportunity for academic, industry professionals to discuss the latest issues and progress in the area of embedded and multimedia computing. Therefore this book will be include the various theories and practical applications in human-centric computing and embedded and multimedia computing.

  12. An Interdisciplinary Bibliography for Computers and the Humanities Courses.

    Science.gov (United States)

    Ehrlich, Heyward

    1991-01-01

    Presents an annotated bibliography of works related to the subject of computers and the humanities. Groups items into textbooks and overviews; introductions; human and computer languages; literary and linguistic analysis; artificial intelligence and robotics; social issue debates; computers' image in fiction; anthologies; writing and the…

  13. Scientific visualization in computational aerodynamics at NASA Ames Research Center

    Science.gov (United States)

    Bancroft, Gordon V.; Plessel, Todd; Merritt, Fergus; Walatka, Pamela P.; Watson, Val

    1989-01-01

    The visualization methods used in computational fluid dynamics research at the NASA-Ames Numerical Aerodynamic Simulation facility are examined, including postprocessing, tracking, and steering methods. The visualization requirements of the facility's three-dimensional graphical workstation are outlined and the types hardware and software used to meet these requirements are discussed. The main features of the facility's current and next-generation workstations are listed. Emphasis is given to postprocessing techniques, such as dynamic interactive viewing on the workstation and recording and playback on videodisk, tape, and 16-mm film. Postprocessing software packages are described, including a three-dimensional plotter, a surface modeler, a graphical animation system, a flow analysis software toolkit, and a real-time interactive particle-tracer.

  14. Viewpoint: professionalism and humanism beyond the academic health center.

    Science.gov (United States)

    Swick, Herbert M

    2007-11-01

    Medical professionalism and humanism have long been integral to the practice of medicine, and they will continue to shape practice in the 21st century. In recent years, many advances have been made in understanding the nature of medical professionalism and in efforts to teach and assess professional values and behaviors. As more and more teaching of both medical students and residents occurs in settings outside of academic medical centers, it is critically important that community physicians demonstrate behaviors that resonate professionalism and humanism. As teachers, they must be committed to being role models for what physicians should be. Activities that are designed to promote and advance professionalism, then, must take place not only in academic settings but also in clinical practice sites that are beyond the academic health center. The author argues that professionalism and humanism share common values and that each can enrich the other. Because the cauldron of practice threatens to erode traditional values of professionalism, not only for individual physicians but also for the medical profession, practicing physicians must incorporate into practice settings activities that are explicitly designed to exemplify those values, not only with students and patients, but also within their communities. The author cites a number of examples of ways in which professionalism and humanism can be fostered by individual physicians as well as professional organizations.

  15. Computational Models to Synthesize Human Walking

    Institute of Scientific and Technical Information of China (English)

    Lei Ren; David Howard; Laurence Kenney

    2006-01-01

    The synthesis of human walking is of great interest in biomechanics and biomimetic engineering due to its predictive capabilities and potential applications in clinical biomechanics, rehabilitation engineering and biomimetic robotics. In this paper,the various methods that have been used to synthesize humanwalking are reviewed from an engineering viewpoint. This involves a wide spectrum of approaches, from simple passive walking theories to large-scale computational models integrating the nervous, muscular and skeletal systems. These methods are roughly categorized under four headings: models inspired by the concept of a CPG (Central Pattern Generator), methods based on the principles of control engineering, predictive gait simulation using optimisation, and models inspired by passive walking theory. The shortcomings and advantages of these methods are examined, and future directions are discussed in the context of providing insights into the neural control objectives driving gait and improving the stability of the predicted gaits. Future advancements are likely to be motivated by improved understanding of neural control strategies and the subtle complexities of the musculoskeletal system during human locomotion. It is only a matter of time before predictive gait models become a practical and valuable tool in clinical diagnosis, rehabilitation engineering and robotics.

  16. High Performance Computing in Science and Engineering '16 : Transactions of the High Performance Computing Center, Stuttgart (HLRS) 2016

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2016. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  17. Non-intrusive human fatigue monitoring in command centers

    Science.gov (United States)

    Alsamman, A.; Ratecki, T.

    2011-04-01

    An inexpensive, non-intrusive, vision-based, active fatigue monitoring system is presented. The system employs a single consumer webcam that is modified to operate in the near-IR range. An active IR LED system is developed to facilitate the quick localization of the eye pupils. Imaging software tracks the eye features by analyzing intensity areas and their changes in the vicinity of localization. To quantify the level of fatigue the algorithm measures the opening of the eyelid, PERCLOS. The software developed runs on the workstation and is designed to draw limited computational power, so as to not interfere with the user task. To overcome low-frame rate and improve real-time monitoring, a two-phase detection and tacking algorithm is implemented. The results presented show that the system successfully monitors the level of fatigue at a low rate of 8 fps. The system is well suited to monitor users in command centers, flight control centers, airport traffic dispatchers, military operation and command centers, etc., but the work can be extended to wearable devices and other environments.

  18. 2012 International Conference on Human-centric Computing

    CERN Document Server

    Jin, Qun; Yeo, Martin; Hu, Bin; Human Centric Technology and Service in Smart Space, HumanCom 2012

    2012-01-01

    The theme of HumanCom is focused on the various aspects of human-centric computing for advances in computer science and its applications and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in human-centric computing. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject.

  19. Geometric Modeling and Reasoning of Human-Centered Freeform Products

    CERN Document Server

    Wang, Charlie C L

    2013-01-01

    The recent trend in user-customized product design requires the shape of products to be automatically adjusted according to the human body’s shape, so that people will feel more comfortable when wearing these products.  Geometric approaches can be used to design the freeform shape of products worn by people, which can greatly improve the efficiency of design processes in various industries involving customized products (e.g., garment design, toy design, jewel design, shoe design, and design of medical devices, etc.). These products are usually composed of very complex geometric shapes (represented by free-form surfaces), and are not driven by a parameter table but a digital human model with free-form shapes or part of human bodies (e.g., wrist, foot, and head models).   Geometric Modeling and Reasoning of Human-Centered Freeform Products introduces the algorithms of human body reconstruction, freeform product modeling, constraining and reconstructing freeform products, and shape optimization for improving...

  20. Removing the center from computing: biology's new mode of digital knowledge production.

    Science.gov (United States)

    November, Joseph

    2011-06-01

    This article shows how the USA's National Institutes of Health (NIH) helped to bring about a major shift in the way computers are used to produce knowledge and in the design of computers themselves as a consequence of its early 1960s efforts to introduce information technology to biologists. Starting in 1960 the NIH sought to reform the life sciences by encouraging researchers to make use of digital electronic computers, but despite generous federal support biologists generally did not embrace the new technology. Initially the blame fell on biologists' lack of appropriate (i.e. digital) data for computers to process. However, when the NIH consulted MIT computer architect Wesley Clark about this problem, he argued that the computer's quality as a device that was centralized posed an even greater challenge to potential biologist users than did the computer's need for digital data. Clark convinced the NIH that if the agency hoped to effectively computerize biology, it would need to satisfy biologists' experimental and institutional needs by providing them the means to use a computer without going to a computing center. With NIH support, Clark developed the 1963 Laboratory Instrument Computer (LINC), a small, real-time interactive computer intended to be used inside the laboratory and controlled entirely by its biologist users. Once built, the LINC provided a viable alternative to the 1960s norm of large computers housed in computing centers. As such, the LINC not only became popular among biologists, but also served in later decades as an important precursor of today's computing norm in the sciences and far beyond, the personal computer.

  1. Human-Computer Interaction and Information Management Research Needs

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — In a visionary future, Human-Computer Interaction HCI and Information Management IM have the potential to enable humans to better manage their lives through the use...

  2. Computational insight into nitration of human myoglobin.

    Science.gov (United States)

    Lin, Ying-Wu; Shu, Xiao-Gang; Du, Ke-Jie; Nie, Chang-Ming; Wen, Ge-Bo

    2014-10-01

    Protein nitration is an important post-translational modification regulating protein structure and function, especially for heme proteins. Myoglobin (Mb) is an ideal protein model for investigating the structure and function relationship of heme proteins. With limited structural information available for nitrated heme proteins from experiments, we herein performed a molecular dynamics study of human Mb with successive nitration of Tyr103, Tyr146, Trp7 and Trp14. We made a detailed comparison of protein motions, intramolecular contacts and internal cavities of nitrated Mbs with that of native Mb. It showed that although nitration of both Tyr103 and Tyr146 slightly alters the local conformation of heme active site, further nitration of both Trp7 and Trp14 shifts helix A apart from the rest of protein, which results in altered internal cavities and forms a water channel, representing an initial stage of Mb unfolding. The computational study provides an insight into the nitration of heme proteins at an atomic level, which is valuable for understanding the structure and function relationship of heme proteins in non-native states by nitration. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Lightness computation by the human visual system

    Science.gov (United States)

    Rudd, Michael E.

    2017-05-01

    A model of achromatic color computation by the human visual system is presented, which is shown to account in an exact quantitative way for a large body of appearance matching data collected with simple visual displays. The model equations are closely related to those of the original Retinex model of Land and McCann. However, the present model differs in important ways from Land and McCann's theory in that it invokes additional biological and perceptual mechanisms, including contrast gain control, different inherent neural gains for incremental, and decremental luminance steps, and two types of top-down influence on the perceptual weights applied to local luminance steps in the display: edge classification and spatial integration attentional windowing. Arguments are presented to support the claim that these various visual processes must be instantiated by a particular underlying neural architecture. By pointing to correspondences between the architecture of the model and findings from visual neurophysiology, this paper suggests that edge classification involves a top-down gating of neural edge responses in early visual cortex (cortical areas V1 and/or V2) while spatial integration windowing occurs in cortical area V4 or beyond.

  4. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  5. Life Sciences Division and Center for Human Genome Studies

    Energy Technology Data Exchange (ETDEWEB)

    Spitzmiller, D.; Bradbury, M.; Cram, S. (comps.)

    1992-05-01

    This report summarizes the research and development activities of Los Alamos National Laboratories Life Sciences Division and biological aspects of the Center for Human Genome Studies for the calendar year 1991. Selected research highlights include: yeast artificial chromosome libraries from flow sorted human chromosomes 16 and 21; distances between the antigen binding sites of three murine antibody subclasses measured using neutron and x-ray scattering; NFCR 10th anniversary highlights; kinase-mediated differences found in the cell cycle regulation of normal and transformed cells; and detecting mutations that cause Gaucher's disease by denaturing gradient gel electrophoresis. Project descriptions include: genomic structure and regulation, molecular structure, cytometry, cell growth and differentiation, radiation biology and carcinogenesis, and pulmonary biology.

  6. Building reactive copper centers in human carbonic anhydrase II.

    Science.gov (United States)

    Song, He; Weitz, Andrew C; Hendrich, Michael P; Lewis, Edwin A; Emerson, Joseph P

    2013-08-01

    Reengineering metalloproteins to generate new biologically relevant metal centers is an effective a way to test our understanding of the structural and mechanistic features that steer chemical transformations in biological systems. Here, we report thermodynamic data characterizing the formation of two type-2 copper sites in carbonic anhydrase and experimental evidence showing one of these new, copper centers has characteristics similar to a variety of well-characterized copper centers in synthetic models and enzymatic systems. Human carbonic anhydrase II is known to bind two Cu(2+) ions; these binding events were explored using modern isothermal titration calorimetry techniques that have become a proven method to accurately measure metal-binding thermodynamic parameters. The two Cu(2+)-binding events have different affinities (K a approximately 5 × 10(12) and 1 × 10(10)), and both are enthalpically driven processes. Reconstituting these Cu(2+) sites under a range of conditions has allowed us to assign the Cu(2+)-binding event to the three-histidine, native, metal-binding site. Our initial efforts to characterize these Cu(2+) sites have yielded data that show distinctive (and noncoupled) EPR signals associated with each copper-binding site and that this reconstituted enzyme can activate hydrogen peroxide to catalyze the oxidation of 2-aminophenol.

  7. On the Rhetorical Contract in Human-Computer Interaction.

    Science.gov (United States)

    Wenger, Michael J.

    1991-01-01

    An exploration of the rhetorical contract--i.e., the expectations for appropriate interaction--as it develops in human-computer interaction revealed that direct manipulation interfaces were more likely to establish social expectations. Study results suggest that the social nature of human-computer interactions can be examined with reference to the…

  8. Computer Modeling of Human Delta Opioid Receptor

    Directory of Open Access Journals (Sweden)

    Tatyana Dzimbova

    2013-04-01

    Full Text Available The development of selective agonists of δ-opioid receptor as well as the model of interaction of ligands with this receptor is the subjects of increased interest. In the absence of crystal structures of opioid receptors, 3D homology models with different templates have been reported in the literature. The problem is that these models are not available for widespread use. The aims of our study are: (1 to choose within recently published crystallographic structures templates for homology modeling of the human δ-opioid receptor (DOR; (2 to evaluate the models with different computational tools; and (3 to precise the most reliable model basing on correlation between docking data and in vitro bioassay results. The enkephalin analogues, as ligands used in this study, were previously synthesized by our group and their biological activity was evaluated. Several models of DOR were generated using different templates. All these models were evaluated by PROCHECK and MolProbity and relationship between docking data and in vitro results was determined. The best correlations received for the tested models of DOR were found between efficacy (erel of the compounds, calculated from in vitro experiments and Fitness scoring function from docking studies. New model of DOR was generated and evaluated by different approaches. This model has good GA341 value (0.99 from MODELLER, good values from PROCHECK (92.6% of most favored regions and MolProbity (99.5% of favored regions. Scoring function correlates (Pearson r = -0.7368, p-value = 0.0097 with erel of a series of enkephalin analogues, calculated from in vitro experiments. So, this investigation allows suggesting a reliable model of DOR. Newly generated model of DOR receptor could be used further for in silico experiments and it will give possibility for faster and more correct design of selective and effective ligands for δ-opioid receptor.

  9. Science Letters: Human-centered modeling for style-based adaptive games

    Institute of Scientific and Technical Information of China (English)

    Chee-onn WONG; Jon-gin KIM; Eun-jung HAN; Kee-chui JUNG

    2009-01-01

    This letter proposes a categorization matrix to analyze the playing style of a computer game player for a shooting game genre. Our aim is to use human-centered modeling as a strategy for adaptive games based on entertainment measure to evaluate the playing experience. We utilized a self-organizing map (SOM) to cluster the player's style with the data obtained while playing the game. We further argued that style-based adaptation contributes to higher enjoyment, and this is reflected in our experiment using a supervised multilayered perceptron (MLP) network.

  10. The inhuman computer/the too-human psychotherapist.

    Science.gov (United States)

    Nadelson, T

    1987-10-01

    There has been an understandable rejection by psychotherapists of any natural language processing (computer/human interaction by means of usual language exchange) which is intended to embrace aspects of psychotherapy. For at least twenty years therapists have experimented with computer programs for specific and general purpose with reported success. This paper describes some of the aspects of artificial intelligence used in computer-mediated or computer-assisted therapy and the utility of such efforts in general reevaluation of human-to-human psychotherapy.

  11. Brain-Computer Interfaces and Human-Computer Interaction

    NARCIS (Netherlands)

    Tan, Desney; Nijholt, Anton; Tan, Desney S.; Nijholt, Anton

    2010-01-01

    Advances in cognitive neuroscience and brain imaging technologies have started to provide us with the ability to interface directly with the human brain. This ability is made possible through the use of sensors that can monitor some of the physical processes that occur within the brain that correspo

  12. Brain-Computer Interfaces and Human-Computer Interaction

    NARCIS (Netherlands)

    Tan, Desney; Tan, Desney S.; Nijholt, Antinus

    2010-01-01

    Advances in cognitive neuroscience and brain imaging technologies have started to provide us with the ability to interface directly with the human brain. This ability is made possible through the use of sensors that can monitor some of the physical processes that occur within the brain that

  13. Simulating Human Cognitive Using Computational Verb Theory

    Institute of Scientific and Technical Information of China (English)

    YANGTao

    2004-01-01

    Modeling and simulation of a life system is closely connected to the modeling of cognition,especially for advanced life systems. The primary difference between an advanced life system and a digital computer is that the advanced life system consists of a body with mind while a digital computer is only a mind in a formal sense. To model an advanced life system one needs to symbols into a body where a digital computer is embedded. In this paper, a computational verb theory is proposed as a new paradigm of grounding symbols into the outputs of sensors. On one hand, a computational verb can preserve the physical "meanings" of the dynamics of sensor data such that a symbolic system can be used to manipulate physical meanings instead of abstract tokens in the digital computer. On the other hand, the physical meanings of an abstract symbol/token, which is usually an output of a reasoning process in the digital computer, can be restored and fed back to the actuators. Therefore, the computational verb theory bridges the gap between symbols and physical reality from the dynamic cognition perspective.

  14. Human-Computer Interaction (HCI) in Educational Environments: Implications of Understanding Computers as Media.

    Science.gov (United States)

    Berg, Gary A.

    2000-01-01

    Reviews literature in the field of human-computer interaction (HCI) as it applies to educational environments. Topics include the origin of HCI; human factors; usability; computer interface design; goals, operations, methods, and selection (GOMS) models; command language versus direct manipulation; hypertext; visual perception; interface…

  15. Human-Computer Etiquette Cultural Expectations and the Design Implications They Place on Computers and Technology

    CERN Document Server

    Hayes, Caroline C

    2010-01-01

    Written by experts from various fields, this edited collection explores a wide range of issues pertaining to how computers evoke human social expectations. The book illustrates how socially acceptable conventions can strongly impact the effectiveness of human-computer interactions and how to consider such norms in the design of human-computer interfaces. Providing a complete introduction to the design of social responses to computers, the text emphasizes the value of social norms in the development of usable and enjoyable technology. It also describes the role of socially correct behavior in t

  16. Safety Metrics for Human-Computer Controlled Systems

    Science.gov (United States)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  17. Life Sciences Division and Center for Human Genome Studies 1994

    Energy Technology Data Exchange (ETDEWEB)

    Cram, L.S.; Stafford, C. [comp.

    1995-09-01

    This report summarizes the research and development activities of the Los Alamos National Laboratory`s Life Sciences Division and the biological aspects of the Center for Human Genome Studies for the calendar year 1994. The technical portion of the report is divided into two parts, (1) selected research highlights and (2) research projects and accomplishments. The research highlights provide a more detailed description of a select set of projects. A technical description of all projects is presented in sufficient detail so that the informed reader will be able to assess the scope and significance of each project. Summaries useful to the casual reader desiring general information have been prepared by the group leaders and appear in each group overview. Investigators on the staff of the Life Sciences Division will be pleased to provide further information.

  18. Dynamic Principles of Center of Mass in Human Walking

    CERN Document Server

    Fan, Yifang; Fan, Yubo; Li, Zhiyu; Lv, Changsheng

    2010-01-01

    We present results of an analytic and numerical calculation that studies the relationship between the time of initial foot contact and the ground reaction force of human gait and explores the dynamic principle of center of mass. Assuming the ground reaction force of both feet to be the same in the same phase of a stride cycle, we establish the relationships between the time of initial foot contact and the ground reaction force, acceleration, velocity, displacement and average kinetic energy of center of mass. We employ the dispersion to analyze the effect of the time of the initial foot contact that imposes upon these physical quantities. Our study reveals that when the time of one foot's initial contact falls right in the middle of the other foot's stride cycle, these physical quantities reach extrema. An action function has been identified as the dispersion of the physical quantities and optimized analysis used to prove the least-action principle in gait. In addition to being very significant to the researc...

  19. Applying Human-Centered Design Methods to Scientific Communication Products

    Science.gov (United States)

    Burkett, E. R.; Jayanty, N. K.; DeGroot, R. M.

    2016-12-01

    Knowing your users is a critical part of developing anything to be used or experienced by a human being. User interviews, journey maps, and personas are all techniques commonly employed in human-centered design practices because they have proven effective for informing the design of products and services that meet the needs of users. Many non-designers are unaware of the usefulness of personas and journey maps. Scientists who are interested in developing more effective products and communication can adopt and employ user-centered design approaches to better reach intended audiences. Journey mapping is a qualitative data-collection method that captures the story of a user's experience over time as related to the situation or product that requires development or improvement. Journey maps help define user expectations, where they are coming from, what they want to achieve, what questions they have, their challenges, and the gaps and opportunities that can be addressed by designing for them. A persona is a tool used to describe the goals and behavioral patterns of a subset of potential users or customers. The persona is a qualitative data model that takes the form of a character profile, built upon data about the behaviors and needs of multiple users. Gathering data directly from users avoids the risk of basing models on assumptions, which are often limited by misconceptions or gaps in understanding. Journey maps and user interviews together provide the data necessary to build the composite character that is the persona. Because a persona models the behaviors and needs of the target audience, it can then be used to make informed product design decisions. We share the methods and advantages of developing and using personas and journey maps to create more effective science communication products.

  20. Design experience of a base-isolation system applied to a computer center building

    Energy Technology Data Exchange (ETDEWEB)

    Hasebe, Akiyoshi; Kojima, Hideo; Tamura, Kazuo (Tohoku Electric Power Co., Sendai (Japan))

    1991-06-01

    Design experience of the base-isolated new computer center of the Tohoku Electric Power Co. is described. This building after completion will be the largest isolated building in Japan with a total floor space of {proportional to} 10,000 m{sup 2}. High-damping laminated rubber bearings are used as base-isolation devices. (orig.).

  1. CNC Turning Center Operations and Prove Out. Computer Numerical Control Operator/Programmer. 444-334.

    Science.gov (United States)

    Skowronski, Steven D.

    This student guide provides materials for a course designed to instruct the student in the recommended procedures used when setting up tooling and verifying part programs for a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 discusses course content and reviews and demonstrates set-up procedures…

  2. SAM: The "Search and Match" Computer Program of the Escherichia coli Genetic Stock Center

    Science.gov (United States)

    Bachmann, B. J.; And Others

    1973-01-01

    Describes a computer program used at a genetic stock center to locate particular strains of bacteria. The program can match up to 30 strain descriptions requested by a researcher with the records on file. Uses of this particular program can be made in many fields. (PS)

  3. CNC Turning Center Operations and Prove Out. Computer Numerical Control Operator/Programmer. 444-334.

    Science.gov (United States)

    Skowronski, Steven D.

    This student guide provides materials for a course designed to instruct the student in the recommended procedures used when setting up tooling and verifying part programs for a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 discusses course content and reviews and demonstrates set-up procedures…

  4. CENTER CONDITIONS AND CYCLICITY FOR A FAMILY OF CUBIC SYSTEMS: COMPUTER ALGEBRA APPROACH.

    Science.gov (United States)

    Ferčec, Brigita; Mahdi, Adam

    2013-01-01

    Using methods of computational algebra we obtain an upper bound for the cyclicity of a family of cubic systems. We overcame the problem of nonradicality of the associated Bautin ideal by moving from the ring of polynomials to a coordinate ring. Finally, we determine the number of limit cycles bifurcating from each component of the center variety.

  5. Supporting Negotiation Behavior with Haptics-Enabled Human-Computer Interfaces.

    Science.gov (United States)

    Oguz, S O; Kucukyilmaz, A; Sezgin, Tevfik Metin; Basdogan, C

    2012-01-01

    An active research goal for human-computer interaction is to allow humans to communicate with computers in an intuitive and natural fashion, especially in real-life interaction scenarios. One approach that has been advocated to achieve this has been to build computer systems with human-like qualities and capabilities. In this paper, we present insight on how human-computer interaction can be enriched by employing the computers with behavioral patterns that naturally appear in human-human negotiation scenarios. For this purpose, we introduce a two-party negotiation game specifically built for studying the effectiveness of haptic and audio-visual cues in conveying negotiation related behaviors. The game is centered around a real-time continuous two-party negotiation scenario based on the existing game-theory and negotiation literature. During the game, humans are confronted with a computer opponent, which can display different behaviors, such as concession, competition, and negotiation. Through a user study, we show that the behaviors that are associated with human negotiation can be incorporated into human-computer interaction, and the addition of haptic cues provides a statistically significant increase in the human-recognition accuracy of machine-displayed behaviors. In addition to aspects of conveying these negotiation-related behaviors, we also focus on and report game-theoretical aspects of the overall interaction experience. In particular, we show that, as reported in the game-theory literature, certain negotiation strategies such as tit-for-tat may generate maximum combined utility for the negotiating parties, providing an excellent balance between the energy spent by the user and the combined utility of the negotiating parties.

  6. Accurate Computation of Periodic Regions' Centers in the General M-Set with Integer Index Number

    Directory of Open Access Journals (Sweden)

    Wang Xingyuan

    2010-01-01

    Full Text Available This paper presents two methods for accurately computing the periodic regions' centers. One method fits for the general M-sets with integer index number, the other fits for the general M-sets with negative integer index number. Both methods improve the precision of computation by transforming the polynomial equations which determine the periodic regions' centers. We primarily discuss the general M-sets with negative integer index, and analyze the relationship between the number of periodic regions' centers on the principal symmetric axis and in the principal symmetric interior. We can get the centers' coordinates with at least 48 significant digits after the decimal point in both real and imaginary parts by applying the Newton's method to the transformed polynomial equation which determine the periodic regions' centers. In this paper, we list some centers' coordinates of general M-sets' k-periodic regions (k=3,4,5,6 for the index numbers α=−25,−24,…,−1 , all of which have highly numerical accuracy.

  7. Data center network architecture in cloud computing:review, taxonomy, and open research issues

    Institute of Scientific and Technical Information of China (English)

    Han QI; Muhammad SHIRAZ; Jie-yao LIU; Abdullah GANI; Zulkanain ABDUL RAHMAN; Torki AALTAMEEM

    2014-01-01

    The data center network (DCN), which is an important component of data centers, consists of a large number of hosted servers and switches connected with high speed communication links. A DCN enables the deployment of resources centralization and on-demand access of the information and services of data centers to users. In recent years, the scale of the DCN has constantly increased with the widespread use of cloud-based services and the unprecedented amount of data delivery in/between data centers, whereas the traditional DCN architecture lacks aggregate bandwidth, scalability, and cost effectiveness for coping with the increasing demands of tenants in accessing the services of cloud data centers. Therefore, the design of a novel DCN architecture with the features of scalability, low cost, robustness, and energy conservation is required. This paper reviews the recent research fi ndings and technologies of DCN architectures to identify the issues in the existing DCN architectures for cloud computing. We develop a taxonomy for the classifi cation of the current DCN architectures, and also qualitatively analyze the traditional and contemporary DCN architectures. Moreover, the DCN architectures are compared on the basis of the signifi cant characteristics, such as bandwidth, fault tolerance, scalability, overhead, and deployment cost. Finally, we put forward open research issues in the deployment of scalable, low-cost, robust, and energy-efficient DCN architecture, for data centers in computational clouds.

  8. Rationale awareness for quality assurance in iterative human computation processes

    CERN Document Server

    Xiao, Lu

    2012-01-01

    Human computation refers to the outsourcing of computation tasks to human workers. It offers a new direction for solving a variety of problems and calls for innovative ways of managing human computation processes. The majority of human computation tasks take a parallel approach, whereas the potential of an iterative approach, i.e., having workers iteratively build on each other's work, has not been sufficiently explored. This study investigates whether and how human workers' awareness of previous workers' rationales affects the performance of the iterative approach in a brainstorming task and a rating task. Rather than viewing this work as a conclusive piece, the author believes that this research endeavor is just the beginning of a new research focus that examines and supports meta-cognitive processes in crowdsourcing activities.

  9. Naturalistic Cognition: A Research Paradigm for Human-Centered Design

    Directory of Open Access Journals (Sweden)

    Peter Storkerson

    2010-01-01

    Full Text Available Naturalistic thinking and knowing, the tacit, experiential, and intuitive reasoning of everyday interaction, have long been regarded as inferior to formal reason and labeled primitive, fallible, subjective, superstitious, and in some cases ineffable. But, naturalistic thinking is more rational and definable than it appears. It is also relevant to design. Inquiry into the mechanisms of naturalistic thinking and knowledge can bring its resources into focus and enable designers to create better, human-centered designs for use in real-world settings. This article makes a case for the explicit, formal study of implicit, naturalistic thinking within the fields of design. It develops a framework for defining and studying naturalistic thinking and knowledge, for integrating them into design research and practice, and for developing a more integrated, consistent theory of knowledge in design. It will (a outline historical definitions of knowledge, attitudes toward formal and naturalistic thinking, and the difficulties presented by the co-presence of formal and naturalistic thinking in design, (b define and contrast formal and naturalistic thinking as two distinct human cognitive systems, (c demonstrate the importance of naturalistic cognition in formal thinking and real-world judgment, (d demonstrate methods for researching naturalistic thinking that can be of use in design, and (e briefly discuss the impact on design theory of admitting naturalistic thinking as valid, systematic, and knowable.

  10. Improving flight condition situational awareness through Human Centered Design.

    Science.gov (United States)

    Craig, Carol

    2012-01-01

    In aviation, there is currently a lack of accurate and timely situational information, specifically weather data, which is essential when dealing with the unpredictable complexities that can arise while flying. For example, weather conditions that require immediate evasive action by the flight crew, such as isolated heavy rain, micro bursts, and atmospheric turbulence, require that the flight crew receive near real-time and precise information about the type, position, and intensity of those conditions. Human factors issues arise in considering how to display the various sources of weather information to the users of that information and how to integrate this display into the existing environment. In designing weather information display systems, it is necessary to meet the demands of different users, which requires an examination of the way in which the users process and use weather information. Using Human Centered Design methodologies and concepts will result in a safer, more efficient and more intuitive solution. Specific goals of this approach include 1) Enabling better fuel planning; 2) Allowing better divert strategies; 3) Ensuring pilots, navigators, dispatchers and mission planners are referencing weather from the same sources; 4) Improving aircrew awareness of aviation hazards such as turbulence, icing, hail and convective activity; 5) Addressing inconsistent availability of hazard forecasts outside the United States Air Defense Identification Zone (ADIZ); and 6) Promoting goal driven approaches versus event driven (prediction).

  11. A Survey of Digital Humanities Centers in the United States. CLIR Publication No. 143

    Science.gov (United States)

    Zorich, Diane M.

    2008-01-01

    In preparation for the 2008 Scholarly Communications Institute (SCI 6) focused on humanities research centers, the Council on Library and Information Resources (CLIR) commissioned a survey of digital humanities centers (DHCs). The immediate goals of the survey were to identify the extent of these centers and to explore their financing,…

  12. Pedagogical Strategies for Human and Computer Tutoring.

    Science.gov (United States)

    Reiser, Brian J.

    The pedagogical strategies of human tutors in problem solving domains are described and the possibility of incorporating these techniques into computerized tutors is examined. GIL (Graphical Instruction in LISP), an intelligent tutoring system for LISP programming, is compared to human tutors teaching the same material in order to identify how the…

  13. Shared resource control between human and computer

    Science.gov (United States)

    Hendler, James; Wilson, Reid

    1989-01-01

    The advantages of an AI system of actively monitoring human control of a shared resource (such as a telerobotic manipulator) are presented. A system is described in which a simple AI planning program gains efficiency by monitoring human actions and recognizing when the actions cause a change in the system's assumed state of the world. This enables the planner to recognize when an interaction occurs between human actions and system goals, and allows maintenance of an up-to-date knowledge of the state of the world and thus informs the operator when human action would undo a goal achieved by the system, when an action would render a system goal unachievable, and efficiently replans the establishment of goals after human intervention.

  14. Computational Intelligence in a Human Brain Model

    Directory of Open Access Journals (Sweden)

    Viorel Gaftea

    2016-06-01

    Full Text Available This paper focuses on the current trends in brain research domain and the current stage of development of research for software and hardware solutions, communication capabilities between: human beings and machines, new technologies, nano-science and Internet of Things (IoT devices. The proposed model for Human Brain assumes main similitude between human intelligence and the chess game thinking process. Tactical & strategic reasoning and the need to follow the rules of the chess game, all are very similar with the activities of the human brain. The main objective for a living being and the chess game player are the same: securing a position, surviving and eliminating the adversaries. The brain resolves these goals, and more, the being movement, actions and speech are sustained by the vital five senses and equilibrium. The chess game strategy helps us understand the human brain better and easier replicate in the proposed ‘Software and Hardware’ SAH Model.

  15. Computational Intelligence in a Human Brain Model

    Directory of Open Access Journals (Sweden)

    Viorel Gaftea

    2016-06-01

    Full Text Available This paper focuses on the current trends in brain research domain and the current stage of development of research for software and hardware solutions, communication capabilities between: human beings and machines, new technologies, nano-science and Internet of Things (IoT devices. The proposed model for Human Brain assumes main similitude between human intelligence and the chess game thinking process. Tactical & strategic reasoning and the need to follow the rules of the chess game, all are very similar with the activities of the human brain. The main objective for a living being and the chess game player are the same: securing a position, surviving and eliminating the adversaries. The brain resolves these goals, and more, the being movement, actions and speech are sustained by the vital five senses and equilibrium. The chess game strategy helps us understand the human brain better and easier replicate in the proposed ‘Software and Hardware’ SAH Model.

  16. Investigating Impact Metrics for Performance for the US EPA National Center for Computational Toxicology (ACS Fall meeting)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  17. A Glance into the Future of Human Computer Interactions

    CERN Document Server

    Farooq, Umer; Nazir, Sohail

    2011-01-01

    Computers have a direct impact on our lives nowadays. Human's interaction with the computer has modified with the passage of time as improvement in technology occurred the better the human computer interaction became. Today we are facilitated by the operating system that has reduced all the complexity of hardware and we undergo our computation in a very convenient way irrespective of the process occurring at the hardware level. Though the human computer interaction has improved but it's not done yet. If we come to the future the computer's role in our lives would be a lot more rather our life would be of the artificial intelligence. In our future the biggest resource would be component of time and wasting time for a key board entry or a mouse input would be unbearable so the need would be of the computer interaction environment that along with the complexity reduction also minimizes the time wastage in the human computer interaction. Accordingly in our future the computation would also be increased it would n...

  18. A Glance into the Future of Human Computer Interaction

    CERN Document Server

    Farooq, Umer; Nazir, Sohail

    2011-01-01

    Computers have a direct impact on our lives nowadays. Human's interaction with the computer has modified with the passage of time as improvement in technology occurred the better the human computer interaction became. Today we are facilitated by the operating system that has reduced all the complexity of hardware and we undergo our computation in a very convenient way irrespective of the process occurring at the hardware level. Though the human computer interaction has improved but it's not done yet. If we come to the future the computer's role in our lives would be a lot more rather our life would be of the artificial intelligence. In our future the biggest resource would be component of time and wasting time for a key board entry or a mouse input would be unbearable so the need would be of the computer interaction environment that along with the complexity reduction also minimizes the time wastage in the human computer interaction. Accordingly in our future the computation would also be increased it would n...

  19. Can the human brain do quantum computing?

    Science.gov (United States)

    Rocha, A F; Massad, E; Coutinho, F A B

    2004-01-01

    The electrical membrane properties have been the key issues in the understanding of the cerebral physiology for more than almost two centuries. But, molecular neurobiology has now discovered that biochemical transactions play an important role in neuronal computations. Quantum computing (QC) is becoming a reality both from the theoretical point of view as well as from practical applications. Quantum mechanics is the most accurate description at atomic level and it lies behind all chemistry that provides the basis for biology ... maybe the magic of entanglement is also crucial for life. The purpose of the present paper is to discuss the dendrite spine as a quantum computing device, taking into account what is known about the physiology of the glutamate receptors and the cascade of biochemical transactions triggered by the glutamate binding to these receptors.

  20. High Performance Computing in Science and Engineering '08 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2009-01-01

    The discussions and plans on all scienti?c, advisory, and political levels to realize an even larger “European Supercomputer” in Germany, where the hardware costs alone will be hundreds of millions Euro – much more than in the past – are getting closer to realization. As part of the strategy, the three national supercomputing centres HLRS (Stuttgart), NIC/JSC (Julic ¨ h) and LRZ (Munich) have formed the Gauss Centre for Supercomputing (GCS) as a new virtual organization enabled by an agreement between the Federal Ministry of Education and Research (BMBF) and the state ministries for research of Baden-Wurttem ¨ berg, Bayern, and Nordrhein-Westfalen. Already today, the GCS provides the most powerful high-performance computing - frastructure in Europe. Through GCS, HLRS participates in the European project PRACE (Partnership for Advances Computing in Europe) and - tends its reach to all European member countries. These activities aligns well with the activities of HLRS in the European HPC infrastructur...

  1. The GridKa Tier-1 Computing Center within the ALICE Grid Framework

    Science.gov (United States)

    Park, WooJin J.; Christopher, Jung; Heiss, Andreas; Petzold, Andreas; Schwarz, Kilian

    2014-06-01

    The GridKa computing center, hosted by Steinbuch Centre for Computing at the Karlsruhe Institute for Technology (KIT) in Germany, is serving as the largest Tier-1 center used by the ALICE collaboration at the LHC. In 2013, GridKa provides 30k HEPSPEC06, 2.7 PB of disk space, and 5.25 PB of tape storage to ALICE. The 10Gbit/s network connections from GridKa to CERN, several Tier-1 centers and the general purpose network are used by ALICE intensively. In 2012 a total amount of ~1 PB was transferred to and from GridKa. As Grid framework, AliEn (ALICE Environment) is being used to access the resources, and various monitoring tools including the MonALISA (MONitoring Agent using a Large Integrated Services Architecture) are always running to alert in case of any problem. GridKa on-call engineers provide 24/7 support to guarantee minimal loss of availability of computing and storage resources in case of hardware or software problems. We introduce the GridKa Tier-1 center from the viewpoint of ALICE services.

  2. Human-computer interaction and management information systems

    CERN Document Server

    Galletta, Dennis F

    2014-01-01

    ""Human-Computer Interaction and Management Information Systems: Applications"" offers state-of-the-art research by a distinguished set of authors who span the MIS and HCI fields. The original chapters provide authoritative commentaries and in-depth descriptions of research programs that will guide 21st century scholars, graduate students, and industry professionals. Human-Computer Interaction (or Human Factors) in MIS is concerned with the ways humans interact with information, technologies, and tasks, especially in business, managerial, organizational, and cultural contexts. It is distinctiv

  3. STUDY ON HUMAN-COMPUTER SYSTEM FOR STABLE VIRTUAL DISASSEMBLY

    Institute of Scientific and Technical Information of China (English)

    Guan Qiang; Zhang Shensheng; Liu Jihong; Cao Pengbing; Zhong Yifang

    2003-01-01

    The cooperative work between human being and computer based on virtual reality (VR) is investigated to plan the disassembly sequences more efficiently. A three-layer model of human-computer cooperative virtual disassembly is built, and the corresponding human-computer system for stable virtual disassembly is developed. In this system, an immersive and interactive virtual disassembly environment has been created to provide planners with a more visual working scene. For cooperative disassembly, an intelligent module of stability analysis of disassembly operations is embedded into the human-computer system to assist planners to implement disassembly tasks better. The supporting matrix for stability analysis of disassembly operations is defined and the method of stability analysis is detailed. Based on the approach, the stability of any disassembly operation can be analyzed to instruct the manual virtual disassembly. At last, a disassembly case in the virtual environment is given to prove the validity of above ideas.

  4. Cognition beyond the brain computation, interactivity and human artifice

    CERN Document Server

    Cowley, Stephen J

    2013-01-01

    Arguing that a collective dimension has given cognitive flexibility to human intelligence, this book shows that traditional cognitive psychology underplays the role of bodies, dialogue, diagrams, tools, talk, customs, habits, computers and cultural practices.

  5. A Taxonomy and Survey of Energy-Efficient Data Centers and Cloud Computing Systems

    CERN Document Server

    Beloglazov, Anton; Lee, Young Choon; Zomaya, Albert

    2010-01-01

    Traditionally, the development of computing systems has been focused on performance improvements driven by the demand of applications from consumer, scientific and business domains. However, the ever increasing energy consumption of computing systems has started to limit further performance growth due to overwhelming electricity bills and carbon dioxide footprints. Therefore, the goal of the computer system design has been shifted to power and energy efficiency. To identify open challenges in the area and facilitate future advancements it is essential to synthesize and classify the research on power and energy-efficient design conducted to date. In this work we discuss causes and problems of high power / energy consumption, and present a taxonomy of energy-efficient design of computing systems covering the hardware, operating system, virtualization and data center levels. We survey various key works in the area and map them to our taxonomy to guide future design and development efforts. This chapter is conclu...

  6. 75 FR 76995 - National Toxicology Program (NTP); Center for the Evaluation of Risks to Human Reproduction...

    Science.gov (United States)

    2010-12-10

    ... established the NTP Center for the Evaluation of Risks to Human Reproduction (CERHR) in 1998 (63 FR 68782) to... HUMAN SERVICES National Toxicology Program (NTP); Center for the Evaluation of Risks to Human Reproduction (CERHR); NTP Workshop: Role of Environmental Chemicals in the Development of Diabetes and...

  7. Argonne's Laboratory Computing Resource Center 2009 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B. (CLS-CI)

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  8. Computational modeling and analysis of the hydrodynamics of human swimming

    Science.gov (United States)

    von Loebbecke, Alfred

    Computational modeling and simulations are used to investigate the hydrodynamics of competitive human swimming. The simulations employ an immersed boundary (IB) solver that allows us to simulate viscous, incompressible, unsteady flow past complex, moving/deforming three-dimensional bodies on stationary Cartesian grids. This study focuses on the hydrodynamics of the "dolphin kick". Three female and two male Olympic level swimmers are used to develop kinematically accurate models of this stroke for the simulations. A simulation of a dolphin undergoing its natural swimming motion is also presented for comparison. CFD enables the calculation of flow variables throughout the domain and over the swimmer's body surface during the entire kick cycle. The feet are responsible for all thrust generation in the dolphin kick. Moreover, it is found that the down-kick (ventral position) produces more thrust than the up-kick. A quantity of interest to the swimming community is the drag of a swimmer in motion (active drag). Accurate estimates of this quantity have been difficult to obtain in experiments but are easily calculated with CFD simulations. Propulsive efficiencies of the human swimmers are found to be in the range of 11% to 30%. The dolphin simulation case has a much higher efficiency of 55%. Investigation of vortex structures in the wake indicate that the down-kick can produce a vortex ring with a jet of accelerated fluid flowing through its center. This vortex ring and the accompanying jet are the primary thrust generating mechanisms in the human dolphin kick. In an attempt to understand the propulsive mechanisms of surface strokes, we have also conducted a computational analysis of two different styles of arm-pulls in the backstroke and the front crawl. These simulations involve only the arm and no air-water interface is included. Two of the four strokes are specifically designed to take advantage of lift-based propulsion by undergoing lateral motions of the hand

  9. Computer games as a new ontological reality of human existence

    Directory of Open Access Journals (Sweden)

    Maksim Shymeiko

    2015-05-01

    Full Text Available The article considers the ontological dimension of the phenomenon of computer games and their role in the perception of modern man in the world and himself. Describes the characteristic features of the ontological computer game as a virtual world that has an intangible character. Reveals the positive and negative features of computer games in the formation of the meaning of human life.

  10. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  11. Mitochondria in the Center of Human Eosinophil Apoptosis and Survival

    Directory of Open Access Journals (Sweden)

    Pinja Ilmarinen

    2014-03-01

    Full Text Available Eosinophils are abundantly present in most phenotypes of asthma and they contribute to the maintenance and exacerbations of the disease. Regulators of eosinophil longevity play critical roles in determining whether eosinophils accumulate into the airways of asthmatics. Several cytokines enhance eosinophil survival promoting eosinophilic airway inflammation while for example glucocorticoids, the most important anti-inflammatory drugs used to treat asthma, promote the intrinsic pathway of eosinophil apoptosis and by this mechanism contribute to the resolution of eosinophilic airway inflammation. Mitochondria seem to play central roles in both intrinsic mitochondrion-centered and extrinsic receptor-mediated pathways of apoptosis in eosinophils. Mitochondria may also be important for survival signalling. In addition to glucocorticoids, another important agent that regulates human eosinophil longevity via mitochondrial route is nitric oxide, which is present in increased amounts in the airways of asthmatics. Nitric oxide seems to be able to trigger both survival and apoptosis in eosinophils. This review discusses the current evidence of the mechanisms of induced eosinophil apoptosis and survival focusing on the role of mitochondria and clinically relevant stimulants, such as glucocorticoids and nitric oxide.

  12. From STEM to STEAM: Toward a Human-Centered Education

    Science.gov (United States)

    Boy, Guy A.

    2013-01-01

    The 20th century was based on local linear engineering of complicated systems. We made cars, airplanes and chemical plants for example. The 21st century has opened a new basis for holistic non-linear design of complex systems, such as the Internet, air traffic management and nanotechnologies. Complexity, interconnectivity, interaction and communication are major attributes of our evolving society. But, more interestingly, we have started to understand that chaos theories may be more important than reductionism, to better understand and thrive on our planet. Systems need to be investigated and tested as wholes, which requires a cross-disciplinary approach and new conceptual principles and tools. Consequently, schools cannot continue to teach isolated disciplines based on simple reductionism. Science; Technology, Engineering, and Mathematics (STEM) should be integrated together with the Arts1 to promote creativity together with rationalization, and move to STEAM (with an "A" for Arts). This new concept emphasizes the possibility of longer-term socio-technical futures instead of short-term financial predictions that currently lead to uncontrolled economies. Human-centered design (HCD) can contribute to improving STEAM education technologies, systems and practices. HCD not only provides tools and techniques to build useful and usable things, but also an integrated approach to learning by doing, expressing and critiquing, exploring possible futures, and understanding complex systems.

  13. High-fidelity quantum memory using nitrogen-vacancy center ensemble for hybrid quantum computation

    CERN Document Server

    Yang, W L; Hu, Y; Feng, M; Du, J F

    2011-01-01

    We study a hybrid quantum computing system using nitrogen-vacancy center ensemble (NVE) as quantum memory, current-biased Josephson junction (CBJJ) superconducting qubit fabricated in a transmission line resonator (TLR) as quantum computing processor and the microwave photons in TLR as quantum data bus. The storage process is seriously treated by considering all kinds of decoherence mechanisms. Such a hybrid quantum device can also be used to create multi-qubit W states of NVEs through a common CBJJ. The experimental feasibility and challenge are justified using currently available technology.

  14. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...... context spanning heterogeneous devices, multiple applications, services, and information sources. In this article, we present ABC as an approach to contextualize information, and present our research into designing activity-centric computing technologies.......An important research topic in artificial intelligence is automatic sensing and inferencing of contextual information, which is used to build computer models of the user’s activity. One approach to build such activity-aware systems is the notion of activity-based computing (ABC). ABC is a computing...

  15. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    An important research topic in artificial intelligence is automatic sensing and inferencing of contextual information, which is used to build computer models of the user’s activity. One approach to build such activity-aware systems is the notion of activity-based computing (ABC). ABC is a computing...... paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...... context spanning heterogeneous devices, multiple applications, services, and information sources. In this article, we present ABC as an approach to contextualize information, and present our research into designing activity-centric computing technologies....

  16. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2015-01-01

    Full Text Available The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same time, the topic is among the most motivating tasks due to the comprehensive multidisciplinary and technical requirements. To enhance the educational process, the paper proposes a novel model-centered framework and develops the relevant learning objects. It allows implementing an educational platform of constructivist learning process, thus enabling learners’ experimentation with the provided programming models, obtaining learners’ competences of the modern scientific research and computational thinking, and capturing the relevant technical knowledge. It also provides an integral platform that allows a simultaneous and comparative introduction to pipelining and parallel computing. The programming language C for developing programming models and message passing interface (MPI and OpenMP parallelization tools have been chosen for implementation.

  17. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  18. Use of Computers in Human Factors Engineering

    Science.gov (United States)

    1974-11-01

    SENSES (PHYSIOLOGY), THERMOPLASTIC RESINS, VISUAL ACUITY (U)R RESEARCH CONCERNS DETERMINATION OF THE INFORMATION PRESENTATION REQUIREMENTS OF HUMAN DATA...THE GEOMETRY OF THE wORK STATION, IS CURRENTLY BEING DEVELOPED. IT IS CALLED COMBIMAN, AN ACRONYM FOR COMPUTERIZED BIOMECHANICAL MAN- MODELo COMBIMAN

  19. The Erasmus Computing Grid - Building a Super-Computer Virtually for Free at the Erasmus Medical Center and the Hogeschool Rotterdam

    NARCIS (Netherlands)

    T.A. Knoch (Tobias); L.V. de Zeeuw (Luc)

    2006-01-01

    textabstractThe Set-Up of the 20 Teraflop Erasmus Computing Grid: To meet the enormous computational needs of live- science research as well as clinical diagnostics and treatment the Hogeschool Rotterdam and the Erasmus Medical Center are currently setting up one of the largest desktop computing

  20. It is time to talk about people: a human-centered healthcare system

    Directory of Open Access Journals (Sweden)

    Borgi Lea

    2010-11-01

    Full Text Available Abstract Examining vulnerabilities within our current healthcare system we propose borrowing two tools from the fields of engineering and design: a Reason's system approach 1 and b User-centered design 23. Both approaches are human-centered in that they consider common patterns of human behavior when analyzing systems to identify problems and generate solutions. This paper examines these two human-centered approaches in the context of healthcare. We argue that maintaining a human-centered orientation in clinical care, research, training, and governance is critical to the evolution of an effective and sustainable healthcare system.

  1. It is time to talk about people: a human-centered healthcare system.

    Science.gov (United States)

    Searl, Meghan M; Borgi, Lea; Chemali, Zeina

    2010-11-26

    Examining vulnerabilities within our current healthcare system we propose borrowing two tools from the fields of engineering and design: a) Reason's system approach 1 and b) User-centered design 23. Both approaches are human-centered in that they consider common patterns of human behavior when analyzing systems to identify problems and generate solutions. This paper examines these two human-centered approaches in the context of healthcare. We argue that maintaining a human-centered orientation in clinical care, research, training, and governance is critical to the evolution of an effective and sustainable healthcare system.

  2. Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)

    Science.gov (United States)

    Valentine, Timothy

    2017-09-01

    The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.

  3. The Changing Face of Human-Computer Interaction in the Age of Ubiquitous Computing

    Science.gov (United States)

    Rogers, Yvonne

    HCI is reinventing itself. No longer only about being user-centered, it has set its sights on pastures new, embracing a much broader and far-reaching set of interests. From emotional, eco-friendly, embodied experiences to context, constructivism and culture, HCI research is changing apace: from what it looks at, the lenses it uses and what it has to offer. Part of this is as a reaction to what is happening in the world; ubiquitous technologies are proliferating and transforming how we live our lives. We are becoming more connected and more dependent on technology. The home, the crèche, outdoors, public places and even the human body are now being experimented with as potential places to embed computational devices, even to the extent of invading previously private and taboo aspects of our lives. In this paper, I examine the diversity of lifestyle and technological transformations in our midst and outline some 'difficult' questions these raise together with alternative directions for HCI research and practice.

  4. 08292 Abstracts Collection -- The Study of Visual Aesthetics in Human-Computer Interaction

    OpenAIRE

    Hassenzahl, Marc; Lindgaard, Gitte; Platz, Axel; Tractinsky, Noam

    2008-01-01

    From 13.07. to 16.07.2008, the Dagstuhl Seminar 08292 ``The Study of Visual Aesthetics in Human-Computer Interaction'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first secti...

  5. [Affective computing--a mysterious tool to explore human emotions].

    Science.gov (United States)

    Li, Xin; Li, Honghong; Dou, Yi; Hou, Yongjie; Li, Changwu

    2013-12-01

    Perception, affection and consciousness are basic psychological functions of human being. Affection is the subjective reflection of different kinds of objects. The foundation of human being's thinking is constituted by the three basic functions. Affective computing is an effective tool of revealing the affectiveness of human being in order to understand the world. Our research of affective computing focused on the relation, the generation and the influent factors among different affections. In this paper, the affective mechanism, the basic theory of affective computing, is studied, the method of acquiring and recognition of affective information is discussed, and the application of affective computing is summarized as well, in order to attract more researchers into this working area.

  6. Proactive human-computer collaboration for information discovery

    Science.gov (United States)

    DiBona, Phil; Shilliday, Andrew; Barry, Kevin

    2016-05-01

    Lockheed Martin Advanced Technology Laboratories (LM ATL) is researching methods, representations, and processes for human/autonomy collaboration to scale analysis and hypotheses substantiation for intelligence analysts. This research establishes a machinereadable hypothesis representation that is commonsensical to the human analyst. The representation unifies context between the human and computer, enabling autonomy in the form of analytic software, to support the analyst through proactively acquiring, assessing, and organizing high-value information that is needed to inform and substantiate hypotheses.

  7. Unmanned Surface Vehicle Human-Computer Interface for Amphibious Operations

    Science.gov (United States)

    2013-08-01

    FIGURES Figure 1. MOCU Baseline HCI using Both Aerial Photo and Digital Nautical Chart ( DNC ) Maps to Control and Monitor Land, Sea, and Air...Action DNC Digital Nautical Chart FNC Future Naval Capability HCI Human-Computer Interface HRI Human-Robot Interface HSI Human-Systems Integration...Digital Nautical Chart ( DNC ) Maps to Control and Monitor Land, Sea, and Air Vehicles. 3.2 BASELINE MOCU HCI The Baseline MOCU interface is a tiled

  8. Studying Collective Human Decision Making and Creativity with Evolutionary Computation.

    Science.gov (United States)

    Sayama, Hiroki; Dionne, Shelley D

    2015-01-01

    We report a summary of our interdisciplinary research project "Evolutionary Perspective on Collective Decision Making" that was conducted through close collaboration between computational, organizational, and social scientists at Binghamton University. We redefined collective human decision making and creativity as evolution of ecologies of ideas, where populations of ideas evolve via continual applications of evolutionary operators such as reproduction, recombination, mutation, selection, and migration of ideas, each conducted by participating humans. Based on this evolutionary perspective, we generated hypotheses about collective human decision making, using agent-based computer simulations. The hypotheses were then tested through several experiments with real human subjects. Throughout this project, we utilized evolutionary computation (EC) in non-traditional ways-(1) as a theoretical framework for reinterpreting the dynamics of idea generation and selection, (2) as a computational simulation model of collective human decision-making processes, and (3) as a research tool for collecting high-resolution experimental data on actual collaborative design and decision making from human subjects. We believe our work demonstrates untapped potential of EC for interdisciplinary research involving human and social dynamics.

  9. Virtual test: A student-centered software to measure student's critical thinking on human disease

    Science.gov (United States)

    Rusyati, Lilit; Firman, Harry

    2016-02-01

    The study "Virtual Test: A Student-Centered Software to Measure Student's Critical Thinking on Human Disease" is descriptive research. The background is importance of computer-based test that use element and sub element of critical thinking. Aim of this study is development of multiple choices to measure critical thinking that made by student-centered software. Instruments to collect data are (1) construct validity sheet by expert judge (lecturer and medical doctor) and professional judge (science teacher); and (2) test legibility sheet by science teacher and junior high school student. Participants consisted of science teacher, lecturer, and medical doctor as validator; and the students as respondent. Result of this study are describe about characteristic of virtual test that use to measure student's critical thinking on human disease, analyze result of legibility test by students and science teachers, analyze result of expert judgment by science teachers and medical doctor, and analyze result of trial test of virtual test at junior high school. Generally, result analysis shown characteristic of multiple choices to measure critical thinking was made by eight elements and 26 sub elements that developed by Inch et al.; complete by relevant information; and have validity and reliability more than "enough". Furthermore, specific characteristic of multiple choices to measure critical thinking are information in form science comic, table, figure, article, and video; correct structure of language; add source of citation; and question can guide student to critical thinking logically.

  10. HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL*

    Science.gov (United States)

    Malamud, Ofer; Pop-Eleches, Cristian

    2012-01-01

    This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven’s Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors. PMID:22719135

  11. HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL.

    Science.gov (United States)

    Malamud, Ofer; Pop-Eleches, Cristian

    2011-05-01

    This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven's Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors.

  12. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    Science.gov (United States)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  13. Spectrum of tablet computer use by medical students and residents at an academic medical center

    Directory of Open Access Journals (Sweden)

    Robert Robinson

    2015-07-01

    Full Text Available Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians.Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM in July and August of 2012.Results. There were 76 medical student responses (26% response rate and 66 resident/fellow responses to this survey (21% response rate. Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035. The most common reported uses were for accessing medical reference applications (46%, e-Books (45%, and board study (32%. Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010, review radiology images (27% vs. 12%, p = 0.019, and enter patient care orders (26% vs. 3%, p < 0.001.Discussion. This study shows a high prevalence and frequency of tablet computer use among physicians in training at this academic medical center. Most residents and students use tablet computers to access medical references, e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks.Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on

  14. Speech Dialogue with Facial Displays Multimodal Human-Computer Conversation

    CERN Document Server

    Nagao, K; Nagao, Katashi; Takeuchi, Akikazu

    1994-01-01

    Human face-to-face conversation is an ideal model for human-computer dialogue. One of the major features of face-to-face communication is its multiplicity of communication channels that act on multiple modalities. To realize a natural multimodal dialogue, it is necessary to study how humans perceive information and determine the information to which humans are sensitive. A face is an independent communication channel that conveys emotional and conversational signals, encoded as facial expressions. We have developed an experimental system that integrates speech dialogue and facial animation, to investigate the effect of introducing communicative facial expressions as a new modality in human-computer conversation. Our experiments have shown that facial expressions are helpful, especially upon first contact with the system. We have also discovered that featuring facial expressions at an early stage improves subsequent interaction.

  15. The UK Human Genome Mapping Project online computing service.

    Science.gov (United States)

    Rysavy, F R; Bishop, M J; Gibbs, G P; Williams, G W

    1992-04-01

    This paper presents an overview of computing and networking facilities developed by the Medical Research Council to provide online computing support to the Human Genome Mapping Project (HGMP) in the UK. The facility is connected to a number of other computing facilities in various centres of genetics and molecular biology research excellence, either directly via high-speed links or through national and international wide-area networks. The paper describes the design and implementation of the current system, a 'client/server' network of Sun, IBM, DEC and Apple servers, gateways and workstations. A short outline of online computing services currently delivered by this system to the UK human genetics research community is also provided. More information about the services and their availability could be obtained by a direct approach to the UK HGMP-RC.

  16. Computed tomography-guided core-needle biopsy of lung lesions: an oncology center experience

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, Marcos Duarte; Fonte, Alexandre Calabria da; Chojniak, Rubens, E-mail: marcosduarte@yahoo.com.b [Hospital A.C. Camargo, Sao Paulo, SP (Brazil). Dept. of Radiology and Imaging Diagnosis; Andrade, Marcony Queiroz de [Hospital Alianca, Salvador, BA (Brazil); Gross, Jefferson Luiz [Hospital A.C. Camargo, Sao Paulo, SP (Brazil). Dept. of Chest Surgery

    2011-03-15

    Objective: The present study is aimed at describing the experience of an oncology center with computed tomography guided core-needle biopsy of pulmonary lesions. Materials and Methods: Retrospective analysis of 97 computed tomography-guided core-needle biopsy of pulmonary lesions performed in the period between 1996 and 2004 in a Brazilian reference oncology center (Hospital do Cancer - A.C. Camargo). Information regarding material appropriateness and the specific diagnoses were collected and analyzed. Results: Among 97 lung biopsies, 94 (96.9%) supplied appropriate specimens for histological analyses, with 71 (73.2%) cases being diagnosed as malignant lesions and 23 (23.7%) diagnosed as benign lesions. Specimens were inappropriate for analysis in three cases. The frequency of specific diagnosis was 83 (85.6%) cases, with high rates for both malignant lesions with 63 (88.7%) cases and benign lesions with 20 (86.7%). As regards complications, a total of 12 cases were observed as follows: 7 (7.2%) cases of hematoma, 3 (3.1%) cases of pneumothorax and 2 (2.1%) cases of hemoptysis. Conclusion: Computed tomography-guided core needle biopsy of lung lesions demonstrated high rates of material appropriateness and diagnostic specificity, and low rates of complications in the present study. (author)

  17. Removal of ring artifacts in computed tomographic imaging using iterative center weighted median filter.

    Science.gov (United States)

    Sadi, Fazle; Lee, Soo Yeol; Hasan, Md Kamrul

    2010-01-01

    A new iterative center weighted median filter (ICWMF) for ring artifact reduction from the micro-computed tomographic (micro-CT) image is proposed in this paper. The center weight of the median filter is computed based on the characteristic of the ring artifact in the mean curve of the projection data. The filter operates on the deviation of the mean curve to smooth the ring generating peaks and troughs iteratively while preserving the details due to image. A convergence criterion for the iterative algorithm is determined from the distribution of the local deviation computed from the mean curve deviation. The estimate of the mean curve obtained using the ICWMF is used to correct the ring corrupted projection data from which reconstruction gives the ring artifact suppressed micro-CT image. Test results on both the synthetic and real images demonstrate that the ring artifacts can be more effectively suppressed using our method as compared to other ring removal techniques reported in the literature. 2009 Elsevier Ltd. All rights reserved.

  18. Linguistics in the digital humanities: (computational corpus linguistics

    Directory of Open Access Journals (Sweden)

    Kim Ebensgaard Jensen

    2014-12-01

    Full Text Available Corpus linguistics has been closely intertwined with digital technology since the introduction of university computer mainframes in the 1960s. Making use of both digitized data in the form of the language corpus and computational methods of analysis involving concordancers and statistics software, corpus linguistics arguably has a place in the digital humanities. Still, it remains obscure and fi gures only sporadically in the literature on the digital humanities. Th is article provides an overview of the main principles of corpus linguistics and the role of computer technology in relation to data and method and also off ers a bird's-eye view of the history of corpus linguistics with a focus on its intimate relationship with digital technology and how digital technology has impacted the very core of corpus linguistics and shaped the identity of the corpus linguist. Ultimately, the article is oriented towards an acknowledgment of corpus linguistics' alignment with the digital humanities.

  19. Human Computer Interaction Approach in Developing Customer Relationship Management

    Directory of Open Access Journals (Sweden)

    Mohd H.N.M. Nasir

    2008-01-01

    Full Text Available Problem statement: Many published studies have found that more than 50% of Customer Relationship Management (CRM system implementations have failed due to the failure of system usability and does not fulfilled user expectation. This study presented the issues that contributed to the failures of CRM system and proposed a prototype of CRM system developed using Human Computer Interaction approaches in order to resolve the identified issues. Approach: In order to capture the users' requirements, a single in-depth case study of a multinational company was chosen in this research, in which the background, current conditions and environmental interactions were observed, recorded and analyzed for stages of patterns in relation to internal and external influences. Some techniques of blended data gathering which are interviews, naturalistic observation and studying user documentation were employed and then the prototype of CRM system was developed which incorporated User-Centered Design (UCD approach, Hierarchical Task Analysis (HTA, metaphor and identification of users' behaviors and characteristics. The implementation of these techniques, were then measured in terms of usability. Results: Based on the usability testing conducted, the results showed that most of the users agreed that the system is comfortable to work with by taking the quality attributes of learnability, memorizeablity, utility, sortability, font, visualization, user metaphor, information easy view and color as measurement parameters. Conclusions/Recommendations: By combining all these techniques, a comfort level for the users that leads to user satisfaction and higher usability degree can be achieved in a proposed CRM system. Thus, it is important that the companies should put usability quality attribute into a consideration before developing or procuring CRM system to ensure the implementation successfulness of the CRM system.

  20. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  1. Optimization of Italian CMS Computing Centers via MIUR funded Research Projects

    CERN Document Server

    Boccali, Tommaso

    2014-01-01

    The Italian Ministry of Research (MIUR) funded in the past years research projects aimed to an optimization of the analysis activities in the Italian CMS computing Centers. A new grant started in 2013, and activities are already ongoing in 9 INFN sites, all hosting local CMS groups. Main focus will be on the creation of an italian storage federation (via Xrootd initially, and later HTTP) which allows all the italian CMS physicists to a privileged access to CMS data and simulations. Another task will focus on the optimization of the last step of a CMS analysis, via interactive access to resources; this will result in a number of small- to medium-sized analysis centers, where access will be granted at national level to multicore machines, PROOF facilities, high throughput local queues. An important part of this last activity will imply experimenting with on demand analysis machine instantiation via Clouds, using the experience and the resources INFN is building on the subject.

  2. Supporting Negotiation Behavior with Haptics-Enabled Human-Computer Interfaces

    OpenAIRE

    Küçükyılmaz, Ayşe; Sezgin, Tevfik Metin; Başdoğan, Çağatay

    2012-01-01

    An active research goal for human-computer interaction is to allow humans to communicate with computers in an intuitive and natural fashion, especially in real-life interaction scenarios. One approach that has been advocated to achieve this has been to build computer systems with human-like qualities and capabilities. In this paper, we present insight on how human-computer interaction can be enriched by employing the computers with behavioral patterns that naturally appear in human-human nego...

  3. From humans to computers cognition through visual perception

    CERN Document Server

    Alexandrov, Viktor Vasilievitch

    1991-01-01

    This book considers computer vision to be an integral part of the artificial intelligence system. The core of the book is an analysis of possible approaches to the creation of artificial vision systems, which simulate human visual perception. Much attention is paid to the latest achievements in visual psychology and physiology, the description of the functional and structural organization of the human perception mechanism, the peculiarities of artistic perception and the expression of reality. Computer vision models based on these data are investigated. They include the processes of external d

  4. Human computer interaction issues in Clinical Trials Management Systems.

    Science.gov (United States)

    Starren, Justin B; Payne, Philip R O; Kaufman, David R

    2006-01-01

    Clinical trials increasingly rely upon web-based Clinical Trials Management Systems (CTMS). As with clinical care systems, Human Computer Interaction (HCI) issues can greatly affect the usefulness of such systems. Evaluation of the user interface of one web-based CTMS revealed a number of potential human-computer interaction problems, in particular, increased workflow complexity associated with a web application delivery model and potential usability problems resulting from the use of ambiguous icons. Because these design features are shared by a large fraction of current CTMS, the implications extend beyond this individual system.

  5. A Human-Centred Tangible approach to learning Computational Thinking

    Directory of Open Access Journals (Sweden)

    Tommaso Turchi

    2016-08-01

    Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

  6. A computational model of the human hand 93-ERI-053

    Energy Technology Data Exchange (ETDEWEB)

    Hollerbach, K.; Axelrod, T.

    1996-03-01

    The objectives of the Computational Hand Modeling project were to prove the feasibility of the Laboratory`s NIKE3D finite element code to orthopaedic problems. Because of the great complexity of anatomical structures and the nonlinearity of their behavior, we have focused on a subset of joints of the hand and lower extremity and have developed algorithms to model their behavior. The algorithms developed here solve fundamental problems in computational biomechanics and can be expanded to describe any other joints of the human body. This kind of computational modeling has never successfully been attempted before, due in part to a lack of biomaterials data and a lack of computational resources. With the computational resources available at the National Laboratories and the collaborative relationships we have established with experimental and other modeling laboratories, we have been in a position to pursue our innovative approach to biomechanical and orthopedic modeling.

  7. Interactive Evolutionary Computation for Analyzing Human Awareness Mechanisms

    Directory of Open Access Journals (Sweden)

    Hideyuki Takagi

    2012-01-01

    Full Text Available We discuss the importance of establishing awareness science and show the idea of using interactive evolutionary computation (IEC as a tool for analyzing awareness mechanism and making awareness models. First, we describe the importance of human factors in computational intelligence and that IEC is one of approaches for the so-called humanized computational intelligence. Second, we show examples that IEC is used as an analysis tool for human science. As analyzing human awareness mechanism is in this kind of analyzing human characteristics and capabilities, IEC may be able to be used for this purpose. Based on this expectation, we express one idea for analyzing the awareness mechanism. This idea is to make an equivalent model of an IEC user using a learning model and find latent variables that connect inputs and outputs of the user model and that help to understand or explain the inputs-outputs relationship. Although there must be several definitions of awareness, this idea is based on one definition that awareness is to find out unknown variables that helps our understanding. If we establish a method for finding the latent variables automatically, we can realize an awareness model in computer.

  8. Human-centered design of human-computer-human dialogs in aerospace systems

    Science.gov (United States)

    Mitchell, Christine M.

    1994-01-01

    The second six months of this grant saw further development of GT-CATS, the Georgia Tech Crew Activity Tracking System, and progress on research exploring tutoring concepts for tutors for mode management. The latter included data analysis and a preliminary paper summarizing the development and evaluation of the VNAV Tutor. A follow-on to the VNAV Tutor is planned. Research in this direction will examine the use of OFMspert and GT-CATS to create an 'intelligent' tutor for mode management, a more extensive domain of application than only vertical navigation, and alternative pedagogy, such as substituting focused 'cases' of reported mode management situations rather than lessons defined by full LOFT scenarios.

  9. Comparison of nonmesonic hypernuclear decay rates computed in laboratory and center-of-mass coordinates

    Energy Technology Data Exchange (ETDEWEB)

    De Conti, C. [Campus Experimental de Itapeva, UNESP, 18409-010 Itapeva, SP (Brazil); Barbero, C. [Facultad de Ciencias Exactas, UNLP, 1900 La Plata, Argentina and Instituto de Física La Plata, CONICET, 1900 La Plata (Argentina); Galeão, A. P. [Instituto de Física Teórica, UNESP, 01140-070 São Paulo, SP (Brazil); Krmpotić, F. [Instituto de Física La Plata, CONICET, 1900 La Plata (Argentina); Facultad de Ciencias Astronómicas y Geofísicas, UNLP, 1900 La Plata (Argentina); Instituto de Física Teórica, UNESP, 01140-070 São Paulo, SP (Brazil)

    2014-11-11

    In this work we compute the one-nucleon-induced nonmesonic hypernuclear decay rates of {sub Λ}{sup 5}He, {sub Λ}{sup 12}C and {sub Λ}{sup 13}C using a formalism based on the independent particle shell model in terms of laboratory coordinates. To ascertain the correctness and precision of the method, these results are compared with those obtained using a formalism in terms of center-of-mass coordinates, which has been previously reported in the literature. The formalism in terms of laboratory coordinates will be useful in the shell-model approach to two-nucleon-induced transitions.

  10. Mass Storage System Upgrades at the NASA Center for Computational Sciences

    Science.gov (United States)

    Tarshish, Adina; Salmon, Ellen; Macie, Medora; Saletta, Marty

    2000-01-01

    The NASA Center for Computational Sciences (NCCS) provides supercomputing and mass storage services to over 1200 Earth and space scientists. During the past two years, the mass storage system at the NCCS went through a great deal of changes both major and minor. Tape drives, silo control software, and the mass storage software itself were upgraded, and the mass storage platform was upgraded twice. Some of these upgrades were aimed at achieving year-2000 compliance, while others were simply upgrades to newer and better technologies. In this paper we will describe these upgrades.

  11. Unifying Human Centered Design and Systems Engineering for Human Systems Integration

    Science.gov (United States)

    Boy, Guy A.; McGovernNarkevicius, Jennifer

    2013-01-01

    Despite the holistic approach of systems engineering (SE), systems still fail, and sometimes spectacularly. Requirements, solutions and the world constantly evolve and are very difficult to keep current. SE requires more flexibility and new approaches to SE have to be developed to include creativity as an integral part and where the functions of people and technology are appropriately allocated within our highly interconnected complex organizations. Instead of disregarding complexity because it is too difficult to handle, we should take advantage of it, discovering behavioral attractors and the emerging properties that it generates. Human-centered design (HCD) provides the creativity factor that SE lacks. It promotes modeling and simulation from the early stages of design and throughout the life cycle of a product. Unifying HCD and SE will shape appropriate human-systems integration (HSI) and produce successful systems.

  12. Initial Flight Test of the Production Support Flight Control Computers at NASA Dryden Flight Research Center

    Science.gov (United States)

    Carter, John; Stephenson, Mark

    1999-01-01

    The NASA Dryden Flight Research Center has completed the initial flight test of a modified set of F/A-18 flight control computers that gives the aircraft a research control law capability. The production support flight control computers (PSFCC) provide an increased capability for flight research in the control law, handling qualities, and flight systems areas. The PSFCC feature a research flight control processor that is "piggybacked" onto the baseline F/A-18 flight control system. This research processor allows for pilot selection of research control law operation in flight. To validate flight operation, a replication of a standard F/A-18 control law was programmed into the research processor and flight-tested over a limited envelope. This paper provides a brief description of the system, summarizes the initial flight test of the PSFCC, and describes future experiments for the PSFCC.

  13. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    Energy Technology Data Exchange (ETDEWEB)

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art.

  14. The Erasmus Computing Grid - Building a Super-Computer Virtually for Free at the Erasmus Medical Center and the Hogeschool Rotterdam

    NARCIS (Netherlands)

    T.A. Knoch (Tobias); L.V. de Zeeuw (Luc)

    2006-01-01

    textabstractThe Set-Up of the 20 Teraflop Erasmus Computing Grid: To meet the enormous computational needs of live- science research as well as clinical diagnostics and treatment the Hogeschool Rotterdam and the Erasmus Medical Center are currently setting up one of the largest desktop

  15. Computational Virtual Reality (VR) as a human-computer interface in the operation of telerobotic systems

    Science.gov (United States)

    Bejczy, Antal K.

    1995-01-01

    This presentation focuses on the application of computer graphics or 'virtual reality' (VR) techniques as a human-computer interface tool in the operation of telerobotic systems. VR techniques offer very valuable task realization aids for planning, previewing and predicting robotic actions, operator training, and for visual perception of non-visible events like contact forces in robotic tasks. The utility of computer graphics in telerobotic operation can be significantly enhanced by high-fidelity calibration of virtual reality images to actual TV camera images. This calibration will even permit the creation of artificial (synthetic) views of task scenes for which no TV camera views are available.

  16. A Software Framework for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2009-01-01

    This paper describes a software framework we designed and implemented for the development and research in the area of multimodal human-computer interface. The proposed framework is based on publish / subscribe architecture, which allows developers and researchers to conveniently configure, test and

  17. Formal modelling techniques in human-computer interaction

    NARCIS (Netherlands)

    Haan, de G.; Veer, van der G.C.; Vliet, van J.C.

    1991-01-01

    This paper is a theoretical contribution, elaborating the concept of models as used in Cognitive Ergonomics. A number of formal modelling techniques in human-computer interaction will be reviewed and discussed. The analysis focusses on different related concepts of formal modelling techniques in hum

  18. Computed tomography of the human developing anterior skull base

    NARCIS (Netherlands)

    J. van Loosen (J.); A.I.J. Klooswijk (A. I J); D. van Velzen (D.); C.D.A. Verwoerd (Carel)

    1990-01-01

    markdownabstractAbstract The ossification of the anterior skull base, especially the lamina cribrosa, has been studied by computed tomography and histopathology. Sixteen human fetuses, (referred to our laboratory for pathological examination after spontaneous abortion between 18 and 32 weeks of ge

  19. CHI '13 Extended Abstracts on Human Factors in Computing Systems

    DEFF Research Database (Denmark)

    The CHI Papers and Notes program is continuing to grow along with many of our sister conferences. We are pleased that CHI is still the leading venue for research in human-computer interaction. CHI 2013 continued the use of subcommittees to manage the review process. Authors selected the subcommit...

  20. A Software Framework for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2009-01-01

    This paper describes a software framework we designed and implemented for the development and research in the area of multimodal human-computer interface. The proposed framework is based on publish / subscribe architecture, which allows developers and researchers to conveniently configure, test and

  1. Studying Collective Human Decision Making and Creativity with Evolutionary Computation

    OpenAIRE

    Sayama, Hiroki; Dionne, Shelley D.

    2014-01-01

    We report a summary of our interdisciplinary research project "Evolutionary Perspective on Collective Decision Making" that was conducted through close collaboration between computational, organizational and social scientists at Binghamton University. We redefined collective human decision making and creativity as evolution of ecologies of ideas, where populations of ideas evolve via continual applications of evolutionary operators such as reproduction, recombination, mutation, selection, and...

  2. A Decentralized Virtual Machine Migration Approach of Data Centers for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Xiaoying Wang

    2013-01-01

    Full Text Available As cloud computing offers services to lots of users worldwide, pervasive applications from customers are hosted by large-scale data centers. Upon such platforms, virtualization technology is employed to multiplex the underlying physical resources. Since the incoming loads of different application vary significantly, it is important and critical to manage the placement and resource allocation schemes of the virtual machines (VMs in order to guarantee the quality of services. In this paper, we propose a decentralized virtual machine migration approach inside the data centers for cloud computing environments. The system models and power models are defined and described first. Then, we present the key steps of the decentralized mechanism, including the establishment of load vectors, load information collection, VM selection, and destination determination. A two-threshold decentralized migration algorithm is implemented to further save the energy consumption as well as keeping the quality of services. By examining the effect of our approach by performance evaluation experiments, the thresholds and other factors are analyzed and discussed. The results illustrate that the proposed approach can efficiently balance the loads across different physical nodes and also can lead to less power consumption of the entire system holistically.

  3. Energy Efficient Security Preserving VM Live Migration In Data Centers For Cloud Computing

    Directory of Open Access Journals (Sweden)

    Korir Sammy

    2012-03-01

    Full Text Available Virtualization is an innovation that has widely been utilized in modern data centers for cloud computing to realize energy-efficient operations of servers. Virtual machine (VM migration brings multiple benefits such as resource distribution and energy aware consolidation. Server consolidation achieves energy efficiency by enabling multiple instances of operating systems to run simultaneously on a single machine. With virtualization, it is possible to consolidate severs through VM live migration. However, migration of virtual machines brings extra energy consumption and serious security concerns that derail full adoption of this technology. In this paper, we propose a secure energy-aware provisioning of cloud computing resources on consolidated and virtualized platforms. Energy efficiency is achieved through just-right dynamic Round-Robin provisioning mechanism and the ability to power down sub-systems of a host system that are not required by VMs mapped to it. We further propose solutions to security challenges faced during VM live migration. We validate our approach by conducting a set of rigorous performance evaluation study using CloudSim toolkit. The experimental results show that our approach achieves reduced energy consumption in data centers while not compromising on security.

  4. Oversized or Undersized? Defining the Right-sized Computer Center for Electronic Funds Transfer Processing

    Directory of Open Access Journals (Sweden)

    ANDRADE, A.

    2013-06-01

    Full Text Available Electronic Funds Transfer represents an upward trend, which fosters the proximity among consumers and suppliers. Each transaction is sent to a Computer Center, in charge of decoding, processing and returning the results as fast as possible. Particularly, the present article covers the GetNet Company day-by-day, focusing on one of their subsystems. In the article, we model the incoming transaction volume and the corresponding processing to answer the following questions: (i how is the idleness of the company transaction system settings and what are the rates involed on that? (ii Given an annual growth of 20% in the transaction volume, which modifications should be made in the current Computer Center to fulfill the need in terms of transactions until 2020? The tests were based on transactions execution logs during one day, which corresponds to the greater volume of 2011. As expected, the results show that the 10 machines composing GetNet system are overestimated for the current situation, which could support the operational load with only 4 machines. In addition, the current configuration could be sustained, regarding the growth predicted before, until the middle of 2017 without loss of transactions.

  5. Homo ludens in the loop playful human computation systems

    CERN Document Server

    Krause, Markus

    2014-01-01

    The human mind is incredible. It solves problems with ease that will elude machines even for the next decades. This book explores what happens when humans and machines work together to solve problems machines cannot yet solve alone. It explains how machines and computers can work together and how humans can have fun helping to face some of the most challenging problems of artificial intelligence. In this book, you will find designs for games that are entertaining and yet able to collect data to train machines for complex tasks such as natural language processing or image understanding. You wil

  6. Examining the Fundamental Obstructs of Adopting Cloud Computing for 9-1-1 Dispatch Centers in the USA

    Science.gov (United States)

    Osman, Abdulaziz

    2016-01-01

    The purpose of this research study was to examine the unknown fears of embracing cloud computing which stretches across measurements like fear of change from leaders and the complexity of the technology in 9-1-1 dispatch centers in USA. The problem that was addressed in the study was that many 9-1-1 dispatch centers in USA are still using old…

  7. Changing the batch system in a Tier 1 computing center: why and how

    Science.gov (United States)

    Chierici, Andrea; Dal Pra, Stefano

    2014-06-01

    At the Italian Tierl Center at CNAF we are evaluating the possibility to change the current production batch system. This activity is motivated mainly because we are looking for a more flexible licensing model as well as to avoid vendor lock-in. We performed a technology tracking exercise and among many possible solutions we chose to evaluate Grid Engine as an alternative because its adoption is increasing in the HEPiX community and because it's supported by the EMI middleware that we currently use on our computing farm. Another INFN site evaluated Slurm and we will compare our results in order to understand pros and cons of the two solutions. We will present the results of our evaluation of Grid Engine, in order to understand if it can fit the requirements of a Tier 1 center, compared to the solution we adopted long ago. We performed a survey and a critical re-evaluation of our farming infrastructure: many production softwares (accounting and monitoring on top of all) rely on our current solution and changing it required us to write new wrappers and adapt the infrastructure to the new system. We believe the results of this investigation can be very useful to other Tier-ls and Tier-2s centers in a similar situation, where the effort of switching may appear too hard to stand. We will provide guidelines in order to understand how difficult this operation can be and how long the change may take.

  8. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability

    Energy Technology Data Exchange (ETDEWEB)

    Pagliosa, Andre; Raucci-Neto, Walter; Silva-Souza, Yara Teresinha Correa; Alfredo, Edson, E-mail: ysousa@unaerp.br [Universidade de Ribeirao Preto (UNAERP), SP (Brazil). Fac. de Odontologia; Sousa-Neto, Manoel Damiao; Versiani, Marco Aurelio [Universidade de Sao Paulo (USP), Ribeirao Preto, SP (Brazil). Fac. de Odoentologia

    2015-03-01

    The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi) rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10), according to the biomechanical preparative system used: Hero 642 (HR), Liberator (LB), ProTaper (PT), and Twisted File (TF). The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey’s tests (α = 0.05). The results demonstrated no significant difference (p > 0.05) in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR); -0.004 ± 0.044 mm (LB); -0.003 ± 0.064 mm (PT); -0.021 ± 0.064 mm (TF). The mean canal centering ability was: -0.093 ± 0.147 mm (HR); -0.001 ± 0.100 mm (LB); -0.002 ± 0.134 mm (PT); -0.033 ± 0.133 mm (TF). Also, there was no significant difference among the root segments (p > 0.05). It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape. (author)

  9. 75 FR 2545 - National Toxicology Program (NTP); Center for the Evaluation of Risks to Human Reproduction...

    Science.gov (United States)

    2010-01-15

    ... HUMAN SERVICES National Institutes of Health National Toxicology Program (NTP); Center for the Evaluation of Risks to Human Reproduction (CERHR); Availability of the Final Expert Panel Report on Soy... whether exposure to soy infant formula is a hazard to human development. The expert panel also...

  10. Computational Fluid and Particle Dynamics in the Human Respiratory System

    CERN Document Server

    Tu, Jiyuan; Ahmadi, Goodarz

    2013-01-01

    Traditional research methodologies in the human respiratory system have always been challenging due to their invasive nature. Recent advances in medical imaging and computational fluid dynamics (CFD) have accelerated this research. This book compiles and details recent advances in the modelling of the respiratory system for researchers, engineers, scientists, and health practitioners. It breaks down the complexities of this field and provides both students and scientists with an introduction and starting point to the physiology of the respiratory system, fluid dynamics and advanced CFD modeling tools. In addition to a brief introduction to the physics of the respiratory system and an overview of computational methods, the book contains best-practice guidelines for establishing high-quality computational models and simulations. Inspiration for new simulations can be gained through innovative case studies as well as hands-on practice using pre-made computational code. Last but not least, students and researcher...

  11. Human-Centered Content-Based Image Retrieval

    NARCIS (Netherlands)

    van den Broek, Egon

    2005-01-01

    Retrieval of images that lack a (suitable) annotations cannot be achieved through (traditional) Information Retrieval (IR) techniques. Access through such collections can be achieved through the application of computer vision techniques on the IR problem, which is baptized Content-Based Image

  12. Human-Centered Content-Based Image Retrieval

    NARCIS (Netherlands)

    Broek, van den Egon L.

    2005-01-01

    Retrieval of images that lack a (suitable) annotations cannot be achieved through (traditional) Information Retrieval (IR) techniques. Access through such collections can be achieved through the application of computer vision techniques on the IR problem, which is baptized Content-Based Image Retrie

  13. Teaching Human-Centered Security Using Nontraditional Techniques

    Science.gov (United States)

    Renaud, Karen; Cutts, Quintin

    2013-01-01

    Computing science students amass years of programming experience and a wealth of factual knowledge in their undergraduate courses. Based on our combined years of experience, however, one of our students' abiding shortcomings is that they think there is only "one correct answer" to issues in most courses: an "idealistic"…

  14. A Theory of Human Needs Should Be Human-Centered, Not Animal-Centered: Commentary on Kenrick et al. (2010).

    Science.gov (United States)

    Kesebir, Selin; Graham, Jesse; Oishi, Shigehiro

    2010-05-01

    Kenrick et al. (2010, this issue) make an important contribution by presenting a theory of human needs within an evolutionary framework. In our opinion, however, this framework bypasses the human uniqueness that Maslow intended to capture in his theory. We comment on the unique power of culture in shaping human motivation at the phylogenetic, ontogenetic, and proximate levels. We note that culture-gene coevolution may be a more promising lead to a theory of human motivation than a mammalcentric evolutionary perspective. © The Author(s) 2010.

  15. A novel polar-based human face recognition computational model

    Directory of Open Access Journals (Sweden)

    Y. Zana

    2009-07-01

    Full Text Available Motivated by a recently proposed biologically inspired face recognition approach, we investigated the relation between human behavior and a computational model based on Fourier-Bessel (FB spatial patterns. We measured human recognition performance of FB filtered face images using an 8-alternative forced-choice method. Test stimuli were generated by converting the images from the spatial to the FB domain, filtering the resulting coefficients with a band-pass filter, and finally taking the inverse FB transformation of the filtered coefficients. The performance of the computational models was tested using a simulation of the psychophysical experiment. In the FB model, face images were first filtered by simulated V1- type neurons and later analyzed globally for their content of FB components. In general, there was a higher human contrast sensitivity to radially than to angularly filtered images, but both functions peaked at the 11.3-16 frequency interval. The FB-based model presented similar behavior with regard to peak position and relative sensitivity, but had a wider frequency band width and a narrower response range. The response pattern of two alternative models, based on local FB analysis and on raw luminance, strongly diverged from the human behavior patterns. These results suggest that human performance can be constrained by the type of information conveyed by polar patterns, and consequently that humans might use FB-like spatial patterns in face processing.

  16. Neuromolecular computing: a new approach to human brain evolution.

    Science.gov (United States)

    Wallace, R; Price, H

    1999-09-01

    Evolutionary approaches in human cognitive neurobiology traditionally emphasize macroscopic structures. It may soon be possible to supplement these studies with models of human information-processing of the molecular level. Thin-film, simulation, fluorescence microscopy, and high-resolution X-ray crystallographic studies provide evidence for transiently organized neural membrane molecular systems with possible computational properties. This review article examines evidence for hydrophobic-mismatch molecular interactions within phospholipid microdomains of a neural membrane bilayer. It is proposed that these interactions are a massively parallel algorithm which can rapidly compute near-optimal solutions to complex cognitive and physiological problems. Coupling of microdomain activity to permenant ion movements at ligand-gated and voltage-gated channels permits the conversion of molecular computations into neuron frequency codes. Evidence for microdomain transport of proteins to specific locations within the bilayer suggests that neuromolecular computation may be under some genetic control and thus modifiable by natural selection. A possible experimental approach for examining evolutionary changes in neuromolecular computation is briefly discussed.

  17. A Novel Approach for Submission of Tasks to a Data Center in a Virtualized Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    B. Santhosh Kumar

    2016-08-01

    Full Text Available The submission of tasks to a data center plays a crucial role to achieve the services like scheduling, processing in a cloud computing environment. Energy consumption of a data center must be considered for task processing as it results in high operational expenditures and bad environmental impact. Unfortunately, none of the current research works focus on energy factor while submitting tasks to a cloud. In this paper a framework is proposed to select a data center with minimum energy consumption. The service provider has to register all the data centers in a registry. The energy consumed by task processing using virtualization and energy of IT equipments like routers, switches is calculated. The data center selection framework finally selects the data center with minimum energy consumption for task processing. The experimental results indicate that the proposed idea results in a less energy when compared to the existing algorithms for selection of data centers.

  18. Hand Gesture and Neural Network Based Human Computer Interface

    Directory of Open Access Journals (Sweden)

    Aekta Patel

    2014-06-01

    Full Text Available Computer is used by every people either at their work or at home. Our aim is to make computers that can understand human language and can develop a user friendly human computer interfaces (HCI. Human gestures are perceived by vision. The research is for determining human gestures to create an HCI. Coding of these gestures into machine language demands a complex programming algorithm. In this project, We have first detected, recognized and pre-processing the hand gestures by using General Method of recognition. Then We have found the recognized image’s properties and using this, mouse movement, click and VLC Media player controlling are done. After that we have done all these functions thing using neural network technique and compared with General recognition method. From this we can conclude that neural network technique is better than General Method of recognition. In this, I have shown the results based on neural network technique and comparison between neural network method & general method.

  19. Quality Improvement Project to Improve Patient Satisfaction With Pain Management: Using Human-Centered Design.

    Science.gov (United States)

    Trail-Mahan, Tracy; Heisler, Scott; Katica, Mary

    2016-01-01

    In this quality improvement project, our health system developed a comprehensive, patient-centered approach to improving inpatient pain management and assessed its impact on patient satisfaction across 21 medical centers. Using human-centered design principles, a bundle of 6 individual and team nursing practices was developed. Patient satisfaction with pain management, as measured by the Hospital Consumer Assessment of Healthcare Providers and Systems pain composite score, increased from the 25th to just under the 75th national percentile.

  20. Humanoid robotics and human-centered initiatives at IRI

    OpenAIRE

    Alenyà, Guillem; Hernàndez, Sergi; Andrade-Cetto, J.; Sanfeliu, Alberto; Torras, Carme

    2009-01-01

    This work was supported by projects: 'Perception, action & cognition through learning of object-action complexes.' (4915), 'Ubiquitous networking robotics in urban settings' (E-00938), 'CONSOLIDER-INGENIO 2010 Multimodal interaction in pattern recognition and computer vision' (V-00069), 'Robotica ubicua para entornos urbanos' (J-01225), 'Grup de recerca consolidat - VIS' (2005SGR-00937), 'Percepción y acción ante incertidumbre' (4803), 'Grup de recerca consolidat - ROBÒTICA' (8007), 'The huma...

  1. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment

    Science.gov (United States)

    Harrigan, Robert L.; Yvernault, Benjamin C.; Boyd, Brian D.; Damon, Stephen M.; Gibney, Kyla David; Conrad, Benjamin N.; Phillips, Nicholas S.; Rogers, Baxter P.; Gao, Yurui; Landman, Bennett A.

    2015-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and RedCAP to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. PMID:25988229

  2. Human-Centered Object-Based Image Retrieval

    NARCIS (Netherlands)

    Broek, E.L. van den; Rikxoort, E.M. van; Schouten, T.E.

    2005-01-01

    A new object-based image retrieval (OBIR) scheme is introduced. The images are analyzed using the recently developed, human-based 11 colors quantization scheme and the color correlogram. Their output served as input for the image segmentation algorithm: agglomerative merging, which is extended to co

  3. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    Science.gov (United States)

    Molthan, Andrew L.; Limaye, Ashutosh S.; Srikishen, Jayanthi

    2011-01-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula s "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA s National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA s SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT s experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by

  4. Human -Computer Interface using Gestures based on Neural Network

    Directory of Open Access Journals (Sweden)

    Aarti Malik

    2014-10-01

    Full Text Available - Gestures are powerful tools for non-verbal communication. Human computer interface (HCI is a growing field which reduces the complexity of interaction between human and machine in which gestures are used for conveying information or controlling the machine. In the present paper, static hand gestures are utilized for this purpose. The paper presents a novel technique of recognizing hand gestures i.e. A-Z alphabets, 0-9 numbers and 6 additional control signals (for keyboard and mouse control by extracting various features of hand ,creating a feature vector table and training a neural network. The proposed work has a recognition rate of 99%. .

  5. Human-Computer Interaction, Tourism and Cultural Heritage

    Science.gov (United States)

    Cipolla Ficarra, Francisco V.

    We present a state of the art of the human-computer interaction aimed at tourism and cultural heritage in some cities of the European Mediterranean. In the work an analysis is made of the main problems deriving from training understood as business and which can derail the continuous growth of the HCI, the new technologies and tourism industry. Through a semiotic and epistemological study the current mistakes in the context of the interrelations of the formal and factual sciences will be detected and also the human factors that have an influence on the professionals devoted to the development of interactive systems in order to safeguard and boost cultural heritage.

  6. A computer simulation approach to measurement of human control strategy

    Science.gov (United States)

    Green, J.; Davenport, E. L.; Engler, H. F.; Sears, W. E., III

    1982-01-01

    Human control strategy is measured through use of a psychologically-based computer simulation which reflects a broader theory of control behavior. The simulation is called the human operator performance emulator, or HOPE. HOPE was designed to emulate control learning in a one-dimensional preview tracking task and to measure control strategy in that setting. When given a numerical representation of a track and information about current position in relation to that track, HOPE generates positions for a stick controlling the cursor to be moved along the track. In other words, HOPE generates control stick behavior corresponding to that which might be used by a person learning preview tracking.

  7. Visual Interpretation Of Hand Gestures For Human Computer Interaction

    Directory of Open Access Journals (Sweden)

    M.S.Sahane

    2014-01-01

    Full Text Available The use of hand gestures provides an attractive alternative to cumbersome interface devices for human-computer interaction (HCI. In particular, visual interpretation of hand gestures can help in achieving the ease and naturalness desired for HCI. This discussion is organized on the basis of the method used for modeling, analyzing, and recognizing gestures. We propose pointing gesture-based large display interaction using a depth camera. A user interacts with applications for large display by using pointing gestures with the barehand. The calibration between large display and depth camera can be automatically performed by using RGB-D camera.. We also discuss implemented gestural systems as well as other potential applications of vision-based gesture recognition. We discuss directions of future research in gesture recognition, including its integration with other natural modes of human computer interaction.

  8. Computer aided systems human engineering: A hypermedia tool

    Science.gov (United States)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  9. The Human-Computer Domain Relation in UX Models

    DEFF Research Database (Denmark)

    Clemmensen, Torkil

    This paper argues that the conceptualization of the human, the computer and the domain of use in competing lines of UX research have problematic similarities and superficial differences. The paper qualitatively analyses concepts and models in five research papers that together represent two...... influential lines of UX research: aesthetics and temporal UX, and two use situations: using a website and starting to use a smartphone. The results suggest that the two lines of UX research share a focus on users’ evaluative judgments of technology, both focuses on product qualities rather than activity...... domains, give little details about users, and treat human-computer interaction as perception. The conclusion gives similarities and differences between the approaches to UX. The implications for theory building are indicated....

  10. A computer simulation study of oxygen defect centers in BaFBr and BaFCl

    Energy Technology Data Exchange (ETDEWEB)

    Islam, M.S.; Baetzold, R.C. (Eastman Kodak Company, Rochester, NY (United States). Corporate Research Labs.)

    1992-01-01

    Atomistic simulation techniques are used to examine several oxygen trapped-hole centers resulting from X-irradiation of BaFBr and BaFCl crystals. The calculations employ recently derived interatomic potentials for the oxide ion-host anion interactions. Particular attention is focussed on the sites occupied by the oxide impurity and the energetics of ionization. Our results show the defect model involving O{sup -} substitutional at a Br{sup -}/Cl{sup -} site to be a favorable trapped-hole center, in accord with the assignment proposed from electron paramagnetic resonance measurements. The defect simulations find a large energy barrier to oxide interstitial formation from the oxide precursor at a substitutional site, which suggests that conversion from O{sub x}{sup -} to O{sub int}{sup -} is highly unlikely as has been demonstrated in EPR experiments. Ion displacements following lattice relaxation about the defect are also examined. The position of O{sup -} substituted for Br{sup -} in BaFBr is computed to be displaced from the regular lattice site by 0.53 A, along the c axis towards the Ba{sup 2+} ion plane, in agreement with models derived later from ENDOR experiments, while O{sup -} substituted for F{sup -} remains on the lattice site in agreement with experiment. (author).

  11. Human-computer interaction: psychology as a science of design.

    Science.gov (United States)

    Carroll, J M

    1997-01-01

    Human-computer interaction (HCI) study is the region of intersection between psychology and the social sciences, on the one hand, and computer science and technology, on the other. HCI researchers analyze and design specific user interface technologies (e.g. pointing devices). They study and improve the processes of technology development (e.g. task analysis, design rationale). They develop and evaluate new applications of technology (e.g. word processors, digital libraries). Throughout the past two decades, HCI has progressively integrated its scientific concerns with the engineering goal of improving the usability of computer systems and applications, which has resulted in a body of technical knowledge and methodology. HCI continues to provide a challenging test domain for applying and developing psychological and social theory in the context of technology development and use.

  12. Human-Centered Command and Control of Future Autonomous Systems

    Science.gov (United States)

    2013-06-01

    displays revealed by our recent Naïve Realism research in metacognition and visual displays (Smallman & Cook, 2011). The role of the human factors...values as comparisons for real-time values to help monitor. These work-arounds are strikingly similar to strategies used by nuclear plant operators when...monitoring (Mumaw, Roth, Vicente, & Burns, 2000). The development and use of these strategies is indicative of the shortfalls of systems in both

  13. Advances in Human-Computer Interaction: Graphics and Animation Components for Interface Design

    Science.gov (United States)

    Cipolla Ficarra, Francisco V.; Nicol, Emma; Cipolla-Ficarra, Miguel; Richardson, Lucy

    We present an analysis of communicability methodology in graphics and animation components for interface design, called CAN (Communicability, Acceptability and Novelty). This methodology has been under development between 2005 and 2010, obtaining excellent results in cultural heritage, education and microcomputing contexts. In studies where there is a bi-directional interrelation between ergonomics, usability, user-centered design, software quality and the human-computer interaction. We also present the heuristic results about iconography and layout design in blogs and websites of the following countries: Spain, Italy, Portugal and France.

  14. Human-computer systems interaction backgrounds and applications 3

    CERN Document Server

    Kulikowski, Juliusz; Mroczek, Teresa; Wtorek, Jerzy

    2014-01-01

    This book contains an interesting and state-of the art collection of papers on the recent progress in Human-Computer System Interaction (H-CSI). It contributes the profound description of the actual status of the H-CSI field and also provides a solid base for further development and research in the discussed area. The contents of the book are divided into the following parts: I. General human-system interaction problems; II. Health monitoring and disabled people helping systems; and III. Various information processing systems. This book is intended for a wide audience of readers who are not necessarily experts in computer science, machine learning or knowledge engineering, but are interested in Human-Computer Systems Interaction. The level of particular papers and specific spreading-out into particular parts is a reason why this volume makes fascinating reading. This gives the reader a much deeper insight than he/she might glean from research papers or talks at conferences. It touches on all deep issues that ...

  15. Computational Hemodynamic Simulation of Human Circulatory System under Altered Gravity

    Science.gov (United States)

    Kim. Chang Sung; Kiris, Cetin; Kwak, Dochan

    2003-01-01

    A computational hemodynamics approach is presented to simulate the blood flow through the human circulatory system under altered gravity conditions. Numerical techniques relevant to hemodynamics issues are introduced to non-Newtonian modeling for flow characteristics governed by red blood cells, distensible wall motion due to the heart pulse, and capillary bed modeling for outflow boundary conditions. Gravitational body force terms are added to the Navier-Stokes equations to study the effects of gravity on internal flows. Six-type gravity benchmark problems are originally presented to provide the fundamental understanding of gravitational effects on the human circulatory system. For code validation, computed results are compared with steady and unsteady experimental data for non-Newtonian flows in a carotid bifurcation model and a curved circular tube, respectively. This computational approach is then applied to the blood circulation in the human brain as a target problem. A three-dimensional, idealized Circle of Willis configuration is developed with minor arteries truncated based on anatomical data. Demonstrated is not only the mechanism of the collateral circulation but also the effects of gravity on the distensible wall motion and resultant flow patterns.

  16. Human Systems Engineering for Launch processing at Kennedy Space Center (KSC)

    Science.gov (United States)

    Henderson, Gena; Stambolian, Damon B.; Stelges, Katrine

    2012-01-01

    Launch processing at Kennedy Space Center (KSC) is primarily accomplished by human users of expensive and specialized equipment. In order to reduce the likelihood of human error, to reduce personal injuries, damage to hardware, and loss of mission the design process for the hardware needs to include the human's relationship with the hardware. Just as there is electrical, mechanical, and fluids, the human aspect is just as important. The focus of this presentation is to illustrate how KSC accomplishes the inclusion of the human aspect in the design using human centered hardware modeling and engineering. The presentations also explain the current and future plans for research and development for improving our human factors analysis tools and processes.

  17. Human-Centered Planning for Effective Task Autonomy

    Science.gov (United States)

    2012-05-01

    complete the activities, including a soccer ball, tennis balls, rackets, step stools , and golf clubs. They were required to carry a Nokia 770 Internet...ball around the room once Steps Step up and down off a stool 10 times Tennis Bounce a tennis ball on a racket 10 times Golf Putt golf balls on a mini...system reliability affects pilot decision making. In Human Factors and Ergonomics Society 42nd Annual Meeting. 6.1, 8.5 Barret, L., and Barrett, D. 2001

  18. Center of mass velocity-based predictions in balance recovery following pelvis perturbations during human walking

    NARCIS (Netherlands)

    Vlutters, Mark; van Asseldonk, Edwin H.F.; van der Kooij, Herman

    2016-01-01

    In many simple walking models foot placement dictates the center of pressure location and ground reaction force components, whereas humans can modulate these aspects after foot contact. Because of the differences, it is unclear to what extend predictions made by models are valid for human walking.

  19. Building "Bob": A Project Exploring the Human Body at Western Illinois University Preschool Center

    Science.gov (United States)

    Brouette, Scott

    2008-01-01

    When the children at Western Illinois University Preschool Center embarked on a study of human bodies, they decided to build a life-size model of a body, organ by organ from the inside out, to represent some of the things they were learning. This article describes the building of "Bob," the human body model, highlighting the children's…

  20. Criteria of Human-computer Interface Design for Computer Assisted Surgery Systems

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jian-guo; LIN Yan-ping; WANG Cheng-tao; LIU Zhi-hong; YANG Qing-ming

    2008-01-01

    In recent years, computer assisted surgery (CAS) systems become more and more common in clinical practices, but few specific design criteria have been proposed for human-computer interface (HCI) in CAS systems. This paper tried to give universal criteria of HCI design for CAS systems through introduction of demonstration application, which is total knee replacement (TKR) with a nonimage-based navigation system.A typical computer assisted process can be divided into four phases: the preoperative planning phase, the intraoperative registration phase, the intraoperative navigation phase and finally the postoperative assessment phase. The interface design for four steps is described respectively in the demonstration application. These criteria this paper summarized can be useful to software developers to achieve reliable and effective interfaces for new CAS systems more easily.

  1. Issues in human/computer control of dexterous remote hands

    Science.gov (United States)

    Salisbury, K.

    1987-01-01

    Much research on dexterous robot hands has been aimed at the design and control problems associated with their autonomous operation, while relatively little research has addressed the problem of direct human control. It is likely that these two modes can be combined in a complementary manner yielding more capability than either alone could provide. While many of the issues in mixed computer/human control of dexterous hands parallel those found in supervisory control of traditional remote manipulators, the unique geometry and capabilities of dexterous hands pose many new problems. Among these are the control of redundant degrees of freedom, grasp stabilization and specification of non-anthropomorphic behavior. An overview is given of progress made at the MIT AI Laboratory in control of the Salisbury 3 finger hand, including experiments in grasp planning and manipulation via controlled slip. It is also suggested how we might introduce human control into the process at a variety of functional levels.

  2. Advancements in Violin-Related Human-Computer Interaction

    DEFF Research Database (Denmark)

    Overholt, Daniel

    2014-01-01

    Finesse is required while performing with many traditional musical instruments, as they are extremely responsive to human inputs. The violin is specifically examined here, as it excels at translating a performer’s gestures into sound in manners that evoke a wide range of affective qualities...... of human intelligence and emotion is at the core of the Musical Interface Technology Design Space, MITDS. This is a framework that endeavors to retain and enhance such traits of traditional instruments in the design of interactive live performance interfaces. Utilizing the MITDS, advanced Human......-Computer Interaction technologies for the violin are developed in order to allow musicians to explore new methods of creating music. Through this process, the aim is to provide musicians with control systems that let them transcend the interface itself, and focus on musically compelling performances....

  3. Virtual reality: A human centered tool for improving Manufacturing

    CERN Document Server

    Bennis, Fouad; Dépincé, Philippe

    2007-01-01

    Manufacturing is using Virtual Reality tools to enhance the product life cycle. Their definitions are still in flux and it is necessary to define their connections. Thus, firstly, we will introduce more closely some definitions where we will find that, if the Virtual manufacturing concepts originate from machining operations and evolve in this manufacturing area, there exist a lot of applications in different fields such as casting, forging, sheet metalworking and robotics (mechanisms). From the recent projects in Europe or in USA, we notice that the human perception or the simulation of mannequin is more and more needed in both fields. In this context, we have isolated some applications as ergonomic studies, assembly and maintenance simulation, design or training where the virtual reality tools can be applied. Thus, we find out a family of applications where the virtual reality tools give the engineers the main role in the optimization process. We will illustrate our paper by several examples where virtual r...

  4. Energy-Efficient Management of Data Center Resources for Cloud Computing: A Vision, Architectural Elements, and Open Challenges

    CERN Document Server

    Buyya, Rajkumar; Abawajy, Jemal

    2010-01-01

    Cloud computing is offering utility-oriented IT services to users worldwide. Based on a pay-as-you-go model, it enables hosting of pervasive applications from consumer, scientific, and business domains. However, data centers hosting Cloud applications consume huge amounts of energy, contributing to high operational costs and carbon footprints to the environment. Therefore, we need Green Cloud computing solutions that can not only save energy for the environment but also reduce operational costs. This paper presents vision, challenges, and architectural elements for energy-efficient management of Cloud computing environments. We focus on the development of dynamic resource provisioning and allocation algorithms that consider the synergy between various data center infrastructures (i.e., the hardware, power units, cooling and software), and holistically work to boost data center energy efficiency and performance. In particular, this paper proposes (a) architectural principles for energy-efficient management of ...

  5. Whatever works: a systematic user-centered training protocol to optimize brain-computer interfacing individually.

    Directory of Open Access Journals (Sweden)

    Elisabeth V C Friedrich

    Full Text Available This study implemented a systematic user-centered training protocol for a 4-class brain-computer interface (BCI. The goal was to optimize the BCI individually in order to achieve high performance within few sessions for all users. Eight able-bodied volunteers, who were initially naïve to the use of a BCI, participated in 10 sessions over a period of about 5 weeks. In an initial screening session, users were asked to perform the following seven mental tasks while multi-channel EEG was recorded: mental rotation, word association, auditory imagery, mental subtraction, spatial navigation, motor imagery of the left hand and motor imagery of both feet. Out of these seven mental tasks, the best 4-class combination as well as most reactive frequency band (between 8-30 Hz was selected individually for online control. Classification was based on common spatial patterns and Fisher's linear discriminant analysis. The number and time of classifier updates varied individually. Selection speed was increased by reducing trial length. To minimize differences in brain activity between sessions with and without feedback, sham feedback was provided in the screening and calibration runs in which usually no real-time feedback is shown. Selected task combinations and frequency ranges differed between users. The tasks that were included in the 4-class combination most often were (1 motor imagery of the left hand (2, one brain-teaser task (word association or mental subtraction (3, mental rotation task and (4 one more dynamic imagery task (auditory imagery, spatial navigation, imagery of the feet. Participants achieved mean performances over sessions of 44-84% and peak performances in single-sessions of 58-93% in this user-centered 4-class BCI protocol. This protocol is highly adjustable to individual users and thus could increase the percentage of users who can gain and maintain BCI control. A high priority for future work is to examine this protocol with severely

  6. Whatever works: a systematic user-centered training protocol to optimize brain-computer interfacing individually.

    Science.gov (United States)

    Friedrich, Elisabeth V C; Neuper, Christa; Scherer, Reinhold

    2013-01-01

    This study implemented a systematic user-centered training protocol for a 4-class brain-computer interface (BCI). The goal was to optimize the BCI individually in order to achieve high performance within few sessions for all users. Eight able-bodied volunteers, who were initially naïve to the use of a BCI, participated in 10 sessions over a period of about 5 weeks. In an initial screening session, users were asked to perform the following seven mental tasks while multi-channel EEG was recorded: mental rotation, word association, auditory imagery, mental subtraction, spatial navigation, motor imagery of the left hand and motor imagery of both feet. Out of these seven mental tasks, the best 4-class combination as well as most reactive frequency band (between 8-30 Hz) was selected individually for online control. Classification was based on common spatial patterns and Fisher's linear discriminant analysis. The number and time of classifier updates varied individually. Selection speed was increased by reducing trial length. To minimize differences in brain activity between sessions with and without feedback, sham feedback was provided in the screening and calibration runs in which usually no real-time feedback is shown. Selected task combinations and frequency ranges differed between users. The tasks that were included in the 4-class combination most often were (1) motor imagery of the left hand (2), one brain-teaser task (word association or mental subtraction) (3), mental rotation task and (4) one more dynamic imagery task (auditory imagery, spatial navigation, imagery of the feet). Participants achieved mean performances over sessions of 44-84% and peak performances in single-sessions of 58-93% in this user-centered 4-class BCI protocol. This protocol is highly adjustable to individual users and thus could increase the percentage of users who can gain and maintain BCI control. A high priority for future work is to examine this protocol with severely disabled users.

  7. Computed tomography of human joints and radioactive waste drums

    Energy Technology Data Exchange (ETDEWEB)

    Ashby, E; Bernardi, R; Hollerbach, K; Logan, C; Martz, H; Roberson, G P

    1999-06-01

    X- and gamma-ray imaging techniques in nondestructive evaluation (NDE) and assay (NDA) have been increasing use in an array of industrial, environmental, military, and medical applications. Much of this growth in recent years is attributed to the rapid development of computed tomography (CT) and the use of NDE throughout the life-cycle of a product. Two diverse examples of CT are discussed. (1) The computational approach to normal joint kinematics and prosthetic joint analysis offers an opportunity to evaluate and improve prosthetic human joint replacements before they are manufactured or surgically implanted. Computed tomography data from scanned joints are segmented, resulting in the identification of bone and other tissues of interest, with emphasis on the articular surfaces. (2) They are developing NDE and NDE techniques to analyze closed waste drums accurately and quantitatively. Active and passive computed tomography (A and PCT) is a comprehensive and accurate gamma-ray NDA method that can identify all detectable radioisotopes present in a container and measure their radioactivity.

  8. Gesture controlled human-computer interface for the disabled.

    Science.gov (United States)

    Szczepaniak, Oskar M; Sawicki, Dariusz J

    2017-02-28

    The possibility of using a computer by a disabled person is one of the difficult problems of the human-computer interaction (HCI), while the professional activity (employment) is one of the most important factors affecting the quality of life, especially for disabled people. The aim of the project has been to propose a new HCI system that would allow for resuming employment for people who have lost the possibility of a standard computer operation. The basic requirement was to replace all functions of a standard mouse without the need of performing precise hand movements and using fingers. The Microsoft's Kinect motion controller had been selected as a device which would recognize hand movements. Several tests were made in order to create optimal working environment with the new device. The new communication system consisted of the Kinect device and the proper software had been built. The proposed system was tested by means of the standard subjective evaluations and objective metrics according to the standard ISO 9241-411:2012. The overall rating of the new HCI system shows the acceptance of the solution. The objective tests show that although the new system is a bit slower, it may effectively replace the computer mouse. The new HCI system fulfilled its task for a specific disabled person. This resulted in the ability to return to work. Additionally, the project confirmed the possibility of effective but nonstandard use of the Kinect device. Med Pr 2017;68(1):1-21.

  9. Examining human rights and mental health among women in drug abuse treatment centers in Afghanistan.

    Science.gov (United States)

    Abadi, Melissa Harris; Shamblen, Stephen R; Johnson, Knowlton; Thompson, Kirsten; Young, Linda; Courser, Matthew; Vanderhoff, Jude; Browne, Thom

    2012-01-01

    Denial of human rights, gender disparities, and living in a war zone can be associated with severe depression and poor social functioning, especially for female drug abusers. This study of Afghan women in drug abuse treatment (DAT) centers assesses (a) the extent to which these women have experienced human rights violations and mental health problems prior to entering the DAT centers, and (b) whether there are specific risk factors for human rights violations among this population. A total of 176 in-person interviews were conducted with female patients admitted to three drug abuse treatment centers in Afghanistan in 2010. Nearly all women (91%) reported limitations with social functioning. Further, 41% of the women indicated they had suicide ideation and 27% of the women had attempted suicide at least once 30 days prior to entering the DAT centers due to feelings of sadness or hopelessness. Half of the women (50%) experienced at least one human rights violation in the past year prior to entering the DAT centers. Risk factors for human rights violations among this population include marital status, ethnicity, literacy, employment status, entering treatment based on one's own desire, limited social functioning, and suicide attempts. Conclusions stemming from the results are discussed.

  10. What do we mean by Human-Centered Design of Life-Critical Systems?

    Science.gov (United States)

    Boy, Guy A

    2012-01-01

    Human-centered design is not a new approach to design. Aerospace is a good example of a life-critical systems domain where participatory design was fully integrated, involving experimental test pilots and design engineers as well as many other actors of the aerospace engineering community. This paper provides six topics that are currently part of the requirements of the Ph.D. Program in Human-Centered Design of the Florida Institute of Technology (FIT.) This Human-Centered Design program offers principles, methods and tools that support human-centered sustainable products such as mission or process control environments, cockpits and hospital operating rooms. It supports education and training of design thinkers who are natural leaders, and understand complex relationships among technology, organizations and people. We all need to understand what we want to do with technology, how we should organize ourselves to a better life and finally find out whom we are and have become. Human-centered design is being developed for all these reasons and issues.

  11. Patient-Specific Computational Modeling of Human Phonation

    Science.gov (United States)

    Xue, Qian; Zheng, Xudong; University of Maine Team

    2013-11-01

    Phonation is a common biological process resulted from the complex nonlinear coupling between glottal aerodynamics and vocal fold vibrations. In the past, the simplified symmetric straight geometric models were commonly employed for experimental and computational studies. The shape of larynx lumen and vocal folds are highly three-dimensional indeed and the complex realistic geometry produces profound impacts on both glottal flow and vocal fold vibrations. To elucidate the effect of geometric complexity on voice production and improve the fundamental understanding of human phonation, a full flow-structure interaction simulation is carried out on a patient-specific larynx model. To the best of our knowledge, this is the first patient-specific flow-structure interaction study of human phonation. The simulation results are well compared to the established human data. The effects of realistic geometry on glottal flow and vocal fold dynamics are investigated. It is found that both glottal flow and vocal fold dynamics present a high level of difference from the previous simplified model. This study also paved the important step toward the development of computer model for voice disease diagnosis and surgical planning. The project described was supported by Grant Number ROlDC007125 from the National Institute on Deafness and Other Communication Disorders (NIDCD).

  12. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-03-24

    Roughly ~50% of the human genome, contains noncoding sequences serving as regulatory elements responsible for the diverse gene expression of the cells in the body. One very well studied category of regulatory elements is the category of enhancers. Enhancers increase the transcriptional output in cells through chromatin remodeling or recruitment of complexes of binding proteins. Identification of enhancer using computational techniques is an interesting area of research and up to now several approaches have been proposed. However, the current state-of-the-art methods face limitations since the function of enhancers is clarified, but their mechanism of function is not well understood. This PhD thesis presents a bioinformatics/computer science study that focuses on the problem of identifying enhancers in different human cells using computational techniques. The dissertation is decomposed into four main tasks that we present in different chapters. First, since many of the enhancer’s functions are not well understood, we study the basic biological models by which enhancers trigger transcriptional functions and we survey comprehensively over 30 bioinformatics approaches for identifying enhancers. Next, we elaborate more on the availability of enhancer data as produced by different enhancer identification methods and experimental procedures. In particular, we analyze advantages and disadvantages of existing solutions and we report obstacles that require further consideration. To mitigate these problems we developed the Database of Integrated Human Enhancers (DENdb), a centralized online repository that archives enhancer data from 16 ENCODE cell-lines. The integrated enhancer data are also combined with many other experimental data that can be used to interpret the enhancers content and generate a novel enhancer annotation that complements the existing integrative annotation proposed by the ENCODE consortium. Next, we propose the first deep-learning computational

  13. Shape perception in human and computer vision an interdisciplinary perspective

    CERN Document Server

    Dickinson, Sven J

    2013-01-01

    This comprehensive and authoritative text/reference presents a unique, multidisciplinary perspective on Shape Perception in Human and Computer Vision. Rather than focusing purely on the state of the art, the book provides viewpoints from world-class researchers reflecting broadly on the issues that have shaped the field. Drawing upon many years of experience, each contributor discusses the trends followed and the progress made, in addition to identifying the major challenges that still lie ahead. Topics and features: examines each topic from a range of viewpoints, rather than promoting a speci

  14. Computer simulations of human interferon gamma mutated forms

    Science.gov (United States)

    Lilkova, E.; Litov, L.; Petkov, P.; Petkov, P.; Markov, S.; Ilieva, N.

    2010-01-01

    In the general framework of the computer-aided drug design, the method of molecular-dynamics simulations is applied for investigation of the human interferon-gamma (hIFN-γ) binding to its two known ligands (its extracellular receptor and the heparin-derived oligosaccharides). A study of 100 mutated hIFN-γ forms is presented, the mutations encompassing residues 86-88. The structural changes are investigated by comparing the lengths of the α-helices, in which these residues are included, in the native hIFN-γ molecule and in the mutated forms. The most intriguing cases are examined in detail.

  15. Study on Human-Computer Interaction in Immersive Virtual Environment

    Institute of Scientific and Technical Information of China (English)

    段红; 黄柯棣

    2002-01-01

    Human-computer interaction is one of the most important issues in research of Virtual Environments. This paper introduces interaction software developed for a virtual operating environment for space experiments. Core components of the interaction software are: an object-oriented database for behavior management of virtual objects, a software agent called virtual eye for viewpoint control, and a software agent called virtual hand for object manipulation. Based on the above components, some instance programs for object manipulation have been developed. The user can observe the virtual environment through head-mounted display system, control viewpoint by head tracker and/or keyboard, and select and manipulate virtual objects by 3D mouse.

  16. A computational model of human auditory signal processing and perception

    DEFF Research Database (Denmark)

    Jepsen, Morten Løve; Ewert, Stephan D.; Dau, Torsten

    2008-01-01

    A model of computational auditory signal-processing and perception that accounts for various aspects of simultaneous and nonsimultaneous masking in human listeners is presented. The model is based on the modulation filterbank model described by Dau et al. [J. Acoust. Soc. Am. 102, 2892 (1997......)] but includes major changes at the peripheral and more central stages of processing. The model contains outer- and middle-ear transformations, a nonlinear basilar-membrane processing stage, a hair-cell transduction stage, a squaring expansion, an adaptation stage, a 150-Hz lowpass modulation filter, a bandpass...

  17. Examining human rights and mental health among women in drug abuse treatment centers in Afghanistan

    Directory of Open Access Journals (Sweden)

    Abadi MH

    2012-04-01

    Full Text Available Melissa Harris Abadi1, Stephen R Shamblen1, Knowlton Johnson1, Kirsten Thompson1, Linda Young1, Matthew Courser1, Jude Vanderhoff1, Thom Browne21Pacific Institute for Research and Evaluation – Louisville Center, Louisville, KY, USA; 2United States Department of State, Bureau of International Narcotics and Law Enforcement, Washington, DC, USAAbstract: Denial of human rights, gender disparities, and living in a war zone can be associated with severe depression and poor social functioning, especially for female drug abusers. This study of Afghan women in drug abuse treatment (DAT centers assesses (a the extent to which these women have experienced human rights violations and mental health problems prior to entering the DAT centers, and (b whether there are specific risk factors for human rights violations among this population. A total of 176 in-person interviews were conducted with female patients admitted to three drug abuse treatment centers in Afghanistan in 2010. Nearly all women (91% reported limitations with social functioning. Further, 41% of the women indicated they had suicide ideation and 27% of the women had attempted suicide at least once 30 days prior to entering the DAT centers due to feelings of sadness or hopelessness. Half of the women (50% experienced at least one human rights violation in the past year prior to entering the DAT centers. Risk factors for human rights violations among this population include marital status, ethnicity, literacy, employment status, entering treatment based on one’s own desire, limited social functioning, and suicide attempts. Conclusions stemming from the results are discussed.Keywords: Afghanistan, women, human rights, mental health, drug abuse treatment

  18. Atoms of recognition in human and computer vision.

    Science.gov (United States)

    Ullman, Shimon; Assif, Liav; Fetaya, Ethan; Harari, Daniel

    2016-03-01

    Discovering the visual features and representations used by the brain to recognize objects is a central problem in the study of vision. Recently, neural network models of visual object recognition, including biological and deep network models, have shown remarkable progress and have begun to rival human performance in some challenging tasks. These models are trained on image examples and learn to extract features and representations and to use them for categorization. It remains unclear, however, whether the representations and learning processes discovered by current models are similar to those used by the human visual system. Here we show, by introducing and using minimal recognizable images, that the human visual system uses features and processes that are not used by current models and that are critical for recognition. We found by psychophysical studies that at the level of minimal recognizable images a minute change in the image can have a drastic effect on recognition, thus identifying features that are critical for the task. Simulations then showed that current models cannot explain this sensitivity to precise feature configurations and, more generally, do not learn to recognize minimal images at a human level. The role of the features shown here is revealed uniquely at the minimal level, where the contribution of each feature is essential. A full understanding of the learning and use of such features will extend our understanding of visual recognition and its cortical mechanisms and will enhance the capacity of computational models to learn from visual experience and to deal with recognition and detailed image interpretation.

  19. 78 FR 69926 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare & Medicaid...

    Science.gov (United States)

    2013-11-21

    ... ADMINISTRATION Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Centers for Medicare & Medicaid... accordance with the provisions of the Privacy Act, as amended, this notice announces a renewal of an existing... Act of 1988 (Pub. L 100-503), amended the Privacy Act (5 U.S.C. 552a) by describing the...

  20. Adhesion of Human B Cells to Germinal Centers in Vitro Involves VLA-4 and INCAM-110

    Science.gov (United States)

    Freedman, Arnold S.; Munro, J. Michael; Rice, G. Edgar; Bevilacqua, Michael P.; Morimoto, Chikao; McIntyre, Bradley W.; Rhynhart, Kurt; Pober, Jordan S.; Nadler, Lee M.

    1990-08-01

    Human B lymphocytes localize and differentiate within the microenvironment of lymphoid germinal centers. A frozen section binding assay was developed for the identification of those molecules involved in the adhesive interactions between B cells and lymphoid follicles. Activated human B cells and B cell lines were found to selectively adhere to germinal centers. The VLA-4 molecule on the lymphocyte and the adhesion molecule INCAM-110, expressed on follicular dendritic cells, supported this interaction. This cellular interaction model can be used for the study of how B cells differentiate.

  1. Experimental verification of a computational technique for determining ground reactions in human bipedal stance.

    Science.gov (United States)

    Audu, Musa L; Kirsch, Robert F; Triolo, Ronald J

    2007-01-01

    We have developed a three-dimensional (3D) biomechanical model of human standing that enables us to study the mechanisms of posture and balance simultaneously in various directions in space. Since the two feet are on the ground, the system defines a kinematically closed-chain which has redundancy problems that cannot be resolved using the laws of mechanics alone. We have developed a computational (optimization) technique that avoids the problems with the closed-chain formulation thus giving users of such models the ability to make predictions of joint moments, and potentially, muscle activations using more sophisticated musculoskeletal models. This paper describes the experimental verification of the computational technique that is used to estimate the ground reaction vector acting on an unconstrained foot while the other foot is attached to the ground, thus allowing human bipedal standing to be analyzed as an open-chain system. The computational approach was verified in terms of its ability to predict lower extremity joint moments derived from inverse dynamic simulations performed on data acquired from four able-bodied volunteers standing in various postures on force platforms. Sensitivity analyses performed with model simulations indicated which ground reaction force (GRF) and center of pressure (COP) components were most critical for providing better estimates of the joint moments. Overall, the joint moments predicted by the optimization approach are strongly correlated with the joint moments computed using the experimentally measured GRF and COP (0.78 unity slope (experimental=computational results) for postures of the four subjects examined. These results indicate that this model-based technique can be relied upon to predict reasonable and consistent estimates of the joint moments using the predicted GRF and COP for most standing postures.

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  3. History of the USDA Human Nutrition Research Center on Aging at Tufts University

    Science.gov (United States)

    The Jean Mayer United States Department of Agriculture Human Nutrition Research Center on Aging at Tufts University, while quite a mouthful, is aptly named, since it has contributed substantially to the legacy of Jean Mayer, to the scientific stature of the USDA and, in Atwater’s tradition, to the d...

  4. Dragons, Ladybugs, and Softballs: Girls' STEM Engagement with Human-Centered Robotics

    Science.gov (United States)

    Gomoll, Andrea; Hmelo-Silver, Cindy E.; Šabanovic, Selma; Francisco, Matthew

    2016-01-01

    Early experiences in science, technology, engineering, and math (STEM) are important for getting youth interested in STEM fields, particularly for girls. Here, we explore how an after-school robotics club can provide informal STEM experiences that inspire students to engage with STEM in the future. Human-centered robotics, with its emphasis on the…

  5. Dragons, Ladybugs, and Softballs: Girls' STEM Engagement with Human-Centered Robotics

    Science.gov (United States)

    Gomoll, Andrea; Hmelo-Silver, Cindy E.; Šabanovic, Selma; Francisco, Matthew

    2016-01-01

    Early experiences in science, technology, engineering, and math (STEM) are important for getting youth interested in STEM fields, particularly for girls. Here, we explore how an after-school robotics club can provide informal STEM experiences that inspire students to engage with STEM in the future. Human-centered robotics, with its emphasis on the…

  6. Mode 2 in action. Working across sectors to create a Center for Humanities and Technology

    NARCIS (Netherlands)

    Wyatt, S.M.E.

    2015-01-01

    This article examines recent developments in Amsterdam to establish a Center for Humanities and Technology (CHAT). The project is a collaboration between public research institutions and a private partner. To date, a White Paper has been produced that sets out a shared research agenda addressing bot

  7. Adaptive work-centered and human-aware support agents for augmented cognition in tactical environments

    NARCIS (Netherlands)

    Neef, R.M.; Maanen, P.P. van; Petiet, P.; Spoelstra, M.

    2009-01-01

    We introduce a support system concept that offers both work-centered and human-aware support for operators in tactical command and control environments. The support system augments the cognitive capabilities of the operator by offering instant, personalized task and work support. The operator obtain

  8. Exposure Science and the US EPA National Center for Computational Toxicology

    Science.gov (United States)

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. The...

  9. Computational modeling of hypertensive growth in the human carotid artery

    Science.gov (United States)

    Sáez, Pablo; Peña, Estefania; Martínez, Miguel Angel; Kuhl, Ellen

    2014-06-01

    Arterial hypertension is a chronic medical condition associated with an elevated blood pressure. Chronic arterial hypertension initiates a series of events, which are known to collectively initiate arterial wall thickening. However, the correlation between macrostructural mechanical loading, microstructural cellular changes, and macrostructural adaptation remains unclear. Here, we present a microstructurally motivated computational model for chronic arterial hypertension through smooth muscle cell growth. To model growth, we adopt a classical concept based on the multiplicative decomposition of the deformation gradient into an elastic part and a growth part. Motivated by clinical observations, we assume that the driving force for growth is the stretch sensed by the smooth muscle cells. We embed our model into a finite element framework, where growth is stored locally as an internal variable. First, to demonstrate the features of our model, we investigate the effects of hypertensive growth in a real human carotid artery. Our results agree nicely with experimental data reported in the literature both qualitatively and quantitatively.

  10. Human-computer interface glove using flexible piezoelectric sensors

    Science.gov (United States)

    Cha, Youngsu; Seo, Jeonggyu; Kim, Jun-Sik; Park, Jung-Min

    2017-05-01

    In this note, we propose a human-computer interface glove based on flexible piezoelectric sensors. We select polyvinylidene fluoride as the piezoelectric material for the sensors because of advantages such as a steady piezoelectric characteristic and good flexibility. The sensors are installed in a fabric glove by means of pockets and Velcro bands. We detect changes in the angles of the finger joints from the outputs of the sensors, and use them for controlling a virtual hand that is utilized in virtual object manipulation. To assess the sensing ability of the piezoelectric sensors, we compare the processed angles from the sensor outputs with the real angles from a camera recoding. With good agreement between the processed and real angles, we successfully demonstrate the user interaction system with the virtual hand and interface glove based on the flexible piezoelectric sensors, for four hand motions: fist clenching, pinching, touching, and grasping.

  11. Human-computer interface incorporating personal and application domains

    Science.gov (United States)

    Anderson, Thomas G.

    2011-03-29

    The present invention provides a human-computer interface. The interface includes provision of an application domain, for example corresponding to a three-dimensional application. The user is allowed to navigate and interact with the application domain. The interface also includes a personal domain, offering the user controls and interaction distinct from the application domain. The separation into two domains allows the most suitable interface methods in each: for example, three-dimensional navigation in the application domain, and two- or three-dimensional controls in the personal domain. Transitions between the application domain and the personal domain are under control of the user, and the transition method is substantially independent of the navigation in the application domain. For example, the user can fly through a three-dimensional application domain, and always move to the personal domain by moving a cursor near one extreme of the display.

  12. Combining Natural Human-Computer Interaction and Wireless Communication

    Directory of Open Access Journals (Sweden)

    Ştefan Gheorghe PENTIUC

    2011-01-01

    Full Text Available In this paper we present how human-computer interaction can be improved by using wireless communication between devices. Devices that offer a natural user interaction, like the Microsoft Surface Table and tablet PCs, can work together to enhance the experience of an application. Users can use physical objects for a more natural way of handling the virtual world on one hand, and interact with other users wirelessly connected on the other. Physical objects, that interact with the surface table, have a tag attached to them, allowing us to identify them, and take the required action. The TCP/IP protocol was used to handle the wireless communication over the wireless network. A server and a client application were developed for the used devices. To get a wide range of targeted mobile devices, different frameworks for developing cross platform applications were analyzed.

  13. NASA Human Health and Performance Center: Open innovation successes and collaborative projects

    Science.gov (United States)

    Richard, Elizabeth E.; Davis, Jeffrey R.

    2014-11-01

    In May 2007, what was then the Space Life Sciences Directorate published the 2007 Space Life Sciences Strategy for Human Space Exploration, setting the course for development and implementation of new business models and significant advances in external collaboration over the next five years. The strategy was updated on the basis of these accomplishments and reissued as the NASA Human Health and Performance Strategy in 2012, and continues to drive new approaches to innovation for the directorate. This short paper describes the successful execution of the strategy, driving organizational change through open innovation efforts and collaborative projects, including efforts of the NASA Human Health and Performance Center (NHHPC).

  14. Wearable joystick for gloves-on human/computer interaction

    Science.gov (United States)

    Bae, Jaewook; Voyles, Richard M.

    2006-05-01

    In this paper, we present preliminary work on a novel wearable joystick for gloves-on human/computer interaction in hazardous environments. Interacting with traditional input devices can be clumsy and inconvenient for the operator in hazardous environments due to the bulkiness of multiple system components and troublesome wires. During a collapsed structure search, for example, protective clothing, uneven footing, and "snag" points in the environment can render traditional input devices impractical. Wearable computing has been studied by various researchers to increase the portability of devices and to improve the proprioceptive sense of the wearer's intentions. Specifically, glove-like input devices to recognize hand gestures have been developed for general-purpose applications. But, regardless of their performance, prior gloves have been fragile and cumbersome to use in rough environments. In this paper, we present a new wearable joystick to remove the wires from a simple, two-degree of freedom glove interface. Thus, we develop a wearable joystick that is low cost, durable and robust, and wire-free at the glove. In order to evaluate the wearable joystick, we take into consideration two metrics during operator tests of a commercial robot: task completion time and path tortuosity. We employ fractal analysis to measure path tortuosity. Preliminary user test results are presented that compare the performance of both a wearable joystick and a traditional joystick.

  15. The Astromaterials X-Ray Computed Tomography Laboratory at Johnson Space Center

    Science.gov (United States)

    Zeigler, R. A.; Coleff, D. M.; McCubbin, F. M.

    2017-01-01

    The Astromaterials Acquisition and Curation Office at NASA's Johnson Space Center (hereafter JSC curation) is the past, present, and future home of all of NASA's astromaterials sample collections. JSC curation currently houses all or part of nine different sample collections: (1) Apollo samples (1969), (2) Lunar samples (1972), (3) Antarctic meteorites (1976), (4) Cosmic Dust particles (1981), (5) Microparticle Impact Collection (1985), (6) Genesis solar wind atoms (2004); (7) Stardust comet Wild-2 particles (2006), (8) Stardust interstellar particles (2006), and (9) Hayabusa asteroid Itokawa particles (2010). Each sample collection is housed in a dedicated clean room, or suite of clean rooms, that is tailored to the requirements of that sample collection. Our primary goals are to maintain the long-term integrity of the samples and ensure that the samples are distributed for scientific study in a fair, timely, and responsible manner, thus maximizing the return on each sample. Part of the curation process is planning for the future, and we also perform fundamental research in advanced curation initiatives. Advanced Curation is tasked with developing procedures, technology, and data sets necessary for curating new types of sample collections, or getting new results from existing sample collections [2]. We are (and have been) planning for future curation, including cold curation, extended curation of ices and volatiles, curation of samples with special chemical considerations such as perchlorate-rich samples, and curation of organically- and biologically-sensitive samples. As part of these advanced curation efforts we are augmenting our analytical facilities as well. A micro X-Ray computed tomography (micro-XCT) laboratory dedicated to the study of astromaterials will be coming online this spring within the JSC Curation office, and we plan to add additional facilities that will enable nondestructive (or minimally-destructive) analyses of astromaterials in the near

  16. User-centered design in brain-computer interfaces-a case study.

    Science.gov (United States)

    Schreuder, Martijn; Riccio, Angela; Risetti, Monica; Dähne, Sven; Ramsay, Andrew; Williamson, John; Mattia, Donatella; Tangermann, Michael

    2013-10-01

    The array of available brain-computer interface (BCI) paradigms has continued to grow, and so has the corresponding set of machine learning methods which are at the core of BCI systems. The latter have evolved to provide more robust data analysis solutions, and as a consequence the proportion of healthy BCI users who can use a BCI successfully is growing. With this development the chances have increased that the needs and abilities of specific patients, the end-users, can be covered by an existing BCI approach. However, most end-users who have experienced the use of a BCI system at all have encountered a single paradigm only. This paradigm is typically the one that is being tested in the study that the end-user happens to be enrolled in, along with other end-users. Though this corresponds to the preferred study arrangement for basic research, it does not ensure that the end-user experiences a working BCI. In this study, a different approach was taken; that of a user-centered design. It is the prevailing process in traditional assistive technology. Given an individual user with a particular clinical profile, several available BCI approaches are tested and - if necessary - adapted to him/her until a suitable BCI system is found. Described is the case of a 48-year-old woman who suffered from an ischemic brain stem stroke, leading to a severe motor- and communication deficit. She was enrolled in studies with two different BCI systems before a suitable system was found. The first was an auditory event-related potential (ERP) paradigm and the second a visual ERP paradigm, both of which are established in literature. The auditory paradigm did not work successfully, despite favorable preconditions. The visual paradigm worked flawlessly, as found over several sessions. This discrepancy in performance can possibly be explained by the user's clinical deficit in several key neuropsychological indicators, such as attention and working memory. While the auditory paradigm relies

  17. Bridging the digital divide by increasing computer and cancer literacy: community technology centers for head-start parents and families.

    Science.gov (United States)

    Salovey, Peter; Williams-Piehota, Pamela; Mowad, Linda; Moret, Marta Elisa; Edlund, Denielle; Andersen, Judith

    2009-01-01

    This article describes the establishment of two community technology centers affiliated with Head Start early childhood education programs focused especially on Latino and African American parents of children enrolled in Head Start. A 6-hour course concerned with computer and cancer literacy was presented to 120 parents and other community residents who earned a free, refurbished, Internet-ready computer after completing the program. Focus groups provided the basis for designing the structure and content of the course and modifying it during the project period. An outcomes-based assessment comparing program participants with 70 nonparticipants at baseline, immediately after the course ended, and 3 months later suggested that the program increased knowledge about computers and their use, knowledge about cancer and its prevention, and computer use including health information-seeking via the Internet. The creation of community computer technology centers requires the availability of secure space, capacity of a community partner to oversee project implementation, and resources of this partner to ensure sustainability beyond core funding.

  18. A computational model for dynamic analysis of the human gait.

    Science.gov (United States)

    Vimieiro, Claysson; Andrada, Emanuel; Witte, Hartmut; Pinotti, Marcos

    2015-01-01

    Biomechanical models are important tools in the study of human motion. This work proposes a computational model to analyse the dynamics of lower limb motion using a kinematic chain to represent the body segments and rotational joints linked by viscoelastic elements. The model uses anthropometric parameters, ground reaction forces and joint Cardan angles from subjects to analyse lower limb motion during the gait. The model allows evaluating these data in each body plane. Six healthy subjects walked on a treadmill to record the kinematic and kinetic data. In addition, anthropometric parameters were recorded to construct the model. The viscoelastic parameter values were fitted for the model joints (hip, knee and ankle). The proposed model demonstrated that manipulating the viscoelastic parameters between the body segments could fit the amplitudes and frequencies of motion. The data collected in this work have viscoelastic parameter values that follow a normal distribution, indicating that these values are directly related to the gait pattern. To validate the model, we used the values of the joint angles to perform a comparison between the model results and previously published data. The model results show a same pattern and range of values found in the literature for the human gait motion.

  19. A multisegment computer simulation of normal human gait.

    Science.gov (United States)

    Gilchrist, L A; Winter, D A

    1997-12-01

    The goal of this project was to develop a computer simulation of normal human walking that would use as driving moments resultant joint moments from a gait analysis. The system description, initial conditions and driving moments were taken from an inverse dynamics analysis of a normal walking trial. A nine-segment three-dimensional (3-D) model, including a two-part foot, was used. Torsional, linear springs and dampers were used at the hip joints to keep the trunk vertical and at the knee and ankle joints to prevent nonphysiological motion. Dampers at other joints were required to ensure a smooth and realistic motion. The simulated human successfully completed one step (550 ms), including both single and double support phases. The model proved to be sensitive to changes in the spring stiffness values of the trunk controllers. Similar sensitivity was found with the springs used to prevent hyperextension of the knee at heel contact and of the metatarsal-phalangeal joint at push-off. In general, there was much less sensitivity to the damping coefficients. This simulation improves on previous efforts because it incorporates some features necessary in simulations designed to answer clinical science questions. Other control algorithms are required, however, to ensure that the model can be realistically adapted to different subjects.

  20. Human memory B cells originate from three distinct germinal center-dependent and -independent maturation pathways.

    Science.gov (United States)

    Berkowska, Magdalena A; Driessen, Gertjan J A; Bikos, Vasilis; Grosserichter-Wagener, Christina; Stamatopoulos, Kostas; Cerutti, Andrea; He, Bing; Biermann, Katharina; Lange, Johan F; van der Burg, Mirjam; van Dongen, Jacques J M; van Zelm, Menno C

    2011-08-25

    Multiple distinct memory B-cell subsets have been identified in humans, but it remains unclear how their phenotypic diversity corresponds to the type of responses from which they originate. Especially, the contribution of germinal center-independent responses in humans remains controversial. We defined 6 memory B-cell subsets based on their antigen-experienced phenotype and differential expression of CD27 and IgH isotypes. Molecular characterization of their replication history, Ig somatic hypermutation, and class-switch profiles demonstrated their origin from 3 different pathways. CD27⁻IgG⁺ and CD27⁺IgM⁺ B cells are derived from primary germinal center reactions, and CD27⁺IgA⁺ and CD27⁺IgG⁺ B cells are from consecutive germinal center responses (pathway 1). In contrast, natural effector and CD27⁻IgA⁺ memory B cells have limited proliferation and are also present in CD40L-deficient patients, reflecting a germinal center-independent origin. Natural effector cells at least in part originate from systemic responses in the splenic marginal zone (pathway 2). CD27⁻IgA⁺ cells share low replication history and dominant Igλ and IgA2 use with gut lamina propria IgA+ B cells, suggesting their common origin from local germinal center-independent responses (pathway 3). Our findings shed light on human germinal center-dependent and -independent B-cell memory formation and provide new opportunities to study these processes in immunologic diseases.

  1. Hybrid Human-Computing Distributed Sense-Making: Extending the SOA Paradigm for Dynamic Adjudication and Optimization of Human and Computer Roles

    Science.gov (United States)

    Rimland, Jeffrey C.

    2013-01-01

    In many evolving systems, inputs can be derived from both human observations and physical sensors. Additionally, many computation and analysis tasks can be performed by either human beings or artificial intelligence (AI) applications. For example, weather prediction, emergency event response, assistive technology for various human sensory and…

  2. Open-Box Muscle-Computer Interface: Introduction to Human-Computer Interactions in Bioengineering, Physiology, and Neuroscience Courses

    Science.gov (United States)

    Landa-Jiménez, M. A.; González-Gaspar, P.; Pérez-Estudillo, C.; López-Meraz, M. L.; Morgado-Valle, C.; Beltran-Parrazal, L.

    2016-01-01

    A Muscle-Computer Interface (muCI) is a human-machine system that uses electromyographic (EMG) signals to communicate with a computer. Surface EMG (sEMG) signals are currently used to command robotic devices, such as robotic arms and hands, and mobile robots, such as wheelchairs. These signals reflect the motor intention of a user before the…

  3. Open-Box Muscle-Computer Interface: Introduction to Human-Computer Interactions in Bioengineering, Physiology, and Neuroscience Courses

    Science.gov (United States)

    Landa-Jiménez, M. A.; González-Gaspar, P.; Pérez-Estudillo, C.; López-Meraz, M. L.; Morgado-Valle, C.; Beltran-Parrazal, L.

    2016-01-01

    A Muscle-Computer Interface (muCI) is a human-machine system that uses electromyographic (EMG) signals to communicate with a computer. Surface EMG (sEMG) signals are currently used to command robotic devices, such as robotic arms and hands, and mobile robots, such as wheelchairs. These signals reflect the motor intention of a user before the…

  4. The Human-Computer Interface and Information Literacy: Some Basics and Beyond.

    Science.gov (United States)

    Church, Gary M.

    1999-01-01

    Discusses human/computer interaction research, human/computer interface, and their relationships to information literacy. Highlights include communication models; cognitive perspectives; task analysis; theory of action; problem solving; instructional design considerations; and a suggestion that human/information interface may be a more appropriate…

  5. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    International genome sequencing projects have produced billions of nucleotides (letters) of DNA sequence data, including the complete genome sequences of 74 organisms. These genome sequences have created many new scientific opportunities, including the ability to identify sequence variations among individuals within a species. These genetic differences, which are known as single nucleotide polymorphisms (SNPs), are particularly important in understanding the genetic basis for disease susceptibility. Since the report of the complete human genome sequence, over two million human SNPs have been identified, including a large-scale comparison of an entire chromosome from twenty individuals. Of the protein coding SNPs (cSNPs), approximately half leads to a single amino acid change in the encoded protein (non-synonymous coding SNPs). Most of these changes are functionally silent, while the remainder negatively impact the protein and sometimes cause human disease. To date, over 550 SNPs have been found to cause single locus (monogenic) diseases and many others have been associated with polygenic diseases. SNPs have been linked to specific human diseases, including late-onset Parkinson disease, autism, rheumatoid arthritis and cancer. The ability to predict accurately the effects of these SNPs on protein function would represent a major advance toward understanding these diseases. To date several attempts have been made toward predicting the effects of such mutations. The most successful of these is a computational approach called ''Sorting Intolerant From Tolerant'' (SIFT). This method uses sequence conservation among many similar proteins to predict which residues in a protein are functionally important. However, this method suffers from several limitations. First, a query sequence must have a sufficient number of relatives to infer sequence conservation. Second, this method does not make use of or provide any information on protein structure, which

  6. Human Pacman: A Mobile Augmented Reality Entertainment System Based on Physical, Social, and Ubiquitous Computing

    Science.gov (United States)

    Cheok, Adrian David

    This chapter details the Human Pacman system to illuminate entertainment computing which ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on infrastructure provided by mobile computing, wireless LAN, and ubiquitous computing. With Human Pacman, we have a physical role-playing computer fantasy together with real human-social and mobile-gaming that emphasizes on collaboration and competition between players in a wide outdoor physical area that allows natural wide-area human-physical movements. Pacmen and Ghosts are now real human players in the real world experiencing mixed computer graphics fantasy-reality provided by using the wearable computers on them. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide novel experiences of seamless transitions between the real and virtual worlds. This is an example of a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.

  7. Spectrum of tablet computer use by medical students and residents at an academic medical center.

    Science.gov (United States)

    Robinson, Robert

    2015-01-01

    Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians. Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM) in July and August of 2012. Results. There were 76 medical student responses (26% response rate) and 66 resident/fellow responses to this survey (21% response rate). Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035). The most common reported uses were for accessing medical reference applications (46%), e-Books (45%), and board study (32%). Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010), review radiology images (27% vs. 12%, p = 0.019), and enter patient care orders (26% vs. 3%, p students use tablet computers to access medical references, e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks. Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on resident physicians. Further study is needed better understand how tablet computers and other mobile devices may assist in medical education and patient care.

  8. Adaptive TrimTree: Green Data Center Networks through Resource Consolidation, Selective Connectedness and Energy Proportional Computing

    Directory of Open Access Journals (Sweden)

    Saima Zafar

    2016-10-01

    Full Text Available A data center is a facility with a group of networked servers used by an organization for storage, management and dissemination of its data. The increase in data center energy consumption over the past several years is staggering, therefore efforts are being initiated to achieve energy efficiency of various components of data centers. One of the main reasons data centers have high energy inefficiency is largely due to the fact that most organizations run their data centers at full capacity 24/7. This results into a number of servers and switches being underutilized or even unutilized, yet working and consuming electricity around the clock. In this paper, we present Adaptive TrimTree; a mechanism that employs a combination of resource consolidation, selective connectedness and energy proportional computing for optimizing energy consumption in a Data Center Network (DCN. Adaptive TrimTree adopts a simple traffic-and-topology-based heuristic to find a minimum power network subset called ‘active network subset’ that satisfies the existing network traffic conditions while switching off the residual unused network components. A ‘passive network subset’ is also identified for redundancy which consists of links and switches that can be required in future and this subset is toggled to sleep state. An energy proportional computing technique is applied to the active network subset for adapting link data rates to workload thus maximizing energy optimization. We have compared our proposed mechanism with fat-tree topology and ElasticTree; a scheme based on resource consolidation. Our simulation results show that our mechanism saves 50%–70% more energy as compared to fat-tree and 19.6% as compared to ElasticTree, with minimal impact on packet loss percentage and delay. Additionally, our mechanism copes better with traffic anomalies and surges due to passive network provision.

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  10. A Semiotic Analysis of Young Children's Symbol Making in a Classroom Computer Center.

    Science.gov (United States)

    Labbo, Linda D.

    1996-01-01

    Investigates kindergarten children's production and use of symbol making on the computer in the classroom. Uses the metaphor of "screenland" to describe children's stances toward their work--children viewed the computer as a land to be entered for various purposes. Suggests that as these children emerged as users of symbols they also learned how…

  11. Performance Analysis of Heterogeneous Data Centers in Cloud Computing Using a Complex Queuing Model

    Directory of Open Access Journals (Sweden)

    Wei-Hua Bai

    2015-01-01

    Full Text Available Performance evaluation of modern cloud data centers has attracted considerable research attention among both cloud providers and cloud customers. In this paper, we investigate the heterogeneity of modern data centers and the service process used in these heterogeneous data centers. Using queuing theory, we construct a complex queuing model composed of two concatenated queuing systems and present this as an analytical model for evaluating the performance of heterogeneous data centers. Based on this complex queuing model, we analyze the mean response time, the mean waiting time, and other important performance indicators. We also conduct simulation experiments to confirm the validity of the complex queuing model. We further conduct numerical experiments to demonstrate that the traffic intensity (or utilization of each execution server, as well as the configuration of server clusters, in a heterogeneous data center will impact the performance of the system. Our results indicate that our analytical model is effective in accurately estimating the performance of the heterogeneous data center.

  12. Effective Use of Human Computer Interaction in Digital Academic Supportive Devices

    OpenAIRE

    Thuseethan, S.; Kuhanesan, S.

    2015-01-01

    In this research, a literature in human-computer interaction is reviewed and the technology aspect of human computer interaction related with digital academic supportive devices is also analyzed. According to all these concerns, recommendations to design good human-computer digital academic supportive devices are analyzed and proposed. Due to improvements in both hardware and software, digital devices have unveiled continuous advances in efficiency and processing capacity. However, many of th...

  13. A review of the design and development processes of simulation for training in healthcare - A technology-centered versus a human-centered perspective.

    Science.gov (United States)

    Persson, Johanna

    2017-01-01

    This article reviews literature about simulation systems for training in healthcare regarding the prevalence of human-centered approaches in the design and development of these systems, motivated by a tradition in this field of working technology-centered. The results show that the focus on human needs and context of use is limited. It is argued that a reduction of the focus on technical advancements in favor of the needs of the users and the healthcare community, underpinned by human factors and ergonomics theory, is favorable. Due to the low number of identified articles describing or discussing human-centered approaches it is furthermore concluded that the publication culture promotes technical descriptions and summative evaluations rather than descriptions and reflections regarding the design and development processes. Shifting the focus from a technology-centered approach to a human-centered one can aid in the process of creating simulation systems for training in healthcare that are: 1) relevant to the learning objectives, 2) adapted to the needs of users, context and task, and 3) not selected based on technical or fidelity criteria.

  14. NASA Human Health and Performance Center: Open Innovation Successes and Collaborative Projects

    Science.gov (United States)

    Davis, Jeffrey R.; Richard, Elizabeth E.

    2014-01-01

    In May 2007, what was then the Space Life Sciences Directorate published the 2007 Space Life Sciences Strategy for Human Space Exploration, which resulted in the development and implementation of new business models and significant advances in external collaboration over the next five years. The strategy was updated on the basis of these accomplishments and reissued as the NASA Human Health and Performance Strategy in 2012, and continues to drive new approaches to innovation for the directorate. This short paper describes the open innovation successes and collaborative projects developed over this timeframe, including the efforts of the NASA Human Health and Performance Center (NHHPC), which was established to advance human health and performance innovations for spaceflight and societal benefit via collaboration in new markets.

  15. An older-worker employment model: Japan's Silver Human Resource Centers.

    Science.gov (United States)

    Bass, S A; Oka, M

    1995-10-01

    Over the past 20 years, a unique model of publicly assisted industries has developed in Japan, which contracts for services provided by retirees. Jobs for retirees are part-time and temporary in nature and, for the most part, are designed to assist in expanding community-based services. The program, known as the Silver Human Resource Centers, has expanded nationwide and reflects a novel approach to the productive engagement of retirees in society that may be replicable in other industrialized nations.

  16. The EGI-Engage EPOS Competence Center - Interoperating heterogeneous AAI mechanisms and Orchestrating distributed computational resources

    Science.gov (United States)

    Bailo, Daniele; Scardaci, Diego; Spinuso, Alessandro; Sterzel, Mariusz; Schwichtenberg, Horst; Gemuend, Andre

    2016-04-01

    manage the use of the subsurface of the Earth. EPOS started its Implementation Phase in October 2015 and is now actively working in order to integrate multidisciplinary data into a single e-infrastructure. Multidisciplinary data are organized and governed by the Thematic Core Services (TCS) - European wide organizations and e-Infrastructure providing community specific data and data products - and are driven by various scientific communities encompassing a wide spectrum of Earth science disciplines. TCS data, data products and services will be integrated into the Integrated Core Services (ICS) system, that will ensure their interoperability and access to these services by the scientific community as well as other users within the society. The EPOS competence center (EPOS CC) goal is to tackle two of the main challenges that the ICS are going to face in the near future, by taking advantage of the technical solutions provided by EGI. In order to do this, we will present the two pilot use cases the EGI-EPOS CC is developing: 1) The AAI pilot, dealing with the provision of transparent and homogeneous access to the ICS infrastructure to users owning different kind of credentials (e.g. eduGain, OpenID Connect, X509 certificates etc.). Here the focus is on the mechanisms which allow the credential delegation. 2) The computational pilot, Improve the back-end services of an existing application in the field of Computational Seismology, developed in the context of the EC funded project VERCE. The application allows the processing and the comparison of data resulting from the simulation of seismic wave propagation following a real earthquake and real measurements recorded by seismographs. While the simulation data is produced directly by the users and stored in a Data Management System, the observations need to be pre-staged from institutional data-services, which are maintained by the community itself. This use case aims at exploiting the EGI FedCloud e-infrastructure for Data

  17. Human resources management in fitness centers and their relationship with the organizational performance

    Directory of Open Access Journals (Sweden)

    Jerónimo García Fernández

    2014-12-01

    Full Text Available Purpose: Human capital is essential in organizations providing sports services. However, there are few studies that examine what practices are carried out and whether they, affect sports organizations achieve better results are. Therefore the aim of this paper is to analyze the practices of human resource management in private fitness centers and the relationship established with organizational performance.Design/methodology/approach: Questionnaire to 101 managers of private fitness centers in Spain, performing exploratory and confirmatory factor analysis, and linear regressions between the variables.Findings: In organizations of fitness, the findings show that training practices, reward, communication and selection are positively correlated with organizational performance.Research limitations/implications: The fact that you made a convenience sampling in a given country and reduce the extrapolation of the results to the market.Originality/value: First, it represents a contribution to the fact that there are no studies analyzing the management of human resources in sport organizations from the point of view of the top leaders. On the other hand, allows fitness center managers to adopt practices to improve organizational performance.

  18. Epidemic transmission of human immunodeficiency virus in renal dialysis centers in Egypt.

    Science.gov (United States)

    El Sayed, N M; Gomatos, P J; Beck-Sagué, C M; Dietrich, U; von Briesen, H; Osmanov, S; Esparza, J; Arthur, R R; Wahdan, M H; Jarvis, W R

    2000-01-01

    In 1993 an epidemic of human immunodeficiency virus (HIV) infection occurred among 39 patients at 2 renal dialysis centers in Egypt. The centers, private center A (PCA) and university center A (UCA) were visited, HIV-infected patients were interviewed, seroconversion rates at UCA were calculated, and relatedness of HIV strains was determined by sequence analysis; 34 (62%) of 55 patients from UCA and 5 (42%) of 12 patients from PCA were HIV-infected. The HIV seroconversion risk at UCA varied significantly with day and shift of dialysis session. Practices that resulted in sharing of syringes among patients were observed at both centers. The analyzed V3 loop sequences of the HIV strain of 12 outbreak patients were >96% related to each other. V3 loop sequences from each of 8 HIV-infected Egyptians unrelated to the 1993 epidemic were only 76%-89% related to those from outbreak strains. Dialysis patients may be at risk for HIV infection if infection control guidelines are not followed.

  19. Brain representation of object-centered space in monkeys and humans.

    Science.gov (United States)

    Olson, Carl R

    2003-01-01

    Visuospatial cognition requires taking into account where things are relative to each other and not just relative to the viewer. Consequently it would make sense for the brain to form an explicit representation of object-centered and not just of ego-centered space. Evidence bearing on the presence and nature of neural maps of object-centered space has come from two sources: single-neuron recording in behaving monkeys and assessment of the visual abilities of human patients with hemispatial neglect. Studies of the supplementary eye field of the monkey have revealed that it contains neurons with object-centered spatial selectivity. These neurons fire when the monkey has selected, as target for an eye movement or attention, a particular location defined relative to a reference object. Studies of neglect have revealed that in some patients the condition is expressed with respect to an object-centered and object-aligned reference frame. These patients neglect one side of an object, as defined relative to its intrinsic midline, regardless of its location and orientation relative to the viewer. The two sets of observations are complementary in the sense that the loss of neurons, such as observed in the monkey, could explain the spatial distribution of neglect in these patients.

  20. Computational lipidology: predicting lipoprotein density profiles in human blood plasma.

    Directory of Open Access Journals (Sweden)

    Katrin Hübner

    2008-05-01

    Full Text Available Monitoring cholesterol levels is strongly recommended to identify patients at risk for myocardial infarction. However, clinical markers beyond "bad" and "good" cholesterol are needed to precisely predict individual lipid disorders. Our work contributes to this aim by bringing together experiment and theory. We developed a novel computer-based model of the human plasma lipoprotein metabolism in order to simulate the blood lipid levels in high resolution. Instead of focusing on a few conventionally used predefined lipoprotein density classes (LDL, HDL, we consider the entire protein and lipid composition spectrum of individual lipoprotein complexes. Subsequently, their distribution over density (which equals the lipoprotein profile is calculated. As our main results, we (i successfully reproduced clinically measured lipoprotein profiles of healthy subjects; (ii assigned lipoproteins to narrow density classes, named high-resolution density sub-fractions (hrDS, revealing heterogeneous lipoprotein distributions within the major lipoprotein classes; and (iii present model-based predictions of changes in the lipoprotein distribution elicited by disorders in underlying molecular processes. In its present state, the model offers a platform for many future applications aimed at understanding the reasons for inter-individual variability, identifying new sub-fractions of potential clinical relevance and a patient-oriented diagnosis of the potential molecular causes for individual dyslipidemia.

  1. Brain computer interface to enhance episodic memory in human participants

    Directory of Open Access Journals (Sweden)

    John F Burke

    2015-01-01

    Full Text Available Recent research has revealed that neural oscillations in the theta (4-8 Hz and alpha (9-14 Hz bands are predictive of future success in memory encoding. Because these signals occur before the presentation of an upcoming stimulus, they are considered stimulus-independent in that they correlate with enhanced memory encoding independent of the item being encoded. Thus, such stimulus-independent activity has important implications for the neural mechanisms underlying episodic memory as well as the development of cognitive neural prosthetics. Here, we developed a brain computer interface (BCI to test the ability of such pre-stimulus activity to modulate subsequent memory encoding. We recorded intracranial electroencephalography (iEEG in neurosurgical patients as they performed a free recall memory task, and detected iEEG theta and alpha oscillations that correlated with optimal memory encoding. We then used these detected oscillatory changes to trigger the presentation of items in the free recall task. We found that item presentation contingent upon the presence of prestimulus theta and alpha oscillations modulated memory performance in more sessions than expected by chance. Our results suggest that an electrophysiological signal may be causally linked to a specific behavioral condition, and contingent stimulus presentation has the potential to modulate human memory encoding.

  2. AHPCRC (Army High Performance Computing Rsearch Center) Bulletin. Volume 1, Issue 4

    Science.gov (United States)

    2011-01-01

    as glycerol or dimethyl sulfoxide (DMSO) to prevent agglomera- tion, then freeze-drying them (lyophilization). The platelets are thawed and...the reform and improvement of mathematics and science education by appropriate incorporation of computational and communication technologies. Our

  3. Psychosocial and Cultural Modeling in Human Computation Systems: A Gamification Approach

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Riensche, Roderick M.; Haack, Jereme N.; Butner, R. Scott

    2013-11-20

    “Gamification”, the application of gameplay to real-world problems, enables the development of human computation systems that support decision-making through the integration of social and machine intelligence. One of gamification’s major benefits includes the creation of a problem solving environment where the influence of cognitive and cultural biases on human judgment can be curtailed through collaborative and competitive reasoning. By reducing biases on human judgment, gamification allows human computation systems to exploit human creativity relatively unhindered by human error. Operationally, gamification uses simulation to harvest human behavioral data that provide valuable insights for the solution of real-world problems.

  4. Petascale Computing for Ground-Based Solar Physics with the DKIST Data Center

    Science.gov (United States)

    Berukoff, Steven J.; Hays, Tony; Reardon, Kevin P.; Spiess, DJ; Watson, Fraser; Wiant, Scott

    2016-05-01

    When construction is complete in 2019, the Daniel K. Inouye Solar Telescope will be the most-capable large aperture, high-resolution, multi-instrument solar physics facility in the world. The telescope is designed as a four-meter off-axis Gregorian, with a rotating Coude laboratory designed to simultaneously house and support five first-light imaging and spectropolarimetric instruments. At current design, the facility and its instruments will generate data volumes of 3 PB per year, and produce 107-109 metadata elements.The DKIST Data Center is being designed to store, curate, and process this flood of information, while providing association of science data and metadata to its acquisition and processing provenance. The Data Center will produce quality-controlled calibrated data sets, and make them available freely and openly through modern search interfaces and APIs. Documented software and algorithms will also be made available through community repositories like Github for further collaboration and improvement.We discuss the current design and approach of the DKIST Data Center, describing the development cycle, early technology analysis and prototyping, and the roadmap ahead. We discuss our iterative development approach, the underappreciated challenges of calibrating ground-based solar data, the crucial integration of the Data Center within the larger Operations lifecycle, and how software and hardware support, intelligently deployed, will enable high-caliber solar physics research and community growth for the DKIST's 40-year lifespan.

  5. Measurements and predictions of the air distribution systems in high compute density (Internet) data centers

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jinkyun [HIMEC (Hanil Mechanical Electrical Consultants) Ltd., Seoul 150-103 (Korea); Department of Architectural Engineering, Yonsei University, Seoul 120-749 (Korea); Lim, Taesub; Kim, Byungseon Sean [Department of Architectural Engineering, Yonsei University, Seoul 120-749 (Korea)

    2009-10-15

    When equipment power density increases, a critical goal of a data center cooling system is to separate the equipment exhaust air from the equipment intake air in order to prevent the IT server from overheating. Cooling systems for data centers are primarily differentiated according to the way they distribute air. The six combinations of flooded and locally ducted air distribution make up the vast majority of all installations, except fully ducted air distribution methods. Once the air distribution system (ADS) is selected, there are other elements that must be integrated into the system design. In this research, the design parameters and IT environmental aspects of the cooling system were studied with a high heat density data center. CFD simulation analysis was carried out in order to compare the heat removal efficiencies of various air distribution systems. The IT environment of an actual operating data center is measured to validate a model for predicting the effect of different air distribution systems. A method for planning and design of the appropriate air distribution system is described. IT professionals versed in precision air distribution mechanisms, components, and configurations can work more effectively with mechanical engineers to ensure the specification and design of optimized cooling solutions. (author)

  6. 64-row multi-dector computed tomography coronary image from a center with early experience: first illustration of learning curve

    Institute of Scientific and Technical Information of China (English)

    Sze Piaw CHIN; Tiong Kiam ONG; Wei Ling CHAN; Chee Khoon LIEW; M.Tobias Seyfarth; Fong Yean Yip ALAN; Houng Bang LIEW; Kui Hian SIM

    2006-01-01

    Background and objectives The recent joint ACCF/AHA clinical competence statement on cardiac imaging with multi-detector computed tomography recommended a minimum of 6 months training and 300 contrast examinations, of which the candidate must be directly involved in at least 100 studies. Whether this is adequate to become proficient in interpretation of coronary computed tomogsignificant coronary stenosis in a center with 1 year's experience using a 64-row scanner. Methods A total of 778 patients underwent contrast-enhanced CTA between January and December 2005. Out of these patients, 301 patients also underwent contrast-enhanced conventional coronary angiography (CCA). These patients were divided into 4 groups according to the time the examination was underwent. Group Q1: first quarter of the year (n=20), Group Q2: second quarter (n=128), Group Q3: third quarter (n=134), and Group Results The sensitivity, specificity, positive, and negative predictive values were Q1 - 64%, 89%, 49% and 94%, respectively; Q2 -79%, 96%, 74% and 97%, respectively; Q3 - 78%, 96%, 74%, 97%, respectively, and Q4 - 100% for all. Conclusions In a center with formal training and high caseload, our accuracy in CTA analysis reached a plateau after 6 months experience. Test-bolus protocols produce better image quality and can improve accuracy. New centers embarking on CTA will need to overcome an initial 6-month learning curve depending upon the caseload during which time they should consider correlation with CCA.

  7. Risk factors for computer visual syndrome (CVS) among operators of two call centers in São Paulo, Brazil.

    Science.gov (United States)

    Sa, Eduardo Costa; Ferreira Junior, Mario; Rocha, Lys Esther

    2012-01-01

    The aims of this study were to investigate work conditions, to estimate the prevalence and to describe risk factors associated with Computer Vision Syndrome among two call centers' operators in São Paulo (n = 476). The methods include a quantitative cross-sectional observational study and an ergonomic work analysis, using work observation, interviews and questionnaires. The case definition was the presence of one or more specific ocular symptoms answered as always, often or sometimes. The multiple logistic regression model, were created using the stepwise forward likelihood method and remained the variables with levels below 5% (p Computer Vision Syndrome was 54.6%. Associations verified were: being female (OR 2.6, 95% CI 1.6 to 4.1), lack of recognition at work (OR 1.4, 95% CI 1.1 to 1.8), organization of work in call center (OR 1.4, 95% CI 1.1 to 1.7) and high demand at work (OR 1.1, 95% CI 1.0 to 1.3). The organization and psychosocial factors at work should be included in prevention programs of visual syndrome among call centers' operators.

  8. Assessing students' genre knowledge in an engineering writing center: An analysis of sophomore lab reports in electrical and computer engineering

    Science.gov (United States)

    Walker, Kristin Wilds Davidson

    As discipline-specific writing centers continue to increase in number, writing center consultants must determine ways to help their clients acquire discipline-specific and course-specific literacy. One way to achieve this goal is through genre analysis. This study focuses on the genre of EECE 201 (Tools and Techniques for Electrical and Computer Engineers) lab reports and strategies writing center consultants can implement to teach students communication skills necessary for discipline-specific literacy. Beginning with a discussion of the Electrical and Computer Engineering (ECE) Writing Center's history, the methodological foundations of this study, and an historical overview of genre theory from classical times to the present, this study surveys the history and debates surrounding teaching genres to students. The role of assessment in analyzing and teaching genre is discussed as well, with application specifically to the sophomore-level EECE 201 course within ECE at the University of South Carolina. The study itself consists of analyzing four students' lab reports written for the EECE 201 course. Using a list of eleven characteristics developed with experienced communicators within this discipline, I analyzed each report (there is a total of 14), determining to what extent the characteristics appeared in the reports. At the end of each student's analysis, a table summarizes the information gathered from the reports, and overall conclusions are drawn for each student. The end of the study chapter presents generic writing trends exhibited by the students during the semester, such as inability to show evidence of inductive/deductive reasoning and difficulties with conceptualizing audience and applying formatting skills. The study concludes by recommending strategies that ECE Writing Center consultants can implement to help the sophomore students acquire discipline-specific knowledge. Going beyond the ECE Writing Center's context, however, the study also suggests

  9. A computer system to analyze showers in nuclear emulsions: Center Director's discretionary fund report

    Science.gov (United States)

    Meegan, C. A.; Fountain, W. F.; Berry, F. A., Jr.

    1987-01-01

    A system to rapidly digitize data from showers in nuclear emulsions is described. A TV camera views the emulsions though a microscope. The TV output is superimposed on the monitor of a minicomputer. The operator uses the computer's graphics capability to mark the positions of particle tracks. The coordinates of each track are stored on a disk. The computer then predicts the coordinates of each track through successive layers of emulsion. The operator, guided by the predictions, thus tracks and stores the development of the shower. The system provides a significant improvement over purely manual methods of recording shower development in nuclear emulsion stacks.

  10. Computer science security research and human subjects: emerging considerations for research ethics boards.

    Science.gov (United States)

    Buchanan, Elizabeth; Aycock, John; Dexter, Scott; Dittrich, David; Hvizdak, Erin

    2011-06-01

    This paper explores the growing concerns with computer science research, and in particular, computer security research and its relationship with the committees that review human subjects research. It offers cases that review boards are likely to confront, and provides a context for appropriate consideration of such research, as issues of bots, clouds, and worms enter the discourse of human subjects review.

  11. A Real-Time Model-Based Human Motion Tracking and Analysis for Human-Computer Interface Systems

    Directory of Open Access Journals (Sweden)

    Chung-Lin Huang

    2004-09-01

    Full Text Available This paper introduces a real-time model-based human motion tracking and analysis method for human computer interface (HCI. This method tracks and analyzes the human motion from two orthogonal views without using any markers. The motion parameters are estimated by pattern matching between the extracted human silhouette and the human model. First, the human silhouette is extracted and then the body definition parameters (BDPs can be obtained. Second, the body animation parameters (BAPs are estimated by a hierarchical tritree overlapping searching algorithm. To verify the performance of our method, we demonstrate different human posture sequences and use hidden Markov model (HMM for posture recognition testing.

  12. Eliciting Children's Recall of Events: How Do Computers Compare with Humans?

    Science.gov (United States)

    Powell, Martine B.; Wilson, J. Clare; Thomson, Donald M.

    2002-01-01

    Describes a study that investigated the usefulness of an interactive computer program in eliciting children's reports about an event. Compared results of interviews by computer with interviews with humans with children aged five through eight that showed little benefit in computers over face-to-face interviews. (Author/LRW)

  13. PRODEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP : HIGH PERFORMANCE COMPUTING WITH QCDOC AND BLUEGENE.

    Energy Technology Data Exchange (ETDEWEB)

    CHRIST,N.; DAVENPORT,J.; DENG,Y.; GARA,A.; GLIMM,J.; MAWHINNEY,R.; MCFADDEN,E.; PESKIN,A.; PULLEYBLANK,W.

    2003-03-11

    Staff of Brookhaven National Laboratory, Columbia University, IBM and the RIKEN BNL Research Center organized a one-day workshop held on February 28, 2003 at Brookhaven to promote the following goals: (1) To explore areas other than QCD applications where the QCDOC and BlueGene/L machines can be applied to good advantage, (2) To identify areas where collaboration among the sponsoring institutions can be fruitful, and (3) To expose scientists to the emerging software architecture. This workshop grew out of an informal visit last fall by BNL staff to the IBM Thomas J. Watson Research Center that resulted in a continuing dialog among participants on issues common to these two related supercomputers. The workshop was divided into three sessions, addressing the hardware and software status of each system, prospective applications, and future directions.

  14. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT) imaging center

    Energy Technology Data Exchange (ETDEWEB)

    Santana, Priscila do Carmo; Oliveira, Paulo Marcio Campos de; Mamede, Marcelo; Silveira, Mariana de Castro; Aguiar, Polyanna; Real, Raphaela Vila, E-mail: pridili@gmail.com [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil); Silva, Teogenes Augusto da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2015-01-15

    Objective: to evaluate the level of ambient radiation in a PET/CT center. Materials and methods: previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results: in none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/ year) or free area (0.5 mSv/year) as recommended by the Brazilian regulations. Conclusion: in the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed. (author)

  15. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT imaging center

    Directory of Open Access Journals (Sweden)

    Priscila do Carmo Santana

    2015-02-01

    Full Text Available Objective: To evaluate the level of ambient radiation in a PET/CT center. Materials and Methods: Previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results: In none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/year or free area (0.5 mSv/year as recommended by the Brazilian regulations. Conclusion: In the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed.

  16. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT) imaging center

    Science.gov (United States)

    Santana, Priscila do Carmo; de Oliveira, Paulo Marcio Campos; Mamede, Marcelo; Silveira, Mariana de Castro; Aguiar, Polyanna; Real, Raphaela Vila; da Silva, Teógenes Augusto

    2015-01-01

    Objective To evaluate the level of ambient radiation in a PET/CT center. Materials and Methods Previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results In none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/year) or free area (0.5 mSv/year) as recommended by the Brazilian regulations. Conclusion In the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed. PMID:25798004

  17. ALICE Grid Computing at the GridKa Tier-1 Center

    Science.gov (United States)

    Jung, C.; Petzold, A.; Pfeiler, C.-E.; Schwarz, K.

    2012-12-01

    The GridKa center at the Karlsruhe Institute of Technology is the largest ALICE Tier-1 center. It hosts 40,000 HEPSEPC'06, approximately 2.75 PB of disk space, and 5.25 PB of tape space for the ‘A Large Ion Collider Experiment’ (ALICE), at the CERN Large Hadron Collider (LHC). These resources are accessed via the AliEn (ALICE Environment) middleware. The storage is divided into two instances, both using the storage middleware xrootd. We will focus on the set-up of these resources and on the topic of monitoring. The latter serves a vast number of purposes, ranging from efficiency statistics for process and procedure optimization to alerts for on-call duty engineers.

  18. Noise-Resilient Quantum Computing with a Nitrogen-Vacancy Center and Nuclear Spins

    Science.gov (United States)

    Casanova, J.; Wang, Z.-Y.; Plenio, M. B.

    2016-09-01

    Selective control of qubits in a quantum register for the purposes of quantum information processing represents a critical challenge for dense spin ensembles in solid-state systems. Here we present a protocol that achieves a complete set of selective electron-nuclear gates and single nuclear rotations in such an ensemble in diamond facilitated by a nearby nitrogen-vacancy (NV) center. The protocol suppresses internuclear interactions as well as unwanted coupling between the NV center and other spins of the ensemble to achieve quantum gate fidelities well exceeding 99%. Notably, our method can be applied to weakly coupled, distant spins representing a scalable procedure that exploits the exceptional properties of nuclear spins in diamond as robust quantum memories.

  19. Factors Affecting the Computer Usage of Physics Teachers Working at Private Training Centers

    Science.gov (United States)

    Guzel, Hatice

    2011-01-01

    The rapid development of computer and instructional technologies eases our lives in many ways. Private teaching institutions have become one of the most important entities in the educational system of Turkey. The topics spelling at private teaching institutions will determine the university as well as the departments that the students are going to…

  20. Person-Centered Emotional Support and Gender Attributions in Computer-Mediated Communication

    Science.gov (United States)

    Spottswood, Erin L.; Walther, Joseph B.; Holmstrom, Amanda J.; Ellison, Nicole B.

    2013-01-01

    Without physical appearance, identification in computer-mediated communication is relatively ambiguous and may depend on verbal cues such as usernames, content, and/or style. This is important when gender-linked differences exist in the effects of messages, as in emotional support. This study examined gender attribution for online support…

  1. Modern USV technology for uninterrupted operation of computer centers; Moderne USV-Techniken halten Rechenzentren am Laufen

    Energy Technology Data Exchange (ETDEWEB)

    Graefen, R.

    2002-03-01

    Computer centers always have the same problems: A low budget necessitates low-cost products which on the other hand should have high availability and a capacity for enhancement as the infrastructure is growing. Uninterrupted power supply systems offer solutions with new concepts and novel equipment. [German] Rechenzentrumbetreiber haben alle dieselben Probleme: Ein geringes Budget zwingt zum Einkauf kostenguenstiger Produkte, die dennoch hochverfuegbar sein sollen und die beim Ausbau der Infrastruktur mitwachsen koennen. Diese Anforderungen erfuellen die Hersteller unterbrechungsfreier Stromversorgungen (USV) mit neuen Konzepten und Geraeten. (orig.)

  2. User-Centered Design Gymkhana

    OpenAIRE

    Garreta Domingo, Muriel; Almirall Hill, Magí; Mor Pera, Enric

    2007-01-01

    The User-centered design (UCD) Gymkhana is a tool for human-computer interaction practitioners to demonstrate through a game the key user-centered design methods and how they interrelate in the design process.The target audiences are other organizational departments unfamiliar with UCD but whose work is related to the definition, cretaion, and update of a product service.

  3. The growth of the UniTree mass storage system at the NASA Center for Computational Sciences

    Science.gov (United States)

    Tarshish, Adina; Salmon, Ellen

    1993-01-01

    In October 1992, the NASA Center for Computational Sciences made its Convex-based UniTree system generally available to users. The ensuing months saw the growth of near-online data from nil to nearly three terabytes, a doubling of the number of CPU's on the facility's Cray YMP (the primary data source for UniTree), and the necessity for an aggressive regimen for repacking sparse tapes and hierarchical 'vaulting' of old files to freestanding tape. Connectivity was enhanced as well with the addition of UltraNet HiPPI. This paper describes the increasing demands placed on the storage system's performance and throughput that resulted from the significant augmentation of compute-server processor power and network speed.

  4. Autonomous Robot Navigation in Human-Centered Environments Based on 3D Data Fusion

    Science.gov (United States)

    Steinhaus, Peter; Strand, Marcus; Dillmann, Rüdiger

    2007-12-01

    Efficient navigation of mobile platforms in dynamic human-centered environments is still an open research topic. We have already proposed an architecture (MEPHISTO) for a navigation system that is able to fulfill the main requirements of efficient navigation: fast and reliable sensor processing, extensive global world modeling, and distributed path planning. Our architecture uses a distributed system of sensor processing, world modeling, and path planning units. In this arcticle, we present implemented methods in the context of data fusion algorithms for 3D world modeling and real-time path planning. We also show results of the prototypic application of the system at the museum ZKM (center for art and media) in Karlsruhe.

  5. Autonomous Robot Navigation in Human-Centered Environments Based on 3D Data Fusion

    Directory of Open Access Journals (Sweden)

    Rüdiger Dillmann

    2007-01-01

    Full Text Available Efficient navigation of mobile platforms in dynamic human-centered environments is still an open research topic. We have already proposed an architecture (MEPHISTO for a navigation system that is able to fulfill the main requirements of efficient navigation: fast and reliable sensor processing, extensive global world modeling, and distributed path planning. Our architecture uses a distributed system of sensor processing, world modeling, and path planning units. In this arcticle, we present implemented methods in the context of data fusion algorithms for 3D world modeling and real-time path planning. We also show results of the prototypic application of the system at the museum ZKM (center for art and media in Karlsruhe.

  6. Developing Educational Computer Animation Based on Human Personality Types

    Science.gov (United States)

    Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol

    2015-01-01

    Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…

  7. AHPCRC (Army High Performance Computing Research Center) Bulletin. Volume 1, Issue 3

    Science.gov (United States)

    2011-01-01

    attached to the underlying struc- Time-lapse computer simulation of fabric deformation and rupture. Zylon fabric test samples are ruptured using a...lie in the region between 2% and 10% prior to rupturing. Zylon fabric, which was used in the experiments with which the simulations were compared...ruptures at 3% strain. Zylon yarn has 350 fibrils per strand, and the simulated yarn response is obtained by summing the responses for all the fibrils

  8. Dragons, Ladybugs, and Softballs: Girls' STEM Engagement with Human-Centered Robotics

    Science.gov (United States)

    Gomoll, Andrea; Hmelo-Silver, Cindy E.; Šabanović, Selma; Francisco, Matthew

    2016-12-01

    Early experiences in science, technology, engineering, and math (STEM) are important for getting youth interested in STEM fields, particularly for girls. Here, we explore how an after-school robotics club can provide informal STEM experiences that inspire students to engage with STEM in the future. Human-centered robotics, with its emphasis on the social aspects of science and technology, may be especially important for bringing girls into the STEM pipeline. Using a problem-based approach, we designed two robotics challenges. We focus here on the more extended second challenge, in which participants were asked to imagine and build a telepresence robot that would allow others to explore their space from a distance. This research follows four girls as they engage with human-centered telepresence robotics design. We constructed case studies of these target participants to explore their different forms of engagement and phases of interest development—considering facets of behavioral, social, cognitive, and conceptual-to-consequential engagement as well as stages of interest ranging from triggered interest to well-developed individual interest. The results demonstrated that opportunities to personalize their robots and feedback from peers and facilitators were important motivators. We found both explicit and vicarious engagement and varied interest phases in our group of four focus participants. This first iteration of our project demonstrated that human-centered robotics is a promising approach to getting girls interested and engaged in STEM practices. As we design future iterations of our robotics club environment, we must consider how to harness multiple forms of leadership and engagement without marginalizing students with different working preferences.

  9. Dragons, Ladybugs, and Softballs: Girls' STEM Engagement with Human-Centered Robotics

    Science.gov (United States)

    Gomoll, Andrea; Hmelo-Silver, Cindy E.; Šabanović, Selma; Francisco, Matthew

    2016-08-01

    Early experiences in science, technology, engineering, and math (STEM) are important for getting youth interested in STEM fields, particularly for girls. Here, we explore how an after-school robotics club can provide informal STEM experiences that inspire students to engage with STEM in the future. Human-centered robotics, with its emphasis on the social aspects of science and technology, may be especially important for bringing girls into the STEM pipeline. Using a problem-based approach, we designed two robotics challenges. We focus here on the more extended second challenge, in which participants were asked to imagine and build a telepresence robot that would allow others to explore their space from a distance. This research follows four girls as they engage with human-centered telepresence robotics design. We constructed case studies of these target participants to explore their different forms of engagement and phases of interest development—considering facets of behavioral, social, cognitive, and conceptual-to-consequential engagement as well as stages of interest ranging from triggered interest to well-developed individual interest. The results demonstrated that opportunities to personalize their robots and feedback from peers and facilitators were important motivators. We found both explicit and vicarious engagement and varied interest phases in our group of four focus participants. This first iteration of our project demonstrated that human-centered robotics is a promising approach to getting girls interested and engaged in STEM practices. As we design future iterations of our robotics club environment, we must consider how to harness multiple forms of leadership and engagement without marginalizing students with different working preferences.

  10. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    Science.gov (United States)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  11. Human-centered design of a cyber-physical system for advanced response to Ebola (CARE).

    Science.gov (United States)

    Dimitrov, Velin; Jagtap, Vinayak; Skorinko, Jeanine; Chernova, Sonia; Gennert, Michael; Padir, Taşkin

    2015-01-01

    We describe the process towards the design of a safe, reliable, and intuitive emergency treatment unit to facilitate a higher degree of safety and situational awareness for medical staff, leading to an increased level of patient care during an epidemic outbreak in an unprepared, underdeveloped, or disaster stricken area. We start with a human-centered design process to understand the design challenge of working with Ebola treatment units in Western Africa in the latest Ebola outbreak, and show preliminary work towards cyber-physical technologies applicable to potentially helping during the next outbreak.

  12. Human-robot interactions during the robot-assisted urban search and rescue response at the World Trade Center.

    Science.gov (United States)

    Casper, J; Murphy, R R

    2003-01-01

    The World Trade Center (WTC) rescue response provided an unfortunate opportunity to study the human-robot interactions (HRI) during a real unstaged rescue for the first time. A post-hoc analysis was performed on the data collected during the response, which resulted in 17 findings on the impact of the environment and conditions on the HRI: the skills displayed and needed by robots and humans, the details of the Urban Search and Rescue (USAR) task, the social informatics in the USAR domain, and what information is communicated at what time. The results of this work impact the field of robotics by providing a case study for HRI in USAR drawn from an unstaged USAR effort. Eleven recommendations are made based on the findings that impact the robotics, computer science, engineering, psychology, and rescue fields. These recommendations call for group organization and user confidence studies, more research into perceptual and assistive interfaces, and formal models of the state of the robot, state of the world, and information as to what has been observed.

  13. Establishing and evaluating bar-code technology in blood sampling system: a model based on human centered human-centered design method.

    Science.gov (United States)

    Chou, Shin-Shang; Yan, Hsiu-Fang; Huang, Hsiu-Ya; Tseng, Kuan-Jui; Kuo, Shu-Chen

    2012-01-01

    This study intended to use a human-centered design study method to develop a bar-code technology in blood sampling process. By using the multilevel analysis to gather the information, the bar-code technology has been constructed to identify the patient's identification, simplify the work process, and prevent medical error rates. A Technology Acceptance Model questionnaire was developed to assess the effectiveness of system and the data of patient's identification and sample errors were collected daily. The average scores of 8 items users' perceived ease of use was 25.21(3.72), 9 items users' perceived usefulness was 28.53(5.00), and 14 items task-technology fit was 52.24(7.09), the rate of patient identification error and samples with order cancelled were down to zero, however, new errors were generated after the new system deployed; which were the position of barcode stickers on the sample tubes. Overall, more than half of nurses (62.5%) were willing to use the new system.

  14. Teaching scientific principles through a computer-based, design-centered learning environment

    Science.gov (United States)

    Wolfe, Michael Brian

    Research on science instruction indicates that the traditional science classroom is not always effective in improving students' scientific understanding. Physics courses, in particular, do not promote the ability to apply scientific principles for many reasons, based on their focus on procedural problem-solving and lab exercises. In this dissertation, I propose the Designing-to-Learn Architecture (DTLA), a design-centered goal-based scenario (GBS) architecture, theoretically grounded in the literature on design-centered learning environments, goal-based scenarios, intelligent tutoring systems and simulations. The DTLA offers an alternative approach to addressing the issues encountered in the traditional science classroom. The architecture consists of an artifact with associated design goals; components with component options; a simulation; a reference database; and guided tutorials. I describe the design of Goin' Up?, the prototype DTL application, serving as the basis for evaluating the effectiveness of the DTLA. I present results of interview and testing protocols from the formative evaluation of Goin' Up?, suggesting that learning outcomes, though not statistically significant, could be improved through DTLA enhancements informed by usage patterns in software sessions. I conclude with an analysis of the results and suggestions for improvements to the DTLA, including additional components to address reflection, provide support for novice designers, and offer tutorial guidance on the analysis of the artifact.

  15. Appearance-based human gesture recognition using multimodal features for human computer interaction

    Science.gov (United States)

    Luo, Dan; Gao, Hua; Ekenel, Hazim Kemal; Ohya, Jun

    2011-03-01

    The use of gesture as a natural interface plays an utmost important role for achieving intelligent Human Computer Interaction (HCI). Human gestures include different components of visual actions such as motion of hands, facial expression, and torso, to convey meaning. So far, in the field of gesture recognition, most previous works have focused on the manual component of gestures. In this paper, we present an appearance-based multimodal gesture recognition framework, which combines the different groups of features such as facial expression features and hand motion features which are extracted from image frames captured by a single web camera. We refer 12 classes of human gestures with facial expression including neutral, negative and positive meanings from American Sign Languages (ASL). We combine the features in two levels by employing two fusion strategies. At the feature level, an early feature combination can be performed by concatenating and weighting different feature groups, and LDA is used to choose the most discriminative elements by projecting the feature on a discriminative expression space. The second strategy is applied on decision level. Weighted decisions from single modalities are fused in a later stage. A condensation-based algorithm is adopted for classification. We collected a data set with three to seven recording sessions and conducted experiments with the combination techniques. Experimental results showed that facial analysis improve hand gesture recognition, decision level fusion performs better than feature level fusion.

  16. Life Sciences Division and Center for Human Genome Studies. Annual report, 1991

    Energy Technology Data Exchange (ETDEWEB)

    Spitzmiller, D.; Bradbury, M.; Cram, S. [comps.

    1992-05-01

    This report summarizes the research and development activities of Los Alamos National Laboratories Life Sciences Division and biological aspects of the Center for Human Genome Studies for the calendar year 1991. Selected research highlights include: yeast artificial chromosome libraries from flow sorted human chromosomes 16 and 21; distances between the antigen binding sites of three murine antibody subclasses measured using neutron and x-ray scattering; NFCR 10th anniversary highlights; kinase-mediated differences found in the cell cycle regulation of normal and transformed cells; and detecting mutations that cause Gaucher`s disease by denaturing gradient gel electrophoresis. Project descriptions include: genomic structure and regulation, molecular structure, cytometry, cell growth and differentiation, radiation biology and carcinogenesis, and pulmonary biology.

  17. A Human-Centered Smart Home System with Wearable-Sensor Behavior Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Jianting; Liu, Ting; Shen, Chao; Wu, Hongyu; Liu, Wenyi; Su, Man; Chen, Siyun; Jia, Zhanpei

    2016-11-17

    Smart home has recently attracted much research interest owing to its potential in improving the quality of human life. How to obtain user's demand is the most important and challenging task for appliance optimal scheduling in smart home, since it is highly related to user's unpredictable behavior. In this paper, a human-centered smart home system is proposed to identify user behavior, predict their demand and schedule the household appliances. Firstly, the sensor data from user's wearable devices are monitored to profile user's full-day behavior. Then, the appliance-demand matrix is constructed to predict user's demand on home environment, which is extracted from the history of appliance load data and user behavior. Two simulations are designed to demonstrate user behavior identification, appliance-demand matrix construction and strategy of appliance optimal scheduling generation.

  18. CICART Center For Integrated Computation And Analysis Of Reconnection And Turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Bhattacharjee, Amitava [Univ. of New Hampshire, Durham, NH (United States)

    2016-03-27

    CICART is a partnership between the University of New Hampshire (UNH) and Dartmouth College. CICART addresses two important science needs of the DoE: the basic understanding of magnetic reconnection and turbulence that strongly impacts the performance of fusion plasmas, and the development of new mathematical and computational tools that enable the modeling and control of these phenomena. The principal participants of CICART constitute an interdisciplinary group, drawn from the communities of applied mathematics, astrophysics, computational physics, fluid dynamics, and fusion physics. It is a main premise of CICART that fundamental aspects of magnetic reconnection and turbulence in fusion devices, smaller-scale laboratory experiments, and space and astrophysical plasmas can be viewed from a common perspective, and that progress in understanding in any of these interconnected fields is likely to lead to progress in others. The establishment of CICART has strongly impacted the education and research mission of a new Program in Integrated Applied Mathematics in the College of Engineering and Applied Sciences at UNH by enabling the recruitment of a tenure-track faculty member, supported equally by UNH and CICART, and the establishment of an IBM-UNH Computing Alliance. The proposed areas of research in magnetic reconnection and turbulence in astrophysical, space, and laboratory plasmas include the following topics: (A) Reconnection and secondary instabilities in large high-Lundquist-number plasmas, (B) Particle acceleration in the presence of multiple magnetic islands, (C) Gyrokinetic reconnection: comparison with fluid and particle-in-cell models, (D) Imbalanced turbulence, (E) Ion heating, and (F) Turbulence in laboratory (including fusion-relevant) experiments. These theoretical studies make active use of three high performance computer simulation codes: (1) The Magnetic Reconnection Code, based on extended two-fluid (or Hall MHD) equations, in an Adaptive Mesh

  19. CICART Center For Integrated Computation And Analysis Of Reconnection And Turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Bhattacharjee, Amitava [Univ. of New Hampshire, Durham, NH (United States)

    2016-03-27

    CICART is a partnership between the University of New Hampshire (UNH) and Dartmouth College. CICART addresses two important science needs of the DoE: the basic understanding of magnetic reconnection and turbulence that strongly impacts the performance of fusion plasmas, and the development of new mathematical and computational tools that enable the modeling and control of these phenomena. The principal participants of CICART constitute an interdisciplinary group, drawn from the communities of applied mathematics, astrophysics, computational physics, fluid dynamics, and fusion physics. It is a main premise of CICART that fundamental aspects of magnetic reconnection and turbulence in fusion devices, smaller-scale laboratory experiments, and space and astrophysical plasmas can be viewed from a common perspective, and that progress in understanding in any of these interconnected fields is likely to lead to progress in others. The establishment of CICART has strongly impacted the education and research mission of a new Program in Integrated Applied Mathematics in the College of Engineering and Applied Sciences at UNH by enabling the recruitment of a tenure-track faculty member, supported equally by UNH and CICART, and the establishment of an IBM-UNH Computing Alliance. The proposed areas of research in magnetic reconnection and turbulence in astrophysical, space, and laboratory plasmas include the following topics: (A) Reconnection and secondary instabilities in large high-Lundquist-number plasmas, (B) Particle acceleration in the presence of multiple magnetic islands, (C) Gyrokinetic reconnection: comparison with fluid and particle-in-cell models, (D) Imbalanced turbulence, (E) Ion heating, and (F) Turbulence in laboratory (including fusion-relevant) experiments. These theoretical studies make active use of three high-performance computer simulation codes: (1) The Magnetic Reconnection Code, based on extended two-fluid (or Hall MHD) equations, in an Adaptive Mesh

  20. Science, humanism, judgement, ethics: person-centered medicine as an emergent model of modern clinical practice.

    Science.gov (United States)

    Miles, Andrew

    2013-01-01

    The Medical University of Plovdiv (MUP) has as its motto 'Committed to humanity". But what does humanity in modern medicine mean? Is it possible to practise a form of medicine that is without humanity? In the current article, it is argued that modern medicine is increasingly being practised in a de-personalised fashion, where the patient is understood not as a unique human individual, a person, but rather as a subject or an object and more in the manner of a complex biological machine. Medicine has, it is contended, become distracted from its duty to care, comfort and console as well as to ameliorate, attenuate and cure and that the rapid development of medicine's scientific knowledge is, paradoxically, principally causative. Signal occurrences in the 'patient as a person' movement are reviewed, together with the emergence of the evidence-based medicine (EBM) and patient-centered care (PCC) movements. The characteristics of a model of medicine evolving in response to medicine's current deficiencies--person-centered healthcare (PCH)--are noted and described. In seeking to apply science with humanism, via clinical judgement, within an ethical framework, it is contended that PCH will prove to be far more responsive to the needs of the individual patient and his/her personal circumstances than current models of practice, so that neither a reductive anatomico-pathological, disease-centric model of illness (EBM), nor an aggressive patient-directed, consumerist form of care (PCC) is allowed continued dominance within modern healthcare systems. In conclusion, it is argued that PCH will enable affordable advances in biomedicine and technology to be delivered to patients within a humanistic framework of clinical practice that recognises the patient as a person and which takes full account of his/her stories, values, preferences, goals, aspirations, fears, worries, hopes, cultural context and which responds to his/her psychological, emotional, spiritual and social necessities

  1. Computer Data Analysis for Meteorology - Project-Centered Skill Development for the Early Undergraduate Career

    Science.gov (United States)

    Ellis, T. D.

    2014-12-01

    Too often in geoscience education are the computer skills necessary for success in the workforce put off until the last years of undergraduate education. This is especially true in meteorology, a form of geophysical fluid dynamics many people encounter on a daily basis. Meteorologists often need to know specialized computer skills, including the use of scripting languages to automate handling large bundles of data, manipulating four-dimensional arrays (with three spatial dimensions and one time dimension), visualizing said datasets simply and effectively for publication, and performing statistical analysis of those datasets. Such topics are often addressed only at the senior undergraduate level or graduate school. At SUNY Oneonta, we are piloting a course that teaches these skills to third-semester students with the intent of building confidence in these skills throughout students' careers and with the of building a tool-box of skills that can be used in upper-division courses and undergraduate research. This poster will present the methods used in building this course, the kinds of activities designed, the desired student learning outcomes, and our assessment of those outcomes, and new initiatives engaged since the completion of the NSF-funded portion of the project in 2012.

  2. Computation of Electromagnetic Fields Scattered From Dielectric Objects of Uncertain Shapes Using MLMC Center for Uncertainty

    KAUST Repository

    Litvinenko, Alexander

    2015-01-05

    Simulators capable of computing scattered fields from objects of uncertain shapes are highly useful in electromagnetics and photonics, where device designs are typically subject to fabrication tolerances. Knowledge of statistical variations in scattered fields is useful in ensuring error-free functioning of devices. Oftentimes such simulators use a Monte Carlo (MC) scheme to sample the random domain, where the variables parameterize the uncertainties in the geometry. At each sample, which corresponds to a realization of the geometry, a deterministic electromagnetic solver is executed to compute the scattered fields. However, to obtain accurate statistics of the scattered fields, the number of MC samples has to be large. This significantly increases the total execution time. In this work, to address this challenge, the Multilevel MC (MLMC) scheme is used together with a (deterministic) surface integral equation solver. The MLMC achieves a higher efficiency by “balancing” the statistical errors due to sampling of the random domain and the numerical errors due to discretization of the geometry at each of these samples. Error balancing results in a smaller number of samples requiring coarser discretizations. Consequently, total execution time is significantly shortened.

  3. Computer classes and games in virtual reality environment to reduce loneliness among students of an elderly reference center

    Science.gov (United States)

    Antunes, Thaiany Pedrozo Campos; de Oliveira, Acary Souza Bulle; Crocetta, Tania Brusque; Antão, Jennifer Yohanna Ferreira de Lima; Barbosa, Renata Thais de Almeida; Guarnieri, Regiani; Massetti, Thais; Monteiro, Carlos Bandeira de Mello; de Abreu, Luiz Carlos

    2017-01-01

    Abstract Introduction: Physical and mental changes associated with aging commonly lead to a decrease in communication capacity, reducing social interactions and increasing loneliness. Computer classes for older adults make significant contributions to social and cognitive aspects of aging. Games in a virtual reality (VR) environment stimulate the practice of communicative and cognitive skills and might also bring benefits to older adults. Furthermore, it might help to initiate their contact to the modern technology. The purpose of this study protocol is to evaluate the effects of practicing VR games during computer classes on the level of loneliness of students of an elderly reference center. Methods and Analysis: This study will be a prospective longitudinal study with a randomised cross-over design, with subjects aged 50 years and older, of both genders, spontaneously enrolled in computer classes for beginners. Data collection will be done in 3 moments: moment 0 (T0) – at baseline; moment 1 (T1) – after 8 typical computer classes; and moment 2 (T2) – after 8 computer classes which include 15 minutes for practicing games in VR environment. A characterization questionnaire, the short version of the Short Social and Emotional Loneliness Scale for Adults (SELSA-S) and 3 games with VR (Random, MoviLetrando, and Reaction Time) will be used. For the intervention phase 4 other games will be used: Coincident Timing, Motor Skill Analyser, Labyrinth, and Fitts. The statistical analysis will compare the evolution in loneliness perception, performance, and reaction time during the practice of the games between the 3 moments of data collection. Performance and reaction time during the practice of the games will also be correlated to the loneliness perception. Ethics and Dissemination: The protocol is approved by the host institution's ethics committee under the number 52305215.3.0000.0082. Results will be disseminated via peer-reviewed journal articles and conferences

  4. Operational characteristics optimization of human-computer system

    OpenAIRE

    Zulquernain Mallick; Irfan Anjum Badruddin magami; Khaleed Hussain Tandur

    2010-01-01

    Computer operational parameters are having vital influence on the operators efficiency from readability viewpoint. Four parameters namely font, text/background color, viewing angle and viewing distance are analyzed. The text reading task, in the form of English text, was presented on the computer screen to the participating subjects and their performance, measured in terms of number of words read per minute (NWRPM), was recorded. For the purpose of optimization, the Taguchi method is u...

  5. Biomedical optics centers: forty years of multidisciplinary clinical translation for improving human health

    Science.gov (United States)

    Tromberg, Bruce J.; Anderson, R. Rox; Birngruber, Reginald; Brinkmann, Ralf; Berns, Michael W.; Parrish, John A.; Apiou-Sbirlea, Gabriela

    2016-12-01

    Despite widespread government and public interest, there are significant barriers to translating basic science discoveries into clinical practice. Biophotonics and biomedical optics technologies can be used to overcome many of these hurdles, due, in part, to offering new portable, bedside, and accessible devices. The current JBO special issue highlights promising activities and examples of translational biophotonics from leading laboratories around the world. We identify common essential features of successful clinical translation by examining the origins and activities of three major international academic affiliated centers with beginnings traceable to the mid-late 1970s: The Wellman Center for Photomedicine (Mass General Hospital, USA), the Beckman Laser Institute and Medical Clinic (University of California, Irvine, USA), and the Medical Laser Center Lübeck at the University of Lübeck, Germany. Major factors driving the success of these programs include visionary founders and leadership, multidisciplinary research and training activities in light-based therapies and diagnostics, diverse funding portfolios, and a thriving entrepreneurial culture that tolerates risk. We provide a brief review of how these three programs emerged and highlight critical phases and lessons learned. Based on these observations, we identify pathways for encouraging the growth and formation of similar programs in order to more rapidly and effectively expand the impact of biophotonics and biomedical optics on human health.

  6. Simulation of Human Episodic Memory by Using a Computational Model of the Hippocampus

    Directory of Open Access Journals (Sweden)

    Naoyuki Sato

    2010-01-01

    Full Text Available The episodic memory, the memory of personal events and history, is essential for understanding the mechanism of human intelligence. Neuroscience evidence has shown that the hippocampus, a part of the limbic system, plays an important role in the encoding and the retrieval of the episodic memory. This paper reviews computational models of the hippocampus and introduces our own computational model of human episodic memory based on neural synchronization. Results from computer simulations demonstrate that our model provides advantage for instantaneous memory formation and selective retrieval enabling memory search. Moreover, this model was found to have the ability to predict human memory recall by integrating human eye movement data during encoding. The combined approach between computational models and experiment is efficient for theorizing the human episodic memory.

  7. Human hip joint center analysis for biomechanical design of a hip joint exoskeleton

    Institute of Scientific and Technical Information of China (English)

    Wei YANG; Can-jun YANG‡; Ting XU

    2016-01-01

    We propose a new method for the customized design of hip exoskeletons based on the optimization of the human- machine physical interface to improve user comfort. The approach is based on mechanisms designed to follow the natural tra-jectories of the human hip as the flexion angle varies during motion. The motions of the hip joint center with variation of the flexion angle were measured and the resulting trajectory was modeled. An exoskeleton mechanism capable to follow the hip center’s movement was designed to cover the full motion ranges of flexion and abduction angles, and was adopted in a lower extremity assistive exoskeleton. The resulting design can reduce human-machine interaction forces by 24.1% and 76.0% during hip flexion and abduction, respectively, leading to a more ergonomic and comfortable-to-wear exoskeleton system. The human- exoskeleton model was analyzed to further validate the decrease of the hip joint internal force during hip joint flexion or abduction by applying the resulting design.

  8. Applying systemic-structural activity theory to design of human-computer interaction systems

    CERN Document Server

    Bedny, Gregory Z; Bedny, Inna

    2015-01-01

    Human-Computer Interaction (HCI) is an interdisciplinary field that has gained recognition as an important field in ergonomics. HCI draws on ideas and theoretical concepts from computer science, psychology, industrial design, and other fields. Human-Computer Interaction is no longer limited to trained software users. Today people interact with various devices such as mobile phones, tablets, and laptops. How can you make such interaction user friendly, even when user proficiency levels vary? This book explores methods for assessing the psychological complexity of computer-based tasks. It also p

  9. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    Energy Technology Data Exchange (ETDEWEB)

    Munro, J.K. Jr.

    1980-05-01

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone who wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables.

  10. Comparison of human face matching behavior and computational image similarity measure

    Institute of Scientific and Technical Information of China (English)

    CHEN WenFeng; LIU ChangHong; LANDER Karen; FU XiaoLan

    2009-01-01

    Computational similarity measures have been evaluated in a variety of ways, but few of the validated computational measures are based on a high-level, cognitive criterion of objective similarity. In this paper, we evaluate two popular objective similarity measures by comparing them with face matching performance In human observers. The results suggest that these measures are still limited in predicting human behavior, especially In rejection behavior, but objective measure taking advantage of global and local face characteristics may improve the prediction. It is also suggested that human may set different criterions for "hit" and "rejection" and this may provide implications for biologically-inspired computational systems.

  11. Computer vision for real-time orbital operations. Center directors discretionary fund

    Science.gov (United States)

    Vinz, F. L.; Brewster, L. L.; Thomas, L. D.

    1984-01-01

    Machine vision research is examined as it relates to the NASA Space Station program and its associated Orbital Maneuvering Vehicle (OMV). Initial operation of OMV for orbital assembly, docking, and servicing are manually controlled from the ground by means of an on board TV camera. These orbital operations may be accomplished autonomously by machine vision techniques which use the TV camera as a sensing device. Classical machine vision techniques are described. An alternate method is developed and described which employs a syntactic pattern recognition scheme. It has the potential for substantial reduction of computing and data storage requirements in comparison to the Two-Dimensional Fast Fourier Transform (2D FFT) image analysis. The method embodies powerful heuristic pattern recognition capability by identifying image shapes such as elongation, symmetry, number of appendages, and the relative length of appendages.

  12. Digital image analysis of ossification centers in the axial dens and body in the human fetus.

    Science.gov (United States)

    Baumgart, Mariusz; Wiśniewski, Marcin; Grzonkowska, Magdalena; Małkowski, Bogdan; Badura, Mateusz; Dąbrowska, Maria; Szpinda, Michał

    2016-12-01

    The detailed understanding of the anatomy and timing of ossification centers is indispensable in both determining the fetal stage and maturity and for detecting congenital disorders. This study was performed to quantitatively examine the odontoid and body ossification centers in the axis with respect to their linear, planar and volumetric parameters. Using the methods of CT, digital image analysis and statistics, the size of the odontoid and body ossification centers in the axis in 55 spontaneously aborted human fetuses aged 17-30 weeks was studied. With no sex difference, the best fit growth dynamics for odontoid and body ossification centers of the axis were, respectively, as follows: for transverse diameter y = -10.752 + 4.276 × ln(age) ± 0.335 and y = -10.578 + 4.265 × ln(age) ± 0.338, for sagittal diameter y = -4.329 + 2.010 × ln(age) ± 0.182 and y = -3.934 + 1.930 × ln(age) ± 0.182, for cross-sectional area y = -7.102 + 0.520 × age ± 0.724 and y = -7.002 + 0.521 × age ± 0.726, and for volume y = -37.021 + 14.014 × ln(age) ± 1.091 and y = -37.425 + 14.197 × ln(age) ± 1.109. With no sex differences, the odontoid and body ossification centers of the axis grow logarithmically in transverse and sagittal diameters, and in volume, while proportionately in cross-sectional area. Our specific-age reference data for the odontoid and body ossification centers of the axis may be relevant for determining the fetal stage and maturity and for in utero three-dimensional sonographic detecting segmentation anomalies of the axis.

  13. Predicting Structures of Ru-Centered Dyes: A Computational Screening Tool.

    Science.gov (United States)

    Fredin, Lisa A; Allison, Thomas C

    2016-04-07

    Dye-sensitized solar cells (DSCs) represent a means for harvesting solar energy to produce electrical power. Though a number of light harvesting dyes are in use, the search continues for more efficient and effective compounds to make commercially viable DSCs a reality. Computational methods have been increasingly applied to understand the dyes currently in use and to aid in the search for improved light harvesting compounds. Semiempirical quantum chemistry methods have a well-deserved reputation for giving good quality results in a very short amount of computer time. The most recent semiempirical models such as PM6 and PM7 are parametrized for a wide variety of molecule types, including organometallic complexes similar to DSC chromophores. In this article, the performance of PM6 is tested against a set of 20 molecules whose geometries were optimized using a density functional theory (DFT) method. It is found that PM6 gives geometries that are in good agreement with the optimized DFT structures. In order to reduce the differences between geometries optimized using PM6 and geometries optimized using DFT, the PM6 basis set parameters have been optimized for a subset of the molecules. It is found that it is sufficient to optimize the basis set for Ru alone to improve the agreement between the PM6 results and the DFT results. When this optimized Ru basis set is used, the mean unsigned error in Ru-ligand bond lengths is reduced from 0.043 to 0.017 Å in the set of 20 test molecules. Though the magnitude of these differences is small, the effect on the calculated UV/vis spectra is significant. These results clearly demonstrate the value of using PM6 to screen DSC chromophores as well as the value of optimizing PM6 basis set parameters for a specific set of molecules.

  14. Aiding human reliance decision making using computational models of trust

    NARCIS (Netherlands)

    Maanen, P.P. van; Klos, T.; Dongen, C.J. van

    2007-01-01

    This paper involves a human-agent system in which there is an operator charged with a pattern recognition task, using an automated decision aid. The objective is to make this human-agent system operate as effectively as possible. Effectiveness is gained by an increase of appropriate reliance on the

  15. SOCaaS: Security Operations Center as a Service for Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Fahad F. Alruwaili

    2014-06-01

    Full Text Available The management of information security operations is a complex task, especially in a cloud environment.  The cloud service layers and multi-tenancy architecture creates a complex environment in which to develop and manage an information security incident management and compliance program. This paper presents a novel security operations center (SOC framework as a service for cloud service providers and customers. The goal is to protect cloud services against new and existing attacks as well as comply with security policies and regulatory requirements. The SOCaaS design is based on multi-governance and defense in depth models and fits within the multi-tenancy cloud services. A SOCaaS provider is a trusted entity that collects event and system logs from cloud systems to ensure proactive incident management and compliance with regulations. The proposed approach provides better managed services for customers wanting to outsource their information security operations to attain reliable, transparent, and efficient security and privacy.

  16. Adapting the human-computer interface for reading literacy and computer skill to facilitate collection of information directly from patients.

    Science.gov (United States)

    Lobach, David F; Arbanas, Jennifer M; Mishra, Dharani D; Campbell, Marci; Wildemuth, Barbara M

    2004-01-01

    Clinical information collected directly from patients is critical to the practice of medicine. Past efforts to collect this information using computers have had limited utility because these efforts required users to be facile with the computerized information collecting system. In this paper we describe the design, development, and function of a computer system that uses recent technology to overcome the limitations of previous computer-based data collection tools by adapting the human-computer interface to the native language, reading literacy, and computer skills of the user. Specifically, our system uses a numerical representation of question content, multimedia, and touch screen technology to adapt the computer interface to the native language, reading literacy, and computer literacy of the user. In addition, the system supports health literacy needs throughout the data collection session and provides contextually relevant disease-specific education to users based on their responses to the questions. The system has been successfully used in an academically affiliated family medicine clinic and in an indigent adult medicine clinic.

  17. Analysis on Construction and Operation of Cloud Computing Data Center%云计算数据中心建设运营分析

    Institute of Scientific and Technical Information of China (English)

    曹鲁

    2012-01-01

    通过对现在云计算数据中心的建设成本、市场业务发展和综合管理等方面的详细分析,建立云计算数据中心的建设运营模型,结合现阶段国内外云计算发展情况,给出企业、政府及电信运营商建设运营云计算数据中心的建议和意见。%Detailed analysis of cloud computing data center construction costs, the market for business development and management, constructed cloud computing data center construction and operation models, combined at this stage at home and abroad cloud computing situation, given enterprises, governments and telecommunications suggestions and opinions of the operators in the construction and operation of cloud computing data center.

  18. Human-centered risk management for medical devices - new methods and tools.

    Science.gov (United States)

    Janß, Armin; Plogmann, Simon; Radermacher, Klaus

    2016-04-01

    Studies regarding adverse events with technical devices in the medical context showed, that in most of the cases non-usable interfaces are the cause for use deficiencies and therefore a potential harm for the patient and third parties. This is partially due to the lack of suitable methods for interlinking usability engineering and human-centered risk management. Especially regarding the early identification of human-induced errors and the systematic control of these failures, medical device manufacturers and in particular the developers have to be supported in order to guarantee reliable design and error-tolerant human-machine interfaces (HMI). In this context, we developed the HiFEM methodology and a corresponding software tool (mAIXuse) for model-based human risk analysis. Based on a two-fold approach, HiFEM provides a task-type-sensitive modeling structure with integrated temporal relations in order to represent and analyze the use process in a detailed way. The approach can be used from early developmental stages up to the validation process. Results of a comparative study with the HiFEM method and a classical process-failure mode and effect analysis (FMEA) depict, that the new modeling and analysis technique clearly outperforms the FMEA. Besides, we implemented a new method for systematic human risk control (mAIXcontrol). Accessing information from the method's knowledge base enables the operator to detect the most suitable countermeasures for a respective risk. Forty-one approved generic countermeasure principles have been indexed as a resulting combination of root causes and failures in a matrix. The methodology has been tested in comparison to a conventional approach as well. Evaluation of the matrix and the reassessment of the risk priority numbers by a blind expert demonstrate a substantial benefit of the new mAIXcontrol method.

  19. Human-Computer Interaction Software: Lessons Learned, Challenges Ahead

    Science.gov (United States)

    1989-01-01

    domain communi- Iatelligent s t s s Me cation. Users familiar with problem Inteligent support systes. High-func- anddomains but inxperienced with comput...8217i. April 1987, pp. 7.3-78. His research interests include artificial intel- Creating better HCI softw-are will have a 8. S.K Catrd. I.P. Moran. arid

  20. [Attempt at computer modeling of evolution of human society].

    Science.gov (United States)

    Levchenko, V F; Menshutkin, V V

    2009-01-01

    A model of evolution of human society and biosphere, which is based on the concepts of V. I. Vernadskii about noosphere and of L. N. Gumilev about ethnogenesis is developed and studied. The mathematical apparatus of the model is composition of finite stochastic automata. By using this model, a possibility of the global ecological crisis is demonstrated in the case of preservation of the current tendencies of interaction of the biosphere and the human civilization.

  1. Investigating Students’ Achievements in Computing Science Using Human Metric

    Directory of Open Access Journals (Sweden)

    Ezekiel U. Okike

    2014-05-01

    Full Text Available This study investigates the role of personality traits, motivation for career choice and study habits in students’ academic achievements in the computing sciences. A quantitative research method was employed. Data was collected from 60 computing science students using the Myer Briggs Type indicator (MBTI with additional questionnaires. A model of the form y_(ij=ß_0+ß_1 x_(1j+ ß_2 x_(2j+ ß_3 x_(3j+ ß_4 x_(4j+ …ß_n x_nj was used, where y_ij represents a dependent variable, ß_0+ß_1 x_(1j+ ß_2 x_(2j+ ß_3 x_(3j+ ß_4 x_(4j+ …ß_n x_nj the independent variables. Data analysis was performed on the data using the Statistical Package for the social sciences (SPSS. Linear regression was done in order to fit the model and justify its significance or none significance at the 0.05 level of significance. Result of regression model was also used to determine the impact of the independent variable on students’ performance. Results from this study suggests that the strongest motivator for a choice of career in the computing sciences is the desire to become a computing professional. Students’ achievements especially in the computing sciences do not depend only on students temperamental ability or personality traits, motivations for choice of course of study and reading habit, but also on the use of Internet based sources more than going to the university library to read book materials available in all areas

  2. Multidetector computed tomography features of pancreatic metastases from leiomyosarcoma: Experience at a tertiary cancer center

    Institute of Scientific and Technical Information of China (English)

    Chong Hyun Suh; Abhishek Keraliya; Atul B Shinagare; Kyung Won Kim; Nikhil H Ramaiya; Sree Harsha Tirumani

    2016-01-01

    AIM: To describe the multidetector computed tomography features of pancreatic metastasis from leiomyosarcoma(LMS).METHODS: Between January 1995 and December 2012, 13 consecutive patients(11 women, 2 men; mean age of 57 years; range, 38-78 years) with pancreatic metastases from LMS were included in our study. Imaging features including location, number, largest dimension, tumor attenuation and enhancement characteristics, presence of necrosis, pancreatic ductal dilatation, common bile duct(CBD) dilatation, presence of pancreatitis, and atrophy were documented.RESULTS: The most common site of origin of the pancreatic metastases from LMS was uterus(38.5%), followed by retroperitoneum(30.8%) and extremity(23.1%). None of the patients in our study had pancreas as the first site of metastasis. All patients developed pancreatic metastases at a median interval of 24 mo. Pancreatic metastases from LMS were solitary in 8/13 patients and multiple in 5/13 patients, had no predilection for any part of the pancreas, were hypovascular on arterial phase in 10/13 patients and associated with pancreatic duct dilatation in 3/13 patients. None had CBD dilatation. None of the pancreatic metastases in LMS cohort caused pancreatitis, and atrophy. Median duration of follow-up was 19 mo for LMS cohort during which two patients underwent resection of metastasis(median survival 45 mo) while the remaining underwent systemic therapy(median survival 13 mo).CONCLUSION: Pancreatic metastases from LMS are often solitary and hypovascular masses and less commonly associated with pancreatic ductal dilatation, CBD dilatation, pancreatitis or pancreatic atrophy. Surgical resection of solitary LMS pancreatic metastasis can be considered due to the long survival of these patients.

  3. Single-center study comparing computed tomography colonography with conventional colonoscopy

    Institute of Scientific and Technical Information of China (English)

    Ian C Roberts-Thomson; Graeme R Tucker; Peter J Hewett; Peter Cheung; Ruben A Sebben; EE Win Khoo; Julie D Marker; Wayne K Clapton

    2008-01-01

    AIM: To compare the results from computed tomography (CT) colonography with conventional colonoscopy in symptomatic patients referred for colonoscopy. METHODS: The study included 227 adult outpatients, mean age 60 years, with appropriate indications for colonoscopy. CT colonography and colonoscopy were performed on the same day in a metropolitan teaching hospital. Colonoscopists were initially blinded to the results of CT colonography but there was segmental unblinding during the procedure. The primary outcome measures were the sensitivity and specificity of CT colonography for the identification of polyps seen at colonoscopy (i.e. analysis by polyp). Secondary outcome measures included an analysis by patient, extracolonic findings at CT colonography, adverse events with both procedures and patient acceptance and preference. RESULTS: Twenty-five patients (11%) were excluded from the analysis because of incomplete colonoscopy or poor bowel preparation that affected either CT colonography, colonoscopy or both procedures. Polyps and masses (usually cancers) were detected at colonoscopy and CT colonography in 35% and 42% of patients, respectively. Of nine patients with a final diagnosis of cancer, eight (89%) were identified by CT colonography as masses (5) or polyps (3). For polyps analyzed according to polyp, the overall sensitivity of CT colonography was 50% (95% CI, 39%-61%) but this increased to 71% (95% CI, 52%-85%) for polyps ≥ 6 mm in size. Similarly, specificity for all polyps was 48% (95% CI, 39%-58%) increasing to 67% (95% CI, 56%-76%) for polyps ≥ 6 mm. Adverse events were uncommon but included one colonic perforation at colonoscopy, Patient acceptance was high for both procedures but preference favoured CT colonography. CONCLUSION: Although CT colonography was more sensitive in this study than in some previous studies, the procedure is not yet sensitive enough for widespread application in symptomatic patients.

  4. EVALUATION OF PROPTOSIS BY USING COMPUTED TOMOGRAPHY IN A TERTIARY CARE CENTER, BURLA, SAMBALPUR, ODISHA

    Directory of Open Access Journals (Sweden)

    Vikas Agrawal

    2017-07-01

    Full Text Available BACKGROUND Proptosis is defined as the abnormal anterior protrusion of the globe beyond the orbital margins.1 It is an important clinical manifestation of various orbital as well as systemic disorders. Aetiology ranging from infection to malignant tumours, among which space occupying lesions within the orbits are the most important. Proptosis is defined as an abnormal protrusion of the eyeball. MATERIALS AND METHODS A total of 32 patients referred from various departments mainly from ophthalmology and medicine with history and clinical features suggestive of proptosis were evaluated in our department and after proper history taking and clinical examination, Computed Tomography (CT scan was done. RESULTS The age of the patients ranged from 1-55 years. Associated chief complaints in case of proptosis were in decreasing order from pain / headache, restricted eye movement, diminished vision and diplopia. Mass lesions (46.87% were the most common cause of proptosis followed by inflammatory lesions (37.5%. Trauma vascular lesions and congenital conditions were infrequent causes of proptosis. In children, common causes of proptosis were retinoblastoma (35.71% and orbital cellulitis (28.57% and in adults the common causes were thyroid ophthalmopathy (22.22%, trauma (16.66% and pseudo-tumour (16.66%. CONCLUSION Mass lesions (46.87% were the most common cause of proptosis followed by inflammatory lesions (37.5%. CT scanning should be the chief investigation in evaluation of lesions causing proptosis. It is the most useful in detecting characterising and determining the extent of disease process. The overall accuracy of CT scan in diagnosis of proptosis is 96.87%.

  5. 超级计算中心功能与设计探讨%Discussion on the Function and Design of Super Computer Center

    Institute of Scientific and Technical Information of China (English)

    焦建欣

    2013-01-01

      超级计算中心是数据中心领域中的一个特殊的类型,本文以国家超级计算深圳中心为例,探讨了超级计算中心的功能及相关的设计。%Super computer center is a particular type in the field of data center. Based on the National Supercomputing Center in Shenzhen as an example, the paper discusses the function and related design for such supercomputing centers.

  6. Secure Human-Computer Identification against Peeping Attacks (SecHCI): A Survey

    OpenAIRE

    Li, SJ; Shum, HY

    2003-01-01

    This paper focuses on human-computer identification systems against peeping attacks, in which adversaries can observe (and even control) interactions between humans (provers) and computers (verifiers). Real cases on peeping attacks were reported by Ross J. Anderson ten years before. Fixed passwords are insecure to peeping attacks since adversaries can simply replay the observed passwords. Some identification techniques can be used to defeat peeping attacks, but auxiliary devices must be used ...

  7. Design of Food Management Information System Based on Human-computer Interaction

    Directory of Open Access Journals (Sweden)

    Xingkai Cui

    2015-07-01

    Full Text Available Food safety problem is directly related with public health. This study takes the necessity of establishing food management information system as the breakthrough point, through the interpretation of the overview of human-computer interaction technology, as well as the conceptual framework of human-computer interaction, it discusses the construction of food management information system, expecting to promote China's food safety management process so as to guarantee public health guarantee.

  8. The human-computer interaction design of self-operated mobile telemedicine devices

    OpenAIRE

    Zheng, Shaoqing

    2015-01-01

    Human-computer interaction (HCI) is an important issue in the area of medicine, for example, the operation of surgical simulators, virtual rehabilitation systems, telemedicine treatments, and so on. In this thesis, the human-computer interaction of a self-operated mobile telemedicine device is designed. The mobile telemedicine device (i.e. intelligent Medication Box or iMedBox) is used for remotely monitoring patient health and activity information such as ECG (electrocardiogram) signals, hom...

  9. Developing Educational Computer Animation Based on Human Personality Types

    Directory of Open Access Journals (Sweden)

    Sajid Musa

    2015-03-01

    Full Text Available Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered to be one of the most elegant ways for preparing materials for teaching, and its importance in assisting learners to process, understand and remember information efficiently has vastly grown since the advent of powerful graphics-oriented computers era. Based on theories and facts of psychology, colour science, computer animation, geometric modelling and technical aesthetics, this study intends to establish an inter-disciplinary area of research towards a greater educational effectiveness. With today’s high educational demands as well as the lack of time provided for certain courses, classical educational methods have shown deficiencies in keeping up with the drastic changes observed in the digital era. Generally speaking, without taking into account various significant factors as, for instance, gender, age, level of interest and memory level, educational animations may turn out to be insufficient for learners or fail to meet their needs. Though, we have noticed that the applications of animation for education have been given only inadequate attention, and students’ personality types of temperaments (sanguine, choleric, melancholic, phlegmatic, etc. have never been taken into account. We suggest there is an interesting relationship here, and propose essential factors in creating educational animations based on students’ personality types. Particularly, we study how information in computer animation may be presented in a more preferable way based on font types and their families, colours and colour schemes, emphasizing texts, shapes of characters designed by planar quadratic Bernstein-Bézier curves

  10. Design Science in Human-Computer Interaction: A Model and Three Examples

    Science.gov (United States)

    Prestopnik, Nathan R.

    2013-01-01

    Humanity has entered an era where computing technology is virtually ubiquitous. From websites and mobile devices to computers embedded in appliances on our kitchen counters and automobiles parked in our driveways, information and communication technologies (ICTs) and IT artifacts are fundamentally changing the ways we interact with our world.…

  11. The design of an intelligent human-computer interface for the test, control and monitor system

    Science.gov (United States)

    Shoaff, William D.

    1988-01-01

    The graphical intelligence and assistance capabilities of a human-computer interface for the Test, Control, and Monitor System at Kennedy Space Center are explored. The report focuses on how a particular commercial off-the-shelf graphical software package, Data Views, can be used to produce tools that build widgets such as menus, text panels, graphs, icons, windows, and ultimately complete interfaces for monitoring data from an application; controlling an application by providing input data to it; and testing an application by both monitoring and controlling it. A complete set of tools for building interfaces is described in a manual for the TCMS toolkit. Simple tools create primitive widgets such as lines, rectangles and text strings. Intermediate level tools create pictographs from primitive widgets, and connect processes to either text strings or pictographs. Other tools create input objects; Data Views supports output objects directly, thus output objects are not considered. Finally, a set of utilities for executing, monitoring use, editing, and displaying the content of interfaces is included in the toolkit.

  12. Using Noninvasive Brain Measurement to Explore the Psychological Effects of Computer Malfunctions on Users during Human-Computer Interactions

    Directory of Open Access Journals (Sweden)

    Leanne M. Hirshfield

    2014-01-01

    Full Text Available In today’s technologically driven world, there is a need to better understand the ways that common computer malfunctions affect computer users. These malfunctions may have measurable influences on computer user’s cognitive, emotional, and behavioral responses. An experiment was conducted where participants conducted a series of web search tasks while wearing functional near-infrared spectroscopy (fNIRS and galvanic skin response sensors. Two computer malfunctions were introduced during the sessions which had the potential to influence correlates of user trust and suspicion. Surveys were given after each session to measure user’s perceived emotional state, cognitive load, and perceived trust. Results suggest that fNIRS can be used to measure the different cognitive and emotional responses associated with computer malfunctions. These cognitive and emotional changes were correlated with users’ self-report levels of suspicion and trust, and they in turn suggest future work that further explores the capability of fNIRS for the measurement of user experience during human-computer interactions.

  13. Computer models of the human immunoglobulins shape and segmental flexibility.

    Science.gov (United States)

    Pumphrey, R

    1986-06-01

    At present there is interest in the design and deployment of engineered biosensor molecules. Antibodies are the most versatile of the naturally occurring biosensors and it is important to understand their mechanical properties and the ways in which they can interact with their natural ligands. Two dimensional representations are clearly inadequate, and three dimensional representations are too complicated to manipulate except as numerical abstractions in computers. Recent improvements in computer graphics allow these coordinate matrices to be seen and more easily comprehended, and interactive programs permit the modification and reassembly of molecular fragments. The models which result have distinct advantages both over those of lower resolution, and those showing every atom, which are limited to the few fragments(2-5) or mutant molecules for which the X-ray crystallographic coordinates are known. In this review Richard Pumphrey describes the shape and flexibility of immunoglobulin molecules in relation to the three dimensional structure. Copyright © 1986. Published by Elsevier B.V.

  14. Parallel computing-based sclera recognition for human identification

    Science.gov (United States)

    Lin, Yong; Du, Eliza Y.; Zhou, Zhi

    2012-06-01

    Compared to iris recognition, sclera recognition which uses line descriptor can achieve comparable recognition accuracy in visible wavelengths. However, this method is too time-consuming to be implemented in a real-time system. In this paper, we propose a GPU-based parallel computing approach to reduce the sclera recognition time. We define a new descriptor in which the information of KD tree structure and sclera edge are added. Registration and matching task is divided into subtasks in various sizes according to their computation complexities. Every affine transform parameters are generated by searching on KD tree. Texture memory, constant memory, and shared memory are used to store templates and transform matrixes. The experiment results show that the proposed method executed on GPU can dramatically improve the sclera matching speed in hundreds of times without accuracy decreasing.

  15. Social effects of an anthropomorphic help agent: humans versus computers.

    Science.gov (United States)

    David, Prabu; Lu, Tingting; Kline, Susan; Cai, Li

    2007-06-01

    The purpose of this study was to examine perceptions of fairness of a computer-administered quiz as a function of the anthropomorphic features of the help agent offered within the quiz environment. The addition of simple anthropomorphic cues to a computer help agent reduced the perceived friendliness of the agent, perceived intelligence of the agent, and the perceived fairness of the quiz. These differences were observed only for male anthropomorphic cues, but not for female anthropomorphic cues. The results were not explained by the social attraction of the anthropomorphic agents used in the quiz or by gender identification with the agents. Priming of visual cues provides the best account of the data. Practical implications of the study are discussed.

  16. Human cardiac systems electrophysiology and arrhythmogenesis: iteration of experiment and computation.

    Science.gov (United States)

    Holzem, Katherine M; Madden, Eli J; Efimov, Igor R

    2014-11-01

    Human cardiac electrophysiology (EP) is a unique system for computational modelling at multiple scales. Due to the complexity of the cardiac excitation sequence, coordinated activity must occur from the single channel to the entire myocardial syncytium. Thus, sophisticated computational algorithms have been developed to investigate cardiac EP at the level of ion channels, cardiomyocytes, multicellular tissues, and the whole heart. Although understanding of each functional level will ultimately be important to thoroughly understand mechanisms of physiology and disease, cardiac arrhythmias are expressly the product of cardiac tissue-containing enough cardiomyocytes to sustain a reentrant loop of activation. In addition, several properties of cardiac cellular EP, that are critical for arrhythmogenesis, are significantly altered by cell-to-cell coupling. However, relevant human cardiac EP data, upon which to develop or validate models at all scales, has been lacking. Thus, over several years, we have developed a paradigm for multiscale human heart physiology investigation and have recovered and studied over 300 human hearts. We have generated a rich experimental dataset, from which we better understand mechanisms of arrhythmia in human and can improve models of human cardiac EP. In addition, in collaboration with computational physiologists, we are developing a database for the deposition of human heart experimental data, including thorough experimental documentation. We anticipate that accessibility to this human heart dataset will further human EP computational investigations, as well as encourage greater data transparency within the field of cardiac EP.

  17. A human-centered framework for innovation in conservation incentive programs.

    Science.gov (United States)

    Sorice, Michael G; Donlan, C Josh

    2015-12-01

    The promise of environmental conservation incentive programs that provide direct payments in exchange for conservation outcomes is that they enhance the value of engaging in stewardship behaviors. An insidious but important concern is that a narrow focus on optimizing payment levels can ultimately suppress program participation and subvert participants' internal motivation to engage in long-term conservation behaviors. Increasing participation and engendering stewardship can be achieved by recognizing that participation is not simply a function of the payment; it is a function of the overall structure and administration of the program. Key to creating innovative and more sustainable programs is fitting them within the existing needs and values of target participants. By focusing on empathy for participants, co-designing program approaches, and learning from the rapid prototyping of program concepts, a human-centered approach to conservation incentive program design enhances the propensity for discovery of novel and innovative solutions to pressing conservation issues.

  18. Building communication strategy on health prevention through the human-centered design

    Directory of Open Access Journals (Sweden)

    Karine de Mello Freire

    2016-03-01

    Full Text Available It has been identified a latent need for developing efficient communication strategies for prevention of diseases and also, design as a potential agent to create communications artifacts that are able to promote self-care. In order to analyze a design process that develops this kind of artifact, an action research in IAPI Health Center in Porto Alegre was done. The action’s goal was to design a strategy to promote self-care to prevent cervical cancer. The process was conducted from the human centered design approach - HCD, which seeks to create solutions desirable for people and feasible for organizations from three main phases: a Hear, in which inspirations are originated from stories collected from people; b Create, which aims to translate these knowledge into prototypes; and, c Deliver, where the prototypes are tested and developed with users. Communication strategies were supported by design studies about visual-verbal rhetoric. As results, this design approach has shown adequate to create communication strategies targeted at self-care behaviors, aiming to empower users to change their behavior.

  19. Production Support Flight Control Computers: Research Capability for F/A-18 Aircraft at Dryden Flight Research Center

    Science.gov (United States)

    Carter, John F.

    1997-01-01

    NASA Dryden Flight Research Center (DFRC) is working with the United States Navy to complete ground testing and initiate flight testing of a modified set of F/A-18 flight control computers. The Production Support Flight Control Computers (PSFCC) can give any fleet F/A-18 airplane an in-flight, pilot-selectable research control law capability. NASA DFRC can efficiently flight test the PSFCC for the following four reasons: (1) Six F/A-18 chase aircraft are available which could be used with the PSFCC; (2) An F/A-18 processor-in-the-loop simulation exists for validation testing; (3) The expertise has been developed in programming the research processor in the PSFCC; and (4) A well-defined process has been established for clearing flight control research projects for flight. This report presents a functional description of the PSFCC. Descriptions of the NASA DFRC facilities, PSFCC verification and validation process, and planned PSFCC projects are also provided.

  20. The Human Dimension of Computer-Mediated Communications: Implications for International Educational Computer Conferences.

    Science.gov (United States)

    Scott, Douglass J.

    This article presents a conceptual framework for the research and practice of educational computer conferences that shifts the focus from the on-line messages being exchanged to the participants' engagement with the conference. This framework, known as the "Iceberg Metaphor" or the "Michigan Model of educational…

  1. Preface (to: Brain-Computer Interfaces. Applying our Minds to Human-Computer Interaction)

    NARCIS (Netherlands)

    Tan, Desney; Tan, Desney S.; Nijholt, Antinus

    2010-01-01

    The advances in cognitive neuroscience and brain imaging technologies provide us with the increasing ability to interface directly with activity in the brain. Researchers have begun to use these technologies to build brain-computer interfaces. Originally, these interfaces were meant to allow

  2. Data Bases and Other Computer Tools in the Humanities.

    Science.gov (United States)

    Collegiate Microcomputer, 1990

    1990-01-01

    Describes 38 database projects sponsored by the National Endowment for the Humanities (NEH). Information on hardware, software, and access and dissemination is given for projects in the areas of art and architectural history; folklore; history; medicinal plants; interdisciplinary topics; language and linguistics; literature; and music and music…

  3. The Human Genome Project: Biology, Computers, and Privacy.

    Science.gov (United States)

    Cutter, Mary Ann G.; Drexler, Edward; Gottesman, Kay S.; Goulding, Philip G.; McCullough, Laurence B.; McInerney, Joseph D.; Micikas, Lynda B.; Mural, Richard J.; Murray, Jeffrey C.; Zola, John

    This module, for high school teachers, is the second of two modules about the Human Genome Project (HGP) produced by the Biological Sciences Curriculum Study (BSCS). The first section of this module provides background information for teachers about the structure and objectives of the HGP, aspects of the science and technology that underlie the…

  4. Computational biology in human aging : an omics data integration approach

    NARCIS (Netherlands)

    Akker, Erik Ben van den

    2015-01-01

    Throughout this thesis, human aging and its relation to health are studied in the context of two parallel though complementary lines of research: biomarkers and genetics. The search for informative biomarkers of aging focuses on easy accessible and quantifiable substances of the body that can be u

  5. Associating Human-Centered Concepts with Social Networks Using Fuzzy Sets

    Science.gov (United States)

    Yager, Ronald R.

    The rapidly growing global interconnectivity, brought about to a large extent by the Internet, has dramatically increased the importance and diversity of social networks. Modern social networks cut across a spectrum from benign recreational focused websites such as Facebook to occupationally oriented websites such as LinkedIn to criminally focused groups such as drug cartels to devastation and terror focused groups such as Al-Qaeda. Many organizations are interested in analyzing and extracting information related to these social networks. Among these are governmental police and security agencies as well marketing and sales organizations. To aid these organizations there is a need for technologies to model social networks and intelligently extract information from these models. While established technologies exist for the modeling of relational networks [1-7] few technologies exist to extract information from these, compatible with human perception and understanding. Data bases is an example of a technology in which we have tools for representing our information as well as tools for querying and extracting the information contained. Our goal is in some sense analogous. We want to use the relational network model to represent information, in this case about relationships and interconnections, and then be able to query the social network using intelligent human-centered concepts. To extend our capabilities to interact with social relational networks we need to associate with these network human concepts and ideas. Since human beings predominantly use linguistic terms in which to reason and understand we need to build bridges between human conceptualization and the formal mathematical representation of the social network. Consider for example a concept such as "leader". An analyst may be able to express, in linguistic terms, using a network relevant vocabulary, properties of a leader. Our task is to translate this linguistic description into a mathematical formalism

  6. Recent Advances in Computational Mechanics of the Human Knee Joint

    Directory of Open Access Journals (Sweden)

    M. Kazemi

    2013-01-01

    Full Text Available Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling.

  7. Recent advances in computational mechanics of the human knee joint.

    Science.gov (United States)

    Kazemi, M; Dabiri, Y; Li, L P

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling.

  8. Individual Difference Effects in Human-Computer Interaction

    Science.gov (United States)

    1991-10-01

    evaluated in terns of the amount of sales revenue af -er deducting production costs. nhe time variable was measured in terms of the amount of time a subject...subject acted as an inventory/ production manage:r of a hypothetical firm which was simulated by a computer program. The cubject’s task was to obtain the...34search list" will be examined. Thus, the u3ar w.ll probably match "apple pie" but not "apple cider " or "appl-? butter’ because these items would not

  9. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  10. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  11. Advanced approaches to characterize the human intestinal microbiota by computational meta-analysis

    NARCIS (Netherlands)

    Nikkilä, J.; Vos, de W.M.

    2010-01-01

    GOALS: We describe advanced approaches for the computational meta-analysis of a collection of independent studies, including over 1000 phylogenetic array datasets, as a means to characterize the variability of human intestinal microbiota. BACKGROUND: The human intestinal microbiota is a complex micr

  12. Human Inspired Self-developmental Model of Neural Network (HIM): Introducing Content/Form Computing

    Science.gov (United States)

    Krajíček, Jiří

    This paper presents cross-disciplinary research between medical/psychological evidence on human abilities and informatics needs to update current models in computer science to support alternative methods for computation and communication. In [10] we have already proposed hypothesis introducing concept of human information model (HIM) as cooperative system. Here we continue on HIM design in detail. In our design, first we introduce Content/Form computing system which is new principle of present methods in evolutionary computing (genetic algorithms, genetic programming). Then we apply this system on HIM (type of artificial neural network) model as basic network self-developmental paradigm. Main inspiration of our natural/human design comes from well known concept of artificial neural networks, medical/psychological evidence and Sheldrake theory of "Nature as Alive" [22].

  13. Operational characteristics optimization of human-computer system

    Directory of Open Access Journals (Sweden)

    Zulquernain Mallick

    2010-09-01

    Full Text Available Computer operational parameters are having vital influence on the operators efficiency from readability viewpoint. Four parameters namely font, text/background color, viewing angle and viewing distance are analyzed. The text reading task, in the form of English text, was presented on the computer screen to the participating subjects and their performance, measured in terms of number of words read per minute (NWRPM, was recorded. For the purpose of optimization, the Taguchi method is used to find the optimal parameters to maximize operators’ efficiency for performing readability task. Two levels of each parameter have been considered in this study. An orthogonal array, the signal-to-noise (S/N ratio and the analysis of variance (ANOVA were employed to investigate the operators’ performance/efficiency. Results showed that Times Roman font, black text on white background, 40 degree viewing angle and 60 cm viewing distance, the subjects were quite comfortable, efficient and read maximum number of words per minute. Text/background color was dominant parameter with a percentage contribution of 76.18% towards the laid down objective followed by font type at 18.17%, viewing distance 7.04% and viewing angle 0.58%. Experimental results are provided to confirm the effectiveness of this approach.

  14. Measuring Human Performance within Computer Security Incident Response Teams

    Energy Technology Data Exchange (ETDEWEB)

    McClain, Jonathan T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva, Austin Ray [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Avina, Glory Emmanuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Forsythe, James C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Human performance has become a pertinen t issue within cyber security. However, this research has been stymied by the limited availability of expert cyber security professionals. This is partly attributable to the ongoing workload faced by cyber security professionals, which is compound ed by the limited number of qualified personnel and turnover of p ersonnel across organizations. Additionally, it is difficult to conduct research, and particularly, openly published research, due to the sensitivity inherent to cyber ope rations at most orga nizations. As an alternative, the current research has focused on data collection during cyb er security training exercises. These events draw individuals with a range of knowledge and experience extending from seasoned professionals to recent college gradu ates to college students. The current paper describes research involving data collection at two separate cyber security exercises. This data collection involved multiple measures which included behavioral performance based on human - machine transactions and questionnaire - based assessments of cyber security experience.

  15. Computational Human Performance Modeling For Alarm System Design

    Energy Technology Data Exchange (ETDEWEB)

    Jacques Hugo

    2012-07-01

    The introduction of new technologies like adaptive automation systems and advanced alarms processing and presentation techniques in nuclear power plants is already having an impact on the safety and effectiveness of plant operations and also the role of the control room operator. This impact is expected to escalate dramatically as more and more nuclear power utilities embark on upgrade projects in order to extend the lifetime of their plants. One of the most visible impacts in control rooms will be the need to replace aging alarm systems. Because most of these alarm systems use obsolete technologies, the methods, techniques and tools that were used to design the previous generation of alarm system designs are no longer effective and need to be updated. The same applies to the need to analyze and redefine operators’ alarm handling tasks. In the past, methods for analyzing human tasks and workload have relied on crude, paper-based methods that often lacked traceability. New approaches are needed to allow analysts to model and represent the new concepts of alarm operation and human-system interaction. State-of-the-art task simulation tools are now available that offer a cost-effective and efficient method for examining the effect of operator performance in different conditions and operational scenarios. A discrete event simulation system was used by human factors researchers at the Idaho National Laboratory to develop a generic alarm handling model to examine the effect of operator performance with simulated modern alarm system. It allowed analysts to evaluate alarm generation patterns as well as critical task times and human workload predicted by the system.

  16. Behind Human Error: Cognitive Systems, Computers and Hindsight

    Science.gov (United States)

    1994-12-01

    squeeze became on the powers of the operator.... And as Norbert Wiener noted some years later (1964, p. 63): The gadget-minded people often have the...for one exception see Woods and Elias , 1988). This failure to develop representations that reveal change and highlight events in the monitored...Woods, D. D., and Elias , G. (1988). Significance messages: An inte- gral display concept. In Proceedings of the 32nd Annual Meeting of the Human

  17. Collection of Information Directly from Patients through an Adaptive Human-computer Interface

    Science.gov (United States)

    Lobach, David F.; Arbanas, Jennifer M.; Mishra, Dharani D.; Wildemuth, Barbara; Campbell, Marci

    2002-01-01

    Clinical information collected directly from patients is critical to the practice of medicine. Past efforts to collect this information using computers have had limited utility because these efforts required users to be facile with the information collecting system. This poster describes the development and function of a computer system that uses technology to overcome the limitations of previous computer-based data collection tools by adapting the human-computer interface to fit the skills of the user. The system has been successfully used at two diverse clinical sites.

  18. Brain-Computer Interfaces Applying Our Minds to Human-computer Interaction

    CERN Document Server

    Tan, Desney S

    2010-01-01

    For generations, humans have fantasized about the ability to create devices that can see into a person's mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science fiction stories. Recent advances in cognitive neuroscience and brain imaging technologies have started to turn these myths into a reality, and are providing us with the ability to interface directly with the human brain. This ability is made possible through the use of sensors that monitor physical p

  19. Computational fluid dynamics assessment: Volume 1, Computer simulations of the METC (Morgantown Energy Technology Center) entrained-flow gasifier: Final report

    Energy Technology Data Exchange (ETDEWEB)

    Celik, I.; Chattree, M.

    1988-07-01

    An assessment of the theoretical and numerical aspects of the computer code, PCGC-2, is made; and the results of the application of this code to the Morgantown Energy Technology Center (METC) advanced gasification facility entrained-flow reactor, ''the gasifier,'' are presented. PCGC-2 is a code suitable for simulating pulverized coal combustion or gasification under axisymmetric (two-dimensional) flow conditions. The governing equations for the gas and particulate phase have been reviewed. The numerical procedure and the related programming difficulties have been elucidated. A single-particle model similar to the one used in PCGC-2 has been developed, programmed, and applied to some simple situations in order to gain insight to the physics of coal particle heat-up, devolatilization, and char oxidation processes. PCGC-2 was applied to the METC entrained-flow gasifier to study numerically the flash pyrolysis of coal, and gasification of coal with steam or carbon dioxide. The results from the simulations are compared with measurements. The gas and particle residence times, particle temperature, and mass component history were also calculated and the results were analyzed. The results provide useful information for understanding the fundamentals of coal gasification and for assessment of experimental results performed using the reactor considered. 69 refs., 35 figs., 23 tabs.

  20. Computational fluid dynamics assessment: Volume 1, Computer simulations of the METC (Morgantown Energy Technology Center) entrained-flow gasifier: Final report

    Energy Technology Data Exchange (ETDEWEB)

    Celik, I.; Chattree, M.

    1988-07-01

    An assessment of the theoretical and numerical aspects of the computer code, PCGC-2, is made; and the results of the application of this code to the Morgantown Energy Technology Center (METC) advanced gasification facility entrained-flow reactor, ''the gasifier,'' are presented. PCGC-2 is a code suitable for simulating pulverized coal combustion or gasification under axisymmetric (two-dimensional) flow conditions. The governing equations for the gas and particulate phase have been reviewed. The numerical procedure and the related programming difficulties have been elucidated. A single-particle model similar to the one used in PCGC-2 has been developed, programmed, and applied to some simple situations in order to gain insight to the physics of coal particle heat-up, devolatilization, and char oxidation processes. PCGC-2 was applied to the METC entrained-flow gasifier to study numerically the flash pyrolysis of coal, and gasification of coal with steam or carbon dioxide. The results from the simulations are compared with measurements. The gas and particle residence times, particle temperature, and mass component history were also calculated and the results were analyzed. The results provide useful information for understanding the fundamentals of coal gasification and for assessment of experimental results performed using the reactor considered. 69 refs., 35 figs., 23 tabs.

  1. Digging into data using new collaborative infrastructures supporting humanities-based computer science research

    OpenAIRE

    2011-01-01

    This paper explores infrastructure supporting humanities–computer science research in large–scale image data by asking: Why is collaboration a requirement for work within digital humanities projects? What is required for fruitful interdisciplinary collaboration? What are the technical and intellectual approaches to constructing such an infrastructure? What are the challenges associated with digital humanities collaborative work? We reveal that digital humanities collaboration requ...

  2. Computational analysis of expression of human embryonic stem cell-associated signatures in tumors

    OpenAIRE

    Wang, Xiaosheng

    2011-01-01

    Background The cancer stem cell model has been proposed based on the linkage between human embryonic stem cells and human cancer cells. However, the evidences supporting the cancer stem cell model remain to be collected. In this study, we extensively examined the expression of human embryonic stem cell-associated signatures including core genes, transcription factors, pathways and microRNAs in various cancers using the computational biology approach. Results We used the class comparison analy...

  3. Computational analysis of expression of human embryonic stem cell-associated signatures in tumors

    OpenAIRE

    Wang Xiaosheng

    2011-01-01

    Abstract Background The cancer stem cell model has been proposed based on the linkage between human embryonic stem cells and human cancer cells. However, the evidences supporting the cancer stem cell model remain to be collected. In this study, we extensively examined the expression of human embryonic stem cell-associated signatures including core genes, transcription factors, pathways and microRNAs in various cancers using the computational biology approach. Results We used the class compari...

  4. Brain-Computer Interfaces. Applying our Minds to Human-Computer Interaction

    NARCIS (Netherlands)

    Tan, Desney S.; Nijholt, Antinus

    2010-01-01

    For generations, humans have fantasized about the ability to create devices that can see into a person’s mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science

  5. Brain-Computer Interfaces: Applying our Minds to Human-Computer Interaction

    NARCIS (Netherlands)

    Tan, Desney S.; Nijholt, Anton

    2010-01-01

    For generations, humans have fantasized about the ability to create devices that can see into a person’s mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science f

  6. Evolution of Neural Computations: Mantis Shrimp and Human Color Decoding

    Directory of Open Access Journals (Sweden)

    Qasim Zaidi

    2014-10-01

    Full Text Available Mantis shrimp and primates both possess good color vision, but the neural implementation in the two species is very different, a reflection of the largely unrelated evolutionary lineages of these creatures. Mantis shrimp have scanning compound eyes with 12 classes of photoreceptors, and have evolved a system to decode color information at the front-end of the sensory stream. Primates have image-focusing eyes with three classes of cones, and decode color further along the visual-processing hierarchy. Despite these differences, we report a fascinating parallel between the computational strategies at the color-decoding stage in the brains of stomatopods and primates. Both species appear to use narrowly tuned cells that support interval decoding color identification.

  7. Evolution of neural computations: Mantis shrimp and human color decoding.

    Science.gov (United States)

    Zaidi, Qasim; Marshall, Justin; Thoen, Hanne; Conway, Bevil R

    2014-01-01

    Mantis shrimp and primates both possess good color vision, but the neural implementation in the two species is very different, a reflection of the largely unrelated evolutionary lineages of these creatures. Mantis shrimp have scanning compound eyes with 12 classes of photoreceptors, and have evolved a system to decode color information at the front-end of the sensory stream. Primates have image-focusing eyes with three classes of cones, and decode color further along the visual-processing hierarchy. Despite these differences, we report a fascinating parallel between the computational strategies at the color-decoding stage in the brains of stomatopods and primates. Both species appear to use narrowly tuned cells that support interval decoding color identification.

  8. A Study of Electromyogram Based on Human-Computer Interface

    Institute of Scientific and Technical Information of China (English)

    Jun-Ru Ren; Tie-Jun Liu; Yu Huang; De-Zhong Yao

    2009-01-01

    In this paper,a new control system based on forearm electromyogram (EMG) is proposed for computer peripheral control and artificial prosthesis control.This control system intends to realize the commands of six pre-defined hand poses:up,down,left,right,yes,and no.In order to research the possibility of using a unified amplifier for both electro-encephalogram (EEG) and EMG,the surface forearm EMG data is acquired by a 4-channel EEG measure-ment system.The Bayesian classifier is used to classify the power spectral density (PSD) of the signal.The experiment result verifies that this control system can supply a high command recognition rate (average 48%) even the EMG data is collected with an EEG system just with single electrode measurement.

  9. New Human-Computer Interface Concepts for Mission Operations

    Science.gov (United States)

    Fox, Jeffrey A.; Hoxie, Mary Sue; Gillen, Dave; Parkinson, Christopher; Breed, Julie; Nickens, Stephanie; Baitinger, Mick

    2000-01-01

    The current climate of budget cuts has forced the space mission operations community to reconsider how it does business. Gone are the days of building one-of-kind control centers with teams of controllers working in shifts 24 hours per day, 7 days per week. Increasingly, automation is used to significantly reduce staffing needs. In some cases, missions are moving towards lights-out operations where the ground system is run semi-autonomously. On-call operators are brought in only to resolve anomalies. Some operations concepts also call for smaller operations teams to manage an entire family of spacecraft. In the not too distant future, a skeleton crew of full-time general knowledge operators will oversee the operations of large constellations of small spacecraft, while geographically distributed specialists will be assigned to emergency response teams based on their expertise. As the operations paradigms change, so too must the tools to support the mission operations team's tasks. Tools need to be built not only to automate routine tasks, but also to communicate varying types of information to the part-time, generalist, or on-call operators and specialists more effectively. Thus, the proper design of a system's user-system interface (USI) becomes even more importance than before. Also, because the users will be accessing these systems from various locations (e.g., control center, home, on the road) via different devices with varying display capabilities (e.g., workstations, home PCs, PDAS, pagers) over connections with various bandwidths (e.g., dial-up 56k, wireless 9.6k), the same software must have different USIs to support the different types of users, their equipment, and their environments. In other words, the software must now adapt to the needs of the users! This paper will focus on the needs and the challenges of designing USIs for mission operations. After providing a general discussion of these challenges, the paper will focus on the current efforts of

  10. Human Computation in Visualization: Using Purpose Driven Games for Robust Evaluation of Visualization Algorithms.

    Science.gov (United States)

    Ahmed, N; Zheng, Ziyi; Mueller, K

    2012-12-01

    Due to the inherent characteristics of the visualization process, most of the problems in this field have strong ties with human cognition and perception. This makes the human brain and sensory system the only truly appropriate evaluation platform for evaluating and fine-tuning a new visualization method or paradigm. However, getting humans to volunteer for these purposes has always been a significant obstacle, and thus this phase of the development process has traditionally formed a bottleneck, slowing down progress in visualization research. We propose to take advantage of the newly emerging field of Human Computation (HC) to overcome these challenges. HC promotes the idea that rather than considering humans as users of the computational system, they can be made part of a hybrid computational loop consisting of traditional computation resources and the human brain and sensory system. This approach is particularly successful in cases where part of the computational problem is considered intractable using known computer algorithms but is trivial to common sense human knowledge. In this paper, we focus on HC from the perspective of solving visualization problems and also outline a framework by which humans can be easily seduced to volunteer their HC resources. We introduce a purpose-driven game titled "Disguise" which serves as a prototypical example for how the evaluation of visualization algorithms can be mapped into a fun and addicting activity, allowing this task to be accomplished in an extensive yet cost effective way. Finally, we sketch out a framework that transcends from the pure evaluation of existing visualization methods to the design of a new one.

  11. Impact of Cognitive Architectures on Human-Computer Interaction

    Science.gov (United States)

    2014-09-01

    simulation. In this work they were preparing for the Synthetic Theatre of War-1997 exercise where between 10,000 and 50,000 automated agents would...work with up to 1,000 humans.27 The results of this exercise are documented by Laird et al.28 5. Conclusions and Future Work To assess whether cognitive...RW, MacKenzie IS. Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts’ law research in HCI. International Journal of

  12. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    Science.gov (United States)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

  13. Metaphors for the Nature of Human-Computer Interaction in an Empowering Environment: Interaction Style Influences the Manner of Human Accomplishment.

    Science.gov (United States)

    Weller, Herman G.; Hartson, H. Rex

    1992-01-01

    Describes human-computer interface needs for empowering environments in computer usage in which the machine handles the routine mechanics of problem solving while the user concentrates on its higher order meanings. A closed-loop model of interaction is described, interface as illusion is discussed, and metaphors for human-computer interaction are…

  14. Compliant bipedal model with the center of pressure excursion associated with oscillatory behavior of the center of mass reproduces the human gait dynamics.

    Science.gov (United States)

    Jung, Chang Keun; Park, Sukyung

    2014-01-03

    Although the compliant bipedal model could reproduce qualitative ground reaction force (GRF) of human walking, the model with a fixed pivot showed overestimations in stance leg rotation and the ratio of horizontal to vertical GRF. The human walking data showed a continuous forward progression of the center of pressure (CoP) during the stance phase and the suspension of the CoP near the forefoot before the onset of step transition. To better describe human gait dynamics with a minimal expense of model complexity, we proposed a compliant bipedal model with the accelerated pivot which associated the CoP excursion with the oscillatory behavior of the center of mass (CoM) with the existing simulation parameter and leg stiffness. Owing to the pivot acceleration defined to emulate human CoP profile, the arrival of the CoP at the limit of the stance foot over the single stance duration initiated the step-to-step transition. The proposed model showed an improved match of walking data. As the forward motion of CoM during single stance was partly accounted by forward pivot translation, the previously overestimated rotation of the stance leg was reduced and the corresponding horizontal GRF became closer to human data. The walking solutions of the model ranged over higher speed ranges (~1.7 m/s) than those of the fixed pivoted compliant bipedal model (~1.5m/s) and exhibited other gait parameters, such as touchdown angle, step length and step frequency, comparable to the experimental observations. The good matches between the model and experimental GRF data imply that the continuous pivot acceleration associated with CoM oscillatory behavior could serve as a useful framework of bipedal model.

  15. Cross-sectional study of the neural ossification centers of vertebrae C1-S5 in the human fetus.

    Science.gov (United States)

    Szpinda, Michał; Baumgart, Mariusz; Szpinda, Anna; Woźniak, Alina; Mila-Kierzenkowska, Celestyna

    2013-10-01

    An understanding of the normal evolution of the spine is of great relevance in the prenatal detection of spinal abnormalities. This study was carried out to estimate the length, width, cross-sectional area and volume of the neural ossification centers of vertebrae C1-S5 in the human fetus. Using the methods of CT (Biograph mCT), digital-image analysis (Osirix 3.9) and statistics (the one-way ANOVA test for paired data, the Kolmogorov-Smirnov test, Levene's test, Student's t test, the one-way ANOVA test for unpaired data with post hoc RIR Tukey comparisons) the size for the neural ossification centers throughout the spine in 55 spontaneously aborted human fetuses (27 males, 28 females) at ages of 17-30 weeks was studied. The neural ossification centers were visualized in the whole pre-sacral spine, in 74.5 % for S1, in 61.8 % for S2, in 52.7 % for S3, and in 12.7 % for S4. Neither male-female nor right-left significant differences in the size of neural ossification centers were found. The neural ossification centers were the longest within the cervical spine. The maximum values referred to the axis on the right, and to C5 vertebra on the left. There was a gradual decrease in length for the neural ossification centers of T1-S4 vertebrae. The neural ossification centers were the widest within the proximal thoracic spine and narrowed bi-directionally. The growth dynamics for CSA of neural ossification centers were found to parallel that of volume. The largest CSAs and volumes of neural ossification centers were found in the C3 vertebra, and decreased in the distal direction. The neural ossification centers show neither male-female nor right-left differences. The neural ossification centers are characterized by the maximum length for C2-C6 vertebrae, the maximum width for the proximal thoracic spine, and both the maximum cross-sectional area and volume for C3 vertebra. There is a sharp decrease in size of the neural ossification centers along the sacral spine. A

  16. A conceptual and computational model of moral decision making in human and artificial agents.

    Science.gov (United States)

    Wallach, Wendell; Franklin, Stan; Allen, Colin

    2010-07-01

    Recently, there has been a resurgence of interest in general, comprehensive models of human cognition. Such models aim to explain higher-order cognitive faculties, such as deliberation and planning. Given a computational representation, the validity of these models can be tested in computer simulations such as software agents or embodied robots. The push to implement computational models of this kind has created the field of artificial general intelligence (AGI). Moral decision making is arguably one of the most challenging tasks for computational approaches to higher-order cognition. The need for increasingly autonomous artificial agents to factor moral considerations into their choices and actions has given rise to another new field of inquiry variously known as Machine Morality, Machine Ethics, Roboethics, or Friendly AI. In this study, we discuss how LIDA, an AGI model of human cognition, can be adapted to model both affective and rational features of moral decision making. Using the LIDA model, we will demonstrate how moral decisions can be made in many domains using the same mechanisms that enable general decision making. Comprehensive models of human cognition typically aim for compatibility with recent research in the cognitive and neural sciences. Global workspace theory, proposed by the neuropsychologist Bernard Baars (1988), is a highly regarded model of human cognition that is currently being computationally instantiated in several software implementations. LIDA (Franklin, Baars, Ramamurthy, & Ventura, 2005) is one such computational implementation. LIDA is both a set of computational tools and an underlying model of human cognition, which provides mechanisms that are capable of explaining how an agent's selection of its next action arises from bottom-up collection of sensory data and top-down processes for making sense of its current situation. We will describe how the LIDA model helps integrate emotions into the human decision-making process, and we

  17. Can Computers Foster Human Users’ Creativity? Theory and Praxis of Mixed-Initiative Co-Creativity

    Directory of Open Access Journals (Sweden)

    Antonios Liapis

    2016-07-01

    Full Text Available This article discusses the impact of artificially intelligent computers to the process of design, play and educational activities. A computational process which has the necessary intelligence and creativity to take a proactive role in such activities can not only support human creativity but also foster it and prompt lateral thinking. The argument is made both from the perspective of human creativity, where the computational input is treated as an external stimulus which triggers re-framing of humans’ routines and mental associations, but also from the perspective of computational creativity where human input and initiative constrains the search space of the algorithm, enabling it to focus on specific possible solutions to a problem rather than globally search for the optimal. The article reviews four mixed-initiative tools (for design and educational play based on how they contribute to human-machine co-creativity. These paradigms serve different purposes, afford different human interaction methods and incorporate different computationally creative processes. Assessing how co-creativity is facilitated on a per-paradigm basis strengthens the theoretical argument and provides an initial seed for future work in the burgeoning domain of mixed-initiative interaction.

  18. Computation of particle detachment from floors due to human walking

    Science.gov (United States)

    Elhadidi, Basman; Khalifa, Ezzat

    2005-11-01

    A computational model for detachment of fine particles due to the unsteady flow under a foot is developed. As the foot approaches the floor, fluid volume is displaced laterally as a wall jet from the perimeter of the contact area at high velocity and acceleration. Unsteady aerodynamic forces on particles attached to the floor are considered. Results show that the jet velocity is ˜40 m/s for a foot idealized as a 15 cm circular disk approaching the floor at 1 m/s with a final gap of 0.8 mm. This velocity is sufficient to detach small particles (1˜μm). The flow accelerates at ˜400 m/s^2 which affects the detachment of larger sized particles (˜100 μm). As the disk is brought to rest, the unsteady jet expands outwards, advecting a vortex ring closely attached to it. At the disk edge, a counter rotating vortex is generated by the sudden deceleration of the disk. Both vortices can play a role in entrainment of the suspended particles in the flowfield. Numerical studies also show that the maximum jet velocity is ˜20 m/s for a simplified foot immediately after heel contact in the stance phase of the gait.

  19. Computer-assisted learning in human and dental medicine.

    Science.gov (United States)

    Höhne, S; Schumann, R R

    2004-04-01

    This article describes the development and application of new didactic methods for use in computer-assisted teaching and learning systems for training doctors and dentists. Taking the Meducase project as an example, didactic models and their technological implementation are explained, together with the limitations of imparting knowledge with the "new media". In addition, legal concepts for a progressive, pragmatic, and innovative distribution of knowledge to undergraduate students are presented. In conclusion, potential and visions for the wide use of electronic learning in the German and European universities in the future are discussed. Self-directed learning (SDL) is a key component in both undergraduate education and lifelong learning for medical practitioners. E-learning can already be used to promote SDL at undergraduate level. The Meducase project uses self-directed, constructive, case- and problem-oriented learning within a learning platform for medical and dental students. In the long run, e-learning programs can only be successful in education if there is consistent analysis and implementation of value-added factors and the development and use of media-didactic concepts matched to electronic learning. The use of innovative forms of licensing - open source licenses for software and similar licenses for content - facilitates continuous, free access to these programs for all students and teachers. These legal concepts offer the possibility of innovative knowledge distribution, quality assurance and standardization across specializations, university departments, and possibly even national borders.

  20. Computational model of soft tissues in the human upper airway.

    Science.gov (United States)

    Pelteret, J-P V; Reddy, B D

    2012-01-01

    This paper presents a three-dimensional finite element model of the tongue and surrounding soft tissues with potential application to the study of sleep apnoea and of linguistics and speech therapy. The anatomical data was obtained from the Visible Human Project, and the underlying histological data was also extracted and incorporated into the model. Hyperelastic constitutive models were used to describe the material behaviour, and material incompressibility was accounted for. An active Hill three-element muscle model was used to represent the muscular tissue of the tongue. The neural stimulus for each muscle group was determined through the use of a genetic algorithm-based neural control model. The fundamental behaviour of the tongue under gravitational and breathing-induced loading is investigated. It is demonstrated that, when a time-dependent loading is applied to the tongue, the neural model is able to control the position of the tongue and produce a physiologically realistic response for the genioglossus.

  1. Interactive 3D computer model of the human corneolimbal region

    DEFF Research Database (Denmark)

    Molvaer, Rikke Kongshaug; Andreasen, Arne; Heegaard, Steffen;

    2013-01-01

    in the superior limbal region and one LEC, six LCs and 12 FSPs in the inferior limbal region. Only few LECs, LCs and FSPs were localized nasally and temporally. CONCLUSION: Interactive 3D models are a powerful tool that may help to shed more light on the existence and spatial localization of the different stem......PURPOSE: This study aims to clarify the existence of and to map the localization of different proposed stem cell niches in the corneal limbal region. MATERIALS AND METHODS: One human eye was cut into 2200 consecutive sections. Every other section was stained with haematoxylin and eosin, digitized...... in the limbal region: limbal epithelial crypts (LECs), limbal crypts (LCs) and focal stromal projections (FSPs). In all, eight LECs, 25 LCs and 105 FSPs were identified in the limbal region. The LECs, LCs and FSPs were predominantly located in the superior limbal region with seven LECs, 19 LCs and 93 FSPs...

  2. Situated dialog in speech-based human-computer interaction

    CERN Document Server

    Raux, Antoine; Lane, Ian; Misu, Teruhisa

    2016-01-01

    This book provides a survey of the state-of-the-art in the practical implementation of Spoken Dialog Systems for applications in everyday settings. It includes contributions on key topics in situated dialog interaction from a number of leading researchers and offers a broad spectrum of perspectives on research and development in the area. In particular, it presents applications in robotics, knowledge access and communication and covers the following topics: dialog for interacting with robots; language understanding and generation; dialog architectures and modeling; core technologies; and the analysis of human discourse and interaction. The contributions are adapted and expanded contributions from the 2014 International Workshop on Spoken Dialog Systems (IWSDS 2014), where researchers and developers from industry and academia alike met to discuss and compare their implementation experiences, analyses and empirical findings.

  3. When a Talking-Face Computer Agent Is Half-Human and Half-Humanoid: Human Identity and Consistency Preference

    Science.gov (United States)

    Gong, Li; Nass, Clifford

    2007-01-01

    Computer-generated anthropomorphic characters are a growing type of communicator that is deployed in digital communication environments. An essential theoretical question is how people identify humanlike but clearly artificial, hence humanoid, entities in comparison to natural human ones. This identity categorization inquiry was approached under…

  4. Computational model of sustained acceleration effects on human cognitive performance.

    Science.gov (United States)

    McKinlly, Richard A; Gallimore, Jennie J

    2013-08-01

    Extreme acceleration maneuvers encountered in modern agile fighter aircraft can wreak havoc on human physiology, thereby significantly influencing cognitive task performance. As oxygen content declines under acceleration stress, the activity of high order cortical tissue reduces to ensure sufficient metabolic resources are available for critical life-sustaining autonomic functions. Consequently, cognitive abilities reliant on these affected areas suffer significant performance degradations. The goal was to develop and validate a model capable of predicting human cognitive performance under acceleration stress. Development began with creation of a proportional control cardiovascular model that produced predictions of several hemodynamic parameters, including eye-level blood pressure and regional cerebral oxygen saturation (rSo2). An algorithm was derived to relate changes in rSo2 within specific brain structures to performance on cognitive tasks that require engagement of different brain areas. Data from the "precision timing" experiment were then used to validate the model predicting cognitive performance as a function of G(z) profile. The following are value ranges. Results showed high agreement between the measured and predicted values for the rSo2 (correlation coefficient: 0.7483-0.8687; linear best-fit slope: 0.5760-0.9484; mean percent error: 0.75-3.33) and cognitive performance models (motion inference task--correlation coefficient: 0.7103-0.9451; linear best-fit slope: 0.7416-0.9144; mean percent error: 6.35-38.21; precision timing task--correlation coefficient: 0.6856-0.9726; linear best-fit slope: 0.5795-1.027; mean percent error: 6.30-17.28). The evidence suggests that the model is capable of accurately predicting cognitive performance of simplistic tasks under high acceleration stress.

  5. Computational analysis of splicing errors and mutations in human transcripts

    Directory of Open Access Journals (Sweden)

    Gelfand Mikhail S

    2008-01-01

    Full Text Available Abstract Background Most retained introns found in human cDNAs generated by high-throughput sequencing projects seem to result from underspliced transcripts, and thus they capture intermediate steps of pre-mRNA splicing. On the other hand, mutations in splice sites cause exon skipping of the respective exon or activation of pre-existing cryptic sites. Both types of events reflect properties of the splicing mechanism. Results The retained introns were significantly shorter than constitutive ones, and skipped exons are shorter than exons with cryptic sites. Both donor and acceptor splice sites of retained introns were weaker than splice sites of constitutive introns. The authentic acceptor sites affected by mutations were significantly weaker in exons with activated cryptic sites than in skipped exons. The distance from a mutated splice site to the nearest equivalent site is significantly shorter in cases of activated cryptic sites compared to exon skipping events. The prevalence of retained introns within genes monotonically increased in the 5'-to-3' direction (more retained introns close to the 3'-end, consistent with the model of co-transcriptional splicing. The density of exonic splicing enhancers was higher, and the density of exonic splicing silencers lower in retained introns compared to constitutive ones and in exons with cryptic sites compared to skipped exons. Conclusion Thus the analysis of retained introns in human cDNA, exons skipped due to mutations in splice sites and exons with cryptic sites produced results consistent with the intron definition mechanism of splicing of short introns, co-transcriptional splicing, dependence of splicing efficiency on the splice site strength and the density of candidate exonic splicing enhancers and silencers. These results are consistent with other, recently published analyses.

  6. Comparison of Head Center Position and Screw Fixation Options Between a Jumbo Cup and an Offset Center of Rotation Cup in Revision Total Hip Arthroplasty: A Computer Simulation Study.

    Science.gov (United States)

    Faizan, Ahmad; Black, Brandon J; Fay, Brian D; Heffernan, Christopher D; Ries, Michael D

    2016-01-01

    Jumbo acetabular cups are commonly used in revision total hip arthroplasty (THA). A straightforward reaming technique is used which is similar to primary THA. However, jumbo cups may also be associated with hip center elevation, limited screw fixation options, and anterior soft tissue impingement. A partially truncated hemispherical shell was designed with an offset center of rotation, thick superior rim, and beveled anterior and superior rims as an alternative to a conventional jumbo cup. A three dimensional computer simulation was used to assess head center position and safe screw trajectories. Results of this in vitro study indicate that a modified hemispherical implant geometry can reduce head center elevation while permitting favorable screw fixation trajectories into the pelvis in comparison to a conventional jumbo cup.

  7. Physical properties of the human head: mass, center of gravity and moment of inertia.

    Science.gov (United States)

    Yoganandan, Narayan; Pintar, Frank A; Zhang, Jiangyue; Baisden, Jamie L

    2009-06-19

    This paper presents a synthesis of biomedical investigations of the human head with specific reference to certain aspects of physical properties and development of anthropometry data, leading to the advancement of dummies used in crashworthiness research. As a significant majority of the studies have been summarized as reports, an effort has been made to chronologically review the literature with the above objectives. The first part is devoted to early studies wherein the mass, center of gravity (CG), and moment of inertia (MOI) properties are obtained from human cadaver experiments. Unembalmed and preserved whole-body and isolated head and head-neck experiments are discussed. Acknowledging that the current version of the Hybrid III dummy is the most widely used anthropomorphic test device in motor vehicle crashworthiness research for frontal impact applications for over 30 years, bases for the mass and MOI-related data used in the dummy are discussed. Since the development and federalization of the dummy in the United States, description of methods used to arrive at these properties form a part of the manuscript. Studies subsequent to the development of this dummy including those from the US Military are also discussed. As the head and neck are coupled in any impact, and increasing improvements in technology such as advanced airbags, and pre-tensioners and load limiters in manual seatbelts affect the kinetics of the head-neck complex, the manuscript underscores the need to pursue studies to precisely determine all the physical properties of the head. Because the most critical parameters (locations of CG and occipital condyles (OC), mass, and MOI) have not been determined on a specimen-by-specimen basis in any single study, it is important to gather these data in future experiments. These critical data will be of value for improving occupant safety, designing advanced restraint systems, developing second generation dummies, and assessing the injury mitigating

  8. Interaction of molecular hydrogen with open transition metal centers for enhanced binding in metal-organic frameworks: a computational study.

    Science.gov (United States)

    Lochan, Rohini C; Khaliullin, Rustam Z; Head-Gordon, Martin

    2008-05-19

    Molecular hydrogen is known to form stable, "nonclassical" sigma complexes with transition metal centers that are stabilized by donor-acceptor interactions and electrostatics. In this computational study, we establish that strong H2 sorption sites can be obtained in metal-organic frameworks by incorporating open transition metal sites on the organic linkers. Using density functional theory and energy decomposition analysis, we investigate the nature and characteristics of the H2 interaction with models of exposed open metal binding sites {half-sandwich piano-stool shaped complexes of the form (Arene)ML(3- n)(H2)n [M=Cr, Mo, V(-), Mn(+); Arene = C6H5X (X=H, F, Cl, OCH3, NH2, CH3, CF3) or C6H3Y2X (Y=COOH, X=CF3, Cl; L=CO; n=1-3]}. The metal-H2 bond dissociation energy of the studied complexes is calculated to be between 48 and 84 kJ/mol, based on the introduction of arene substituents, changes to the metal core, and of charge-balancing ligands. Thus, design of the binding site controls the H2 binding affinity and could be potentially used to control the magnitude of the H2 interaction energy to achieve reversible sorption characteristics at ambient conditions. Energy decomposition analysis illuminates both the possibilities and present challenges associated with rational materials design.

  9. Center for Programming Models for Scalable Parallel Computing - Towards Enhancing OpenMP for Manycore and Heterogeneous Nodes

    Energy Technology Data Exchange (ETDEWEB)

    Barbara Chapman

    2012-02-01

    OpenMP was not well recognized at the beginning of the project, around year 2003, because of its limited use in DoE production applications and the inmature hardware support for an efficient implementation. Yet in the recent years, it has been graduately adopted both in HPC applications, mostly in the form of MPI+OpenMP hybrid code, and in mid-scale desktop applications for scientific and experimental studies. We have observed this trend and worked deligiently to improve our OpenMP compiler and runtimes, as well as to work with the OpenMP standard organization to make sure OpenMP are evolved in the direction close to DoE missions. In the Center for Programming Models for Scalable Parallel Computing project, the HPCTools team at the University of Houston (UH), directed by Dr. Barbara Chapman, has been working with project partners, external collaborators and hardware vendors to increase the scalability and applicability of OpenMP for multi-core (and future manycore) platforms and for distributed memory systems by exploring different programming models, language extensions, compiler optimizations, as well as runtime library support.

  10. Rugoscopy: Human identification by computer-assisted photographic superimposition technique

    Directory of Open Access Journals (Sweden)

    Rezwana Begum Mohammed

    2013-01-01

    Full Text Available Background: Human identification has been studied since fourteenth century and it has gradually advanced for forensic purposes. Traditional methods such as dental, fingerprint, and DNA comparisons are probably the most common techniques used in this context, allowing fast and secure identification processes. But, in circumstances where identification of an individual by fingerprint or dental record comparison is difficult, palatal rugae may be considered as an alternative source of material. Aim: The present study was done to evaluate the individualistic nature and use of palatal rugae patterns for personal identification and also to test the efficiency of computerized software for forensic identification by photographic superimposition of palatal photographs obtained from casts. Materials and Methods: Two sets of Alginate impressions were made from the upper arches of 100 individuals (50 males and 50 females with one month interval in between and the casts were poured. All the teeth except the incisors were removed to ensure that only the palate could be used in identification process. In one set of the casts, the palatal rugae were highlighted with a graphite pencil. All the 200 casts were randomly numbered, and then, they were photographed with a 10.1 Mega Pixel Kodak digital camera using standardized method. Using computerized software, the digital photographs of the models without highlighting the palatal rugae were overlapped over the images (transparent of the palatal rugae with highlighted palatal rugae, in order to identify the pairs by superimposition technique. Incisors were remained and used as landmarks to determine the magnification required to bring the two set of photographs to the same size, in order to make perfect superimposition of images. Results: The result of the overlapping of the digital photographs of highlighted palatal rugae over normal set of models without highlighted palatal rugae resulted in 100% positive

  11. Alternative Ultrasound Gel for a Sustainable Ultrasound Program: Application of Human Centered Design.

    Science.gov (United States)

    Salmon, Margaret; Salmon, Christian; Bissinger, Alexa; Muller, Mundenga Mutendi; Gebreyesus, Alegnta; Geremew, Haimanot; Wendel, Sarah K; Wendell, Sarah; Azaza, Aklilu; Salumu, Maurice; Benfield, Nerys

    2015-01-01

    This paper describes design of a low cost, ultrasound gel from local products applying aspects of Human Centered Design methodology. A multidisciplinary team worked with clinicians who use ultrasound where commercial gel is cost prohibitive and scarce. The team followed the format outlined in the Ideo Took Kit. Research began by defining the challenge "how to create locally available alternative ultrasound gel for a low-resourced environment? The "End-Users," were identified as clinicians who use ultrasound in Democratic Republic of the Congo and Ethiopia. An expert group was identified and queried for possible alternatives to commercial gel. Responses included shampoo, oils, water and cornstarch. Cornstarch, while a reasonable solution, was either not available or too expensive. We then sought deeper knowledge of locally sources materials from local experts, market vendors, to develop a similar product. Suggested solutions gleaned from these interviews were collected and used to create ultrasound gel accounting for cost, image quality, manufacturing capability. Initial prototypes used cassava root flour from Great Lakes Region (DRC, Rwanda, Uganda, Tanzania) and West Africa, and bula from Ethiopia. Prototypes were tested in the field and resulting images evaluated by our user group. A final prototype was then selected. Cassava and bula at a 32 part water, 8 part flour and 4 part salt, heated, mixed then cooled was the product design of choice.

  12. Interleukin-24 inhibits the plasma cell differentiation program in human germinal center B cells.

    Science.gov (United States)

    Maarof, Ghyath; Bouchet-Delbos, Laurence; Gary-Gouy, Hélène; Durand-Gasselin, Ingrid; Krzysiek, Roman; Dalloul, Ali

    2010-03-04

    Complex molecular mechanisms control B-cell fate to become a memory or a plasma cell. Interleukin-24 (IL-24) is a class II family cytokine of poorly understood immune function that regulates the cell cycle. We previously observed that IL-24 is strongly expressed in leukemic memory-type B cells. Here we show that IL-24 is also expressed in human follicular B cells; it is more abundant in CD27(+) memory B cells and CD5-expressing B cells, whereas it is low to undetectable in centroblasts and plasma cells. Addition of IL-24 to B cells, cultured in conditions shown to promote plasma cell differentiation, strongly inhibited plasma cell generation and immunoglobulin G (IgG) production. By contrast, IL-24 siRNA increased terminal differentiation of B cells into plasma cells. IL-24 is optimally induced by BCR triggering and CD40 engagement; IL-24 increased CD40-induced B-cell proliferation and modulated the transcription of key factors involved in plasma cell differentiation. It also inhibited activation-induced tyrosine phosphorylation of signal transducer and activator of transcription-3 (STAT-3), and inhibited the transcription of IL-10. Taken together, our results indicate that IL-24 is a novel cytokine involved in T-dependent antigen (Ag)-driven B-cell differentiation and suggest its physiologic role in favoring germinal center B-cell maturation in memory B cells at the expense of plasma cells.

  13. Detachment of human immunodeficiency virus type 1 from germinal centers by blocking complement receptor type 2.

    Science.gov (United States)

    Kacani, L; Prodinger, W M; Sprinzl, G M; Schwendinger, M G; Spruth, M; Stoiber, H; Döpper, S; Steinhuber, S; Steindl, F; Dierich, M P

    2000-09-01

    After the transition from the acute to the chronic phase of human immunodeficiency virus (HIV) infection, complement mediates long-term storage of virions in germinal centers (GC) of lymphoid tissue. The contribution of particular complement receptors (CRs) to virus trapping in GC was studied on tonsillar specimens from HIV-infected individuals. CR2 (CD21) was identified as the main binding site for HIV in GC. Monoclonal antibodies (MAb) blocking the CR2-C3d interaction were shown to detach 62 to 77% of HIV type 1 from tonsillar cells of an individual in the presymptomatic stage. Although they did so at a lower efficiency, these antibodies were able to remove HIV from tonsillar cells of patients under highly active antiretroviral therapy, suggesting that the C3d-CR2 interaction remains a primary entrapment mechanism in treated patients as well. In contrast, removal of HIV was not observed with MAb blocking CR1 or CR3. Thus, targeting CR2 may facilitate new approaches toward a reduction of residual virus in GC.

  14. Implementasi Model Audit Pertanggungjawaban Sosial Berbasis Human-Centered Design pada Organisasi Sektor Publik

    Directory of Open Access Journals (Sweden)

    Priyo Suprobo

    2014-01-01

    Full Text Available Previous research has resulted the Model of Audit for Social Accountability based on Human-Centered Design, which is hereinafter referred to HCD. The present study aims to obtain the alternative audit approaches that are more simple, effective, and suitable in connecting people with public sector organizations. Thus, the present study intends to test the model by implementing it in public sector organizations. The research approach is to test the implementation of the audit model in real terms in the field. Feedback on implementation test is conducted by the qualitative approach. Sampling was done by purposive sampling and the public sector organizations studied were CV. Aidrat & General Store of Pondok Pesantren Sunan Drajat Lamongan, Hospital RSAB Soerya Sidoarjo and the University of Widya Kartika Surabaya. These organizations were selected based on the type of public sector organizations covering the business units under religious organizations, health organizations, and educational institutions. On the other side, they are also determined by the willingness to cooperate and the area represented. In the assessment, the results of the audit to the criteria of social responsibility and a legal formal institutional aspect in the preliminary survey show that the University of Widya Kartika and RSAB Soerya have good performance, while CV. Aidrat has an acceptable performance. In terms of internal control, all of these organizations have an acceptable performance, while in terms of social responsibility programs, CV. Aidrat and RSAB Soerya have a good performance.

  15. Alternative Ultrasound Gel for a Sustainable Ultrasound Program: Application of Human Centered Design.

    Directory of Open Access Journals (Sweden)

    Margaret Salmon

    Full Text Available This paper describes design of a low cost, ultrasound gel from local products applying aspects of Human Centered Design methodology. A multidisciplinary team worked with clinicians who use ultrasound where commercial gel is cost prohibitive and scarce. The team followed the format outlined in the Ideo Took Kit. Research began by defining the challenge "how to create locally available alternative ultrasound gel for a low-resourced environment? The "End-Users," were identified as clinicians who use ultrasound in Democratic Republic of the Congo and Ethiopia. An expert group was identified and queried for possible alternatives to commercial gel. Responses included shampoo, oils, water and cornstarch. Cornstarch, while a reasonable solution, was either not available or too expensive. We then sought deeper knowledge of locally sources materials from local experts, market vendors, to develop a similar product. Suggested solutions gleaned from these interviews were collected and used to create ultrasound gel accounting for cost, image quality, manufacturing capability. Initial prototypes used cassava root flour from Great Lakes Region (DRC, Rwanda, Uganda, Tanzania and West Africa, and bula from Ethiopia. Prototypes were tested in the field and resulting images evaluated by our user group. A final prototype was then selected. Cassava and bula at a 32 part water, 8 part flour and 4 part salt, heated, mixed then cooled was the product design of choice.

  16. Massive parallel IGHV gene sequencing reveals a germinal center pathway in origins of human multiple myeloma.

    Science.gov (United States)

    Cowan, Graeme; Weston-Bell, Nicola J; Bryant, Dean; Seckinger, Anja; Hose, Dirk; Zojer, Niklas; Sahota, Surinder S

    2015-05-30

    Human multiple myeloma (MM) is characterized by accumulation of malignant terminally differentiated plasma cells (PCs) in the bone marrow (BM), raising the question when during maturation neoplastic transformation begins. Immunoglobulin IGHV genes carry imprints of clonal tumor history, delineating somatic hypermutation (SHM) events that generally occur in the germinal center (GC). Here, we examine MM-derived IGHV genes using massive parallel deep sequencing, comparing them with profiles in normal BM PCs. In 4/4 presentation IgG MM, monoclonal tumor-derived IGHV sequences revealed significant evidence for intraclonal variation (ICV) in mutation patterns. IGHV sequences of 2/2 normal PC IgG populations revealed dominant oligoclonal expansions, each expansion also displaying mutational ICV. Clonal expansions in MM and in normal BM PCs reveal common IGHV features. In such MM, the data fit a model of tumor origins in which neoplastic transformation is initiated in a GC B-cell committed to terminal differentiation but still targeted by on-going SHM. Strikingly, the data parallel IGHV clonal sequences in some monoclonal gammopathy of undetermined significance (MGUS) known to display on-going SHM imprints. Since MGUS generally precedes MM, these data suggest origins of MGUS and MM with IGHV gene mutational ICV from the same GC B-cell, arising via a distinctive pathway.

  17. Proceedings of the Third International Conference on Intelligent Human Computer Interaction

    CERN Document Server

    Pokorný, Jaroslav; Snášel, Václav; Abraham, Ajith

    2013-01-01

    The Third International Conference on Intelligent Human Computer Interaction 2011 (IHCI 2011) was held at Charles University, Prague, Czech Republic from August 29 - August 31, 2011. This conference was third in the series, following IHCI 2009 and IHCI 2010 held in January at IIIT Allahabad, India. Human computer interaction is a fast growing research area and an attractive subject of interest for both academia and industry. There are many interesting and challenging topics that need to be researched and discussed. This book aims to provide excellent opportunities for the dissemination of interesting new research and discussion about presented topics. It can be useful for researchers working on various aspects of human computer interaction. Topics covered in this book include user interface and interaction, theoretical background and applications of HCI and also data mining and knowledge discovery as a support of HCI applications.

  18. Radiological and Environmental Research Division, Center for Human Radiobiology. Annual report, July 1980-June 1981. [Lead abstract

    Energy Technology Data Exchange (ETDEWEB)

    1982-03-01

    Separate abstracts were prepared for the 22 papers of this annual report of the Center for Human Radiobiology. Abstracts were not written for 2 appendices which contain data on the exposure and radium-induced malignancies of 2259 persons whose radium content has been determined at least once. (KRM)

  19. Real Time Multiple Hand Gesture Recognition System for Human Computer Interaction

    Directory of Open Access Journals (Sweden)

    Siddharth S. Rautaray

    2012-05-01

    Full Text Available With the increasing use of computing devices in day to day life, the need of user friendly interfaces has lead towards the evolution of different types of interfaces for human computer interaction. Real time vision based hand gesture recognition affords users the ability to interact with computers in more natural and intuitive ways. Direct use of hands as an input device is an attractive method which can communicate much more information by itself in comparison to mice, joysticks etc allowing a greater number of recognition system that can be used in a variety of human computer interaction applications. The gesture recognition system consist of three main modules like hand segmentation, hand tracking and gesture recognition from hand features. The designed system further integrated with different applications like image browser, virtual game etc. possibilities for human computer interaction. Computer Vision based systems has the potential to provide more natural, non-contact solutions. The present research work focuses on to design and develops a practical framework for real time hand gesture.

  20. Human-computer interaction handbook fundamentals, evolving technologies and emerging applications

    CERN Document Server

    Sears, Andrew

    2007-01-01

    This second edition of The Human-Computer Interaction Handbook provides an updated, comprehensive overview of the most important research in the field, including insights that are directly applicable throughout the process of developing effective interactive information technologies. It features cutting-edge advances to the scientific knowledge base, as well as visionary perspectives and developments that fundamentally transform the way in which researchers and practitioners view the discipline. As the seminal volume of HCI research and practice, The Human-Computer Interaction Handbook feature