WorldWideScience

Sample records for computing capability user

  1. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Vigil,Benny Manuel [Los Alamos National Laboratory; Ballance, Robert [SNL; Haskell, Karen [SNL

    2012-08-09

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.

  2. Scientific user facilities at Oak Ridge National Laboratory: New research capabilities and opportunities

    Science.gov (United States)

    Roberto, James

    2011-10-01

    Over the past decade, Oak Ridge National Laboratory (ORNL) has transformed its research infrastructure, particularly in the areas of neutron scattering, nanoscale science and technology, and high-performance computing. New facilities, including the Spallation Neutron Source, Center for Nanophase Materials Sciences, and Leadership Computing Facility, have been constructed that provide world-leading capabilities in neutron science, condensed matter and materials physics, and computational physics. In addition, many existing physics-related facilities have been upgraded with new capabilities, including new instruments and a high- intensity cold neutron source at the High Flux Isotope Reactor. These facilities are operated for the scientific community and are available to qualified users based on competitive peer-reviewed proposals. User facilities at ORNL currently welcome more than 2,500 researchers each year, mostly from universities. These facilities, many of which are unique in the world, will be reviewed including current and planned research capabilities, availability and operational performance, access procedures, and recent research results. Particular attention will be given to new neutron scattering capabilities, nanoscale science, and petascale simulation and modeling. In addition, user facilities provide a portal into ORNL that can enhance the development of research collaborations. The spectrum of partnership opportunities with ORNL will be described including collaborations, joint faculty, and graduate research and education.

  3. User Instructions for the Systems Assessment Capability, Rev. 0, Computer Codes Volume 2: Impact Modules

    International Nuclear Information System (INIS)

    Eslinger, Paul W.; Arimescu, Carmen; Kanyid, Beverly A.; Miley, Terri B.

    2001-01-01

    One activity of the Department of Energy?s Groundwater/Vadose Zone Integration Project is an assessment of cumulative impacts from Hanford Site wastes on the subsurface environment and the Columbia River. Through the application of a system assessment capability (SAC), decisions for each cleanup and disposal action will be able to take into account the composite effect of other cleanup and disposal actions. The SAC has developed a suite of computer programs to simulate the migration of contaminants (analytes) present on the Hanford Site and to assess the potential impacts of the analytes, including dose to humans, socio-cultural impacts, economic impacts, and ecological impacts. The general approach to handling uncertainty in the SAC computer codes is a Monte Carlo approach. Conceptually, one generates a value for every stochastic parameter in the code (the entire sequence of modules from inventory through transport and impacts) and then executes the simulation, obtaining an output value, or result. This document provides user instructions for the SAC codes that generate human, ecological, economic, and cultural impacts

  4. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  5. User's guide for the REBUS-3 fuel cycle analysis capability

    International Nuclear Information System (INIS)

    Toppel, B.J.

    1983-03-01

    REBUS-3 is a system of programs designed for the fuel-cycle analysis of fast reactors. This new capability is an extension and refinement of the REBUS-3 code system and complies with the standard code practices and interface dataset specifications of the Committee on Computer Code Coordination (CCCC). The new code is hence divorced from the earlier ARC System. In addition, the coding has been designed to enhance code exportability. Major new capabilities not available in the REBUS-2 code system include a search on burn cycle time to achieve a specified value for the multiplication constant at the end of the burn step; a general non-repetitive fuel-management capability including temporary out-of-core fuel storage, loading of fresh fuel, and subsequent retrieval and reloading of fuel; significantly expanded user input checking; expanded output edits; provision of prestored burnup chains to simplify user input; option of fixed-or free-field BCD input formats; and, choice of finite difference, nodal or spatial flux-synthesis neutronics in one-, two-, or three-dimensions

  6. Guidelines for the integration of audio cues into computer user interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Sumikawa, D.A.

    1985-06-01

    Throughout the history of computers, vision has been the main channel through which information is conveyed to the computer user. As the complexities of man-machine interactions increase, more and more information must be transferred from the computer to the user and then successfully interpreted by the user. A logical next step in the evolution of the computer-user interface is the incorporation of sound and thereby using the sense of ''hearing'' in the computer experience. This allows our visual and auditory capabilities to work naturally together in unison leading to more effective and efficient interpretation of all information received by the user from the computer. This thesis presents an initial set of guidelines to assist interface developers in designing an effective sight and sound user interface. This study is a synthesis of various aspects of sound, human communication, computer-user interfaces, and psychoacoustics. We introduce the notion of an earcon. Earcons are audio cues used in the computer-user interface to provide information and feedback to the user about some computer object, operation, or interaction. A possible construction technique for earcons, the use of earcons in the interface, how earcons are learned and remembered, and the affects of earcons on their users are investigated. This study takes the point of view that earcons are a language and human/computer communication issue and are therefore analyzed according to the three dimensions of linguistics; syntactics, semantics, and pragmatics.

  7. An assessment of the real-time application capabilities of the SIFT computer system

    Science.gov (United States)

    Butler, R. W.

    1982-01-01

    The real-time capabilities of the SIFT computer system, a highly reliable multicomputer architecture developed to support the flight controls of a relaxed static stability aircraft, are discussed. The SIFT computer system was designed to meet extremely high reliability requirements and to facilitate a formal proof of its correctness. Although SIFT represents a significant achievement in fault-tolerant system research it presents an unusual and restrictive interface to its users. The characteristics of the user interface and its impact on application system design are assessed.

  8. User's guide for the REBUS-3 fuel cycle analysis capability

    Energy Technology Data Exchange (ETDEWEB)

    Toppel, B.J.

    1983-03-01

    REBUS-3 is a system of programs designed for the fuel-cycle analysis of fast reactors. This new capability is an extension and refinement of the REBUS-3 code system and complies with the standard code practices and interface dataset specifications of the Committee on Computer Code Coordination (CCCC). The new code is hence divorced from the earlier ARC System. In addition, the coding has been designed to enhance code exportability. Major new capabilities not available in the REBUS-2 code system include a search on burn cycle time to achieve a specified value for the multiplication constant at the end of the burn step; a general non-repetitive fuel-management capability including temporary out-of-core fuel storage, loading of fresh fuel, and subsequent retrieval and reloading of fuel; significantly expanded user input checking; expanded output edits; provision of prestored burnup chains to simplify user input; option of fixed-or free-field BCD input formats; and, choice of finite difference, nodal or spatial flux-synthesis neutronics in one-, two-, or three-dimensions.

  9. Using Noninvasive Brain Measurement to Explore the Psychological Effects of Computer Malfunctions on Users during Human-Computer Interactions

    Directory of Open Access Journals (Sweden)

    Leanne M. Hirshfield

    2014-01-01

    Full Text Available In today’s technologically driven world, there is a need to better understand the ways that common computer malfunctions affect computer users. These malfunctions may have measurable influences on computer user’s cognitive, emotional, and behavioral responses. An experiment was conducted where participants conducted a series of web search tasks while wearing functional near-infrared spectroscopy (fNIRS and galvanic skin response sensors. Two computer malfunctions were introduced during the sessions which had the potential to influence correlates of user trust and suspicion. Surveys were given after each session to measure user’s perceived emotional state, cognitive load, and perceived trust. Results suggest that fNIRS can be used to measure the different cognitive and emotional responses associated with computer malfunctions. These cognitive and emotional changes were correlated with users’ self-report levels of suspicion and trust, and they in turn suggest future work that further explores the capability of fNIRS for the measurement of user experience during human-computer interactions.

  10. User Instructions for the Systems Assessment Capability, Rev. 0, Computer Codes Volume 1: Inventory, Release, and Transport Modules

    International Nuclear Information System (INIS)

    Eslinger, Paul W.; Engel, David W.; Gerhardstein, Lawrence H.; Lopresti, Charles A.; Nichols, William E.; Strenge, Dennis L.

    2001-12-01

    One activity of the Department of Energy's Groundwater/Vadose Zone Integration Project is an assessment of cumulative impacts from Hanford Site wastes on the subsurface environment and the Columbia River. Through the application of a system assessment capability (SAC), decisions for each cleanup and disposal action will be able to take into account the composite effect of other cleanup and disposal actions. The SAC has developed a suite of computer programs to simulate the migration of contaminants (analytes) present on the Hanford Site and to assess the potential impacts of the analytes, including dose to humans, socio-cultural impacts, economic impacts, and ecological impacts. The general approach to handling uncertainty in the SAC computer codes is a Monte Carlo approach. Conceptually, one generates a value for every stochastic parameter in the code (the entire sequence of modules from inventory through transport and impacts) and then executes the simulation, obtaining an output value, or result. This document provides user instructions for the SAC codes that handle inventory tracking, release of contaminants to the environment, and transport of contaminants through the unsaturated zone, saturated zone, and the Columbia River

  11. Facility Interface Capability Assessment (FICA) user manual

    International Nuclear Information System (INIS)

    Pope, R.B.; MacDonald, R.R.; Massaglia, J.L.; Williamson, D.A.; Viebrock, J.M.; Mote, N.

    1995-09-01

    The US Department of Energy's (DOE) Office of Civilian Radioactive Waste Management (OCRWM) is responsible for developing the Civilian Radioactive Waste Management System (CRWMS) to accept spent nuclear fuel from commercial facilities. The objective of the Facility Interface Capability Assessment (FICA) project was to assess the capability of each commercial spent nuclear fuel (SNF) storage facility, at which SNF is stored, to handle various SNF shipping casks. The purpose of this report is describe the FICA computer software and to provide the FICA user with a guide on how to use the FICA system. The FICA computer software consists of two executable programs: the FICA Reactor Report program and the FICA Summary Report program (written in the Ca-Clipper version 5.2 development system). The complete FICA software system is contained on either a 3.5 in. (double density) or a 5.25 in. (high density) diskette and consists of the two FICA programs and all the database files (generated using dBASE III). The FICA programs are provided as ''stand alone'' systems and neither the Ca-Clipper compiler nor dBASE III is required to run the FICA programs. The steps for installing the FICA software system and executing the FICA programs are described in this report. Instructions are given on how to install the FICA software system onto the hard drive of the PC and how to execute the FICA programs from the FICA subdirectory on the hard drive. Both FICA programs are menu driven with the up-arrow and down-arrow keys used to move the cursor to the desired selection

  12. PHREEQCI; a graphical user interface for the geochemical computer program PHREEQC

    Science.gov (United States)

    Charlton, Scott R.; Macklin, Clifford L.; Parkhurst, David L.

    1997-01-01

    PhreeqcI is a Windows-based graphical user interface for the geochemical computer program PHREEQC. PhreeqcI provides the capability to generate and edit input data files, run simulations, and view text files containing simulation results, all within the framework of a single interface. PHREEQC is a multipurpose geochemical program that can perform speciation, inverse, reaction-path, and 1D advective reaction-transport modeling. Interactive access to all of the capabilities of PHREEQC is available with PhreeqcI. The interface is written in Visual Basic and will run on personal computers under the Windows(3.1), Windows95, and WindowsNT operating systems.

  13. BPACK -- A computer model package for boiler reburning/co-firing performance evaluations. User`s manual, Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Wu, K.T.; Li, B.; Payne, R.

    1992-06-01

    This manual presents and describes a package of computer models uniquely developed for boiler thermal performance and emissions evaluations by the Energy and Environmental Research Corporation. The model package permits boiler heat transfer, fuels combustion, and pollutant emissions predictions related to a number of practical boiler operations such as fuel-switching, fuels co-firing, and reburning NO{sub x} reductions. The models are adaptable to most boiler/combustor designs and can handle burner fuels in solid, liquid, gaseous, and slurried forms. The models are also capable of performing predictions for combustion applications involving gaseous-fuel reburning, and co-firing of solid/gas, liquid/gas, gas/gas, slurry/gas fuels. The model package is conveniently named as BPACK (Boiler Package) and consists of six computer codes, of which three of them are main computational codes and the other three are input codes. The three main codes are: (a) a two-dimensional furnace heat-transfer and combustion code: (b) a detailed chemical-kinetics code; and (c) a boiler convective passage code. This user`s manual presents the computer model package in two volumes. Volume 1 describes in detail a number of topics which are of general users` interest, including the physical and chemical basis of the models, a complete description of the model applicability, options, input/output, and the default inputs. Volume 2 contains a detailed record of the worked examples to assist users in applying the models, and to illustrate the versatility of the codes.

  14. User's manual for the NEFTRAN II computer code

    International Nuclear Information System (INIS)

    Olague, N.E.; Campbell, J.E.; Leigh, C.D.; Longsine, D.E.

    1991-02-01

    This document describes the NEFTRAN II (NEtwork Flow and TRANsport in Time-Dependent Velocity Fields) computer code and is intended to provide the reader with sufficient information to use the code. NEFTRAN II was developed as part of a performance assessment methodology for storage of high-level nuclear waste in unsaturated, welded tuff. NEFTRAN II is a successor to the NEFTRAN and NWFT/DVM computer codes and contains several new capabilities. These capabilities include: (1) the ability to input pore velocities directly to the transport model and bypass the network fluid flow model, (2) the ability to transport radionuclides in time-dependent velocity fields, (3) the ability to account for the effect of time-dependent saturation changes on the retardation factor, and (4) the ability to account for time-dependent flow rates through the source regime. In addition to these changes, the input to NEFTRAN II has been modified to be more convenient for the user. This document is divided into four main sections consisting of (1) a description of all the models contained in the code, (2) a description of the program and subprograms in the code, (3) a data input guide and (4) verification and sample problems. Although NEFTRAN II is the fourth generation code, this document is a complete description of the code and reference to past user's manuals should not be necessary. 19 refs., 33 figs., 25 tabs

  15. DDP-516 Computer Graphics System Capabilities

    Science.gov (United States)

    1972-06-01

    This report describes the capabilities of the DDP-516 Computer Graphics System. One objective of this report is to acquaint DOT management and project planners with the system's current capabilities, applications hardware and software. The Appendix i...

  16. Graphical User Interface Programming in Introductory Computer Science.

    Science.gov (United States)

    Skolnick, Michael M.; Spooner, David L.

    Modern computing systems exploit graphical user interfaces for interaction with users; as a result, introductory computer science courses must begin to teach the principles underlying such interfaces. This paper presents an approach to graphical user interface (GUI) implementation that is simple enough for beginning students to understand, yet…

  17. Prevalance of neck pain in computer users

    International Nuclear Information System (INIS)

    Sabeen, F.; Bashir, M.S.; Hussain, S.I.

    2013-01-01

    Prolonged use of computers during daily work activities and recreation is often cited as a cause of neck pain. Neck pain and computer users are clearly connected due to extended periods of sitting in a certain position with no breaks to stretch the neck muscles. Pro-longed computer use with neck bent forward, will cause the anterior neck muscles to gradually get shorter and tighter, while the muscles in the back of neck will grow longer and weaker. These changes will lead to development of neck pain. Objectives: To find incidence of neck pain in computer users, association between neck pain and prolong sitting in wrong posture, association between effects of break during prolong work, association between types of chair in use in prolong sitting and occurrence of neck pain. Methodology: For this observational study data was collected through Questionnaires from office workers (computer users), and students. Results: Out of 50 persons 72% of computer users had neck pain. Strong association was found between neck pain and prolonged computer use (p = 0.001). Those who took break during their work had less neck pain. No significant association was found between type of chair in use and neck pain. Neck pain and type of system in use also had no significant association. Conclusion: So duration of computer use and frequency of breaks are associated with neck pain at work. Severe Neck pain was found in people who use computer for more than 5 hours a day. (author)

  18. Sierra/SolidMechanics 4.48 User's Guide: Addendum for Shock Capabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose; Le, San; Littlewood, David John; Merewether, Mark Thomas; Mosby, Matthew David; Pierson, Kendall H.; Porter, Vicki L.; Shelton, Timothy; Thomas, Jesse David; Tupek, Michael R.; Veilleux, Michael; Xavier, Patrick G.

    2018-03-01

    This is an addendum to the Sierra/SolidMechanics 4.48 User's Guide that documents additional capabilities available only in alternate versions of the Sierra/SolidMechanics (Sierra/SM) code. These alternate versions are enhanced to provide capabilities that are regulated under the U.S. Department of State's International Traffic in Arms Regulations (ITAR) export-control rules. The ITAR regulated codes are only distributed to entities that comply with the ITAR export-control requirements. The ITAR enhancements to Sierra/SM in- clude material models with an energy-dependent pressure response (appropriate for very large deformations and strain rates) and capabilities for blast modeling. Since this is an addendum to the standard Sierra/SM user's guide, please refer to that document first for general descriptions of code capability and use.

  19. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  20. QUEST Hanford Site Computer Users - What do they do?

    Energy Technology Data Exchange (ETDEWEB)

    WITHERSPOON, T.T.

    2000-03-02

    The Fluor Hanford Chief Information Office requested that a computer-user survey be conducted to determine the user's dependence on the computer and its importance to their ability to accomplish their work. Daily use trends and future needs of Hanford Site personal computer (PC) users was also to be defined. A primary objective was to use the data to determine how budgets should be focused toward providing those services that are truly needed by the users.

  1. Distributed user interfaces for clinical ubiquitous computing applications.

    Science.gov (United States)

    Bång, Magnus; Larsson, Anders; Berglund, Erik; Eriksson, Henrik

    2005-08-01

    Ubiquitous computing with multiple interaction devices requires new interface models that support user-specific modifications to applications and facilitate the fast development of active workspaces. We have developed NOSTOS, a computer-augmented work environment for clinical personnel to explore new user interface paradigms for ubiquitous computing. NOSTOS uses several devices such as digital pens, an active desk, and walk-up displays that allow the system to track documents and activities in the workplace. We present the distributed user interface (DUI) model that allows standalone applications to distribute their user interface components to several devices dynamically at run-time. This mechanism permit clinicians to develop their own user interfaces and forms to clinical information systems to match their specific needs. We discuss the underlying technical concepts of DUIs and show how service discovery, component distribution, events and layout management are dealt with in the NOSTOS system. Our results suggest that DUIs--and similar network-based user interfaces--will be a prerequisite of future mobile user interfaces and essential to develop clinical multi-device environments.

  2. Evaluation of Musculoskeletal Disorders among computer Users in Isfahan

    Directory of Open Access Journals (Sweden)

    Ayoub Ghanbary

    2015-08-01

    Full Text Available Along with widespread use of computers, work-related musculoskeletal disorders (MSDs have become the most prevalent ergonomic problems in computer users. With evaluating musculoskeletal disorders among Computer Users can intervent a action to reduce musculoskeletal disorders carried out. The aim of the present study was to Assessment of Musculoskeletal Disorders among Computer Users in Isfahan University with Rapid Office Strain Assessment (ROSA method and Nordic questionnaire. This cross-sectional study was conducted on 96 computer users in Isfahan university. The data were analyzed using correlation and line regression by test spss 20. and descriptive statistics and Anova test. Data collection tool was Nordic questionnaire and Rapid Office Strain Assessment method checklist. The results of Nordic questionnaire showed that prevalence of musculoskeletal disorders in computer users were in the shoulder (62.1%, neck (54.9% and back (53.1% respectively more than in other parts of the body. Based on the level of risk of ROSA were 19 individuals in an area of low risk, 50 individual area of notification and 27 individual in the area hazard and need for ergonomics interventions. Musculoskeletal disorders prevalence were in women more than men. Also Anova test showed that there is a direct and significant correlation between age and work experience with a final score ROSA (p<0.001. The study result showed that the prevalence of MSDs among computer users of Isfahan universities is pretty high and must ergonomic interventions such as computer workstation redesign, users educate about ergonomic principles computer with work, reduced working hours in computers with work, and elbows should be kept close to the body with the angle between 90 and 120 degrees to reduce musculoskeletal disorders carried out.

  3. Multiple-User, Multitasking, Virtual-Memory Computer System

    Science.gov (United States)

    Generazio, Edward R.; Roth, Don J.; Stang, David B.

    1993-01-01

    Computer system designed and programmed to serve multiple users in research laboratory. Provides for computer control and monitoring of laboratory instruments, acquisition and anlaysis of data from those instruments, and interaction with users via remote terminals. System provides fast access to shared central processing units and associated large (from megabytes to gigabytes) memories. Underlying concept of system also applicable to monitoring and control of industrial processes.

  4. Designing User-Computer Dialogues: Basic Principles and Guidelines.

    Science.gov (United States)

    Harrell, Thomas H.

    This discussion of the design of computerized psychological assessment or testing instruments stresses the importance of the well-designed computer-user interface. The principles underlying the three main functional elements of computer-user dialogue--data entry, data display, and sequential control--are discussed, and basic guidelines derived…

  5. Reach and get capability in a computing environment

    Science.gov (United States)

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2012-06-05

    A reach and get technique includes invoking a reach command from a reach location within a computing environment. A user can then navigate to an object within the computing environment and invoke a get command on the object. In response to invoking the get command, the computing environment is automatically navigated back to the reach location and the object copied into the reach location.

  6. End of paper registration forms for new computer users

    CERN Multimedia

    2007-01-01

    As of 3rd December 2007 it will be possible for new users to sign the Computer Centre User Registration Form electronically. As before, new users will still need to go to their computing group administrator, who will make the electronic request for account creation using CRA and give the new user his or her initial password. The difference is that the requested accounts will be created and usable almost immediately. Users will then have 3 days within which they must go to the web page http://cern.ch/cernaccount and click on ‘New User’. They will be required to follow a short computer security awareness training course, read the CERN Computing Rules and then confirm that they accept the rules. If this is not completed within 3 days all their computer accounts will be blocked and they will have to contact the Helpdesk to unblock their accounts and get a second chance to complete the registration. During the introductory phase the existing paper forms will also be accepted ...

  7. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Science.gov (United States)

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A.; Duro, Richard

    2016-01-01

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location. PMID:27399711

  8. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Directory of Open Access Journals (Sweden)

    Gervasio Varela

    2016-07-01

    Full Text Available This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC and Ambient Intelligence (AmI systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  9. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems.

    Science.gov (United States)

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A; Duro, Richard

    2016-07-07

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  10. Recommended documentation for computer users at ANL

    Energy Technology Data Exchange (ETDEWEB)

    Heiberger, A.A.

    1992-04-01

    Recommended Documentation for Computer Users at ANL is for all users of the services available from the Argonne National Laboratory (ANL) Computing and Telecommunications Division (CTD). This document will guide you in selecting available documentation that will best fill your particular needs. Chapter 1 explains how to use this document to select documents and how to obtain them from the CTD Document Distribution Counter. Chapter 2 contains a table that categorizes available publications. Chapter 3 gives descriptions of the online DOCUMENT command for CMS, and VAX, and the Sun workstation. DOCUMENT allows you to scan for and order documentation that interests you. Chapter 4 lists publications by subject. Categories I and IX cover publications of a general nature and publications on telecommunications and networks respectively. Categories II, III, IV, V, VI, VII, VIII, and X cover publications on specific computer systems. Category XI covers publications on advanced scientific computing at Argonne. Chapter 5 contains abstracts for each publication, all arranged alphabetically. Chapter 6 describes additional publications containing bibliographies and master indexes that the user may find useful. The appendix identifies available computer systems, applications, languages, and libraries.

  11. User's manual for SPLPLOT-2: a computer code for data plotting and editing in conversational mode

    International Nuclear Information System (INIS)

    Muramatsu, Ken; Matsumoto, Kiyoshi; Kohsaka, Atsuo; Maniwa, Masaki.

    1985-07-01

    The computer code SPLPLOT-2 for plotting and data editing has been developed as a part of the code package: SPLPACK-1. The SPLPLOT-2 code has capabilities of both conversational and batch processings. This report describes the user's manual for SPLPLOT-2. The following improvements have been made in the SPLPLOT-2. (1) It has capabilities of both conversational and batch processings, (2) function of conversion of files from the input SPL (Standard PLotter) files to internal work files have been implemented to reduce number of time consuming access to the input SPL files, (3) user supplied subroutines can be assigned for data editing from the SPL files, (4) in addition to the two-dimensional graphs, streamline graphs, contour line graphs and bird's-eye view graphs can be drawn. (author)

  12. Perspectives on distributed computing : thirty people, four user types, and the distributed computing user experience.

    Energy Technology Data Exchange (ETDEWEB)

    Childers, L.; Liming, L.; Foster, I.; Mathematics and Computer Science; Univ. of Chicago

    2008-10-15

    This report summarizes the methodology and results of a user perspectives study conducted by the Community Driven Improvement of Globus Software (CDIGS) project. The purpose of the study was to document the work-related goals and challenges facing today's scientific technology users, to record their perspectives on Globus software and the distributed-computing ecosystem, and to provide recommendations to the Globus community based on the observations. Globus is a set of open source software components intended to provide a framework for collaborative computational science activities. Rather than attempting to characterize all users or potential users of Globus software, our strategy has been to speak in detail with a small group of individuals in the scientific community whose work appears to be the kind that could benefit from Globus software, learn as much as possible about their work goals and the challenges they face, and describe what we found. The result is a set of statements about specific individuals experiences. We do not claim that these are representative of a potential user community, but we do claim to have found commonalities and differences among the interviewees that may be reflected in the user community as a whole. We present these as a series of hypotheses that can be tested by subsequent studies, and we offer recommendations to Globus developers based on the assumption that these hypotheses are representative. Specifically, we conducted interviews with thirty technology users in the scientific community. We included both people who have used Globus software and those who have not. We made a point of including individuals who represent a variety of roles in scientific projects, for example, scientists, software developers, engineers, and infrastructure providers. The following material is included in this report: (1) A summary of the reported work-related goals, significant issues, and points of satisfaction with the use of Globus software

  13. Spectrometer user interface to computer systems

    International Nuclear Information System (INIS)

    Salmon, L.; Davies, M.; Fry, F.A.; Venn, J.B.

    1979-01-01

    A computer system for use in radiation spectrometry should be designed around the needs and comprehension of the user and his operating environment. To this end, the functions of the system should be built in a modular and independent fashion such that they can be joined to the back end of an appropriate user interface. The point that this interface should be designed rather than just allowed to evolve is illustrated by reference to four related computer systems of differing complexity and function. The physical user interfaces in all cases are keyboard terminals, and the virtues and otherwise of these devices are discussed and compared with others. The language interface needs to satisfy a number of requirements, often conflicting. Among these, simplicity and speed of operation compete with flexibility and scope. Both experienced and novice users need to be considered, and any individual's needs may vary from naive to complex. To be efficient and resilient, the implementation must use an operating system, but the user needs to be protected from its complex and unfamiliar syntax. At the same time the interface must allow the user access to all services appropriate to his needs. The user must also receive an image of privacy in a multi-user system. The interface itself must be stable and exhibit continuity between implementations. Some of these conflicting needs have been overcome by the SABRE interface with languages operating at several levels. The foundation is a simple semimnemonic command language that activates indididual and independent functions. The commands can be used with positional parameters or in an interactive dialogue the precise nature of which depends upon the operating environment and the user's experience. A command procedure or macrolanguage allows combinations of commands with conditional branching and arithmetic features. Thus complex but repetitive operations are easily performed

  14. Mac OS X Snow Leopard for Power Users Advanced Capabilities and Techniques

    CERN Document Server

    Granneman, Scott

    2010-01-01

    Mac OS X Snow Leopard for Power Users: Advanced Capabilities and Techniques is for Mac OS X users who want to go beyond the obvious, the standard, and the easy. If want to dig deeper into Mac OS X and maximize your skills and productivity using the world's slickest and most elegant operating system, then this is the book for you. Written by Scott Granneman, an experienced teacher, developer, and consultant, Mac OS X for Power Users helps you push Mac OS X to the max, unveiling advanced techniques and options that you may have not known even existed. Create custom workflows and apps with Automa

  15. The Computer as Rorschach: Implications for Management and User Acceptance

    OpenAIRE

    Kaplan, Bonnie

    1983-01-01

    Different views of the computer held by different participants in a medical computing project make it difficult to gain wide acceptance of an application. Researchers', programmers', and clinicians' views illustrate how users project their views onto the computer. Effects of these different views on user acceptance and implications for the management of computer projects are presented.

  16. Specialized, multi-user computer facility for the high-speed, interactive processing of experimental data

    International Nuclear Information System (INIS)

    Maples, C.C.

    1979-05-01

    A proposal has been made at LBL to develop a specialized computer facility specifically designed to deal with the problems associated with the reduction and analysis of experimental data. Such a facility would provide a highly interactive, graphics-oriented, multi-user environment capable of handling relatively large data bases for each user. By conceptually separating the general problem of data analysis into two parts, cyclic batch calculations and real-time interaction, a multilevel, parallel processing framework may be used to achieve high-speed data processing. In principle such a system should be able to process a mag tape equivalent of data through typical transformations and correlations in under 30 s. The throughput for such a facility, for five users simultaneously reducing data, is estimated to be 2 to 3 times greater than is possible, for example, on a CDC7600. 3 figures

  17. Specialized, multi-user computer facility for the high-speed, interactive processing of experimental data

    International Nuclear Information System (INIS)

    Maples, C.C.

    1979-01-01

    A proposal has been made to develop a specialized computer facility specifically designed to deal with the problems associated with the reduction and analysis of experimental data. Such a facility would provide a highly interactive, graphics-oriented, multi-user environment capable of handling relatively large data bases for each user. By conceptually separating the general problem of data analysis into two parts, cyclic batch calculations and real-time interaction, a multi-level, parallel processing framework may be used to achieve high-speed data processing. In principle such a system should be able to process a mag tape equivalent of data, through typical transformations and correlations, in under 30 sec. The throughput for such a facility, assuming five users simultaneously reducing data, is estimated to be 2 to 3 times greater than is possible, for example, on a CDC7600

  18. CASL Dakota Capabilities Summary

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Simmons, Chris [Univ. of Texas, Austin, TX (United States); Williams, Brian J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-10

    The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.

  19. Computer-Based Tools for Evaluating Graphical User Interfaces

    Science.gov (United States)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  20. Integrating user studies into computer graphics-related courses.

    Science.gov (United States)

    Santos, B S; Dias, P; Silva, S; Ferreira, C; Madeira, J

    2011-01-01

    This paper presents computer graphics. Computer graphics and visualization are essentially about producing images for a target audience, be it the millions watching a new CG-animated movie or the small group of researchers trying to gain insight into the large amount of numerical data resulting from a scientific experiment. To ascertain the final images' effectiveness for their intended audience or the designed visualizations' accuracy and expressiveness, formal user studies are often essential. In human-computer interaction (HCI), such user studies play a similar fundamental role in evaluating the usability and applicability of interaction methods and metaphors for the various devices and software systems we use.

  1. Computer and video game addiction-a comparison between game users and non-game users.

    Science.gov (United States)

    Weinstein, Aviv Malkiel

    2010-09-01

    Computer game addiction is excessive or compulsive use of computer and video games that may interfere with daily life. It is not clear whether video game playing meets diagnostic criteria for Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV). First objective is to review the literature on computer and video game addiction over the topics of diagnosis, phenomenology, epidemiology, and treatment. Second objective is to describe a brain imaging study measuring dopamine release during computer game playing. Article search of 15 published articles between 2000 and 2009 in Medline and PubMed on computer and video game addiction. Nine abstinent "ecstasy" users and 8 control subjects were scanned at baseline and after performing on a motorbike riding computer game while imaging dopamine release in vivo with [123I] IBZM and single photon emission computed tomography (SPECT). Psycho-physiological mechanisms underlying computer game addiction are mainly stress coping mechanisms, emotional reactions, sensitization, and reward. Computer game playing may lead to long-term changes in the reward circuitry that resemble the effects of substance dependence. The brain imaging study showed that healthy control subjects had reduced dopamine D2 receptor occupancy of 10.5% in the caudate after playing a motorbike riding computer game compared with baseline levels of binding consistent with increased release and binding to its receptors. Ex-chronic "ecstasy" users showed no change in levels of dopamine D2 receptor occupancy after playing this game. This evidence supports the notion that psycho-stimulant users have decreased sensitivity to natural reward. Computer game addicts or gamblers may show reduced dopamine response to stimuli associated with their addiction presumably due to sensitization.

  2. VASCOMP 2. The V/STOL aircraft sizing and performance computer program. Volume 6: User's manual, revision 3

    Science.gov (United States)

    Schoen, A. H.; Rosenstein, H.; Stanzione, K.; Wisniewski, J. S.

    1980-01-01

    This report describes the use of the V/STOL Aircraft Sizing and Performance Computer Program (VASCOMP II). The program is useful in performing aircraft parametric studies in a quick and cost efficient manner. Problem formulation and data development were performed by the Boeing Vertol Company and reflects the present preliminary design technology. The computer program, written in FORTRAN IV, has a broad range of input parameters, to enable investigation of a wide variety of aircraft. User oriented features of the program include minimized input requirements, diagnostic capabilities, and various options for program flexibility.

  3. Security for small computer systems a practical guide for users

    CERN Document Server

    Saddington, Tricia

    1988-01-01

    Security for Small Computer Systems: A Practical Guide for Users is a guidebook for security concerns for small computers. The book provides security advice for the end-users of small computers in different aspects of computing security. Chapter 1 discusses the security and threats, and Chapter 2 covers the physical aspect of computer security. The text also talks about the protection of data, and then deals with the defenses against fraud. Survival planning and risk assessment are also encompassed. The last chapter tackles security management from an organizational perspective. The bo

  4. Multimodal Challenge: Analytics Beyond User-computer Interaction Data

    NARCIS (Netherlands)

    Di Mitri, Daniele; Schneider, Jan; Specht, Marcus; Drachsler, Hendrik

    2018-01-01

    This contribution describes one the challenges explored in the Fourth LAK Hackathon. This challenge aims at shifting the focus from learning situations which can be easily traced through user-computer interactions data and concentrate more on user-world interactions events, typical of co-located and

  5. ARDS User Manual

    Science.gov (United States)

    Fleming, David P.

    2001-01-01

    Personal computers (PCs) are now used extensively for engineering analysis. their capability exceeds that of mainframe computers of only a few years ago. Programs originally written for mainframes have been ported to PCs to make their use easier. One of these programs is ARDS (Analysis of Rotor Dynamic Systems) which was developed at Arizona State University (ASU) by Nelson et al. to quickly and accurately analyze rotor steady state and transient response using the method of component mode synthesis. The original ARDS program was ported to the PC in 1995. Several extensions were made at ASU to increase the capability of mainframe ARDS. These extensions have also been incorporated into the PC version of ARDS. Each mainframe extension had its own user manual generally covering only that extension. Thus to exploit the full capability of ARDS required a large set of user manuals. Moreover, necessary changes and enhancements for PC ARDS were undocumented. The present document is intended to remedy those problems by combining all pertinent information needed for the use of PC ARDS into one volume.

  6. An assessment of future computer system needs for large-scale computation

    Science.gov (United States)

    Lykos, P.; White, J.

    1980-01-01

    Data ranging from specific computer capability requirements to opinions about the desirability of a national computer facility are summarized. It is concluded that considerable attention should be given to improving the user-machine interface. Otherwise, increased computer power may not improve the overall effectiveness of the machine user. Significant improvement in throughput requires highly concurrent systems plus the willingness of the user community to develop problem solutions for that kind of architecture. An unanticipated result was the expression of need for an on-going cross-disciplinary users group/forum in order to share experiences and to more effectively communicate needs to the manufacturers.

  7. RAMONA-4B a computer code with three-dimensional neutron kinetics for BWR and SBWR system transient - user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Rohatgi, U.S.; Cheng, H.S.; Khan, H.J.; Mallen, A.N.; Neymotin, L.Y.

    1998-03-01

    This document is the User`s Manual for the Boiling Water Reactor (BWR), and Simplified Boiling Water Reactor (SBWR) systems transient code RAMONA-4B. The code uses a three-dimensional neutron-kinetics model coupled with a multichannel, nonequilibrium, drift-flux, phase-flow model of the thermal hydraulics of the reactor vessel. The code is designed to analyze a wide spectrum of BWR core and system transients. Chapter 1 gives an overview of the code`s capabilities and limitations; Chapter 2 describes the code`s structure, lists major subroutines, and discusses the computer requirements. Chapter 3 is on code, auxillary codes, and instructions for running RAMONA-4B on Sun SPARC and IBM Workstations. Chapter 4 contains component descriptions and detailed card-by-card input instructions. Chapter 5 provides samples of the tabulated output for the steady-state and transient calculations and discusses the plotting procedures for the steady-state and transient calculations. Three appendices contain important user and programmer information: lists of plot variables (Appendix A) listings of input deck for sample problem (Appendix B), and a description of the plotting program PAD (Appendix C). 24 refs., 18 figs., 11 tabs.

  8. Managing the Risks Associated with End-User Computing.

    Science.gov (United States)

    Alavi, Maryam; Weiss, Ira R.

    1986-01-01

    Identifies organizational risks of end-user computing (EUC) associated with different stages of the end-user applications life cycle (analysis, design, implementation). Generic controls are identified that address each of the risks enumerated in a manner that allows EUC management to select those most appropriate to their EUC environment. (5…

  9. Pattern of Ocular Diseases among Computer users in Enugu, Nigeria

    African Journals Online (AJOL)

    7 subjects (1.3%) had monocular blindness with VA<3/60. 37 (3.3%) subjects had low vision with VA < 6/18-3/60. Conclusion: Most of the subjects were young people. Ocular disorders were encountered in computer users. Ocular health status of computer users can be improved through periodic ocular examination and ...

  10. Evaluating biomechanics of user-selected sitting and standing computer workstation.

    Science.gov (United States)

    Lin, Michael Y; Barbir, Ana; Dennerlein, Jack T

    2017-11-01

    A standing computer workstation has now become a popular modern work place intervention to reduce sedentary behavior at work. However, user's interaction related to a standing computer workstation and its differences with a sitting workstation need to be understood to assist in developing recommendations for use and set up. The study compared the differences in upper extremity posture and muscle activity between user-selected sitting and standing workstation setups. Twenty participants (10 females, 10 males) volunteered for the study. 3-D posture, surface electromyography, and user-reported discomfort were measured while completing simulated tasks with each participant's self-selected workstation setups. Sitting computer workstation associated with more non-neutral shoulder postures and greater shoulder muscle activity, while standing computer workstation induced greater wrist adduction angle and greater extensor carpi radialis muscle activity. Sitting computer workstation also associated with greater shoulder abduction postural variation (90th-10th percentile) while standing computer workstation associated with greater variation for should rotation and wrist extension. Users reported similar overall discomfort levels within the first 10 min of work but had more than twice as much discomfort while standing than sitting after 45 min; with most discomfort reported in the low back for standing and shoulder for sitting. These different measures provide understanding in users' different interactions with sitting and standing and by alternating between the two configurations in short bouts may be a way of changing the loading pattern on the upper extremity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Recommended documentation for computer users at ANL. Revision 3

    Energy Technology Data Exchange (ETDEWEB)

    Heiberger, A.A.

    1992-04-01

    Recommended Documentation for Computer Users at ANL is for all users of the services available from the Argonne National Laboratory (ANL) Computing and Telecommunications Division (CTD). This document will guide you in selecting available documentation that will best fill your particular needs. Chapter 1 explains how to use this document to select documents and how to obtain them from the CTD Document Distribution Counter. Chapter 2 contains a table that categorizes available publications. Chapter 3 gives descriptions of the online DOCUMENT command for CMS, and VAX, and the Sun workstation. DOCUMENT allows you to scan for and order documentation that interests you. Chapter 4 lists publications by subject. Categories I and IX cover publications of a general nature and publications on telecommunications and networks respectively. Categories II, III, IV, V, VI, VII, VIII, and X cover publications on specific computer systems. Category XI covers publications on advanced scientific computing at Argonne. Chapter 5 contains abstracts for each publication, all arranged alphabetically. Chapter 6 describes additional publications containing bibliographies and master indexes that the user may find useful. The appendix identifies available computer systems, applications, languages, and libraries.

  12. Differences in prevalence of self-reported musculoskeletal symptoms among computer and non-computer users in a Nigerian population: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Ayanniyi O

    2010-08-01

    Full Text Available Abstract Background Literature abounds on the prevalent nature of Self Reported Musculoskeletal Symptoms (SRMS among computer users, but studies that actually compared this with non computer users are meagre thereby reducing the strength of the evidence. This study compared the prevalence of SRMS between computer and non computer users and assessed the risk factors associated with SRMS. Methods A total of 472 participants comprising equal numbers of age and sex matched computer and non computer users were assessed for the presence of SRMS. Information concerning musculoskeletal symptoms and discomforts from the neck, shoulders, upper back, elbows, wrists/hands, low back, hips/thighs, knees and ankles/feet were obtained using the Standardized Nordic questionnaire. Results The prevalence of SRMS was significantly higher in the computer users than the non computer users both over the past 7 days (χ2 = 39.11, p = 0.001 and during the past 12 month durations (χ2 = 53.56, p = 0.001. The odds of reporting musculoskeletal symptoms was least for participants above the age of 40 years (OR = 0.42, 95% CI = 0.31-0.64 over the past 7 days and OR = 0.61; 95% CI = 0.47-0.77 during the past 12 months and also reduced in female participants. Increasing daily hours and accumulated years of computer use and tasks of data processing and designs/graphics significantly (p Conclusion The prevalence of SRMS was significantly higher in the computer users than the non computer users and younger age, being male, working longer hours daily, increasing years of computer use, data entry tasks and computer designs/graphics were the significant risk factors for reporting musculoskeletal symptoms among the computer users. Computer use may explain the increase in prevalence of SRMS among the computer users.

  13. Personal computer wallpaper user segmentation based on Sasang typology.

    Science.gov (United States)

    Lee, Joung-Youn

    2015-03-01

    As human-computer interaction (HCI) is becoming a significant part of all human life, the user's emotional satisfaction is an important factor to consider. These changes have been pointed out by several researchers who claim that a user's personality may become the most important factor in the design. The objective of this study is to examine Sasang typology as a user segmentation method in the area of HCI design. To test HCI usage patterns in terms of the user's personality and temperament, this study focuses on personal computer (PC) or lap-top wallpaper settings. One hundred and four Facebook friends completed a QSCC II survey assessing Sasang typology type and sent a captured image of their personal PC or lap-top wallpaper. To classify the computer usage pattern, folder organization and wallpaper setting were investigated. The research showed that So-Yang type organized folders and icons in an orderly manner, whereas So-Eum type did not organize folders and icons at all. With regard to wallpaper settings, So-Yang type used the default wallpaper provided by the PC but So-Eum type used landscape images. Because So-Yang type was reported to be emotionally stable and extrovert, they tended to be highly concerned with online privacy compared with So-Eum type. So-Eum type use a lot of images of landscapes as the background image, which demonstrates So-Eum's low emotional stability, anxiety, and the desire to obtain analogy throughout the computer screen. Also, So-Yang's wallpapers display family or peripheral figures and this is due to the sociability that extrovert So-Yang types possess. By proposing the Sasang typology as a factor in influencing an HCI usage pattern in this study, it can be used to predict the user's HCI experience, or suggest a native design methodology that can actively cope with the user's psychological environment.

  14. Computational aerodynamics requirements: The future role of the computer and the needs of the aerospace industry

    Science.gov (United States)

    Rubbert, P. E.

    1978-01-01

    The commercial airplane builder's viewpoint on the important issues involved in the development of improved computational aerodynamics tools such as powerful computers optimized for fluid flow problems is presented. The primary user of computational aerodynamics in a commercial aircraft company is the design engineer who is concerned with solving practical engineering problems. From his viewpoint, the development of program interfaces and pre-and post-processing capability for new computational methods is just as important as the algorithms and machine architecture. As more and more details of the entire flow field are computed, the visibility of the output data becomes a major problem which is then doubled when a design capability is added. The user must be able to see, understand, and interpret the results calculated. Enormous costs are expanded because of the need to work with programs having only primitive user interfaces.

  15. Specifying computer-based counseling systems in health care: a new approach to user-interface and interaction design.

    Science.gov (United States)

    Herzberg, Dominikus; Marsden, Nicola; Kübler, Peter; Leonhardt, Corinna; Thomanek, Sabine; Jung, Hartmut; Becker, Annette

    2009-04-01

    Computer-based counseling systems in health care play an important role in the toolset available for medical doctors to inform, motivate and challenge their patients according to a well-defined therapeutic goal. The design, development and implementation of such systems require close collaboration between users, i.e. patients, and developers. While this is true of any software development process, it can be particularly challenging in the health counseling field, where there are multiple specialties and extremely heterogeneous user groups. In order to facilitate a structured design approach for counseling systems in health care, we developed (a) an iterative three-staged specification process, which enables early involvement of potential users in the development process, and (b) a specification language, which enables an author to consistently describe and define user interfaces and interaction designs in a stepwise manner. Due to the formal nature of our specifications, our implementation has some unique features, like early execution of prototypes, automated system generation and verification capabilities.

  16. LTCP 2D Graphical User Interface. Application Description and User's Guide

    Science.gov (United States)

    Ball, Robert; Navaz, Homayun K.

    1996-01-01

    A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.

  17. Understanding and Mastering Dynamics in Computing Grids Processing Moldable Tasks with User-Level Overlay

    CERN Document Server

    Moscicki, Jakub Tomasz

    Scientic communities are using a growing number of distributed systems, from lo- cal batch systems, community-specic services and supercomputers to general-purpose, global grid infrastructures. Increasing the research capabilities for science is the raison d'^etre of such infrastructures which provide access to diversied computational, storage and data resources at large scales. Grids are rather chaotic, highly heterogeneous, de- centralized systems where unpredictable workloads, component failures and variability of execution environments are commonplace. Understanding and mastering the hetero- geneity and dynamics of such distributed systems is prohibitive for end users if they are not supported by appropriate methods and tools. The time cost to learn and use the interfaces and idiosyncrasies of dierent distributed environments is another challenge. Obtaining more reliable application execution times and boosting parallel speedup are important to increase the research capabilities of scientic communities. L...

  18. Design of an online health-promoting community: negotiating user community needs with public health goals and service capabilities.

    Science.gov (United States)

    Ekberg, Joakim; Timpka, Toomas; Angbratt, Marianne; Frank, Linda; Norén, Anna-Maria; Hedin, Lena; Andersen, Emelie; Gursky, Elin A; Gäre, Boel Andersson

    2013-07-04

    An online health-promoting community (OHPC) has the potential to promote health and advance new means of dialogue between public health representatives and the general public. The aim of this study was to examine what aspects of an OHPC that are critical for satisfying the needs of the user community and public health goals and service capabilities. Community-based participatory research methods were used for data collection and analysis, and participatory design principles to develop a case study OHPC for adolescents. Qualitative data from adolescents on health appraisals and perspectives on health information were collected in a Swedish health service region and classified into categories of user health information exchange needs. A composite design rationale for the OHPC was completed by linking the identified user needs, user-derived requirements, and technical and organizational systems solutions. Conflicts between end-user requirements and organizational goals and resources were identified. The most prominent health information needs were associated to food, exercise, and well-being. The assessment of the design rationale document and prototype in light of the regional public health goals and service capabilities showed that compromises were needed to resolve conflicts involving the management of organizational resources and responsibilities. The users wanted to discuss health issues with health experts having little time to set aside to the OHPC and it was unclear who should set the norms for the online discussions. OHPCs can be designed to satisfy both the needs of user communities and public health goals and service capabilities. Compromises are needed to resolve conflicts between users' needs to discuss health issues with domain experts and the management of resources and responsibilities in public health organizations.

  19. A User Assessment of Workspaces in Selected Music Education Computer Laboratories.

    Science.gov (United States)

    Badolato, Michael Jeremy

    A study of 120 students selected from the user populations of four music education computer laboratories was conducted to determine the applicability of current ergonomic and environmental design guidelines in satisfying the needs of users of educational computing workspaces. Eleven categories of workspace factors were organized into a…

  20. Automated Generation of User Guidance by Combining Computation and Deduction

    Directory of Open Access Journals (Sweden)

    Walther Neuper

    2012-02-01

    Full Text Available Herewith, a fairly old concept is published for the first time and named "Lucas Interpretation". This has been implemented in a prototype, which has been proved useful in educational practice and has gained academic relevance with an emerging generation of educational mathematics assistants (EMA based on Computer Theorem Proving (CTP. Automated Theorem Proving (ATP, i.e. deduction, is the most reliable technology used to check user input. However ATP is inherently weak in automatically generating solutions for arbitrary problems in applied mathematics. This weakness is crucial for EMAs: when ATP checks user input as incorrect and the learner gets stuck then the system should be able to suggest possible next steps. The key idea of Lucas Interpretation is to compute the steps of a calculation following a program written in a novel CTP-based programming language, i.e. computation provides the next steps. User guidance is generated by combining deduction and computation: the latter is performed by a specific language interpreter, which works like a debugger and hands over control to the learner at breakpoints, i.e. tactics generating the steps of calculation. The interpreter also builds up logical contexts providing ATP with the data required for checking user input, thus combining computation and deduction. The paper describes the concepts underlying Lucas Interpretation so that open questions can adequately be addressed, and prerequisites for further work are provided.

  1. Effectively identifying user profiles in network and host metrics

    Science.gov (United States)

    Murphy, John P.; Berk, Vincent H.; Gregorio-de Souza, Ian

    2010-04-01

    This work presents a collection of methods that is used to effectively identify users of computers systems based on their particular usage of the software and the network. Not only are we able to identify individual computer users by their behavioral patterns, we are also able to detect significant deviations in their typical computer usage over time, or compared to a group of their peers. For instance, most people have a small, and relatively unique selection of regularly visited websites, certain email services, daily work hours, and typical preferred applications for mandated tasks. We argue that these habitual patterns are sufficiently specific to identify fully anonymized network users. We demonstrate that with only a modest data collection capability, profiles of individual computer users can be constructed so as to uniquely identify a profiled user from among their peers. As time progresses and habits or circumstances change, the methods presented update each profile so that changes in user behavior can be reliably detected over both abrupt and gradual time frames, without losing the ability to identify the profiled user. The primary benefit of our methodology allows one to efficiently detect deviant behaviors, such as subverted user accounts, or organizational policy violations. Thanks to the relative robustness, these techniques can be used in scenarios with very diverse data collection capabilities, and data privacy requirements. In addition to behavioral change detection, the generated profiles can also be compared against pre-defined examples of known adversarial patterns.

  2. The Radiological Safety Analysis Computer Program (RSAC-5) user's manual

    International Nuclear Information System (INIS)

    Wenzel, D.R.

    1994-02-01

    The Radiological Safety Analysis Computer Program (RSAC-5) calculates the consequences of the release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory from either reactor operating history or nuclear criticalities. RSAC-5 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated through the inhalation, immersion, ground surface, and ingestion pathways. RSAC+, a menu-driven companion program to RSAC-5, assists users in creating and running RSAC-5 input files. This user's manual contains the mathematical models and operating instructions for RSAC-5 and RSAC+. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-5 and RSAC+. These programs are designed for users who are familiar with radiological dose assessment methods

  3. CEASAW: A User-Friendly Computer Environment Analysis for the Sawmill Owner

    Science.gov (United States)

    Guillermo Mendoza; William Sprouse; Philip A. Araman; William G. Luppold

    1991-01-01

    Improved spreadsheet software capabilities have brought optimization to users with little or no background in mathematical programming. Better interface capabilities of spreadsheet models now make it possible to combine optimization models with a spreadsheet system. Sawmill production and inventory systems possess many features that make them suitable application...

  4. NWChem Meeting on Science Driven Petascale Computing and Capability Development at EMSL

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.

    2007-02-19

    On January 25, and 26, 2007, an NWChem meeting was held that was attended by 65 scientists from 29 institutions including 22 universities and 5 national laboratories. The goals of the meeting were to look at major scientific challenges that could be addressed by computational modeling in environmental molecular sciences, and to identify the associated capability development needs. In addition, insights were sought into petascale computing developments in computational chemistry. During the meeting common themes were identified that will drive the need for the development of new or improved capabilities in NWChem. Crucial areas of development that the developer's team will be focusing on are (1) modeling of dynamics and kinetics in chemical transformations, (2) modeling of chemistry at interfaces and in the condensed phase, and (3) spanning longer time scales in biological processes modeled with molecular dynamics. Various computational chemistry methodologies were discussed during the meeting, which will provide the basis for the capability developments in the near or long term future of NWChem.

  5. Cloud-based Architecture Capabilities Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Vang, Leng [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steven R [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    In collaborating scientific research arena it is important to have an environment where analysts have access to a shared of information documents, software tools and be able to accurately maintain and track historical changes in models. A new cloud-based environment would be accessible remotely from anywhere regardless of computing platforms given that the platform has available of Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report reviews development of a Cloud-based Architecture Capabilities (CAC) as a web portal for PRA tools.

  6. Efficient and anonymous two-factor user authentication in wireless sensor networks: achieving user anonymity with lightweight sensor computation.

    Science.gov (United States)

    Nam, Junghyun; Choo, Kim-Kwang Raymond; Han, Sangchul; Kim, Moonseong; Paik, Juryon; Won, Dongho

    2015-01-01

    A smart-card-based user authentication scheme for wireless sensor networks (hereafter referred to as a SCA-WSN scheme) is designed to ensure that only users who possess both a smart card and the corresponding password are allowed to gain access to sensor data and their transmissions. Despite many research efforts in recent years, it remains a challenging task to design an efficient SCA-WSN scheme that achieves user anonymity. The majority of published SCA-WSN schemes use only lightweight cryptographic techniques (rather than public-key cryptographic techniques) for the sake of efficiency, and have been demonstrated to suffer from the inability to provide user anonymity. Some schemes employ elliptic curve cryptography for better security but require sensors with strict resource constraints to perform computationally expensive scalar-point multiplications; despite the increased computational requirements, these schemes do not provide user anonymity. In this paper, we present a new SCA-WSN scheme that not only achieves user anonymity but also is efficient in terms of the computation loads for sensors. Our scheme employs elliptic curve cryptography but restricts its use only to anonymous user-to-gateway authentication, thereby allowing sensors to perform only lightweight cryptographic operations. Our scheme also enjoys provable security in a formal model extended from the widely accepted Bellare-Pointcheval-Rogaway (2000) model to capture the user anonymity property and various SCA-WSN specific attacks (e.g., stolen smart card attacks, node capture attacks, privileged insider attacks, and stolen verifier attacks).

  7. Efficient and anonymous two-factor user authentication in wireless sensor networks: achieving user anonymity with lightweight sensor computation.

    Directory of Open Access Journals (Sweden)

    Junghyun Nam

    Full Text Available A smart-card-based user authentication scheme for wireless sensor networks (hereafter referred to as a SCA-WSN scheme is designed to ensure that only users who possess both a smart card and the corresponding password are allowed to gain access to sensor data and their transmissions. Despite many research efforts in recent years, it remains a challenging task to design an efficient SCA-WSN scheme that achieves user anonymity. The majority of published SCA-WSN schemes use only lightweight cryptographic techniques (rather than public-key cryptographic techniques for the sake of efficiency, and have been demonstrated to suffer from the inability to provide user anonymity. Some schemes employ elliptic curve cryptography for better security but require sensors with strict resource constraints to perform computationally expensive scalar-point multiplications; despite the increased computational requirements, these schemes do not provide user anonymity. In this paper, we present a new SCA-WSN scheme that not only achieves user anonymity but also is efficient in terms of the computation loads for sensors. Our scheme employs elliptic curve cryptography but restricts its use only to anonymous user-to-gateway authentication, thereby allowing sensors to perform only lightweight cryptographic operations. Our scheme also enjoys provable security in a formal model extended from the widely accepted Bellare-Pointcheval-Rogaway (2000 model to capture the user anonymity property and various SCA-WSN specific attacks (e.g., stolen smart card attacks, node capture attacks, privileged insider attacks, and stolen verifier attacks.

  8. The effect of ergonomic training and intervention on reducing occupational stress among computer users

    Directory of Open Access Journals (Sweden)

    T. Yektaee

    2014-05-01

    Result: According to covariance analysis, ergonomic training and interventions lead to reduction of occupational stress of computer users. .Conclusion: Training computer users and informing them of the ergonomic principals and also providing interventions such as correction of posture, reducing duration of work time, using armrest and footrest would have significant implication in reducing occupational stress among computer users.

  9. Directory of computer users in nuclear medicine

    International Nuclear Information System (INIS)

    Henne, R.L.; Erickson, J.J.; McClain, W.J.; Kirch, D.L.

    1977-01-01

    The directory is composed of two major divisions, a Users' section and a Vendors' section. The Users' section consists of detailed installation descriptions and indexes to these descriptions. A typical description contains the name, address, type, and size of the institution as well as names of persons to contact. Following the hardware descriptions are listed the type of studies for which the computers are utilized, including the languages used, the method of output and an estimate of how often the study is performed. The Vendors' section contains short descriptions of current commercially available nuclear medicine systems as supplied by the vendors themselves. In order to reduce the amount of obsolete data and to include new institutions in future updates of the directory, a user questionnaire is included

  10. Dosimetry and health effects self-teaching curriculum: illustrative problems to supplement the user's manual for the Dosimetry and Health Effects Computer Code

    International Nuclear Information System (INIS)

    Runkle, G.E.; Finley, N.C.

    1983-03-01

    This document contains a series of sample problems for the Dosimetry and Health Effects Computer Code to be used in conjunction with the user's manual (Runkle and Cranwell, 1982) for the code. This code was developed at Sandia National Laboratories for the Risk Methodology for Geologic Disposal of Radioactive Waste program (NRC FIN A-1192). The purpose of this document is to familiarize the user with the code, its capabilities, and its limitations. When the user has finished reading this document, he or she should be able to prepare data input for the Dosimetry and Health Effects code and have some insights into interpretation of the model output

  11. Fragment-based docking: development of the CHARMMing Web user interface as a platform for computer-aided drug design.

    Science.gov (United States)

    Pevzner, Yuri; Frugier, Emilie; Schalk, Vinushka; Caflisch, Amedeo; Woodcock, H Lee

    2014-09-22

    Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser. One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing's capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of "re-dockings" with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing's docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening.

  12. Documentation and user's guide for DOSTOMAN: a pathways computer model of radionuclide movement

    International Nuclear Information System (INIS)

    Root, R.W. Jr.

    1980-01-01

    This report documents the mathematical development and the computer implementation of the Savannah River Laboratory computer code used to simulate radonuclide movement in the environment. The user's guide provides all the necessary information for the prospective user to input the required data, execute the computer program, and display the results

  13. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  14. Advanced Simulation Capability for Environmental Management: Development and Demonstrations - 12532

    Energy Technology Data Exchange (ETDEWEB)

    Freshley, Mark D.; Freedman, Vicky; Gorton, Ian [Pacific Northwest National Laboratory, MSIN K9-33, P.O. Box 999, Richland, WA 99352 (United States); Hubbard, Susan S. [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, MS 50B-4230, Berkeley, CA 94720 (United States); Moulton, J. David; Dixon, Paul [Los Alamos National Laboratory, MS B284, P.O. Box 1663, Los Alamos, NM 87544 (United States)

    2012-07-01

    The U.S. Department of Energy Office of Environmental Management (EM), Technology Innovation and Development is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high-performance computing tool facilitates integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of capabilities, which are organized into Platform and Integrated Tool-sets and a High-Performance Computing Multi-process Simulator. The Platform capabilities target a level of functionality to allow end-to-end model development, starting with definition of the conceptual model and management of data for model input. The High-Performance Computing capabilities target increased functionality of process model representations, tool-sets for interaction with Platform, and verification and model confidence testing. The new capabilities are demonstrated through working groups, including one focused on the Hanford Site Deep Vadose Zone. The ASCEM program focused on planning during the first year and executing a prototype tool-set for an early demonstration of individual components. Subsequently, ASCEM has focused on developing and demonstrating an integrated set of capabilities, making progress toward a version of the capabilities that can be used to engage end users. Demonstration of capabilities continues to be implemented through working groups. Three different working groups, one focused on EM problems in the deep vadose zone, another investigating attenuation mechanisms for metals and radionuclides, and a third focusing on waste tank performance assessment, continue to make progress. The project

  15. User Participation and Participatory Design: Topics in Computing Education.

    Science.gov (United States)

    Kautz, Karlheinz

    1996-01-01

    Discusses user participation and participatory design in the context of formal education for computing professionals. Topics include the current curriculum debate; mathematical- and engineering-based education; traditional system-development training; and an example of a course program that includes computers and society, and prototyping. (53…

  16. Directory of computer users in nuclear medicine

    Energy Technology Data Exchange (ETDEWEB)

    Henne, R.L.; Erickson, J.J.; McClain, W.J.; Kirch, D.L.

    1977-01-01

    The directory is composed of two major divisions, a Users' section and a Vendors' section. The Users' section consists of detailed installation descriptions and indexes to these descriptions. A typical description contains the name, address, type, and size of the institution as well as names of persons to contact. Following the hardware descriptions are listed the type of studies for which the computers are utilized, including the languages used, the method of output and an estimate of how often the study is performed. The Vendors' section contains short descriptions of current commercially available nuclear medicine systems as supplied by the vendors themselves. In order to reduce the amount of obsolete data and to include new institutions in future updates of the directory, a user questionnaire is included. (HLW)

  17. Graphical Visualization of Human Exploration Capabilities

    Science.gov (United States)

    Rodgers, Erica M.; Williams-Byrd, Julie; Arney, Dale C.; Simon, Matthew A.; Williams, Phillip A.; Barsoum, Christopher; Cowan, Tyler; Larman, Kevin T.; Hay, Jason; Burg, Alex

    2016-01-01

    NASA's pioneering space strategy will require advanced capabilities to expand the boundaries of human exploration on the Journey to Mars (J2M). The Evolvable Mars Campaign (EMC) architecture serves as a framework to identify critical capabilities that need to be developed and tested in order to enable a range of human exploration destinations and missions. Agency-wide System Maturation Teams (SMT) are responsible for the maturation of these critical exploration capabilities and help formulate, guide and resolve performance gaps associated with the EMC-identified capabilities. Systems Capability Organization Reporting Engine boards (SCOREboards) were developed to integrate the SMT data sets into cohesive human exploration capability stories that can be used to promote dialog and communicate NASA's exploration investments. Each SCOREboard provides a graphical visualization of SMT capability development needs that enable exploration missions, and presents a comprehensive overview of data that outlines a roadmap of system maturation needs critical for the J2M. SCOREboards are generated by a computer program that extracts data from a main repository, sorts the data based on a tiered data reduction structure, and then plots the data according to specified user inputs. The ability to sort and plot varying data categories provides the flexibility to present specific SCOREboard capability roadmaps based on customer requests. This paper presents the development of the SCOREboard computer program and shows multiple complementary, yet different datasets through a unified format designed to facilitate comparison between datasets. Example SCOREboard capability roadmaps are presented followed by a discussion of how the roadmaps are used to: 1) communicate capability developments and readiness of systems for future missions, and 2) influence the definition of NASA's human exploration investment portfolio through capability-driven processes. The paper concludes with a description

  18. Integration of End-User Cloud Storage for CMS Analysis

    CERN Document Server

    Riahi, Hassen; Álvarez Ayllón, Alejandro; Balcas, Justas; Ciangottini, Diego; Hernández, José M; Keeble, Oliver; Magini, Nicolò; Manzi, Andrea; Mascetti, Luca; Mascheroni, Marco; Tanasijczuk, Andres Jorge; Vaandering, Eric Wayne

    2018-01-01

    End-user Cloud storage is increasing rapidly in popularity in research communities thanks to the collaboration capabilities it offers, namely synchronisation and sharing. CERN IT has implemented a model of such storage named, CERNBox, integrated with the CERN AuthN and AuthZ services. To exploit the use of the end-user Cloud storage for the distributed data analysis activity, the CMS experiment has started the integration of CERNBox as a Grid resource. This will allow CMS users to make use of their own storage in the Cloud for their analysis activities as well as to benefit from synchronisation and sharing capabilities to achieve results faster and more effectively. It will provide an integration model of Cloud storages in the Grid, which is implemented and commissioned over the world’s largest computing Grid infrastructure, Worldwide LHC Computing Grid (WLCG). In this paper, we present the integration strategy and infrastructure changes needed in order to transparently integrate end-user Cloud storage with...

  19. HCIDL: Human-computer interface description language for multi-target, multimodal, plastic user interfaces

    Directory of Open Access Journals (Sweden)

    Lamia Gaouar

    2018-06-01

    Full Text Available From the human-computer interface perspectives, the challenges to be faced are related to the consideration of new, multiple interactions, and the diversity of devices. The large panel of interactions (touching, shaking, voice dictation, positioning … and the diversification of interaction devices can be seen as a factor of flexibility albeit introducing incidental complexity. Our work is part of the field of user interface description languages. After an analysis of the scientific context of our work, this paper introduces HCIDL, a modelling language staged in a model-driven engineering approach. Among the properties related to human-computer interface, our proposition is intended for modelling multi-target, multimodal, plastic interaction interfaces using user interface description languages. By combining plasticity and multimodality, HCIDL improves usability of user interfaces through adaptive behaviour by providing end-users with an interaction-set adapted to input/output of terminals and, an optimum layout. Keywords: Model driven engineering, Human-computer interface, User interface description languages, Multimodal applications, Plastic user interfaces

  20. Advanced display object selection methods for enhancing user-computer productivity

    Science.gov (United States)

    Osga, Glenn A.

    1993-01-01

    The User-Interface Technology Branch at NCCOSC RDT&E Division has been conducting a series of studies to address the suitability of commercial off-the-shelf (COTS) graphic user-interface (GUI) methods for efficiency and performance in critical naval combat systems. This paper presents an advanced selection algorithm and method developed to increase user performance when making selections on tactical displays. The method has also been applied with considerable success to a variety of cursor and pointing tasks. Typical GUI's allow user selection by: (1) moving a cursor with a pointing device such as a mouse, trackball, joystick, touchscreen; and (2) placing the cursor on the object. Examples of GUI objects are the buttons, icons, folders, scroll bars, etc. used in many personal computer and workstation applications. This paper presents an improved method of selection and the theoretical basis for the significant performance gains achieved with various input devices tested. The method is applicable to all GUI styles and display sizes, and is particularly useful for selections on small screens such as notebook computers. Considering the amount of work-hours spent pointing and clicking across all styles of available graphic user-interfaces, the cost/benefit in applying this method to graphic user-interfaces is substantial, with the potential for increasing productivity across thousands of users and applications.

  1. PLEXFIN a computer model for the economic assessment of nuclear power plant life extension. User's manual

    International Nuclear Information System (INIS)

    2007-01-01

    The IAEA developed PLEXFIN, a computer model analysis tool aimed to assist decision makers in the assessment of the economic viability of a nuclear power plant life/licence extension. This user's manual was produced to facilitate the application of the PLEXFIN computer model. It is widely accepted in the industry that the operational life of a nuclear power plant is not limited to a pre-determined number of years, sometimes established on non-technical grounds, but by the capability of the plant to comply with the nuclear safety and technical requirements in a cost effective manner. The decision to extend the license/life of a nuclear power plant involves a number of political, technical and economic issues. The economic viability is a cornerstone of the decision-making process. In a liberalized electricity market, the economics to justify a nuclear power plant life/license extension decision requires a more complex evaluation. This user's manual was elaborated in the framework of the IAEA's programmes on Continuous process improvement of NPP operating performance, and on Models for analysis and capacity building for sustainable energy development, with the support of four consultants meetings

  2. A Roadmap for NEAMS Capability Transfer

    Energy Technology Data Exchange (ETDEWEB)

    Bernholdt, David E [ORNL

    2011-11-01

    The vision of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program is to bring truly predictive modeling and simulation (M&S) capabilities to the nuclear engineering community in order to enable a new approach to the design and analysis of nuclear energy systems. From its inception, the NEAMS program has always envisioned a broad user base for its software and scientific products, including researchers within the DOE complex, nuclear industry technology developers and vendors, and operators. However activities to date have focused almost exclusively on interactions with NEAMS sponsors, who are also near-term users of NEAMS technologies. The task of the NEAMS Capability Transfer (CT) program element for FY2011 is to develop a comprehensive plan to support the program's needs for user outreach and technology transfer. In order to obtain community input to this plan, a 'NEAMS Capability Transfer Roadmapping Workshop' was held 4-5 April 2011 in Chattanooga, TN, and is summarized in this report. The 30 workshop participants represented the NEAMS program, the DOE and industrial user communities, and several outside programs. The workshop included a series of presentations providing an overview of the NEAMS program and presentations on the user outreach and technology transfer experiences of (1) The Advanced Simulation and Computing (ASC) program, (2) The Standardized Computer Analysis for Licensing Evaluation (SCALE) project, and (3) The Consortium for Advanced Simulation of Light Water Reactors (CASL), followed by discussion sessions. Based on the workshop and other discussions throughout the year, we make a number of recommendations of key areas for the NEAMS program to develop the user outreach and technology transfer activities: (1) Engage not only DOE, but also industrial users sooner and more often; (2) Engage with the Nuclear Regulatory Commission to facilitate their understanding and acceptance of NEAMS approach to predictive M&S; (3

  3. GADRAS-DRF 18.5 User's Manual.

    Energy Technology Data Exchange (ETDEWEB)

    Horne, Steven M.; Thoreson, Gregory G; Theisen, Lisa A.; Mitchell, Dean J.; Harding, Lee; Amai, Wendy A.

    2014-12-01

    The Gamma Detector Response and Analysis Software - Detector Response Function (GADRAS-DRF) application computes the response of gamma-ray and neutron detectors to incoming radiation. This manual provides step-by-step procedures to acquaint new users with the use of the application. The capabilities include characterization of detector response parameters, plotting and viewing measured and computed spectra, analyzing spectra to identify isotopes, and estimating source energy distributions from measured spectra. GADRAS-DRF can compute and provide detector responses quickly and accurately, giving users the ability to obtain usable results in a timely manner (a matter of seconds or minutes).

  4. GADRAS-DRF 18.6 User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Horne, Steve M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Threat Science; Thoreson, Greg G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Threat Science; Theisen, Lisa A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Threat Science; Mitchell, Dean J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Threat Science; Harding, Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Nuclear Threat Science; Amai, Wendy A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Robotic Security Systems

    2016-05-01

    The Gamma Detector Response and Analysis Software–Detector Response Function (GADRAS-DRF) application computes the response of gamma-ray and neutron detectors to incoming radiation. This manual provides step-by-step procedures to acquaint new users with the use of the application. The capabilities include characterization of detector response parameters, plotting and viewing measured and computed spectra, analyzing spectra to identify isotopes, and estimating source energy distributions from measured spectra. GADRAS-DRF can compute and provide detector responses quickly and accurately, giving users the ability to obtain usable results in a timely manner (a matter of seconds or minutes).

  5. Investigating User Interfaces of Non-Iranian Digital Libraries based on Social Bookmarking Capabilities and Characteristics to Use by Iranian Digital Libraries

    Directory of Open Access Journals (Sweden)

    Zahra Naseri

    2016-08-01

    Full Text Available Current study aims to investigate the status of user interfaces of non-Iranian digital libraries’ based on social bookmarking capabilities and characteristics to use by Iranian digital libraries. This research studies the characteristics and capabilities of top digital libraries’ user interfaces in the world based on social bookmarking used by library users. This capability facilitates producing, identifying, organizing, and sharing contents using tags. Survey method was used with descriptive-analytical approach in this study. Populations include non-Iranian digital libraries interfaces. Top ten digital libraries’ interfaces were selected as the sample. A researcher-made checklist prepared based on literature review and investigating four distinguished websites (Library Thing, Delicious, Amazon, and Google Book. Faced validity evaluated by 10 experts’ viewpoints, then reliability calculated 0.87.Findings of this study are important because of two reasons: first, it provides a comprehensive and an unambiguous vision for recognizing user interfaces’ basic capabilities and characteristics based on social bookmarking. Second, it can provide a base for designing digital libraries in Iran. The results showed that the majority of digital libraries around the world had not used web 2.0 characteristics such as producing, identifying, organizing, and sharing contents except two digital libraries (Google Books, and Ibiblio.

  6. COMPUTER SUPPORT SYSTEMS FOR ESTIMATING CHEMICAL TOXICITY: PRESENT CAPABILITIES AND FUTURE TRENDS

    Science.gov (United States)

    Computer Support Systems for Estimating Chemical Toxicity: Present Capabilities and Future Trends A wide variety of computer-based artificial intelligence (AI) and decision support systems exist currently to aid in the assessment of toxicity for environmental chemicals. T...

  7. Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows

    Science.gov (United States)

    Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.

    2014-12-01

    The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.

  8. A Model-Driven Approach to Graphical User Interface Runtime Adaptation

    OpenAIRE

    Criado, Javier; Vicente Chicote, Cristina; Iribarne, Luis; Padilla, Nicolás

    2010-01-01

    Graphical user interfaces play a key role in human-computer interaction, as they link the system with its end-users, allowing information exchange and improving communication. Nowadays, users increasingly demand applications with adaptive interfaces that dynamically evolve in response to their specific needs. Thus, providing graphical user interfaces with runtime adaptation capabilities is becoming more and more an important issue. To address this problem, this paper proposes a componen...

  9. Examining the Security Awareness, Information Privacy, and the Security Behaviors of Home Computer Users

    Science.gov (United States)

    Edwards, Keith

    2015-01-01

    Attacks on computer systems continue to be a problem. The majority of the attacks target home computer users. To help mitigate the attacks some companies provide security awareness training to their employees. However, not all people work for a company that provides security awareness training and typically, home computer users do not have the…

  10. COBRA-SFS [Spent Fuel Storage]: A thermal-hydraulic analysis computer code: Volume 2, User's manual

    International Nuclear Information System (INIS)

    Rector, D.R.; Cuta, J.M.; Lombardo, N.J.; Michener, T.E.; Wheeler, C.L.

    1986-11-01

    COBRA-SFS (Spent Fuel Storage) is a general thermal-hydraulic analysis computer code used to predict temperatures and velocities in a wide variety of systems. The code was refined and specialized for spent fuel storage system analyses for the US Department of Energy's Commercial Spent Fuel Management Program. The finite-volume equations governing mass, momentum, and energy conservation are written for an incompressible, single-phase fluid. The flow equations model a wide range of conditions including natural circulation. The energy equations include the effects of solid and fluid conduction, natural convection, and thermal radiation. The COBRA-SFS code is structured to perform both steady-state and transient calculations; however, the transient capability has not yet been validated. This volume contains the input instructions for COBRA-SFS and an auxiliary radiation exchange factor code, RADX-1. It is intended to aid the user in becoming familiar with the capabilities and modeling conventions of the code

  11. Users' guide for a personal-computer-based nuclear power plant fire data base

    International Nuclear Information System (INIS)

    Wheelis, W.T.

    1986-08-01

    The Nuclear Power Plant Fire Data Base has been developed for use with an IBM XT (or with a compatible system). Nuclear power plant fire data is located in many diverse references, making it both costly and time-consuming to obtain. The purpose of this Fire Data Base is to collect and to make easily accessible nuclear power plant fire data. This users' guide discusses in depth the specific features and capabilities of the various options found in the data base. Capabilities include the ability to search several database fields simultaneously to meet user-defined conditions, display basic plant information, and determine the operating experience (in years) for several nuclear power plant locations. Step-by-step examples are included for each option to allow the user to learn how to access the data

  12. Randomized Trial of Desktop Humidifier for Dry Eye Relief in Computer Users.

    Science.gov (United States)

    Wang, Michael T M; Chan, Evon; Ea, Linda; Kam, Clifford; Lu, Yvonne; Misra, Stuti L; Craig, Jennifer P

    2017-11-01

    Dry eye is a frequently reported problem among computer users. Low relative humidity environments are recognized to exacerbate signs and symptoms of dry eye, yet are common in offices of computer operators. Desktop USB-powered humidifiers are available commercially, but their efficacy for dry eye relief has not been established. This study aims to evaluate the potential for a desktop USB-powered humidifier to improve tear-film parameters, ocular surface characteristics, and subjective comfort of computer users. Forty-four computer users were enrolled in a prospective, masked, randomized crossover study. On separate days, participants were randomized to 1 hour of continuous computer use, with and without exposure to a desktop humidifier. Lipid-layer grade, noninvasive tear-film breakup time, and tear meniscus height were measured before and after computer use. Following the 1-hour period, participants reported whether ocular comfort was greater, equal, or lesser than that at baseline. The desktop humidifier effected a relative difference in humidity between the two environments of +5.4 ± 5.0% (P .05). However, a relative increase in the median noninvasive tear-film breakup time of +4.0 seconds was observed in the humidified environment (P computer use.Trial registration no: ACTRN12617000326392.

  13. The Impact of User Interface on Young Children's Computational Thinking

    Science.gov (United States)

    Pugnali, Alex; Sullivan, Amanda; Bers, Marina Umaschi

    2017-01-01

    Aim/Purpose: Over the past few years, new approaches to introducing young children to computational thinking have grown in popularity. This paper examines the role that user interfaces have on children's mastery of computational thinking concepts and positive interpersonal behaviors. Background: There is a growing pressure to begin teaching…

  14. System and method for controlling power consumption in a computer system based on user satisfaction

    Science.gov (United States)

    Yang, Lei; Dick, Robert P; Chen, Xi; Memik, Gokhan; Dinda, Peter A; Shy, Alex; Ozisikyilmaz, Berkin; Mallik, Arindam; Choudhary, Alok

    2014-04-22

    Systems and methods for controlling power consumption in a computer system. For each of a plurality of interactive applications, the method changes a frequency at which a processor of the computer system runs, receives an indication of user satisfaction, determines a relationship between the changed frequency and the user satisfaction of the interactive application, and stores the determined relationship information. The determined relationship can distinguish between different users and different interactive applications. A frequency may be selected from the discrete frequencies at which the processor of the computer system runs based on the determined relationship information for a particular user and a particular interactive application running on the processor of the computer system. The processor may be adapted to run at the selected frequency.

  15. Promoting the freedom of thought of mental health service users: Nussbaum's capabilities approach meets values-based practice.

    Science.gov (United States)

    Stenlund, Mari

    2017-08-09

    This article clarifies how the freedom of thought as a human right can be understood and promoted as a right of mental health service users, especially people with psychotic disorder, by using Martha Nussbaum's capabilities approach and Fulford's and Fulford et al 's values-based practice. According to Nussbaum, freedom of thought seems to primarily protect the capability to think, believe and feel. This capability can be promoted in the context of mental health services by values-based practice. The article points out that both Nussbaum's approach and values-based practice recognise that people's values differ. The idea of involving different actors and service users in mental healthcare is also common in both Nussbaum's approach and values-based practice. However, there are also differences in that values-based practice relies on a 'good process' in decision-making, whereas the capabilities approach is oriented towards a 'right outcome'. However, since process and outcome are linked with each other, these two approaches do not necessarily conflict despite this difference. The article suggests that absolute rights are possible within the two approaches. It also recognises that the capabilities approach, values-based practice and human rights approach lean on liberal values and thus can be combined at least in liberal societies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. A survey of common habits of computer users as indicators of ...

    African Journals Online (AJOL)

    Other unhealthy practices found among computer users included eating (52.1), drinking (56), coughing, sneezing and scratching of head (48.2%). Since microorganisms can be transferred through contact, droplets or airborne routes, it follows that these habits exhibited by users may act as sources of bacteria on keyboards ...

  17. Prevalence of Asthenopia among computer users in Enugu ...

    African Journals Online (AJOL)

    Majority of the subjects (96%) had good vision (VA of 6/6- 6/18). Conclusion: Presence of ametropia is related to occurrence of asthenopia. Correction of existing ametropia would contribute to visual comfort of computer (vdt) users. Pre- employment and regular ocular examination should be made accessible to those who ...

  18. Environments for online maritime simulators with cloud computing capabilities

    Science.gov (United States)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  19. Determining the frequency of dry eye in computer users and comparing with control group

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Davari

    2017-08-01

    Full Text Available AIM: To determine the frequency of dry eye in computer users and to compare them with control group. METHODS: This study was a case control research conducted in 2015 in the city of Birjand. Sample size of study was estimated to be 304 subjects(152 subjects in each group, computer user group and control group. Non-randomized method of sampling was used in both groups. Schirmer test was used to evaluate dry eye of subjects. Then, subjects completed questionnaire. This questionnaire was developed based on objectives and reviewing the literature. After collecting the data, they were entered to SPSS Software and they were analyzed using Chi-square test or Fisher's test at the alpha level of 0.05.RESULTS: In total, 304 subjects(152 subjects in each groupwere included in the study. Frequency of dry eyes in the control group was 3.3%(5 subjectsand it was 61.8% in computer users group(94 subjects. Significant difference was observed between two groups in this regard(Pn=12, and it was 34.2% in computer users group(n=52, which significant difference was observed between two groups in this regard(PP=0.8. The mean working hour with computer per day in patients with dry eye was 6.65±3.52h, while it was 1.62±2.54h in healthy group(T=13.25, PCONCLUSION: This study showed a significant relationship between using computer and dry eye and ocular symptoms. Thus, it is necessary that officials need to pay particular attention to working hours with computer by employees. They should also develop appropriate plans to divide the working hours with computer among computer users. However, due to various confounding factors, it is recommended that these factors to be controlled in future studies.

  20. ADPAC v1.0: User's Manual

    Science.gov (United States)

    Hall, Edward J.; Heidegger, Nathan J.; Delaney, Robert A.

    1999-01-01

    The overall objective of this study was to evaluate the effects of turbulence models in a 3-D numerical analysis on the wake prediction capability. The current version of the computer code resulting from this study is referred to as ADPAC v7 (Advanced Ducted Propfan Analysis Codes -Version 7). This report is intended to serve as a computer program user's manual for the ADPAC code used and modified under Task 15 of NASA Contract NAS3-27394. The ADPAC program is based on a flexible multiple-block and discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. Turbulence models now available in the ADPAC code are: a simple mixing-length model, the algebraic Baldwin-Lomax model with user defined coefficients, the one-equation Spalart-Allmaras model, and a two-equation k-R model. The consolidated ADPAC code is capable of executing in either a serial or parallel computing mode from a single source code.

  1. An Approach for Indoor Path Computation among Obstacles that Considers User Dimension

    Directory of Open Access Journals (Sweden)

    Liu Liu

    2015-12-01

    Full Text Available People often transport objects within indoor environments, who need enough space for the motion. In such cases, the accessibility of indoor spaces relies on the dimensions, which includes a person and her/his operated objects. This paper proposes a new approach to avoid obstacles and compute indoor paths with respect to the user dimension. The approach excludes inaccessible spaces for a user in five steps: (1 compute the minimum distance between obstacles and find the inaccessible gaps; (2 group obstacles according to the inaccessible gaps; (3 identify groups of obstacles that influence the path between two locations; (4 compute boundaries for the selected groups; and (5 build a network in the accessible area around the obstacles in the room. Compared to the Minkowski sum method for outlining inaccessible spaces, the proposed approach generates simpler polygons for groups of obstacles that do not contain inner rings. The creation of a navigation network becomes easier based on these simple polygons. By using this approach, we can create user- and task-specific networks in advance. Alternatively, the accessible path can be generated on the fly before the user enters a room.

  2. Electronic Commerce user manual

    Energy Technology Data Exchange (ETDEWEB)

    1992-04-10

    This User Manual supports the Electronic Commerce Standard System. The Electronic Commerce Standard System is being developed for the Department of Defense of the Technology Information Systems Program at the Lawrence Livermore National Laboratory, operated by the University of California for the Department of Energy. The Electronic Commerce Standard System, or EC as it is known, provides the capability for organizations to conduct business electronically instead of through paper transactions. Electronic Commerce and Computer Aided Acquisition and Logistics Support, are two major projects under the DoD`s Corporate Information Management program, whose objective is to make DoD business transactions faster and less costly by using computer networks instead of paper forms and postage. EC runs on computers that use the UNIX operating system and provides a standard set of applications and tools that are bound together by a common command and menu system. These applications and tools may vary according to the requirements of the customer or location and may be customized to meet the specific needs of an organization. Local applications can be integrated into the menu system under the Special Databases & Applications option on the EC main menu. These local applications will be documented in the appendices of this manual. This integration capability provides users with a common environment of standard and customized applications.

  3. Electronic Commerce user manual

    Energy Technology Data Exchange (ETDEWEB)

    1992-04-10

    This User Manual supports the Electronic Commerce Standard System. The Electronic Commerce Standard System is being developed for the Department of Defense of the Technology Information Systems Program at the Lawrence Livermore National Laboratory, operated by the University of California for the Department of Energy. The Electronic Commerce Standard System, or EC as it is known, provides the capability for organizations to conduct business electronically instead of through paper transactions. Electronic Commerce and Computer Aided Acquisition and Logistics Support, are two major projects under the DoD's Corporate Information Management program, whose objective is to make DoD business transactions faster and less costly by using computer networks instead of paper forms and postage. EC runs on computers that use the UNIX operating system and provides a standard set of applications and tools that are bound together by a common command and menu system. These applications and tools may vary according to the requirements of the customer or location and may be customized to meet the specific needs of an organization. Local applications can be integrated into the menu system under the Special Databases Applications option on the EC main menu. These local applications will be documented in the appendices of this manual. This integration capability provides users with a common environment of standard and customized applications.

  4. Training leads to increased auditory brain-computer interface performance of end-users with motor impairments.

    Science.gov (United States)

    Halder, S; Käthner, I; Kübler, A

    2016-02-01

    Auditory brain-computer interfaces are an assistive technology that can restore communication for motor impaired end-users. Such non-visual brain-computer interface paradigms are of particular importance for end-users that may lose or have lost gaze control. We attempted to show that motor impaired end-users can learn to control an auditory speller on the basis of event-related potentials. Five end-users with motor impairments, two of whom with additional visual impairments, participated in five sessions. We applied a newly developed auditory brain-computer interface paradigm with natural sounds and directional cues. Three of five end-users learned to select symbols using this method. Averaged over all five end-users the information transfer rate increased by more than 1800% from the first session (0.17 bits/min) to the last session (3.08 bits/min). The two best end-users achieved information transfer rates of 5.78 bits/min and accuracies of 92%. Our results show that an auditory BCI with a combination of natural sounds and directional cues, can be controlled by end-users with motor impairment. Training improves the performance of end-users to the level of healthy controls. To our knowledge, this is the first time end-users with motor impairments controlled an auditory brain-computer interface speller with such high accuracy and information transfer rates. Further, our results demonstrate that operating a BCI with event-related potentials benefits from training and specifically end-users may require more than one session to develop their full potential. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  5. Dry eye syndrome among computer users

    Science.gov (United States)

    Gajta, Aurora; Turkoanje, Daniela; Malaescu, Iosif; Marin, Catalin-Nicolae; Koos, Marie-Jeanne; Jelicic, Biljana; Milutinovic, Vuk

    2015-12-01

    Dry eye syndrome is characterized by eye irritation due to changes of the tear film. Symptoms include itching, foreign body sensations, mucous discharge and transitory vision blurring. Less occurring symptoms include photophobia and eye tiredness. Aim of the work was to determine the quality of the tear film and ocular dryness potential risk in persons who spend more than 8 hours using computers and possible correlations between severity of symptoms (dry eyes symptoms anamnesis) and clinical signs assessed by: Schirmer test I, TBUT (Tears break-up time), TFT (Tear ferning test). The results show that subjects using computer have significantly shorter TBUT (less than 5 s for 56 % of subjects and less than 10 s for 37 % of subjects), TFT type II/III in 50 % of subjects and type III 31% of subjects was found when compared to computer non users (TFT type I and II was present in 85,71% of subjects). Visual display terminal use, more than 8 hours daily, has been identified as a significant risk factor for dry eye. It's been advised to all persons who spend substantial time using computers to use artificial tears drops in order to minimize the symptoms of dry eyes syndrome and prevents serious complications.

  6. AITRAC: Augmented Interactive Transient Radiation Analysis by Computer. User's information manual

    International Nuclear Information System (INIS)

    1977-10-01

    AITRAC is a program designed for on-line, interactive, DC, and transient analysis of electronic circuits. The program solves linear and nonlinear simultaneous equations which characterize the mathematical models used to predict circuit response. The program features 100 external node--200 branch capability; conversional, free-format input language; built-in junction, FET, MOS, and switch models; sparse matrix algorithm with extended-precision H matrix and T vector calculations, for fast and accurate execution; linear transconductances: beta, GM, MU, ZM; accurate and fast radiation effects analysis; special interface for user-defined equations; selective control of multiple outputs; graphical outputs in wide and narrow formats; and on-line parameter modification capability. The user describes the problem by entering the circuit topology and part parameters. The program then automatically generates and solves the circuit equations, providing the user with printed or plotted output. The circuit topology and/or part values may then be changed by the user, and a new analysis, requested. Circuit descriptions may be saved on disk files for storage and later use. The program contains built-in standard models for resistors, voltage and current sources, capacitors, inductors including mutual couplings, switches, junction diodes and transistors, FETS, and MOS devices. Nonstandard models may be constructed from standard models or by using the special equations interface. Time functions may be described by straight-line segments or by sine, damped sine, and exponential functions. 42 figures, 1 table

  7. Internet Shop Users: Computer Practices and Its Relationship to E-Learning Readiness

    OpenAIRE

    Jasper Vincent Q. Alontaga

    2018-01-01

    Access to computer technology is essential in developing 21st century skills. One venue that serves to bridge the gap in terms of access is internet shops (also known cybercafés or internet cafés). As such, it is important to examine the type of activities internet shop users engage in and how they develop and relate to their e-learning readiness. This study examined the profile, computer practices and e-learning readiness of seventy one (71) internet shop users. A researcher-made internet sh...

  8. Mobile cloud computing for computation offloading: Issues and challenges

    Directory of Open Access Journals (Sweden)

    Khadija Akherfi

    2018-01-01

    Full Text Available Despite the evolution and enhancements that mobile devices have experienced, they are still considered as limited computing devices. Today, users become more demanding and expect to execute computational intensive applications on their smartphone devices. Therefore, Mobile Cloud Computing (MCC integrates mobile computing and Cloud Computing (CC in order to extend capabilities of mobile devices using offloading techniques. Computation offloading tackles limitations of Smart Mobile Devices (SMDs such as limited battery lifetime, limited processing capabilities, and limited storage capacity by offloading the execution and workload to other rich systems with better performance and resources. This paper presents the current offloading frameworks, computation offloading techniques, and analyzes them along with their main critical issues. In addition, it explores different important parameters based on which the frameworks are implemented such as offloading method and level of partitioning. Finally, it summarizes the issues in offloading frameworks in the MCC domain that requires further research.

  9. A Graphical User Interface for the Computational Fluid Dynamics Software OpenFOAM

    OpenAIRE

    Melbø, Henrik Kaald

    2014-01-01

    A graphical user interface for the computational fluid dynamics software OpenFOAM has been constructed. OpenFOAM is a open source and powerful numerical software, but has much to be wanted in the field of user friendliness. In this thesis the basic operation of OpenFOAM will be introduced and the thesis will emerge in a graphical user interface written in PyQt. The graphical user interface will make the use of OpenFOAM simpler, and hopefully make this powerful tool more available for the gene...

  10. Directory of computer users in nuclear medicine

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, J.J.; Gurney, J.; McClain, W.J. (eds.)

    1979-09-01

    The Directory of Computer Users in Nuclear Medicine consists primarily of detailed descriptions and indexes to these descriptions. A typical Installation Description contains the name, address, type, and size of the institution and the names of persons within the institution who can be contacted for further information. If the department has access to a central computer facility for data analysis or timesharing, the type of equipment available and the method of access to that central computer is included. The dedicated data processing equipment used by the department in its nuclear medicine studies is described, including the peripherals, languages used, modes of data collection, and other pertinent information. Following the hardware descriptions are listed the types of studies for which the data processing equipment is used, including the language(s) used, the method of output, and an estimate of the frequency of the particular study. An Installation Index and an Organ Studies Index are also included. (PCS)

  11. Directory of computer users in nuclear medicine

    International Nuclear Information System (INIS)

    Erickson, J.J.; Gurney, J.; McClain, W.J.

    1979-09-01

    The Directory of Computer Users in Nuclear Medicine consists primarily of detailed descriptions and indexes to these descriptions. A typical Installation Description contains the name, address, type, and size of the institution and the names of persons within the institution who can be contacted for further information. If the department has access to a central computer facility for data analysis or timesharing, the type of equipment available and the method of access to that central computer is included. The dedicated data processing equipment used by the department in its nuclear medicine studies is described, including the peripherals, languages used, modes of data collection, and other pertinent information. Following the hardware descriptions are listed the types of studies for which the data processing equipment is used, including the language(s) used, the method of output, and an estimate of the frequency of the particular study. An Installation Index and an Organ Studies Index are also included

  12. RELAP5 nuclear plant analyzer capabilities

    International Nuclear Information System (INIS)

    Wagner, R.J.; Ransom, V.H.

    1982-01-01

    An interactive execution capability has been developed for the RELAP5 code which permits it to be used as a Nuclear Plant Analyzer. This capability has been demonstrated using a simplified primary and secondary loop model of a PWR. A variety of loss-of-feed-water accidents have been simulated using this model. The computer execution time on a CDC Cyber 176 is one half of the transient simulation time so that the results can be displayed in real time. The results of the demonstration problems are displayed in digital form on a color schematic of the plant model using a Textronics 4027 CRT terminal. The interactive feature allows the user to enter commands in much the same manner as a reactor operator

  13. Wireless-Uplinks-Based Energy-Efficient Scheduling in Mobile Cloud Computing

    OpenAIRE

    Xing Liu; Chaowei Yuan; Zhen Yang; Enda Peng

    2015-01-01

    Mobile cloud computing (MCC) combines cloud computing and mobile internet to improve the computational capabilities of resource-constrained mobile devices (MDs). In MCC, mobile users could not only improve the computational capability of MDs but also save operation consumption by offloading the mobile applications to the cloud. However, MCC faces the problem of energy efficiency because of time-varying channels when the offloading is being executed. In this paper, we address the issue of ener...

  14. MPSalsa a finite element computer program for reacting flow problems. Part 2 - user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Salinger, A.; Devine, K.; Hennigan, G.; Moffat, H. [and others

    1996-09-01

    This manual describes the use of MPSalsa, an unstructured finite element (FE) code for solving chemically reacting flow problems on massively parallel computers. MPSalsa has been written to enable the rigorous modeling of the complex geometry and physics found in engineering systems that exhibit coupled fluid flow, heat transfer, mass transfer, and detailed reactions. In addition, considerable effort has been made to ensure that the code makes efficient use of the computational resources of massively parallel (MP), distributed memory architectures in a way that is nearly transparent to the user. The result is the ability to simultaneously model both three-dimensional geometries and flow as well as detailed reaction chemistry in a timely manner on MT computers, an ability we believe to be unique. MPSalsa has been designed to allow the experienced researcher considerable flexibility in modeling a system. Any combination of the momentum equations, energy balance, and an arbitrary number of species mass balances can be solved. The physical and transport properties can be specified as constants, as functions, or taken from the Chemkin library and associated database. Any of the standard set of boundary conditions and source terms can be adapted by writing user functions, for which templates and examples exist.

  15. DISTRIBUTED COMPUTING SUPPORT CONTRACT USER SURVEY

    CERN Multimedia

    2001-01-01

    IT Division operates a Distributed Computing Support Service, which offers support to owners and users of all variety of desktops throughout CERN as well as more dedicated services for certain groups, divisions and experiments. It also provides the staff who operate the central and satellite Computing Helpdesks, it supports printers throughout the site and it provides the installation activities of the IT Division PC Service. We have published a questionnaire which seeks to gather your feedback on how the services are seen, how they are progressing and how they can be improved. Please take a few minutes to fill in this questionnaire. Replies will be treated in confidence if desired although you may also request an opportunity to be contacted by CERN's service management directly. Please tell us if you met problems but also if you had a successful conclusion to your request for assistance. You will find the questionnaire at the web site http://wwwinfo/support/survey/desktop-contract There will also be a link ...

  16. DISTRIBUTED COMPUTING SUPPORT SERVICE USER SURVEY

    CERN Multimedia

    2001-01-01

    IT Division operates a Distributed Computing Support Service, which offers support to owners and users of all variety of desktops throughout CERN as well as more dedicated services for certain groups, divisions and experiments. It also provides the staff who operate the central and satellite Computing Helpdesks, it supports printers throughout the site and it provides the installation activities of the IT Division PC Service. We have published a questionnaire, which seeks to gather your feedback on how the services are seen, how they are progressing and how they can be improved. Please take a few minutes to fill in this questionnaire. Replies will be treated in confidence if desired although you may also request an opportunity to be contacted by CERN's service management directly. Please tell us if you met problems but also if you had a successful conclusion to your request for assistance. You will find the questionnaire at the web site http://wwwinfo/support/survey/desktop-contract There will also be a link...

  17. The Impact of User Interface on Young Children’s Computational Thinking

    Directory of Open Access Journals (Sweden)

    Amanda Sullivan

    2017-07-01

    Full Text Available Aim/Purpose: Over the past few years, new approaches to introducing young children to computational thinking have grown in popularity. This paper examines the role that user interfaces have on children’s mastery of computational thinking concepts and positive interpersonal behaviors. Background: There is a growing pressure to begin teaching computational thinking at a young age. This study explores the affordances of two very different programming interfaces for teaching computational thinking: a graphical coding application on the iPad (ScratchJr and tangible programmable robotics kit (KIBO. Methodology\t: This study used a mixed-method approach to explore the learning experiences that young children have with tangible and graphical coding interfaces. A sample of children ages four to seven (N = 28 participated. Findings: Results suggest that type of user interface does have an impact on children’s learning, but is only one of many factors that affect positive academic and socio-emotional experiences. Tangible and graphical interfaces each have qualities that foster different types of learning

  18. HEPLIB '91: International users meeting on the support and environments of high energy physics computing

    International Nuclear Information System (INIS)

    Johnstad, H.

    1991-01-01

    The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, data base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards

  19. Study on User Authority Management for Safe Data Protection in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Su-Hyun Kim

    2015-03-01

    Full Text Available In cloud computing environments, user data are encrypted using numerous distributed servers before storing such data. Global Internet service companies, such as Google and Yahoo, recognized the importance of Internet service platforms and conducted self-research and development to create and utilize large cluster-based cloud computing platform technology based on low-priced commercial nodes. As diverse data services become possible in distributed computing environments, high-capacity distributed management is emerging as a major issue. Meanwhile, because of the diverse forms of using high-capacity data, security vulnerability and privacy invasion by malicious attackers or internal users can occur. As such, when various sensitive data are stored in cloud servers and used from there, the problem of data spill might occur because of external attackers or the poor management of internal users. Data can be managed through encryption to prevent such problems. However, existing simple encryption methods involve problems associated with the management of access to data stored in cloud environments. Therefore, in the present paper, a technique for data access management by user authority, based on Attribute-Based Encryption (ABE and secret distribution techniques, is proposed.

  20. ASTEC: Controls analysis for personal computers

    Science.gov (United States)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  1. Rotary engine performance computer program (RCEMAP and RCEMAPPC): User's guide

    Science.gov (United States)

    Bartrand, Timothy A.; Willis, Edward A.

    1993-01-01

    This report is a user's guide for a computer code that simulates the performance of several rotary combustion engine configurations. It is intended to assist prospective users in getting started with RCEMAP and/or RCEMAPPC. RCEMAP (Rotary Combustion Engine performance MAP generating code) is the mainframe version, while RCEMAPPC is a simplified subset designed for the personal computer, or PC, environment. Both versions are based on an open, zero-dimensional combustion system model for the prediction of instantaneous pressures, temperature, chemical composition and other in-chamber thermodynamic properties. Both versions predict overall engine performance and thermal characteristics, including bmep, bsfc, exhaust gas temperature, average material temperatures, and turbocharger operating conditions. Required inputs include engine geometry, materials, constants for use in the combustion heat release model, and turbomachinery maps. Illustrative examples and sample input files for both versions are included.

  2. Present capabilities and new developments in antenna modeling with the numerical electromagnetics code NEC

    Energy Technology Data Exchange (ETDEWEB)

    Burke, G.J.

    1988-04-08

    Computer modeling of antennas, since its start in the late 1960's, has become a powerful and widely used tool for antenna design. Computer codes have been developed based on the Method-of-Moments, Geometrical Theory of Diffraction, or integration of Maxwell's equations. Of such tools, the Numerical Electromagnetics Code-Method of Moments (NEC) has become one of the most widely used codes for modeling resonant sized antennas. There are several reasons for this including the systematic updating and extension of its capabilities, extensive user-oriented documentation and accessibility of its developers for user assistance. The result is that there are estimated to be several hundred users of various versions of NEC world wide. 23 refs., 10 figs.

  3. [Musculoskeletal disorders among university student computer users].

    Science.gov (United States)

    Lorusso, A; Bruno, S; L'Abbate, N

    2009-01-01

    Musculoskeletal disorders are a common problem among computer users. Many epidemiological studies have shown that ergonomic factors and aspects of work organization play an important role in the development of these disorders. We carried out a cross-sectional survey to estimate the prevalence of musculoskeletal symptoms among university students using personal computers and to investigate the features of occupational exposure and the prevalence of symptoms throughout the study course. Another objective was to assess the students' level of knowledge of computer ergonomics and the relevant health risks. A questionnaire was distributed to 183 students attending the lectures for second and fourth year courses of the Faculty of Architecture. Data concerning personal characteristics, ergonomic and organizational aspects of computer use, and the presence of musculoskeletal symptoms in the neck and upper limbs were collected. Exposure to risk factors such as daily duration of computer use, time spent at the computer without breaks, duration of mouse use and poor workstation ergonomics was significantly higher among students of the fourth year course. Neck pain was the most commonly reported symptom (69%), followed by hand/wrist (53%), shoulder (49%) and arm (8%) pain. The prevalence of symptoms in the neck and hand/wrist area was signifcantly higher in the students of the fourth year course. In our survey we found high prevalence of musculoskeletal symptoms among university students using computers for long time periods on a daily basis. Exposure to computer-related ergonomic and organizational risk factors, and the prevalence ofmusculoskeletal symptoms both seem to increase significantly throughout the study course. Furthermore, we found that the level of perception of computer-related health risks among the students was low. Our findings suggest the need for preventive intervention consisting of education in computer ergonomics.

  4. User's manual for computer program BASEPLOT

    Science.gov (United States)

    Sanders, Curtis L.

    2002-01-01

    The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.

  5. Retrieving the aerosol complex refractive index using PyMieScatt: A Mie computational package with visualization capabilities

    Science.gov (United States)

    Sumlin, Benjamin J.; Heinson, William R.; Chakrabarty, Rajan K.

    2018-01-01

    The complex refractive index m = n + ik of a particle is an intrinsic property which cannot be directly measured; it must be inferred from its extrinsic properties such as the scattering and absorption cross-sections. Bohren and Huffman called this approach "describing the dragon from its tracks", since the inversion of Lorenz-Mie theory equations is intractable without the use of computers. This article describes PyMieScatt, an open-source module for Python that contains functionality for solving the inverse problem for complex m using extensive optical and physical properties as input, and calculating regions where valid solutions may exist within the error bounds of laboratory measurements. Additionally, the module has comprehensive capabilities for studying homogeneous and coated single spheres, as well as ensembles of homogeneous spheres with user-defined size distributions, making it a complete tool for studying the optical behavior of spherical particles.

  6. Viscous wing theory development. Volume 2: GRUMWING computer program user's manual

    Science.gov (United States)

    Chow, R. R.; Ogilvie, P. L.

    1986-01-01

    This report is a user's manual which describes the operation of the computer program, GRUMWING. The program computes the viscous transonic flow over three-dimensional wings using a boundary layer type viscid-inviscid interaction approach. The inviscid solution is obtained by an approximate factorization (AFZ)method for the full potential equation. The boundary layer solution is based on integral entrainment methods.

  7. StarTrax --- The Next Generation User Interface

    Science.gov (United States)

    Richmond, Alan; White, Nick

    StarTrax is a software package to be distributed to end users for installation on their local computing infrastructure. It will provide access to many services of the HEASARC, i.e. bulletins, catalogs, proposal and analysis tools, initially for the ROSAT MIPS (Mission Information and Planning System), later for the Next Generation Browse. A user activating the GUI will reach all HEASARC capabilities through a uniform view of the system, independent of the local computing environment and of the networking method of accessing StarTrax. Use it if you prefer the point-and-click metaphor of modern GUI technology, to the classical command-line interfaces (CLI). Notable strengths include: easy to use; excellent portability; very robust server support; feedback button on every dialog; painstakingly crafted User Guide. It is designed to support a large number of input devices including terminals, workstations and personal computers. XVT's Portability Toolkit is used to build the GUI in C/C++ to run on: OSF/Motif (UNIX or VMS), OPEN LOOK (UNIX), or Macintosh, or MS-Windows (DOS), or character systems.

  8. Computer systems experiences of users with and without disabilities an evaluation guide for professionals

    CERN Document Server

    Borsci, Simone; Federici, Stefano; Mele, Maria Laura

    2013-01-01

    This book provides the necessary tools for the evaluation of the interaction between the user who is disabled and the computer system that was designed to assist that person. The book creates an evaluation process that is able to assess the user's satisfaction with a developed system. Presenting a new theoretical perspective in the human computer interaction evaluation of disabled persons, it takes into account all of the individuals involved in the evaluation process.

  9. Real time recording system of radioisotopes by local area network (LAN) computer system and user input processing

    International Nuclear Information System (INIS)

    Shinohara, Kunio; Ito, Atsushi; Kawaguchi, Hajime; Yanase, Makoto; Uno, Kiyoshi.

    1991-01-01

    A computer-assisted real time recording system was developed for management of radioisotopes. The system composed of two personal computers forming LAN, identification-card (ID-card) reader, and electricity-operating door-lock. One computer is operated by radiation safety staffs and stores the records of radioisotopes. The users of radioisotopes are registered in this computer. Another computer is installed in front of the storage room for radioisotopes. This computer is ready for operation by a registered ID-card and is input data by the user. After the completion of data input, the door to the storage room is unlocked. The present system enables us the following merits: Radiation safety staffs can easily keep up with the present states of radioisotopes in the storage room and save much labor. Radioactivity is always corrected. The upper limit of radioactivities in use per day is automatically checked and users are regulated when they input the amounts to be used. Users can obtain storage records of radioisotopes any time. In addition, the system is applicable to facilities which have more than two storage rooms. (author)

  10. Users guide for NRC145-2 accident assessment computer code

    International Nuclear Information System (INIS)

    Pendergast, M.M.

    1982-08-01

    An accident assessment computer code has been developed for use at the Savannah River Plant. This computer code is based upon NRC Regulatory Guide 1.145 which provides guidence for accident assessements for power reactors. The code contains many options so that the user may utilize the code for many different assessments. For example the code can be used for non-nuclear assessments such as Sulpher Dioxide which may be required by the EPA. A discription of the code is contained in DP-1646. This document is a compilation of step-by-step instructions on how to use the code on the SRP IBM 3308 computer. This document consists of a number of tables which contain copies of computer listings. Some of the computer listings are copies of input; other listings give examples of computer output

  11. Evaluation of the Factors which Contribute to the Ocular Complaints in Computer Users.

    Science.gov (United States)

    Agarwal, Smita; Goel, Dishanter; Sharma, Anshu

    2013-02-01

    Use of information technology hardware given new heights to professional success rate and saves time but on the other hand its harmful effect has introduced an array of health related complaints causing hazards for our human health. Increased use of computers has led to an increase in the number of patients with ocular complaints which are being grouped together as computer vision syndrome (CVS). In view of that, this study was undertaken to find out the ocular complaints and the factors contributing to occurrence of such problems in computer users. To evaluate the factors contributing to Ocular complaints in computer users in Teerthanker Mahaveer University, Moradabad, U.P. India. Community-based cross-sectional study of 150 subjects who work on computer for varying period of time in Teerthanker Mahaveer University, Moradabad, Uttar Pradesh. Two hundred computer operators working in different institutes offices and bank of were selected randomly in Teerthanker Mahaveer University, Moradabad, and Uttar Pradesh. 16 were non responders 18 did not come for assessment and 16 were excluded due to complaints prior to computer use making no response rate Twenty-one did not participate in the study, making the no response rate 25%. Rest of the subjects (n = 150) were asked to fill a pre-tested questionnaire, after obtaining their verbal consent Depending on the average hours of usage in a day, they were categorized into three categories viz. 6 hrs of usage. All the responders were asked to come to the Ophthalmic OPD for further interview and assessment. Simple proportions and Chi-square test. Among the 150 subjects studied major ocular complaint reported in descending order were eyestrain. (53%). Occurrence of eye strain, ( 53.8%), itching ( 47.6%) and burning (66.7%) in subjects using computer for more than 6 hours. distance from computer screen with respect to eyes, use of antiglare screen, taking frequent breaks, use of LCD monitor and adjustment of brightness of

  12. Musculoskeletal Problems Associated with University Students Computer Users: A Cross-Sectional Study

    Directory of Open Access Journals (Sweden)

    Rakhadani PB

    2017-07-01

    Full Text Available While several studies have examined the prevalence and correlates of musculoskeletal problems among university students, scanty information exists in South African context. The objective of this study was to determine the prevalence, causes and consequences of musculoskeletal problems among University of Venda students’ computer users. This cross-sectional study involved 694 university students at the University of Venda. A self-designed questionnaire was used to collect information on the sociodemographic characteristics, problems associated with computer users, and causes of musculoskeletal problems associated with computer users. The majority (84.6% of the participants use computer for internet, wording processing (20.3%, and games (18.7%. The students reported neck pain when using computer (52.3%; shoulder (47.0%, finger (45.0%, lower back (43.1%, general body pain (42.9%, elbow (36.2%, wrist (33.7%, hip and foot (29.1% and knee (26.2%. Reported causes of musculoskeletal pains associated with computer usage were: sitting position, low chair, a lot of time spent on computer, uncomfortable laboratory chairs, and stressfulness. Eye problems (51.9%, muscle cramp (344.0%, headache (45.3%, blurred vision (38.0%, feeling of illness (39.9% and missed lectures (29.1% were consequences of musculoskeletal problems linked to computer use. The majority of students reported having mild pain (43.7%, moderate (24.2%, and severe (8.4% pains. Years of computer use were significantly associated with neck, shoulder and wrist pain. Using computer for internet was significantly associated with neck pain (OR=0.60; 95% CI 0.40-0.93; games: neck (OR=0.60; 95% CI 0.40-0.85 and hip/foot (OR=0.60; CI 95% 0.40-0.92, programming for elbow (OR= 1.78; CI 95% 1.10-2.94 and wrist (OR=2.25; CI 95% 1.36-3.73, while word processing was significantly associated with lower back (OR=1.45; CI 95% 1.03-2.04. Undergraduate study had a significant association with elbow pain (OR=2

  13. Evolution of user analysis on the grid in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00218990; The ATLAS collaboration; Dewhurst, Alastair

    2017-01-01

    More than one thousand physicists analyse data collected by the ATLAS experiment at the Large Hadron Collider (LHC) at CERN through 150 computing facilities around the world. Efficient distributed analysis requires optimal resource usage and the interplay of several factors: robust grid and software infrastructures, and system capability to adapt to different workloads. The continuous automatic validation of grid sites and the user support provided by a dedicated team of expert shifters have been proven to provide a solid distributed analysis system for ATLAS users. Typical user workflows on the grid, and their associated metrics, are discussed. Measurements of user job performance and typical requirements are also shown.

  14. MPC Related Computational Capabilities of ARMv7A Processors

    DEFF Research Database (Denmark)

    Frison, Gianluca; Jørgensen, John Bagterp

    2015-01-01

    In recent years, the mass market of mobile devices has pushed the demand for increasingly fast but cheap processors. ARM, the world leader in this sector, has developed the Cortex-A series of processors with focus on computationally intensive applications. If properly programmed, these processors...... are powerful enough to solve the complex optimization problems arising in MPC in real-time, while keeping the traditional low-cost and low-power consumption. This makes these processors ideal candidates for use in embedded MPC. In this paper, we investigate the floating-point capabilities of Cortex A7, A9...... and A15 and show how to exploit the unique features of each processor to obtain the best performance, in the context of a novel implementation method for the linear-algebra routines used in MPC solvers. This method adapts high-performance computing techniques to the needs of embedded MPC. In particular...

  15. Data engineering systems: Computerized modeling and data bank capabilities for engineering analysis

    Science.gov (United States)

    Kopp, H.; Trettau, R.; Zolotar, B.

    1984-01-01

    The Data Engineering System (DES) is a computer-based system that organizes technical data and provides automated mechanisms for storage, retrieval, and engineering analysis. The DES combines the benefits of a structured data base system with automated links to large-scale analysis codes. While the DES provides the user with many of the capabilities of a computer-aided design (CAD) system, the systems are actually quite different in several respects. A typical CAD system emphasizes interactive graphics capabilities and organizes data in a manner that optimizes these graphics. On the other hand, the DES is a computer-aided engineering system intended for the engineer who must operationally understand an existing or planned design or who desires to carry out additional technical analysis based on a particular design. The DES emphasizes data retrieval in a form that not only provides the engineer access to search and display the data but also links the data automatically with the computer analysis codes.

  16. User's manual for a process model code

    International Nuclear Information System (INIS)

    Kern, E.A.; Martinez, D.P.

    1981-03-01

    The MODEL code has been developed for computer modeling of materials processing facilities associated with the nuclear fuel cycle. However, it can also be used in other modeling applications. This report provides sufficient information for a potential user to apply the code to specific process modeling problems. Several examples that demonstrate most of the capabilities of the code are provided

  17. Basic user guide for the radwaste treatment plant computer system

    International Nuclear Information System (INIS)

    Keel, A.

    1990-07-01

    This guide has been produced as an aid to using the Radwaste Treatment Plant computer system. It is designed to help new users to use the database menu system. Some of the forms can be used in ways different from those explained and more complex queries can be performed. (UK)

  18. Relative User Ratings of MMPI-2 Computer-Based Test Interpretations

    Science.gov (United States)

    Williams, John E.; Weed, Nathan C.

    2004-01-01

    There are eight commercially available computer-based test interpretations (CBTIs) for the Minnesota Multiphasic Personality Inventory-2 (MMPI-2), of which few have been empirically evaluated. Prospective users of these programs have little scientific data to guide choice of a program. This study compared ratings of these eight CBTIs. Test users…

  19. Computational physics and applied mathematics capability review June 8-10, 2010

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Stephen R [Los Alamos National Laboratory

    2010-01-01

    Los Alamos National Laboratory will review its Computational Physics and Applied Mathematics (CPAM) capabilities in 2010. The goals of capability reviews are to assess the quality of science, technology, and engineering (STE) performed by the capability, evaluate the integration of this capability across the Laboratory and within the scientific community, examine the relevance of this capability to the Laboratory's programs, and provide advice on the current and future directions of this capability. This is the first such review for CPAM, which has a long and unique history at the Laboratory, starting from the inception of the Laboratory in 1943. The CPAM capability covers an extremely broad technical area at Los Alamos, encompassing a wide array of disciplines, research topics, and organizations. A vast array of technical disciplines and activities are included in this capability, from general numerical modeling, to coupled multi-physics simulations, to detailed domain science activities in mathematics, methods, and algorithms. The CPAM capability involves over 12 different technical divisions and a majority of our programmatic and scientific activities. To make this large scope tractable, the CPAM capability is broken into the following six technical 'themes.' These themes represent technical slices through the CPAM capability and collect critical core competencies of the Laboratory, each of which contributes to the capability (and each of which is divided into multiple additional elements in the detailed descriptions of the themes in subsequent sections), as follows. Theme 1: Computational Fluid Dynamics - This theme speaks to the vast array of scientific capabilities for the simulation of fluids under shocks, low-speed flow, and turbulent conditions - which are key, historical, and fundamental strengths of the Laboratory. Theme 2: Partial Differential Equations - The technical scope of this theme is the applied mathematics and numerical solution

  20. User's Manual for SPECTROM-41: a Finite-Element Heat Transfer Program

    International Nuclear Information System (INIS)

    Svalstad, D.K.

    1983-06-01

    This User's Manual addresses SPECTROM-41: A Finite Element Heat Transfer Computer Program. The user is introduced to the program's capabilities and operation, with required user input outlined in detail. Example problems are included to illustrate the use of the various program features, and included to illustrate the use of the various program features, and analytical solutions are presented for four of the examples to provide a measure of program accuracy. Past and ongoing comparative benchmark analyses are highlighted to provide the user with an indication of how SPECTROM-41 predictions compare with other available heat transfer programs

  1. Mining Emerging Patterns for Recognizing Activities of Multiple Users in Pervasive Computing

    DEFF Research Database (Denmark)

    Gu, Tao; Wu, Zhanqing; Wang, Liang

    2009-01-01

    Understanding and recognizing human activities from sensor readings is an important task in pervasive computing. Existing work on activity recognition mainly focuses on recognizing activities for a single user in a smart home environment. However, in real life, there are often multiple inhabitants...... activity models, and propose an Emerging Pattern based Multi-user Activity Recognizer (epMAR) to recognize both single-user and multiuser activities. We conduct our empirical studies by collecting real-world activity traces done by two volunteers over a period of two weeks in a smart home environment...... sensor readings in a home environment, and propose a novel pattern mining approach to recognize both single-user and multi-user activities in a unified solution. We exploit Emerging Pattern – a type of knowledge pattern that describes significant changes between classes of data – for constructing our...

  2. Computer users' ergonomics and quality of life - evidence from a developing country.

    Science.gov (United States)

    Ahmed, Ishfaq; Shaukat, Muhammad Zeeshan

    2018-06-01

    This study is aimed at investigating the quality of workplace ergonomics at various Pakistani organizations and quality of life of computer users working in these organizations. Two hundred and thirty-five computer users (only those employees who have to do most of their job tasks on computer or laptop, and at their office) responded by filling the questionnaire covering questions on workplace ergonomics and quality of life. Findings of the study revealed the ergonomics at those organizations was poor and unfavourable. The quality of life (both physical and mental health of the employees) of respondents was poor for employees who had unfavourable ergonomic environment. The findings thus highlight an important issue prevalent at Pakistani work settings.

  3. GOCE User Toolbox and Tutorial

    Science.gov (United States)

    Knudsen, Per; Benveniste, Jerome

    2017-04-01

    The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products.
GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information
and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced
computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations,
and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT
Install Guide. A set of a-priori data and models are made available as well. Without any doubt the development
of the GOCE user toolbox have played a major role in paving the way to successful use of the GOCE data for
oceanography. The GUT version 2.2 was released in April 2014 and beside some bug-fixes it adds the capability for the computation of Simple Bouguer Anomaly (Solid-Earth). During this fall a new GUT version 3 has been released. GUTv3 was further developed through a collaborative effort where the scientific communities participate aiming
on an implementation of remaining functionalities facilitating a wider span of research in the fields of Geodesy,
Oceanography and Solid earth studies.
Accordingly, the GUT version 3 has:
 - An attractive and easy to use Graphic User Interface (GUI) for the toolbox,
 - Enhance the toolbox with some further software functionalities such as to facilitate the use of gradients,
anisotropic diffusive filtering and computation of Bouguer and isostatic gravity anomalies.
 - An associated GUT VCM tool for analyzing the GOCE variance covariance matrices.

  4. Support for User Interfaces for Distributed Systems

    Science.gov (United States)

    Eychaner, Glenn; Niessner, Albert

    2005-01-01

    An extensible Java(TradeMark) software framework supports the construction and operation of graphical user interfaces (GUIs) for distributed computing systems typified by ground control systems that send commands to, and receive telemetric data from, spacecraft. Heretofore, such GUIs have been custom built for each new system at considerable expense. In contrast, the present framework affords generic capabilities that can be shared by different distributed systems. Dynamic class loading, reflection, and other run-time capabilities of the Java language and JavaBeans component architecture enable the creation of a GUI for each new distributed computing system with a minimum of custom effort. By use of this framework, GUI components in control panels and menus can send commands to a particular distributed system with a minimum of system-specific code. The framework receives, decodes, processes, and displays telemetry data; custom telemetry data handling can be added for a particular system. The framework supports saving and later restoration of users configurations of control panels and telemetry displays with a minimum of effort in writing system-specific code. GUIs constructed within this framework can be deployed in any operating system with a Java run-time environment, without recompilation or code changes.

  5. On-line computing in a classified environment

    International Nuclear Information System (INIS)

    O'Callaghan, P.B.

    1982-01-01

    Westinghouse Hanford Company (WHC) recently developed a Department of Energy (DOE) approved real-time, on-line computer system to control nuclear material. The system simultaneously processes both classified and unclassified information. Implementation of this system required application of many security techniques. The system has a secure, but user friendly interface. Many software applications protect the integrity of the data base from malevolent or accidental errors. Programming practices ensure the integrity of the computer system software. The audit trail and the reports generation capability record user actions and status of the nuclear material inventory

  6. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users Manual

    International Nuclear Information System (INIS)

    Schrader, Bradley J.

    2009-01-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods

  7. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    Science.gov (United States)

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in information for ≥ 30 frames and provide summary data for each spermatozoon and the population. A few systems evaluate sperm morphology concurrent with motion. CASA cannot accurately predict 'fertility' that will be obtained with a semen sample or subject. However, when carefully validated, current CASA systems provide information important for quality assurance of semen planned for marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible period of time

  8. Localized Ambient Solidity Separation Algorithm Based Computer User Segmentation

    Science.gov (United States)

    Sun, Xiao; Zhang, Tongda; Chai, Yueting; Liu, Yi

    2015-01-01

    Most of popular clustering methods typically have some strong assumptions of the dataset. For example, the k-means implicitly assumes that all clusters come from spherical Gaussian distributions which have different means but the same covariance. However, when dealing with datasets that have diverse distribution shapes or high dimensionality, these assumptions might not be valid anymore. In order to overcome this weakness, we proposed a new clustering algorithm named localized ambient solidity separation (LASS) algorithm, using a new isolation criterion called centroid distance. Compared with other density based isolation criteria, our proposed centroid distance isolation criterion addresses the problem caused by high dimensionality and varying density. The experiment on a designed two-dimensional benchmark dataset shows that our proposed LASS algorithm not only inherits the advantage of the original dissimilarity increments clustering method to separate naturally isolated clusters but also can identify the clusters which are adjacent, overlapping, and under background noise. Finally, we compared our LASS algorithm with the dissimilarity increments clustering method on a massive computer user dataset with over two million records that contains demographic and behaviors information. The results show that LASS algorithm works extremely well on this computer user dataset and can gain more knowledge from it. PMID:26221133

  9. Test and control computer user's guide for a digital beam former test system

    Science.gov (United States)

    Alexovich, Robert E.; Mallasch, Paul G.

    1992-01-01

    A Digital Beam Former Test System was developed to determine the effects of noise, interferers and distortions, and digital implementations of beam forming as applied to the Tracking and Data Relay Satellite 2 (TDRS 2) architectures. The investigation of digital beam forming with application to TDRS 2 architectures, as described in TDRS 2 advanced concept design studies, was conducted by the NASA/Lewis Research Center for NASA/Goddard Space Flight Center. A Test and Control Computer (TCC) was used as the main controlling element of the digital Beam Former Test System. The Test and Control Computer User's Guide for a Digital Beam Former Test System provides an organized description of the Digital Beam Former Test System commands. It is written for users who wish to conduct tests of the Digital Beam forming Test processor using the TCC. The document describes the function, use, and syntax of the TCC commands available to the user while summarizing and demonstrating the use of the commands wtihin DOS batch files.

  10. Octopus: LLL's computing utility

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    The Laboratory's Octopus network constitutes one of the greatest concentrations of computing power in the world. This power derives from the network's organization as well as from the size and capability of its computers, storage media, input/output devices, and communication channels. Being in a network enables these facilities to work together to form a unified computing utility that is accessible on demand directly from the users' offices. This computing utility has made a major contribution to the pace of research and development at the Laboratory; an adequate rate of progress in research could not be achieved without it. 4 figures

  11. The diagnostic value and accuracy of conjunctival impression cytology, dry eye symptomatology, and routine tear function tests in computer users.

    Science.gov (United States)

    Bhargava, Rahul; Kumar, Prachi; Kaur, Avinash; Kumar, Manjushri; Mishra, Anurag

    2014-07-01

    To compare the diagnostic value and accuracy of dry eye scoring system (DESS), conjunctival impression cytology (CIC), tear film breakup time (TBUT), and Schirmer's test in computer users. A case-control study was done at two referral eye centers. Eyes of 344 computer users were compared to 371 eyes of age and sex matched controls. Dry eye questionnaire (DESS) was administered to both groups and they further underwent measurement of TBUT, Schirmer's, and CIC. Correlation analysis was performed between DESS, CIC, TBUT, and Schirmer's test scores. A Pearson's coefficient of the linear expression (R (2)) of 0.5 or more was statistically significant. The mean age in cases (26.05 ± 4.06 years) was comparable to controls (25.67 ± 3.65 years) (P = 0.465). The mean symptom score in computer users was significantly higher as compared to controls (P computer users (P computer users respectively as compared to 8%, 6.7%, and 7.3% symptomatic controls respectively. On correlation analysis, there was a significant (inverse) association of dry eye symptoms (DESS) with TBUT and CIC scores (R (2) > 0.5), in contrast to Schirmer's scores (R(2) computer usage had a significant effect on dry eye symptoms severity, TBUT, and CIC scores as compared to Schirmer's test. DESS should be used in combination with TBUT and CIC for dry eye evaluation in computer users.

  12. Predictive Capability Maturity Model for computational modeling and simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronautics and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.

  13. Work related perceived stress and muscle activity during standardized computer work among female computer users

    DEFF Research Database (Denmark)

    Larsman, P; Thorn, S; Søgaard, K

    2009-01-01

    The current study investigated the associations between work-related perceived stress and surface electromyographic (sEMG) parameters (muscle activity and muscle rest) during standardized simulated computer work (typing, editing, precision, and Stroop tasks). It was part of the European case......-control study, NEW (Neuromuscular assessment in the Elderly Worker). The present cross-sectional study was based on a questionnaire survey and sEMG measurements among Danish and Swedish female computer users aged 45 or older (n=49). The results show associations between work-related perceived stress...... and trapezius muscle activity and rest during standardized simulated computer work, and provide partial empirical support for the hypothesized pathway of stress induced muscle activity in the association between an adverse psychosocial work environment and musculoskeletal symptoms in the neck and shoulder....

  14. BEAM: A computational workflow system for managing and modeling material characterization data in HPC environments

    Energy Technology Data Exchange (ETDEWEB)

    Lingerfelt, Eric J [ORNL; Endeve, Eirik [ORNL; Ovchinnikov, Oleg S [ORNL; Borreguero Calvo, Jose M [ORNL; Park, Byung H [ORNL; Archibald, Richard K [ORNL; Symons, Christopher T [ORNL; Kalinin, Sergei V [ORNL; Messer, Bronson [ORNL; Shankar, Mallikarjun [ORNL; Jesse, Stephen [ORNL

    2016-01-01

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now with the rise of multimodal acquisition systems and the associated processing capability the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalable data analysis and simulation via an intuitive, cross-platform client user interface. This framework delivers authenticated, push-button execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing the converged compute-and-data infrastructure at Oak Ridge National Laboratory s (ORNL) Compute and Data Environment for Science (CADES) and HPC environments like Titan at the Oak Ridge Leadership Computing Facility (OLCF). In this work we address the underlying HPC needs for characterization in the material science community, elaborate how BEAM s design and infrastructure tackle those needs, and present a small sub-set of user cases where scientists utilized BEAM across a broad range of analytical techniques and analysis modes.

  15. Personal computer versus personal computer/mobile device combination users' preclinical laboratory e-learning activity.

    Science.gov (United States)

    Kon, Haruka; Kobayashi, Hiroshi; Sakurai, Naoki; Watanabe, Kiyoshi; Yamaga, Yoshiro; Ono, Takahiro

    2017-11-01

    The aim of the present study was to clarify differences between personal computer (PC)/mobile device combination and PC-only user patterns. We analyzed access frequency and time spent on a complete denture preclinical website in order to maximize website effectiveness. Fourth-year undergraduate students (N=41) in the preclinical complete denture laboratory course were invited to participate in this survey during the final week of the course to track login data. Students accessed video demonstrations and quizzes via our e-learning site/course program, and were instructed to view online demonstrations before classes. When the course concluded, participating students filled out a questionnaire about the program, their opinions, and devices they had used to access the site. Combination user access was significantly more frequent than PC-only during supplementary learning time, indicating that students with mobile devices studied during lunch breaks and before morning classes. Most students had favorable opinions of the e-learning site, but a few combination users commented that some videos were too long and that descriptive answers were difficult on smartphones. These results imply that mobile devices' increased accessibility encouraged learning by enabling more efficient time use between classes. They also suggest that e-learning system improvements should cater to mobile device users by reducing video length and including more short-answer questions. © 2016 John Wiley & Sons Australia, Ltd.

  16. Surface Modeling, Solid Modeling and Finite Element Modeling. Analysis Capabilities of Computer-Assisted Design and Manufacturing Systems.

    Science.gov (United States)

    Nee, John G.; Kare, Audhut P.

    1987-01-01

    Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)

  17. LANCE Usage and User Analysis: Creating a Better System that Meets User Needs

    Science.gov (United States)

    Davies, D.; Boller, R. A.; Schmaltz, J. E.; Murphy, K. J.; Ilavajhala, S.; Ullah, A.; Joshi, T.

    2012-12-01

    This paper describes known uses of NASA's Land Atmosphere Near-real time Capability for EOS (LANCE) data and imagery and summarizes findings from informal interviews with LANCE users, undertaken to better understand their needs. LANCE, the NRT component of EOSDIS, provides products from MODIS, AIRS, OMI and MLS within 3 hours of satellite observation. LANCE has in excess of 50,000 unique anonymous users per month using data and imagery for wildfire management, air quality measurements, shipping, agricultural forecasting as well as monitoring volcanic plumes, dust storms, smoke plumes and floods etc. Users can be categorized as end users or as brokers who may repackage the imagery and pass it on to their own end users. Interactions with a sample of end users found the following: users like MODIS Rapid Response imagery but do not want to be confined to pre-defined subsets; they want a broader selection of imagery and those with higher bandwidth want the capability to pull in imagery in to their own web-mapping or GIS applications. Users with lower bandwidth want the capability to define their own areas-of-interest for simple download of an image file. Users also expressed a desire to see historic as well as near-real time data, so they can compare the current situation to the recent past. Users want download capabilities to enable information to be shared quickly and easily. These and other findings are being fed back to EOSDIS developers who are creating tools and services to better meet user needs. The findings from users have been valuable in ensuring that developers are on track. The most recent offerings, available http://earthdata.nasa.gov/lance, are Worldview - a web-based client which provides capability to interactively browse full-resolution, global, near real-time satellite imagery from 50+ data products from LANCE, and the Global Imagery Browse Services (GIBS) which enables both users and brokers to pull the latest imagery in to their own web mapping

  18. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Science.gov (United States)

    Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans

    2017-05-01

    Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  19. Visibility Aspects Importance of User Interface Reception in Cloud Computing Applications with Increased Automation

    OpenAIRE

    Haxhixhemajli, Denis

    2012-01-01

    Visibility aspects of User Interfaces are important; they deal with the crucial phase of human-computer interaction. They allow users to perform and at the same time hide the complexity of the system. Acceptance of new systems depends on how visibility aspects of the User Interfaces are presented. Human eyes make the first contact with the appearance of any system by so it generates the very beginning of the human – application interaction. In this study it is enforced that visibility aspects...

  20. User's Guide for TOUGH2-MP - A Massively Parallel Version of the TOUGH2 Code

    International Nuclear Information System (INIS)

    Earth Sciences Division; Zhang, Keni; Zhang, Keni; Wu, Yu-Shu; Pruess, Karsten

    2008-01-01

    TOUGH2-MP is a massively parallel (MP) version of the TOUGH2 code, designed for computationally efficient parallel simulation of isothermal and nonisothermal flows of multicomponent, multiphase fluids in one, two, and three-dimensional porous and fractured media. In recent years, computational requirements have become increasingly intensive in large or highly nonlinear problems for applications in areas such as radioactive waste disposal, CO2 geological sequestration, environmental assessment and remediation, reservoir engineering, and groundwater hydrology. The primary objective of developing the parallel-simulation capability is to significantly improve the computational performance of the TOUGH2 family of codes. The particular goal for the parallel simulator is to achieve orders-of-magnitude improvement in computational time for models with ever-increasing complexity. TOUGH2-MP is designed to perform parallel simulation on multi-CPU computational platforms. An earlier version of TOUGH2-MP (V1.0) was based on the TOUGH2 Version 1.4 with EOS3, EOS9, and T2R3D modules, a software previously qualified for applications in the Yucca Mountain project, and was designed for execution on CRAY T3E and IBM SP supercomputers. The current version of TOUGH2-MP (V2.0) includes all fluid property modules of the standard version TOUGH2 V2.0. It provides computationally efficient capabilities using supercomputers, Linux clusters, or multi-core PCs, and also offers many user-friendly features. The parallel simulator inherits all process capabilities from V2.0 together with additional capabilities for handling fractured media from V1.4. This report provides a quick starting guide on how to set up and run the TOUGH2-MP program for users with a basic knowledge of running the (standard) version TOUGH2 code. The report also gives a brief technical description of the code, including a discussion of parallel methodology, code structure, as well as mathematical and numerical methods used

  1. Productivity associated with visual status of computer users.

    Science.gov (United States)

    Daum, Kent M; Clore, Katherine A; Simms, Suzanne S; Vesely, Jon W; Wilczek, Dawn D; Spittle, Brian M; Good, Greg W

    2004-01-01

    The aim of this project is to examine the potential connection between the astigmatic refractive corrections of subjects using computers and their productivity and comfort. We hypothesize that improving the visual status of subjects using computers results in greater productivity, as well as improved visual comfort. Inclusion criteria required subjects 19 to 30 years of age with complete vision examinations before being enrolled. Using a double-masked, placebo-controlled, randomized design, subjects completed three experimental tasks calculated to assess the effects of refractive error on productivity (time to completion and the number of errors) at a computer. The tasks resembled those commonly undertaken by computer users and involved visual search tasks of: (1) counties and populations; (2) nonsense word search; and (3) a modified text-editing task. Estimates of productivity for time to completion varied from a minimum of 2.5% upwards to 28.7% with 2 D cylinder miscorrection. Assuming a conservative estimate of an overall 2.5% increase in productivity with appropriate astigmatic refractive correction, our data suggest a favorable cost-benefit ratio of at least 2.3 for the visual correction of an employee (total cost 268 dollars) with a salary of 25,000 dollars per year. We conclude that astigmatic refractive error affected both productivity and visual comfort under the conditions of this experiment. These data also suggest a favorable cost-benefit ratio for employers who provide computer-specific eyewear to their employees.

  2. A user's guide to the POPFOOD computer code for evaluating ingestion collective doses

    International Nuclear Information System (INIS)

    Nair, S.; Palamountain, J.

    1980-09-01

    A complete description is given of the wide range of user options available for running the POPFOOD computer code, which was developed for the calculation of annual ingestion collective doses from routine atmospheric discharges of radioactivity in the UK. The various options have been depicted pictorially to allow the prospective user to obtain a rapid appreciation of their scope. Facilities for modifying temporarily the library and input data are also described. In addition, input and output data for a sample test case, covering broad range of the various available options, are provided to facilitate programme testing. POPFOOD is written in Fortran IV (level H). The programme is compiled under release 20.6, OPT=2 on the IBM 370/165 computer. (author)

  3. Security and health protection while working with a computer. Survey into the knowledge of users about legal and other requirements.

    OpenAIRE

    Šmejkalová, Petra

    2005-01-01

    This bachelor thesis is aimed at the knowledge of general computer users with regards to work security and health protection. It summarizes the relevant legislation and recommendations of ergonomic specialists. The practical part analyses results of a survey, which examined the computer workplaces and user habits when working with a computer.

  4. Decision Making Under Uncertainty - Bridging the Gap Between End User Needs and Science Capability

    Science.gov (United States)

    Verdon-Kidd, D. C.; Kiem, A.; Austin, E. K.

    2012-12-01

    broker' would be to package, translate (both from end user to scientist and scientist to end user) and transform climate information. Importantly communication of uncertainty needs to be improved so that end users are aware of all the caveats and what can realistically be expected from climate science now and in the near future. Overall this study confirmed that there is indeed a 'gap' between end user's needs and science capability, particularly with respect to uncertainty, communication and packaging of climate information. This 'gap' has been a barrier to successful climate change adaptation in the past. While it is unrealistic to think we could ever close the 'gap' completely, based on the recommendations provided in this paper, it may be possible to bridge the 'gap' (or at least improve people's awareness of the 'gap'). Furthermore, the insights gained and recommendations provided from this study, while based on an Australian context, are likely to be applicable to many other regions of the world, grappling with similar issues.

  5. Comparison of tests of accommodation for computer users.

    Science.gov (United States)

    Kolker, David; Hutchinson, Robert; Nilsen, Erik

    2002-04-01

    With the increased use of computers in the workplace and at home, optometrists are finding more patients presenting with symptoms of Computer Vision Syndrome. Among these symptomatic individuals, research supports that accommodative disorders are the most common vision finding. A prepresbyopic group (N= 30) and a presbyopic group (N = 30) were selected from a private practice. Assignment to a group was determined by age, accommodative amplitude, and near visual acuity with their distance prescription. Each subject was given a thorough vision and ocular health examination, then administered several nearpoint tests of accommodation at a computer working distance. All the tests produced similar results in the presbyopic group. For the prepresbyopic group, the tests yielded very different results. To effectively treat symptomatic VDT users, optometrists must assess the accommodative system along with the binocular and refractive status. For presbyopic patients, all nearpoint tests studied will yield virtually the same result. However, the method of testing accommodation, as well as the test stimulus presented, will yield significantly different responses for prepresbyopic patients. Previous research indicates that a majority of patients prefer the higher plus prescription yielded by the Gaussian image test.

  6. A user`s guide to LUGSAN II. A computer program to calculate and archive lug and sway brace loads for aircraft-carried stores

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, W.N. [Sandia National Labs., Albuquerque, NM (United States). Mechanical and Thermal Environments Dept.

    1998-03-01

    LUG and Sway brace ANalysis (LUGSAN) II is an analysis and database computer program that is designed to calculate store lug and sway brace loads for aircraft captive carriage. LUGSAN II combines the rigid body dynamics code, SWAY85, with a Macintosh Hypercard database to function both as an analysis and archival system. This report describes the LUGSAN II application program, which operates on the Macintosh System (Hypercard 2.2 or later) and includes function descriptions, layout examples, and sample sessions. Although this report is primarily a user`s manual, a brief overview of the LUGSAN II computer code is included with suggested resources for programmers.

  7. Implementing interactive computing in an object-oriented environment

    Directory of Open Access Journals (Sweden)

    Frederic Udina

    2000-04-01

    Full Text Available Statistical computing when input/output is driven by a Graphical User Interface is considered. A proposal is made for automatic control of computational flow to ensure that only strictly required computations are actually carried on. The computational flow is modeled by a directed graph for implementation in any object-oriented programming language with symbolic manipulation capabilities. A complete implementation example is presented to compute and display frequency based piecewise linear density estimators such as histograms or frequency polygons.

  8. Metals Processing Laboratory Users (MPLUS) Facility Annual Report: October 1, 2000 through September 30, 2001

    Energy Technology Data Exchange (ETDEWEB)

    Angelini, P

    2004-04-27

    The Metals Processing Laboratory Users Facility (MPLUS) is a Department of Energy (DOE), Energy Efficiency and Renewable Energy, Industrial Technologies Program user facility designated to assist researchers in key industries, universities, and federal laboratories in improving energy efficiency, improving environmental aspects, and increasing competitiveness. The goal of MPLUS is to provide access to the specialized technical expertise and equipment needed to solve metals processing issues that limit the development and implementation of emerging metals processing technologies. The scope of work can also extend to other types of materials. MPLUS has four primary User Centers including: (1) Processing--casting, powder metallurgy, deformation processing including (extrusion, forging, rolling), melting, thermomechanical processing, high density infrared processing; (2) Joining--welding, monitoring and control, solidification, brazing, bonding; (3) Characterization--corrosion, mechanical properties, fracture mechanics, microstructure, nondestructive examination, computer-controlled dilatometry, and emissivity; (4) Materials/Process Modeling--mathematical design and analyses, high performance computing, process modeling, solidification/deformation, microstructure evolution, thermodynamic and kinetic, and materials data bases. A fully integrated approach provides researchers with unique opportunities to address technologically related issues to solve metals processing problems and probe new technologies. Access is also available to 16 additional Oak Ridge National Laboratory (ORNL) user facilities ranging from state of the art materials characterization capabilities, high performance computing, to manufacturing technologies. MPLUS can be accessed through a standardized User-submitted Proposal and a User Agreement. Nonproprietary (open) or proprietary proposals can be submitted. For open research and development, access to capabilities is provides free of charge while

  9. SensorWeb 3G: Extending On-Orbit Sensor Capabilities to Enable Near Realtime User Configurability

    Science.gov (United States)

    Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Tran, Daniel; Davies, Ashley; Sullivan, Don; Ames, Troy; hide

    2010-01-01

    This research effort prototypes an implementation of a standard interface, Web Coverage Processing Service (WCPS), which is an Open Geospatial Consortium(OGC) standard, to enable users to define, test, upload and execute algorithms for on-orbit sensor systems. The user is able to customize on-orbit data products that result from raw data streaming from an instrument. This extends the SensorWeb 2.0 concept that was developed under a previous Advanced Information System Technology (AIST) effort in which web services wrap sensors and a standardized Extensible Markup Language (XML) based scripting workflow language orchestrates processing steps across multiple domains. SensorWeb 3G extends the concept by providing the user controls into the flight software modules associated with on-orbit sensor and thus provides a degree of flexibility which does not presently exist. The successful demonstrations to date will be presented, which includes a realistic HyspIRI decadal mission testbed. Furthermore, benchmarks that were run will also be presented along with future demonstration and benchmark tests planned. Finally, we conclude with implications for the future and how this concept dovetails into efforts to develop "cloud computing" methods and standards.

  10. A user's manual of Tools for Error Estimation of Complex Number Matrix Computation (Ver.1.0)

    International Nuclear Information System (INIS)

    Ichihara, Kiyoshi.

    1997-03-01

    'Tools for Error Estimation of Complex Number Matrix Computation' is a subroutine library which aids the users in obtaining the error ranges of the complex number linear system's solutions or the Hermitian matrices' eigen values. This library contains routines for both sequential computers and parallel computers. The subroutines for linear system error estimation calulate norms of residual vectors, matrices's condition numbers, error bounds of solutions and so on. The error estimation subroutines for Hermitian matrix eigen values' derive the error ranges of the eigen values according to the Korn-Kato's formula. This user's manual contains a brief mathematical background of error analysis on linear algebra and usage of the subroutines. (author)

  11. User's guide to ESME v. 7.1

    International Nuclear Information System (INIS)

    Stahl, S.; MacLachlan, J.

    1990-01-01

    ESME is a computer program to calculate the evolution of a distribution of particles in energy and azimuth as it is acted upon by the radiofrequency system of a proton synchrotron. It provides for the modeling of multiple rf systems, feedback control, spacecharge, and many of the effects of longitudinal coupling impedance. The capabilities of the program are described, and the requirements for input data are specified in sufficient detail to permit significant calculations by an uninitiated user. The program is currently at version 7.1 and extensively modified since the previous user documentation. Fundamental enhancements make version 6 data unusable, but nearly all facilities of the earlier version have been retained and input data is similar. Also described is a VAX-based code management convention which has been established with a view to maintaining functional equivalence in versions used on different computers. 13 refs

  12. Future computing needs for Fermilab

    International Nuclear Information System (INIS)

    1983-12-01

    The following recommendations are made: (1) Significant additional computing capacity and capability beyond the present procurement should be provided by 1986. A working group with representation from the principal computer user community should be formed to begin immediately to develop the technical specifications. High priority should be assigned to providing a large user memory, software portability and a productive computing environment. (2) A networked system of VAX-equivalent super-mini computers should be established with at least one such computer dedicated to each reasonably large experiment for both online and offline analysis. The laboratory staff responsible for mini computers should be augmented in order to handle the additional work of establishing, maintaining and coordinating this system. (3) The laboratory should move decisively to a more fully interactive environment. (4) A plan for networking both inside and outside the laboratory should be developed over the next year. (5) The laboratory resources devoted to computing, including manpower, should be increased over the next two to five years. A reasonable increase would be 50% over the next two years increasing thereafter to a level of about twice the present one. (6) A standing computer coordinating group, with membership of experts from all the principal computer user constituents of the laboratory, should be appointed by and report to the director. This group should meet on a regularly scheduled basis and be charged with continually reviewing all aspects of the laboratory computing environment

  13. User involvement in the design of human-computer interactions: some similarities and differences between design approaches

    NARCIS (Netherlands)

    Bekker, M.M.; Long, J.B.

    1998-01-01

    This paper presents a general review of user involvement in the design of human-computer interactions, as advocated by a selection of different approaches to design. The selection comprises User-Centred Design, Participatory Design, Socio-Technical Design, Soft Systems Methodology, and Joint

  14. Muscle fatigue in relation to forearm pain and tenderness among professional computer users

    DEFF Research Database (Denmark)

    Thomsen, GF; Johnson, PW; Svendsen, Susanne Wulff

    2007-01-01

    ABSTRACT: BACKGROUND: To examine the hypothesis that forearm pain with palpation tenderness in computer users is associated with increased extensor muscle fatigue. METHODS: Eighteen persons with pain and moderate to severe palpation tenderness in the extensor muscle group of the right forearm...... response was not explained by differences in the MVC or body mass index. CONCLUSION: Computer users with forearm pain and moderate to severe palpation tenderness had diminished forearm extensor muscle fatigue response. Additional studies are necessary to determine whether this result reflects an adaptive...... and twenty gender and age matched referents without such complaints were enrolled from the Danish NUDATA study of neck and upper extremity disorders among technical assistants and machine technicians. Fatigue of the right forearm extensor muscles was assessed by muscle twitch forces in response to low...

  15. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    Directory of Open Access Journals (Sweden)

    Marijan Beg

    2017-05-01

    Full Text Available Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i the re-compilation of source code, (ii the use of configuration files, (iii the graphical user interface, and (iv embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF. We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  16. Individual and work-related risk factors for musculoskeletal pain: a cross-sectional study among Estonian computer users.

    Science.gov (United States)

    Oha, Kristel; Animägi, Liina; Pääsuke, Mati; Coggon, David; Merisalu, Eda

    2014-05-28

    Occupational use of computers has increased rapidly over recent decades, and has been linked with various musculoskeletal disorders, which are now the most commonly diagnosed occupational diseases in Estonia. The aim of this study was to assess the prevalence of musculoskeletal pain (MSP) by anatomical region during the past 12 months and to investigate its association with personal characteristics and work-related risk factors among Estonian office workers using computers. In a cross-sectional survey, the questionnaires were sent to the 415 computer users. Data were collected by self-administered questionnaire from 202 computer users at two universities in Estonia. The questionnaire asked about MSP at different anatomical sites, and potential individual and work related risk factors. Associations with risk factors were assessed by logistic regression. Most respondents (77%) reported MSP in at least one anatomical region during the past 12 months. Most prevalent was pain in the neck (51%), followed by low back pain (42%), wrist/hand pain (35%) and shoulder pain (30%). Older age, right-handedness, not currently smoking, emotional exhaustion, belief that musculoskeletal problems are commonly caused by work, and low job security were the statistically significant risk factors for MSP in different anatomical sites. A high prevalence of MSP in the neck, low back, wrist/arm and shoulder was observed among Estonian computer users. Psychosocial risk factors were broadly consistent with those reported from elsewhere. While computer users should be aware of ergonomic techniques that can make their work easier and more comfortable, presenting computer use as a serious health hazard may modify health beliefs in a way that is unhelpful.

  17. Some computer graphical user interfaces in radiation therapy.

    Science.gov (United States)

    Chow, James C L

    2016-03-28

    In this review, five graphical user interfaces (GUIs) used in radiation therapy practices and researches are introduced. They are: (1) the treatment time calculator, superficial X-ray treatment time calculator (SUPCALC) used in the superficial X-ray radiation therapy; (2) the monitor unit calculator, electron monitor unit calculator (EMUC) used in the electron radiation therapy; (3) the multileaf collimator machine file creator, sliding window intensity modulated radiotherapy (SWIMRT) used in generating fluence map for research and quality assurance in intensity modulated radiation therapy; (4) the treatment planning system, DOSCTP used in the calculation of 3D dose distribution using Monte Carlo simulation; and (5) the monitor unit calculator, photon beam monitor unit calculator (PMUC) used in photon beam radiation therapy. One common issue of these GUIs is that all user-friendly interfaces are linked to complex formulas and algorithms based on various theories, which do not have to be understood and noted by the user. In that case, user only needs to input the required information with help from graphical elements in order to produce desired results. SUPCALC is a superficial radiation treatment time calculator using the GUI technique to provide a convenient way for radiation therapist to calculate the treatment time, and keep a record for the skin cancer patient. EMUC is an electron monitor unit calculator for electron radiation therapy. Instead of doing hand calculation according to pre-determined dosimetric tables, clinical user needs only to input the required drawing of electron field in computer graphical file format, prescription dose, and beam parameters to EMUC to calculate the required monitor unit for the electron beam treatment. EMUC is based on a semi-experimental theory of sector-integration algorithm. SWIMRT is a multileaf collimator machine file creator to generate a fluence map produced by a medical linear accelerator. This machine file controls

  18. Capturing and analyzing wheelchair maneuvering patterns with mobile cloud computing.

    Science.gov (United States)

    Fu, Jicheng; Hao, Wei; White, Travis; Yan, Yuqing; Jones, Maria; Jan, Yih-Kuen

    2013-01-01

    Power wheelchairs have been widely used to provide independent mobility to people with disabilities. Despite great advancements in power wheelchair technology, research shows that wheelchair related accidents occur frequently. To ensure safe maneuverability, capturing wheelchair maneuvering patterns is fundamental to enable other research, such as safe robotic assistance for wheelchair users. In this study, we propose to record, store, and analyze wheelchair maneuvering data by means of mobile cloud computing. Specifically, the accelerometer and gyroscope sensors in smart phones are used to record wheelchair maneuvering data in real-time. Then, the recorded data are periodically transmitted to the cloud for storage and analysis. The analyzed results are then made available to various types of users, such as mobile phone users, traditional desktop users, etc. The combination of mobile computing and cloud computing leverages the advantages of both techniques and extends the smart phone's capabilities of computing and data storage via the Internet. We performed a case study to implement the mobile cloud computing framework using Android smart phones and Google App Engine, a popular cloud computing platform. Experimental results demonstrated the feasibility of the proposed mobile cloud computing framework.

  19. Accelerating Science with the NERSC Burst Buffer Early User Program

    Energy Technology Data Exchange (ETDEWEB)

    Bhimji, Wahid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bard, Debbie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Romanus, Melissa [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Rutgers Univ., New Brunswick, NJ (United States); Paul, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ovsyannikov, Andrey [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Friesen, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bryson, Matt [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Correa, Joaquin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lockwood, Glenn K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Tsulaia, Vakho [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Byna, Suren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Farrell, Steve [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gursoy, Doga [Argonne National Lab. (ANL), Argonne, IL (United States). Advanced Photon Source (APS); Daley, Chris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Beckner, Vince [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Van Straalen, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Trebotich, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Tull, Craig [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Weber, Gunther H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wright, Nicholas J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Antypas, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Prabhat, none [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-01-01

    NVRAM-based Burst Buffers are an important part of the emerging HPC storage landscape. The National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory recently installed one of the first Burst Buffer systems as part of its new Cori supercomputer, collaborating with Cray on the development of the DataWarp software. NERSC has a diverse user base comprised of over 6500 users in 700 different projects spanning a wide variety of scientific computing applications. The use-cases of the Burst Buffer at NERSC are therefore also considerable and diverse. We describe here performance measurements and lessons learned from the Burst Buffer Early User Program at NERSC, which selected a number of research projects to gain early access to the Burst Buffer and exercise its capability to enable new scientific advancements. To the best of our knowledge this is the first time a Burst Buffer has been stressed at scale by diverse, real user workloads and therefore these lessons will be of considerable benefit to shaping the developing use of Burst Buffers at HPC centers.

  20. USER'S GUIDE TO THE PERSONAL COMPUTER VERSION OF THE BIOGENIC EMISSIONS INVENTORY SYSTEM (PC-BEIS2)

    Science.gov (United States)

    The document is a user's guide for an updated Personal Computer version of the Biogenic Emissions Inventory System (PC-BEIS2), allowing users to estimate hourly emissions of biogenic volatile organic compounds (BVOCs) and soil nitrogen oxide emissions for any county in the contig...

  1. TME (Task Mapping Editor): tool for executing distributed parallel computing. TME user's manual

    International Nuclear Information System (INIS)

    Takemiya, Hiroshi; Yamagishi, Nobuhiro; Imamura, Toshiyuki

    2000-03-01

    At the Center for Promotion of Computational Science and Engineering, a software environment PPExe has been developed to support scientific computing on a parallel computer cluster (distributed parallel scientific computing). TME (Task Mapping Editor) is one of components of the PPExe and provides a visual programming environment for distributed parallel scientific computing. Users can specify data dependence among tasks (programs) visually as a data flow diagram and map these tasks onto computers interactively through GUI of TME. The specified tasks are processed by other components of PPExe such as Meta-scheduler, RIM (Resource Information Monitor), and EMS (Execution Management System) according to the execution order of these tasks determined by TME. In this report, we describe the usage of TME. (author)

  2. Metals Processing Laboratory Users (MPLUS) Facility Annual Report FY 2002 (October 1, 2001-September 30, 2002)

    Energy Technology Data Exchange (ETDEWEB)

    Angelini, P

    2004-04-27

    The Metals Processing Laboratory Users Facility (MPLUS) is a Department of Energy (DOE), Energy Efficiency and Renewable Energy, Industrial Technologies Program, user facility designated to assist researchers in key industries, universities, and federal laboratories in improving energy efficiency, improving environmental aspects, and increasing competitiveness. The goal of MPLUS is to provide access to the specialized technical expertise and equipment needed to solve metals processing issues that limit the development and implementation of emerging metals processing technologies. The scope of work can also extend to other types of materials. MPLUS has four primary user centers: (1) Processing--casting, powder metallurgy, deformation processing (including extrusion, forging, rolling), melting, thermomechanical processing, and high-density infrared processing; (2) Joining--welding, monitoring and control, solidification, brazing, and bonding; (3) Characterization--corrosion, mechanical properties, fracture mechanics, microstructure, nondestructive examination, computer-controlled dilatometry, and emissivity; and (4) Materials/Process Modeling--mathematical design and analyses, high-performance computing, process modeling, solidification/deformation, microstructure evolution, thermodynamic and kinetic, and materials databases A fully integrated approach provides researchers with unique opportunities to address technologically related issues to solve metals processing problems and probe new technologies. Access is also available to 16 additional Oak Ridge National Laboratory (ORNL) user facilities ranging from state-of-the-art materials characterization capabilities, and high-performance computing to manufacturing technologies. MPLUS can be accessed through a standardized user-submitted proposal and a user agreement. Nonproprietary (open) or proprietary proposals can be submitted. For open research and development, access to capabilities is provided free of charge

  3. Ergonomic assessment of musculoskeletal disorders risk among the computer users by Rapid Upper Limb Assessment method

    Directory of Open Access Journals (Sweden)

    Ehsanollah Habibi

    2016-01-01

    Conclusion: This study result showed that frequency of musculoskeletal problems in the neck, back, elbow, and wrist was generally high among our subjects, and ergonomic interventions such as computer workstation redesign, users educate about ergonomic principles computer with work, reduced working hours in computers with work must be carried out.

  4. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    Directory of Open Access Journals (Sweden)

    Almeida Jonas S

    2006-03-01

    Full Text Available Abstract Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else. Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web

  5. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    Science.gov (United States)

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over

  6. Computer Agent's Role in Modeling an Online Math Help User

    OpenAIRE

    Dragana Martinovic

    2007-01-01

    This paper investigates perspectives of deployments of open learner model on mathematics online help sites. It proposes enhancing a regular human-to-human interaction with an involvement of a computer agent suitable for tracking users, checking their input and making useful suggestions. Such a design would provide the most support for the interlocutors while keeping the nature of existing environment intact. Special considerations are given to peer-to-peer and expert-to-student mathematics on...

  7. RAMONA-4B a computer code with three-dimensional neutron kinetics for BWR and SBWR system transient - user's manual

    International Nuclear Information System (INIS)

    Rohatgi, U.S.; Cheng, H.S.; Khan, H.J.; Mallen, A.N.; Neymotin, L.Y.

    1998-03-01

    This document is the User's Manual for the Boiling Water Reactor (BWR), and Simplified Boiling Water Reactor (SBWR) systems transient code RAMONA-4B. The code uses a three-dimensional neutron-kinetics model coupled with a multichannel, nonequilibrium, drift-flux, phase-flow model of the thermal hydraulics of the reactor vessel. The code is designed to analyze a wide spectrum of BWR core and system transients. Chapter 1 gives an overview of the code's capabilities and limitations; Chapter 2 describes the code's structure, lists major subroutines, and discusses the computer requirements. Chapter 3 is on code, auxillary codes, and instructions for running RAMONA-4B on Sun SPARC and IBM Workstations. Chapter 4 contains component descriptions and detailed card-by-card input instructions. Chapter 5 provides samples of the tabulated output for the steady-state and transient calculations and discusses the plotting procedures for the steady-state and transient calculations. Three appendices contain important user and programmer information: lists of plot variables (Appendix A) listings of input deck for sample problem (Appendix B), and a description of the plotting program PAD (Appendix C). 24 refs., 18 figs., 11 tabs

  8. High Performance Computing Facility Operational Assessment 2015: Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Barker, Ashley D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bernholdt, David E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Bland, Arthur S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Gary, Jeff D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Hack, James J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; McNally, Stephen T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Rogers, James H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Smith, Brian E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Straatsma, T. P. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Sukumar, Sreenivas Rangan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Thach, Kevin G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Tichenor, Suzy [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Vazhkudai, Sudharshan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility; Wells, Jack C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility

    2016-03-01

    Oak Ridge National Laboratory’s (ORNL’s) Leadership Computing Facility (OLCF) continues to surpass its operational target goals: supporting users; delivering fast, reliable systems; creating innovative solutions for high-performance computing (HPC) needs; and managing risks, safety, and security aspects associated with operating one of the most powerful computers in the world. The results can be seen in the cutting-edge science delivered by users and the praise from the research community. Calendar year (CY) 2015 was filled with outstanding operational results and accomplishments: a very high rating from users on overall satisfaction that ties the highest-ever mark set in CY 2014; the greatest number of core-hours delivered to research projects; the largest percentage of capability usage since the OLCF began tracking the metric in 2009; and success in delivering on the allocation of 60, 30, and 10% of core hours offered for the INCITE (Innovative and Novel Computational Impact on Theory and Experiment), ALCC (Advanced Scientific Computing Research Leadership Computing Challenge), and Director’s Discretionary programs, respectively. These accomplishments, coupled with the extremely high utilization rate, represent the fulfillment of the promise of Titan: maximum use by maximum-size simulations. The impact of all of these successes and more is reflected in the accomplishments of OLCF users, with publications this year in notable journals Nature, Nature Materials, Nature Chemistry, Nature Physics, Nature Climate Change, ACS Nano, Journal of the American Chemical Society, and Physical Review Letters, as well as many others. The achievements included in the 2015 OLCF Operational Assessment Report reflect first-ever or largest simulations in their communities; for example Titan enabled engineers in Los Angeles and the surrounding region to design and begin building improved critical infrastructure by enabling the highest-resolution Cybershake map for Southern

  9. What Do IT-People Know About the (Nordic) History of Computers and User Interfaces?

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms

    2009-01-01

    :  This paper reports a preliminary, empirical exploration of what IT-people know about the history of computers and user interfaces.  The principal motivation for the study is that the younger generations such as students in IT seem to know very little about these topics.  The study employed...... to become the designation or even the icon for the computer.  In other words, one of the key focal points in the area of human-computer interaction: to make the computer as such invisible seems to have been successful...

  10. Overview Electrotactile Feedback for Enhancing Human Computer Interface

    Science.gov (United States)

    Pamungkas, Daniel S.; Caesarendra, Wahyu

    2018-04-01

    To achieve effective interaction between a human and a computing device or machine, adequate feedback from the computing device or machine is required. Recently, haptic feedback is increasingly being utilised to improve the interactivity of the Human Computer Interface (HCI). Most existing haptic feedback enhancements aim at producing forces or vibrations to enrich the user’s interactive experience. However, these force and/or vibration actuated haptic feedback systems can be bulky and uncomfortable to wear and only capable of delivering a limited amount of information to the user which can limit both their effectiveness and the applications they can be applied to. To address this deficiency, electrotactile feedback is used. This involves delivering haptic sensations to the user by electrically stimulating nerves in the skin via electrodes placed on the surface of the skin. This paper presents a review and explores the capability of electrotactile feedback for HCI applications. In addition, a description of the sensory receptors within the skin for sensing tactile stimulus and electric currents alsoseveral factors which influenced electric signal to transmit to the brain via human skinare explained.

  11. Applications of supercomputing and the utility industry: Calculation of power transfer capabilities

    International Nuclear Information System (INIS)

    Jensen, D.D.; Behling, S.R.; Betancourt, R.

    1990-01-01

    Numerical models and iterative simulation using supercomputers can furnish cost-effective answers to utility industry problems that are all but intractable using conventional computing equipment. An example of the use of supercomputers by the utility industry is the determination of power transfer capability limits for power transmission systems. This work has the goal of markedly reducing the run time of transient stability codes used to determine power distributions following major system disturbances. To date, run times of several hours on a conventional computer have been reduced to several minutes on state-of-the-art supercomputers, with further improvements anticipated to reduce run times to less than a minute. In spite of the potential advantages of supercomputers, few utilities have sufficient need for a dedicated in-house supercomputing capability. This problem is resolved using a supercomputer center serving a geographically distributed user base coupled via high speed communication networks

  12. Requirements for SSC central computing staffing (conceptual)

    International Nuclear Information System (INIS)

    Pfister, J.

    1985-01-01

    Given a computation center with --10,000 MIPS supporting --1,000 users, what are the staffing requirements? The attempt in this paper is to list the functions and staff size required in a central computing or centrally supported computing complex. The organization assumes that although considerable computing power would exist (mostly for online) in the four interaction regions (IR) that there are functions/capabilities better performed outside the IR and in this model at a ''central computing facility.'' What follows is one staffing approach, not necessarily optimal, with certain assumptions about numbers of computer systems, media, networks and system controls, that is, one would get the best technology available. Thus, it is speculation about what the technology may bring and what it takes to operate it. From an end user support standpoint it is less clear, given the geography of an SSC, where and what the consulting support should look like and its location

  13. MCNP capabilities for nuclear well logging calculations

    International Nuclear Information System (INIS)

    Forster, R.A.; Little, R.C.; Briesmeister, J.F.; Hendricks, J.S.

    1990-01-01

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. This paper discusses how the general-purpose continuous-energy Monte Carlo code MCNP (Monte Carlo neutron photon), part of the LARTCS, provides a computational predictive capability for many applications of interest to the nuclear well logging community. The generalized three-dimensional geometry of MCNP is well suited for borehole-tool models. SABRINA, another component of the LARTCS, is a graphics code that can be used to interactively create a complex MCNP geometry. Users can define many source and tally characteristics with standard MCNP features. The time-dependent capability of the code is essential when modeling pulsed sources. Problems with neutrons, photons, and electrons as either single particle or coupled particles can be calculated with MCNP. The physics of neutron and photon transport and interactions is modeled in detail using the latest available cross-section data

  14. User Interaction Modeling and Profile Extraction in Interactive Systems: A Groupware Application Case Study †

    Science.gov (United States)

    Tîrnăucă, Cristina; Duque, Rafael; Montaña, José L.

    2017-01-01

    A relevant goal in human–computer interaction is to produce applications that are easy to use and well-adjusted to their users’ needs. To address this problem it is important to know how users interact with the system. This work constitutes a methodological contribution capable of identifying the context of use in which users perform interactions with a groupware application (synchronous or asynchronous) and provides, using machine learning techniques, generative models of how users behave. Additionally, these models are transformed into a text that describes in natural language the main characteristics of the interaction of the users with the system. PMID:28726762

  15. 11 CFR 9003.6 - Production of computer information.

    Science.gov (United States)

    2010-01-01

    ... disbursements; (2) Receipts by and disbursements from a legal and accounting compliance fund under 11 CFR 9003.3... legal and accounting services, including the allocation of payroll and overhead expenditures; (4... explaining the computer system's software capabilities, such as user guides, technical manuals, formats...

  16. Military clouds: utilization of cloud computing systems at the battlefield

    Science.gov (United States)

    Süleyman, Sarıkürk; Volkan, Karaca; İbrahim, Kocaman; Ahmet, Şirzai

    2012-05-01

    Cloud computing is known as a novel information technology (IT) concept, which involves facilitated and rapid access to networks, servers, data saving media, applications and services via Internet with minimum hardware requirements. Use of information systems and technologies at the battlefield is not new. Information superiority is a force multiplier and is crucial to mission success. Recent advances in information systems and technologies provide new means to decision makers and users in order to gain information superiority. These developments in information technologies lead to a new term, which is known as network centric capability. Similar to network centric capable systems, cloud computing systems are operational today. In the near future extensive use of military clouds at the battlefield is predicted. Integrating cloud computing logic to network centric applications will increase the flexibility, cost-effectiveness, efficiency and accessibility of network-centric capabilities. In this paper, cloud computing and network centric capability concepts are defined. Some commercial cloud computing products and applications are mentioned. Network centric capable applications are covered. Cloud computing supported battlefield applications are analyzed. The effects of cloud computing systems on network centric capability and on the information domain in future warfare are discussed. Battlefield opportunities and novelties which might be introduced to network centric capability by cloud computing systems are researched. The role of military clouds in future warfare is proposed in this paper. It was concluded that military clouds will be indispensible components of the future battlefield. Military clouds have the potential of improving network centric capabilities, increasing situational awareness at the battlefield and facilitating the settlement of information superiority.

  17. Prevalence of ocular symptoms and signs among professional computer users in Isfahan, Iran

    Directory of Open Access Journals (Sweden)

    Alireza Dehghani

    2008-12-01

    Full Text Available

    • BACKGROUND: This study was undertaken to detect the prevalence of ocular symptoms and signs in professional video display users (VDUs and non-users in Isfahan.
    • METHODS: This is a cross-sectional descriptive case-control study. The VDUs group was selected from among employees working with computer and the control group was selected from among employees not working with computer. Fifty seven VDUs (34 male & 23 female with mean age of 30.7 ± 6.8 and 56 employees in the control group (25 male & 31 female, mean age of 27.6 ± 7.2 were evaluated. Complete ocular examination was done for both groups.
    • RESULTS: Among VDUs, 45 cases (79% had burning eyes and tearing, 38 cases (66% had dry eye, 37 cases (65% had asthenopia, and 47 cases (82.5% had musculoskeletal pain but these values for the control group were 24 (42.8%, 18 (32.2%, 22(39.3% and 15 (26.8% respectively and the difference was statistically significant (p = 0.037, p = 0.023, p = 0.044, p = 0.013. Schirmer's test was positive in 22 VDUs (38.5% vs. 6 (10.7% of control group (p = 0.012. There was heterophoria in 19 VDUs (33.3% vs. 3 controls (5.4% (p = 0.032.
    • CONCLUSION: Eye burning and tearing, dry eye, asthenopia and musculoskeletal problems were obviously more common in VDUs. Considering the extensive use of computers at home and work, a plan is required to detect dangers and provide appropriate solutions.
    • KEY WORDS: Video Display Terminal, Video Display Users, Computer Vision Syndrome, Dry Eye, Schirmer test, Asthenopia.

  18. Individualized computer-aided education in mammography based on user modeling: concept and preliminary experiments.

    Science.gov (United States)

    Mazurowski, Maciej A; Baker, Jay A; Barnhart, Huiman X; Tourassi, Georgia D

    2010-03-01

    The authors propose the framework for an individualized adaptive computer-aided educational system in mammography that is based on user modeling. The underlying hypothesis is that user models can be developed to capture the individual error making patterns of radiologists-in-training. In this pilot study, the authors test the above hypothesis for the task of breast cancer diagnosis in mammograms. The concept of a user model was formalized as the function that relates image features to the likelihood/extent of the diagnostic error made by a radiologist-in-training and therefore to the level of difficulty that a case will pose to the radiologist-in-training (or "user"). Then, machine learning algorithms were implemented to build such user models. Specifically, the authors explored k-nearest neighbor, artificial neural networks, and multiple regression for the task of building the model using observer data collected from ten Radiology residents at Duke University Medical Center for the problem of breast mass diagnosis in mammograms. For each resident, a user-specific model was constructed that predicts the user's expected level of difficulty for each presented case based on two BI-RADS image features. In the experiments, leave-one-out data handling scheme was applied to assign each case to a low-predicted-difficulty or a high-predicted-difficulty group for each resident based on each of the three user models. To evaluate whether the user model is useful in predicting difficulty, the authors performed statistical tests using the generalized estimating equations approach to determine whether the mean actual error is the same or not between the low-predicted-difficulty group and the high-predicted-difficulty group. When the results for all observers were pulled together, the actual errors made by residents were statistically significantly higher for cases in the high-predicted-difficulty group than for cases in the low-predicted-difficulty group for all modeling

  19. Cross-cultural human-computer interaction and user experience design a semiotic perspective

    CERN Document Server

    Brejcha, Jan

    2015-01-01

    This book describes patterns of language and culture in human-computer interaction (HCI). Through numerous examples, it shows why these patterns matter and how to exploit them to design a better user experience (UX) with computer systems. It provides scientific information on the theoretical and practical areas of the interaction and communication design for research experts and industry practitioners and covers the latest research in semiotics and cultural studies, bringing a set of tools and methods to benefit the process of designing with the cultural background in mind.

  20. Effect of yoga on self-rated visual discomfort in computer users.

    Science.gov (United States)

    Telles, Shirley; Naveen, K V; Dash, Manoj; Deginal, Rajendra; Manjunath, N K

    2006-12-03

    'Dry eye' appears to be the main contributor to the symptoms of computer vision syndrome. Regular breaks and the use of artificial tears or certain eye drops are some of the options to reduce visual discomfort. A combination of yoga practices have been shown to reduce visual strain in persons with progressive myopia. The present randomized controlled trial was planned to evaluate the effect of a combination of yoga practices on self-rated symptoms of visual discomfort in professional computer users in Bangalore. Two hundred and ninety one professional computer users were randomly assigned to two groups, yoga (YG, n = 146) and wait list control (WL, n = 145). Both groups were assessed at baseline and after sixty days for self-rated visual discomfort using a standard questionnaire. During these 60 days the YG group practiced an hour of yoga daily for five days in a week and the WL group did their usual recreational activities also for an hour daily for the same duration. At 60 days there were 62 in the YG group and 55 in the WL group. While the scores for visual discomfort of both groups were comparable at baseline, after 60 days there was a significantly decreased score in the YG group, whereas the WL group showed significantly increased scores. The results suggest that the yoga practice appeared to reduce visual discomfort, while the group who had no yoga intervention (WL) showed an increase in discomfort at the end of sixty days.

  1. Fundamental Evaluation of Adaptation and Human Capabilities in a Condition Using a System to Give a User an Artificial Oculomotor Function to Control Directions of Both Eyes Independently

    Directory of Open Access Journals (Sweden)

    Fumio Mizuno

    2011-10-01

    Full Text Available To investigate flexible adaptation of visual system, we developed a system to provide a user an artificial oculomotor function to control directions of both eyes. The system named “Virtual Chameleon” consists of two CCD cameras independently controlled and a head-mounted display. The user can control each tracking directions of two cameras with sensors set to both hands so that the user can get independent arbitrary view fields for both eyes. We performed fundamental experiments to evaluate capability to evaluate adaptation to use of Virtual Chameleon and effects on the user's capabilities. Eleven healthy volunteers with normal and corrected-to-normal vision participated in the experiments. The experiments were tests to find out each position of targets put in both side of a subject. In the experiments, a condition using Virtual Chameleon and a condition without it was adopted. We obtained accuracy rates and time intervals to find out target positions as experimental results. The experiments showed all of volunteers became able to actively control independent visual axes and correctly understood two different views by using Virtual Chameleon, even though two independent view fields yielded binocular rivalry to volunteers and binocular rivalry reduced human capabilities compared to cases without Virtual Chameleon.

  2. User Interface History

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms; Myers, Brad A

    2008-01-01

    User Interfaces have been around as long as computers have existed, even well before the field of Human-Computer Interaction was established. Over the years, some papers on the history of Human-Computer Interaction and User Interfaces have appeared, primarily focusing on the graphical interface e...

  3. DOSFAC2 user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Young, M.L.; Chanin, D.

    1997-12-01

    This document describes the DOSFAC2 code, which is used for generating dose-to-source conversion factors for the MACCS2 code. DOSFAC2 is a revised and updated version of the DOSFAC code that was distributed with version 1.5.11 of the MACCS code. included are (1) an overview and background of DOSFAC2, (2) a summary of two new functional capabilities, and (3) a user`s guide. 20 refs., 5 tabs.

  4. Pseudo-interactive monitoring in distributed computing

    International Nuclear Information System (INIS)

    Sfiligoi, I.; Bradley, D.; Livny, M.

    2009-01-01

    Distributed computing, and in particular Grid computing, enables physicists to use thousands of CPU days worth of computing every day, by submitting thousands of compute jobs. Unfortunately, a small fraction of such jobs regularly fail; the reasons vary from disk and network problems to bugs in the user code. A subset of these failures result in jobs being stuck for long periods of time. In order to debug such failures, interactive monitoring is highly desirable; users need to browse through the job log files and check the status of the running processes. Batch systems typically don't provide such services; at best, users get job logs at job termination, and even this may not be possible if the job is stuck in an infinite loop. In this paper we present a novel approach of using regular batch system capabilities of Condor to enable users to access the logs and processes of any running job. This does not provide true interactive access, so commands like vi are not viable, but it does allow operations like ls, cat, top, ps, lsof, netstat and dumping the stack of any process owned by the user; we call this pseudo-interactive monitoring. It is worth noting that the same method can be used to monitor Grid jobs in a glidein-based environment. We further believe that the same mechanism could be applied to many other batch systems.

  5. Pseudo-interactive monitoring in distributed computing

    International Nuclear Information System (INIS)

    Sfiligoi, I; Bradley, D; Livny, M

    2010-01-01

    Distributed computing, and in particular Grid computing, enables physicists to use thousands of CPU days worth of computing every day, by submitting thousands of compute jobs. Unfortunately, a small fraction of such jobs regularly fail; the reasons vary from disk and network problems to bugs in the user code. A subset of these failures result in jobs being stuck for long periods of time. In order to debug such failures, interactive monitoring is highly desirable; users need to browse through the job log files and check the status of the running processes. Batch systems typically don't provide such services; at best, users get job logs at job termination, and even this may not be possible if the job is stuck in an infinite loop. In this paper we present a novel approach of using regular batch system capabilities of Condor to enable users to access the logs and processes of any running job. This does not provide true interactive access, so commands like vi are not viable, but it does allow operations like ls, cat, top, ps, lsof, netstat and dumping the stack of any process owned by the user; we call this pseudo-interactive monitoring. It is worth noting that the same method can be used to monitor Grid jobs in a glidein-based environment. We further believe that the same mechanism could be applied to many other batch systems.

  6. Pseudo-interactive monitoring in distributed computing

    Energy Technology Data Exchange (ETDEWEB)

    Sfiligoi, I.; /Fermilab; Bradley, D.; Livny, M.; /Wisconsin U., Madison

    2009-05-01

    Distributed computing, and in particular Grid computing, enables physicists to use thousands of CPU days worth of computing every day, by submitting thousands of compute jobs. Unfortunately, a small fraction of such jobs regularly fail; the reasons vary from disk and network problems to bugs in the user code. A subset of these failures result in jobs being stuck for long periods of time. In order to debug such failures, interactive monitoring is highly desirable; users need to browse through the job log files and check the status of the running processes. Batch systems typically don't provide such services; at best, users get job logs at job termination, and even this may not be possible if the job is stuck in an infinite loop. In this paper we present a novel approach of using regular batch system capabilities of Condor to enable users to access the logs and processes of any running job. This does not provide true interactive access, so commands like vi are not viable, but it does allow operations like ls, cat, top, ps, lsof, netstat and dumping the stack of any process owned by the user; we call this pseudo-interactive monitoring. It is worth noting that the same method can be used to monitor Grid jobs in a glidein-based environment. We further believe that the same mechanism could be applied to many other batch systems.

  7. The computer code Eurdyn - 1 M. (Release 1) Part 2: User's Manual

    International Nuclear Information System (INIS)

    Donea, J.; Giuliani, S.

    1979-01-01

    This report is the user's manual for the computer code Eurdyn-1 M developed at the J.R.C. Ispra for use in containment and fuel subassembly analyses for fast reactor safety studies. The input data are defined and a test problem is presented to illustrate both the input and the output of results

  8. User's manual for a measurement simulation code

    International Nuclear Information System (INIS)

    Kern, E.A.

    1982-07-01

    The MEASIM code has been developed primarily for modeling process measurements in materials processing facilities associated with the nuclear fuel cycle. In addition, the code computes materials balances and the summation of materials balances along with associated variances. The code has been used primarily in performance assessment of materials' accounting systems. This report provides the necessary information for a potential user to employ the code in these applications. A number of examples that demonstrate most of the capabilities of the code are provided

  9. User interface user's guide for HYPGEN

    Science.gov (United States)

    Chiu, Ing-Tsau

    1992-01-01

    The user interface (UI) of HYPGEN is developed using Panel Library to shorten the learning curve for new users and provide easier ways to run HYPGEN for casual users as well as for advanced users. Menus, buttons, sliders, and type-in fields are used extensively in UI to allow users to point and click with a mouse to choose various available options or to change values of parameters. On-line help is provided to give users information on using UI without consulting the manual. Default values are set for most parameters and boundary conditions are determined by UI to further reduce the effort needed to run HYPGEN; however, users are free to make any changes and save it in a file for later use. A hook to PLOT3D is built in to allow graphics manipulation. The viewpoint and min/max box for PLOT3D windows are computed by UI and saved in a PLOT3D journal file. For large grids which take a long time to generate on workstations, the grid generator (HYPGEN) can be run on faster computers such as Crays, while UI stays at the workstation.

  10. Trustworthy reconfigurable systems enhancing the security capabilities of reconfigurable hardware architectures

    CERN Document Server

    Feller, Thomas

    2014-01-01

    ?Thomas Feller sheds some light on trust anchor architectures fortrustworthy reconfigurable systems. He is presenting novel concepts enhancing the security capabilities of reconfigurable hardware.Almost invisible to the user, many computer systems are embedded into everyday artifacts, such as cars, ATMs, and pacemakers. The significant growth of this market segment within the recent years enforced a rethinking with respect to the security properties and the trustworthiness of these systems. The trustworthiness of a system in general equates to the integrity of its system components. Hardware-b

  11. Interactive computer-enhanced remote viewing system with data fusion capabilities

    International Nuclear Information System (INIS)

    Walter, T.J.

    1997-01-01

    Robotic missions will increasingly involve sending autonomous and semiautonomous vehicles into unstructured work environments. Mission success will often depend on the ability to accurately map scenes, to combine information from a variety of sensor types, to convey the three-dimensional (3-D) characteristics of these spaces to operators, and to construct geometric model task planning and collision avoidance. To meet these needs, an interactive computer-enhanced remote viewing system (ICERVS) has been developed with general-purpose capabilities for data visualization and geometric modeling. ICERVS has been augmented with software that enables fusing data from multiple mapping sensors and poses to reduce the error effects in individual data sets and improve the mapping accuracy of a work space

  12. User-centered design in brain-computer interfaces-a case study.

    Science.gov (United States)

    Schreuder, Martijn; Riccio, Angela; Risetti, Monica; Dähne, Sven; Ramsay, Andrew; Williamson, John; Mattia, Donatella; Tangermann, Michael

    2013-10-01

    The array of available brain-computer interface (BCI) paradigms has continued to grow, and so has the corresponding set of machine learning methods which are at the core of BCI systems. The latter have evolved to provide more robust data analysis solutions, and as a consequence the proportion of healthy BCI users who can use a BCI successfully is growing. With this development the chances have increased that the needs and abilities of specific patients, the end-users, can be covered by an existing BCI approach. However, most end-users who have experienced the use of a BCI system at all have encountered a single paradigm only. This paradigm is typically the one that is being tested in the study that the end-user happens to be enrolled in, along with other end-users. Though this corresponds to the preferred study arrangement for basic research, it does not ensure that the end-user experiences a working BCI. In this study, a different approach was taken; that of a user-centered design. It is the prevailing process in traditional assistive technology. Given an individual user with a particular clinical profile, several available BCI approaches are tested and - if necessary - adapted to him/her until a suitable BCI system is found. Described is the case of a 48-year-old woman who suffered from an ischemic brain stem stroke, leading to a severe motor- and communication deficit. She was enrolled in studies with two different BCI systems before a suitable system was found. The first was an auditory event-related potential (ERP) paradigm and the second a visual ERP paradigm, both of which are established in literature. The auditory paradigm did not work successfully, despite favorable preconditions. The visual paradigm worked flawlessly, as found over several sessions. This discrepancy in performance can possibly be explained by the user's clinical deficit in several key neuropsychological indicators, such as attention and working memory. While the auditory paradigm relies

  13. ADVANCED SIMULATION CAPABILITY FOR ENVIRONMENTAL MANAGEMENT- CURRENT STATUS AND PHASE II DEMONSTRATION RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R.

    2013-02-26

    The U.S. Department of Energy (USDOE) Office of Environmental Management (EM), Office of Soil and Groundwater, is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high-performance computing tool facilitates integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of computer software capabilities with an emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Toolsets and High-Performance Computing (HPC) Multiprocess Simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, toolsets for interaction with Platform, and verification and model confidence testing. The Platform and HPC capabilities are being tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities. The Phase I demonstration focusing on individual capabilities of the initial toolsets was completed in 2010. The Phase II demonstration completed in 2012 focused on showcasing integrated ASCEM capabilities. For Phase II, the Hanford Site deep vadose zone (BC Cribs) served as an application site for an end-to-end demonstration of capabilities, with emphasis on integration and linkages between the Platform and HPC components. Other demonstrations

  14. Development and Performance of the Modularized, High-performance Computing and Hybrid-architecture Capable GEOS-Chem Chemical Transport Model

    Science.gov (United States)

    Long, M. S.; Yantosca, R.; Nielsen, J.; Linford, J. C.; Keller, C. A.; Payer Sulprizio, M.; Jacob, D. J.

    2014-12-01

    The GEOS-Chem global chemical transport model (CTM), used by a large atmospheric chemistry research community, has been reengineered to serve as a platform for a range of computational atmospheric chemistry science foci and applications. Development included modularization for coupling to general circulation and Earth system models (ESMs) and the adoption of co-processor capable atmospheric chemistry solvers. This was done using an Earth System Modeling Framework (ESMF) interface that operates independently of GEOS-Chem scientific code to permit seamless transition from the GEOS-Chem stand-alone serial CTM to deployment as a coupled ESM module. In this manner, the continual stream of updates contributed by the CTM user community is automatically available for broader applications, which remain state-of-science and directly referenceable to the latest version of the standard GEOS-Chem CTM. These developments are now available as part of the standard version of the GEOS-Chem CTM. The system has been implemented as an atmospheric chemistry module within the NASA GEOS-5 ESM. The coupled GEOS-5/GEOS-Chem system was tested for weak and strong scalability and performance with a tropospheric oxidant-aerosol simulation. Results confirm that the GEOS-Chem chemical operator scales efficiently for any number of processes. Although inclusion of atmospheric chemistry in ESMs is computationally expensive, the excellent scalability of the chemical operator means that the relative cost goes down with increasing number of processes, making fine-scale resolution simulations possible.

  15. Development of a Tool for Measuring and Analyzing Computer User Satisfaction

    OpenAIRE

    James E. Bailey; Sammy W. Pearson

    1983-01-01

    This paper reports on a technique for measuring and analyzing computer user satisfaction. Starting with the literature and using the critical incident interview technique, 39 factors affecting satisfaction were identified. Adapting the semantic differential scaling technique, a questionnaire for measuring satisfaction was then created. Finally, the instrument was pilot tested to prove its validity and reliability. The results of this effort and suggested uses of the questionnaire are reported...

  16. EMERGING SCOPE OF MEDICAL LABORATORIES SYSTEMS USING CLOUD COMPUTING FROM END-USER PERSPECTIVE

    OpenAIRE

    RAFİ, Zeeshan; DAĞ, Hasan; AYDIN, Mehmet N.

    2016-01-01

    In today’s world the rapid and reliable information extraction has become everybody’s need. Cloud computing is one of the emerging technology solutions to answer this query. This technology is providing many opportunities to the users in different terms to produce rapid and cost effective solution. This study helps in understanding the scope of the cloud computing as a solution in the field of medical laboratory systems. A study has been conducted to determine the need of the services require...

  17. Common Graphics Library (CGL). Volume 2: Low-level user's guide

    Science.gov (United States)

    Taylor, Nancy L.; Hammond, Dana P.; Theophilos, Pauline M.

    1989-01-01

    The intent is to instruct the users of the Low-Level routines of the Common Graphics Library (CGL). The Low-Level routines form an application-independent graphics package enabling the user community to construct and design scientific charts conforming to the publication and/or viewgraph process. The Low-Level routines allow the user to design unique or unusual report-quality charts from a set of graphics utilities. The features of these routines can be used stand-alone or in conjunction with other packages to enhance or augment their capabilities. This library is written in ANSI FORTRAN 77, and currently uses a CORE-based underlying graphics package, and is therefore machine-independent, providing support for centralized and/or distributed computer systems.

  18. User-centric incentive design for participatory mobile phone sensing

    Science.gov (United States)

    Gao, Wei; Lu, Haoyang

    2014-05-01

    Mobile phone sensing is a critical underpinning of pervasive mobile computing, and is one of the key factors for improving people's quality of life in modern society via collective utilization of the on-board sensing capabilities of people's smartphones. The increasing demands for sensing services and ambient awareness in mobile environments highlight the necessity of active participation of individual mobile users in sensing tasks. User incentives for such participation have been continuously offered from an application-centric perspective, i.e., as payments from the sensing server, to compensate users' sensing costs. These payments, however, are manipulated to maximize the benefits of the sensing server, ignoring the runtime flexibility and benefits of participating users. This paper presents a novel framework of user-centric incentive design, and develops a universal sensing platform which translates heterogenous sensing tasks to a generic sensing plan specifying the task-independent requirements of sensing performance. We use this sensing plan as input to reduce three categories of sensing costs, which together cover the possible sources hindering users' participation in sensing.

  19. A New User Interface for On-Demand Customizable Data Products for Sensors in a SensorWeb

    Science.gov (United States)

    Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Sullivan, Don

    2011-01-01

    A SensorWeb is a set of sensors, which can consist of ground, airborne and space-based sensors interoperating in an automated or autonomous collaborative manner. The NASA SensorWeb toolbox, developed at NASA/GSFC in collaboration with NASA/JPL, NASA/Ames and other partners, is a set of software and standards that (1) enables users to create virtual private networks of sensors over open networks; (2) provides the capability to orchestrate their actions; (3) provides the capability to customize the output data products and (4) enables automated delivery of the data products to the users desktop. A recent addition to the SensorWeb Toolbox is a new user interface, together with web services co-resident with the sensors, to enable rapid creation, loading and execution of new algorithms for processing sensor data. The web service along with the user interface follows the Open Geospatial Consortium (OGC) standard called Web Coverage Processing Service (WCPS). This presentation will detail the prototype that was built and how the WCPS was tested against a HyspIRI flight testbed and an elastic computation cloud on the ground with EO-1 data. HyspIRI is a future NASA decadal mission. The elastic computation cloud stores EO-1 data and runs software similar to Amazon online shopping.

  20. SHEAT for PC. A computer code for probabilistic seismic hazard analysis for personal computer, user's manual

    International Nuclear Information System (INIS)

    Yamada, Hiroyuki; Tsutsumi, Hideaki; Ebisawa, Katsumi; Suzuki, Masahide

    2002-03-01

    The SHEAT code developed at Japan Atomic Energy Research Institute is for probabilistic seismic hazard analysis which is one of the tasks needed for seismic Probabilistic Safety Assessment (PSA) of a nuclear power plant. At first, SHEAT was developed as the large sized computer version. In addition, a personal computer version was provided to improve operation efficiency and generality of this code in 2001. It is possible to perform the earthquake hazard analysis, display and the print functions with the Graphical User Interface. With the SHEAT for PC code, seismic hazard which is defined as an annual exceedance frequency of occurrence of earthquake ground motions at various levels of intensity at a given site is calculated by the following two steps as is done with the large sized computer. One is the modeling of earthquake generation around a site. Future earthquake generation (locations, magnitudes and frequencies of postulated earthquake) is modeled based on the historical earthquake records, active fault data and expert judgment. Another is the calculation of probabilistic seismic hazard at the site. An earthquake ground motion is calculated for each postulated earthquake using an attenuation model taking into account its standard deviation. Then the seismic hazard at the site is calculated by summing the frequencies of ground motions by all the earthquakes. This document is the user's manual of the SHEAT for PC code. It includes: (1) Outline of the code, which include overall concept, logical process, code structure, data file used and special characteristics of code, (2) Functions of subprogram and analytical models in them, (3) Guidance of input and output data, (4) Sample run result, and (5) Operational manual. (author)

  1. Authentication of Smartphone Users Based on Activity Recognition and Mobile Sensing

    Science.gov (United States)

    Ehatisham-ul-Haq, Muhammad; Azam, Muhammad Awais; Loo, Jonathan; Shuang, Kai; Islam, Syed; Naeem, Usman; Amin, Yasar

    2017-01-01

    Smartphones are context-aware devices that provide a compelling platform for ubiquitous computing and assist users in accomplishing many of their routine tasks anytime and anywhere, such as sending and receiving emails. The nature of tasks conducted with these devices has evolved with the exponential increase in the sensing and computing capabilities of a smartphone. Due to the ease of use and convenience, many users tend to store their private data, such as personal identifiers and bank account details, on their smartphone. However, this sensitive data can be vulnerable if the device gets stolen or lost. A traditional approach for protecting this type of data on mobile devices is to authenticate users with mechanisms such as PINs, passwords, and fingerprint recognition. However, these techniques are vulnerable to user compliance and a plethora of attacks, such as smudge attacks. The work in this paper addresses these challenges by proposing a novel authentication framework, which is based on recognizing the behavioral traits of smartphone users using the embedded sensors of smartphone, such as Accelerometer, Gyroscope and Magnetometer. The proposed framework also provides a platform for carrying out multi-class smart user authentication, which provides different levels of access to a wide range of smartphone users. This work has been validated with a series of experiments, which demonstrate the effectiveness of the proposed framework. PMID:28878177

  2. Authentication of Smartphone Users Based on Activity Recognition and Mobile Sensing.

    Science.gov (United States)

    Ehatisham-Ul-Haq, Muhammad; Azam, Muhammad Awais; Loo, Jonathan; Shuang, Kai; Islam, Syed; Naeem, Usman; Amin, Yasar

    2017-09-06

    Smartphones are context-aware devices that provide a compelling platform for ubiquitous computing and assist users in accomplishing many of their routine tasks anytime and anywhere, such as sending and receiving emails. The nature of tasks conducted with these devices has evolved with the exponential increase in the sensing and computing capabilities of a smartphone. Due to the ease of use and convenience, many users tend to store their private data, such as personal identifiers and bank account details, on their smartphone. However, this sensitive data can be vulnerable if the device gets stolen or lost. A traditional approach for protecting this type of data on mobile devices is to authenticate users with mechanisms such as PINs, passwords, and fingerprint recognition. However, these techniques are vulnerable to user compliance and a plethora of attacks, such as smudge attacks. The work in this paper addresses these challenges by proposing a novel authentication framework, which is based on recognizing the behavioral traits of smartphone users using the embedded sensors of smartphone, such as Accelerometer, Gyroscope and Magnetometer. The proposed framework also provides a platform for carrying out multi-class smart user authentication, which provides different levels of access to a wide range of smartphone users. This work has been validated with a series of experiments, which demonstrate the effectiveness of the proposed framework.

  3. Effect of yoga on self-rated visual discomfort in computer users

    Directory of Open Access Journals (Sweden)

    Deginal Rajendra

    2006-12-01

    Full Text Available Abstract Background 'Dry eye' appears to be the main contributor to the symptoms of computer vision syndrome. Regular breaks and the use of artificial tears or certain eye drops are some of the options to reduce visual discomfort. A combination of yoga practices have been shown to reduce visual strain in persons with progressive myopia. The present randomized controlled trial was planned to evaluate the effect of a combination of yoga practices on self-rated symptoms of visual discomfort in professional computer users in Bangalore. Methods Two hundred and ninety one professional computer users were randomly assigned to two groups, yoga (YG, n = 146 and wait list control (WL, n = 145. Both groups were assessed at baseline and after sixty days for self-rated visual discomfort using a standard questionnaire. During these 60 days the YG group practiced an hour of yoga daily for five days in a week and the WL group did their usual recreational activities also for an hour daily for the same duration. At 60 days there were 62 in the YG group and 55 in the WL group. Results While the scores for visual discomfort of both groups were comparable at baseline, after 60 days there was a significantly decreased score in the YG group, whereas the WL group showed significantly increased scores. Conclusion The results suggest that the yoga practice appeared to reduce visual discomfort, while the group who had no yoga intervention (WL showed an increase in discomfort at the end of sixty days.

  4. Calculations of reactor-accident consequences, Version 2. CRAC2: computer code user's guide

    International Nuclear Information System (INIS)

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    1983-02-01

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems

  5. PROACT user's guide: how to use the pallet recovery opportunity analysis computer tool

    Science.gov (United States)

    E. Bradley Hager; A.L. Hammett; Philip A. Araman

    2003-01-01

    Pallet recovery projects are environmentally responsible and offer promising business opportunities. The Pallet Recovery Opportunity Analysis Computer Tool (PROACT) assesses the operational and financial feasibility of potential pallet recovery projects. The use of project specific information supplied by the user increases the accuracy and the validity of the...

  6. DYNAPO 4 - a fluid system and frames analysis computer program

    International Nuclear Information System (INIS)

    Lefter, J.D.; Ahdout, H.

    1982-01-01

    DYNAPO 4 is a user oriented specialized computer program, capable of analyzing three-dimensional linear elastic piping systems or frames for static loads, dynamic loads represented by acceleration response spectra, transient dynamic loads represented by harmonic, polynomial of second order, and time history forcing functions. DYNAPO 4 has plotting capability, which plots the input configuration of the piping system or of the structure and also plots its deformed shape after the load is applied. DYNAPO 4 performs the analysis for ASME Section III Class 1, Class 2, and 3, piping, and provides the user with stress reports as per ASME and ANSI Code requirements. 3 refs

  7. A Cloud-User Protocol Based on Ciphertext Watermarking Technology

    Directory of Open Access Journals (Sweden)

    Keyang Liu

    2017-01-01

    Full Text Available With the growth of cloud computing technology, more and more Cloud Service Providers (CSPs begin to provide cloud computing service to users and ask for users’ permission of using their data to improve the quality of service (QoS. Since these data are stored in the form of plain text, they bring about users’ worry for the risk of privacy leakage. However, the existing watermark embedding and encryption technology is not suitable for protecting the Right to Be Forgotten. Hence, we propose a new Cloud-User protocol as a solution for plain text outsourcing problem. We only allow users and CSPs to embed the ciphertext watermark, which is generated and embedded by Trusted Third Party (TTP, into the ciphertext data for transferring. Then, the receiver decrypts it and obtains the watermarked data in plain text. In the arbitration stage, feature extraction and the identity of user will be used to identify the data. The fixed Hamming distance code can help raise the system’s capability for watermarks as much as possible. Extracted watermark can locate the unauthorized distributor and protect the right of honest CSP. The results of experiments demonstrate the security and validity of our protocol.

  8. Utilization Possibilities of Area Definition in User Space for User-Centric Pervasive-Adaptive Systems

    Science.gov (United States)

    Krejcar, Ondrej

    The ability to let a mobile device determine its location in an indoor environment supports the creation of a new range of mobile information system applications. The goal of my project is to complement the data networking capabilities of RF wireless LANs with accurate user location and tracking capabilities for user needed data prebuffering. I created a location based system enhancement for locating and tracking users of indoor information system. User position is used for data prebuffering and pushing information from a server to his mobile client. All server data is saved as artifacts (together) with its indoor position information. The area definition for artifacts selecting is described for current and predicted user position along with valuating options for artifacts ranging. Future trends are also discussed.

  9. Parallel Computing in SCALE

    International Nuclear Information System (INIS)

    DeHart, Mark D.; Williams, Mark L.; Bowman, Stephen M.

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  10. Computer applications in nuclear medicine

    International Nuclear Information System (INIS)

    Lancaster, J.L.; Lasher, J.C.; Blumhardt, R.

    1987-01-01

    Digital computers were introduced to nuclear medicine research as an imaging modality in the mid-1960s. Widespread use of imaging computers (scintigraphic computers) was not seen in nuclear medicine clinics until the mid-1970s. For the user, the ability to acquire scintigraphic images into the computer for quantitative purposes, with accurate selection of regions of interest (ROIs), promised almost endless computational capabilities. Investigators quickly developed many new methods for quantitating the distribution patterns of radiopharmaceuticals within the body both spatially and temporally. The computer was used to acquire data on practically every organ that could be imaged by means of gamma cameras or rectilinear scanners. Methods of image processing borrowed from other disciplines were applied to scintigraphic computer images in an attempt to improve image quality. Image processing in nuclear medicine has evolved into a relatively extensive set of tasks that can be called on by the user to provide additional clinical information rather than to improve image quality. Digital computers are utilized in nuclear medicine departments for nonimaging applications also, Patient scheduling, archiving, radiopharmaceutical inventory, radioimmunoassay (RIA), and health physics are just a few of the areas in which the digital computer has proven helpful. The computer is useful in any area in which a large quantity of data needs to be accurately managed, especially over a long period of time

  11. Sandia QIS Capabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Muller, Richard P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    Sandia National Laboratories has developed a broad set of capabilities in quantum information science (QIS), including elements of quantum computing, quantum communications, and quantum sensing. The Sandia QIS program is built atop unique DOE investments at the laboratories, including the MESA microelectronics fabrication facility, the Center for Integrated Nanotechnologies (CINT) facilities (joint with LANL), the Ion Beam Laboratory, and ASC High Performance Computing (HPC) facilities. Sandia has invested $75 M of LDRD funding over 12 years to develop unique, differentiating capabilities that leverage these DOE infrastructure investments.

  12. Advanced Simulation Capability for Environmental Management - Current Status and Phase II Demonstration Results - 13161

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, Roger R.; Flach, Greg [Savannah River National Laboratory, Savannah River Site, Bldg 773-43A, Aiken, SC 29808 (United States); Freshley, Mark D.; Freedman, Vicky; Gorton, Ian [Pacific Northwest National Laboratory, MSIN K9-33, P.O. Box 999, Richland, WA 99352 (United States); Dixon, Paul; Moulton, J. David [Los Alamos National Laboratory, MS B284, P.O. Box 1663, Los Alamos, NM 87544 (United States); Hubbard, Susan S.; Faybishenko, Boris; Steefel, Carl I.; Finsterle, Stefan [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, MS 50B-4230, Berkeley, CA 94720 (United States); Marble, Justin [Department of Energy, 19901 Germantown Road, Germantown, MD 20874-1290 (United States)

    2013-07-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management (EM), Office of Soil and Groundwater, is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high-performance computing tool facilitates integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of computer software capabilities with an emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Tool-sets and High-Performance Computing (HPC) Multi-process Simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, tool-sets for interaction with Platform, and verification and model confidence testing. The Platform and HPC capabilities are being tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities. The Phase I demonstration focusing on individual capabilities of the initial tool-sets was completed in 2010. The Phase II demonstration completed in 2012 focused on showcasing integrated ASCEM capabilities. For Phase II, the Hanford Site deep vadose zone (BC Cribs) served as an application site for an end-to-end demonstration of capabilities, with emphasis on integration and linkages between the Platform and HPC components. Other demonstrations

  13. Advanced Simulation Capability for Environmental Management - Current Status and Phase II Demonstration Results - 13161

    International Nuclear Information System (INIS)

    Seitz, Roger R.; Flach, Greg; Freshley, Mark D.; Freedman, Vicky; Gorton, Ian; Dixon, Paul; Moulton, J. David; Hubbard, Susan S.; Faybishenko, Boris; Steefel, Carl I.; Finsterle, Stefan; Marble, Justin

    2013-01-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management (EM), Office of Soil and Groundwater, is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high-performance computing tool facilitates integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of computer software capabilities with an emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Tool-sets and High-Performance Computing (HPC) Multi-process Simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, tool-sets for interaction with Platform, and verification and model confidence testing. The Platform and HPC capabilities are being tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities. The Phase I demonstration focusing on individual capabilities of the initial tool-sets was completed in 2010. The Phase II demonstration completed in 2012 focused on showcasing integrated ASCEM capabilities. For Phase II, the Hanford Site deep vadose zone (BC Cribs) served as an application site for an end-to-end demonstration of capabilities, with emphasis on integration and linkages between the Platform and HPC components. Other demonstrations

  14. User's guide to the 'DISPOSALS' model

    International Nuclear Information System (INIS)

    Groom, M.S.; James, A.R.; Laundy, R.S.

    1984-03-01

    This report provides a User's Guide to the 'DISPOSALS' computer model and includes instructions on how to set up and run a specific problem together with details of the scope, theoretical basis, data requirements and capabilities of the model. The function of the 'DISPOSALS' model is to make assignments of nuclear waste material in an optimum manner to a number of disposal sites each subject to a number of constraints such as limits on the volume and activity. The user is able to vary the number of disposal sites, the range and limits of the constraints to be applied to each disposal site and the objective function for optimisation. The model is based on the Linear Programming technique and uses CAP Scientific's LAMPS and MAGIC packages. Currently the model has been implemented on CAP Scientific's VAX 11/750 minicomputer. (author)

  15. An Australian Perspective On The Challenges For Computer And Network Security For Novice End-Users

    Directory of Open Access Journals (Sweden)

    Patryk Szewczyk

    2012-12-01

    Full Text Available It is common for end-users to have difficulty in using computer or network security appropriately and thus have often been ridiculed when misinterpreting instructions or procedures. This discussion paper details the outcomes of research undertaken over the past six years on why security is overly complex for end-users. The results indicate that multiple issues may render end-users vulnerable to security threats and that there is no single solution to address these problems. Studies on a small group of senior citizens has shown that educational seminars can be beneficial in ensuring that simple security aspects are understood and used appropriately.

  16. Assessing mouse alternatives to access to computer: a case study of a user with cerebral palsy.

    Science.gov (United States)

    Pousada, Thais; Pareira, Javier; Groba, Betania; Nieto, Laura; Pazos, Alejandro

    2014-01-01

    The purpose of this study is to describe the process of assessment of three assistive devices to meet the needs of a woman with cerebral palsy (CP) in order to provide her with computer access and use. The user has quadriplegic CP, with anarthria, using a syllabic keyboard. Devices were evaluated through a three-step approach: (a) use of a questionnaire to preselect potential assistive technologies, (b) use of an eTAO tool to determine the effectiveness of each devised, and (c) a conducting semi-structured interview to obtain qualitative data. Touch screen, joystick, and trackball were the preselected devices. The best device that met the user's needs and priorities was joystick. The finding was corroborated by both the eTAO tool and the semi-structured interview. Computers are a basic form of social participation. It is important to consider the special needs and priorities of users and to try different devices when undertaking a device-selection process. Environmental and personal factors have to be considered, as well. This leads to a need to evaluate new tools in order to provide the appropriate support. The eTAO could be a suitable instrument for this purpose. Additional research is also needed to understand how to better match devices with different user populations and how to comprehensively evaluate emerging technologies relative to users with disabilities.

  17. RELAP5-3D User Problems

    Energy Technology Data Exchange (ETDEWEB)

    Riemke, Richard Allan

    2002-09-01

    The Reactor Excursion and Leak Analysis Program with 3D capability1 (RELAP5-3D) is a reactor system analysis code that has been developed at the Idaho National Engineering and Environmental Laboratory (INEEL) for the U. S. Department of Energy (DOE). The 3D capability in RELAP5-3D includes 3D hydrodynamics2 and 3D neutron kinetics3,4. Assessment, verification, and validation of the 3D capability in RELAP5-3D is discussed in the literature5,6,7,8,9,10. Additional assessment, verification, and validation of the 3D capability of RELAP5-3D will be presented in other papers in this users seminar. As with any software, user problems occur. User problems usually fall into the categories of input processing failure, code execution failure, restart/renodalization failure, unphysical result, and installation. This presentation will discuss some of the more generic user problems that have been reported on RELAP5-3D as well as their resolution.

  18. RELAP5-3D User Problems

    International Nuclear Information System (INIS)

    Riemke, Richard Allan

    2001-01-01

    The Reactor Excursion and Leak Analysis Program with 3D capability (RELAP5-3D) is a reactor system analysis code that has been developed at the Idaho National Engineering and Environmental Laboratory (INEEL) for the U. S. Department of Energy (DOE). The 3D capability in RELAP5-3D includes 3D hydrodynamics and 3D neutron kinetics. Assessment, verification, and validation of the 3D capability in RELAP5-3D is discussed in the literature. Additional assessment, verification, and validation of the 3D capability of RELAP5-3D will be presented in other papers in this users seminar. As with any software, user problems occur. User problems usually fall into the categories of input processing failure, code execution failure, restart/renodalization failure, unphysical result, and installation. This presentation will discuss some of the more generic user problems that have been reported on RELAP5-3D as well as their resolution

  19. Capabilities and Advantages of Cloud Computing in the Implementation of Electronic Health Record.

    Science.gov (United States)

    Ahmadi, Maryam; Aslani, Nasim

    2018-01-01

    With regard to the high cost of the Electronic Health Record (EHR), in recent years the use of new technologies, in particular cloud computing, has increased. The purpose of this study was to review systematically the studies conducted in the field of cloud computing. The present study was a systematic review conducted in 2017. Search was performed in the Scopus, Web of Sciences, IEEE, Pub Med and Google Scholar databases by combination keywords. From the 431 article that selected at the first, after applying the inclusion and exclusion criteria, 27 articles were selected for surveyed. Data gathering was done by a self-made check list and was analyzed by content analysis method. The finding of this study showed that cloud computing is a very widespread technology. It includes domains such as cost, security and privacy, scalability, mutual performance and interoperability, implementation platform and independence of Cloud Computing, ability to search and exploration, reducing errors and improving the quality, structure, flexibility and sharing ability. It will be effective for electronic health record. According to the findings of the present study, higher capabilities of cloud computing are useful in implementing EHR in a variety of contexts. It also provides wide opportunities for managers, analysts and providers of health information systems. Considering the advantages and domains of cloud computing in the establishment of HER, it is recommended to use this technology.

  20. Evolution of User Analysis on the Grid in ATLAS

    CERN Document Server

    Legger, Federica; The ATLAS collaboration

    2016-01-01

    More than one thousand physicists analyse data collected by the ATLAS experiment at the Large Hadron Collider (LHC) at CERN through 150 computing facilities around the world. Efficient distributed analysis requires optimal resource usage and the interplay of several factors: robust grid and software infrastructures, system capability to adapt to different workloads. The continuous automatic validation of grid sites and the user support provided by a dedicated team of expert shifters have been proven to provide a solid distributed analysis system for ATLAS users. Based on the experience from the first run of the LHC, substantial improvements to the ATLAS computing system have been made to optimize both production and analysis workflows. These include the re-design of the production and data management systems, a new analysis data format and event model, and the development of common reduction and analysis frameworks. The impact of such changes on the distributed analysis system is evaluated. More than 100 mill...

  1. Computer aided systems human engineering: A hypermedia tool

    Science.gov (United States)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  2. Assessing the Role of User Computer Self-Efficacy, Cybersecurity Countermeasures Awareness, and Cybersecurity Skills toward Computer Misuse Intention at Government Agencies

    Science.gov (United States)

    Choi, Min Suk

    2013-01-01

    Cybersecurity threats and vulnerabilities are causing substantial financial losses for governments and organizations all over the world. Cybersecurity criminals are stealing more than one billion dollars from banks every year by exploiting vulnerabilities caused by bank users' computer misuse. Cybersecurity breaches are threatening the common…

  3. Neuronvisio: A Graphical User Interface with 3D Capabilities for NEURON.

    Science.gov (United States)

    Mattioni, Michele; Cohen, Uri; Le Novère, Nicolas

    2012-01-01

    The NEURON simulation environment is a commonly used tool to perform electrical simulation of neurons and neuronal networks. The NEURON User Interface, based on the now discontinued InterViews library, provides some limited facilities to explore models and to plot their simulation results. Other limitations include the inability to generate a three-dimensional visualization, no standard mean to save the results of simulations, or to store the model geometry within the results. Neuronvisio (http://neuronvisio.org) aims to address these deficiencies through a set of well designed python APIs and provides an improved UI, allowing users to explore and interact with the model. Neuronvisio also facilitates access to previously published models, allowing users to browse, download, and locally run NEURON models stored in ModelDB. Neuronvisio uses the matplotlib library to plot simulation results and uses the HDF standard format to store simulation results. Neuronvisio can be viewed as an extension of NEURON, facilitating typical user workflows such as model browsing, selection, download, compilation, and simulation. The 3D viewer simplifies the exploration of complex model structure, while matplotlib permits the plotting of high-quality graphs. The newly introduced ability of saving numerical results allows users to perform additional analysis on their previous simulations.

  4. Off-Gas Adsorption Model Capabilities and Recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Lyon, Kevin L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Welty, Amy K. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Law, Jack [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ladshaw, Austin [Georgia Inst. of Technology, Atlanta, GA (United States); Yiacoumi, Sotira [Georgia Inst. of Technology, Atlanta, GA (United States); Tsouris, Costas [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-03-01

    Off-gas treatment is required to reduce emissions from aqueous fuel reprocessing. Evaluating the products of innovative gas adsorption research requires increased computational simulation capability to more effectively transition from fundamental research to operational design. Early modeling efforts produced the Off-Gas SeParation and REcoverY (OSPREY) model that, while efficient in terms of computation time, was of limited value for complex systems. However, the computational and programming lessons learned in development of the initial model were used to develop Discontinuous Galerkin OSPREY (DGOSPREY), a more effective model. Initial comparisons between OSPREY and DGOSPREY show that, while OSPREY does reasonably well to capture the initial breakthrough time, it displays far too much numerical dispersion to accurately capture the real shape of the breakthrough curves. DGOSPREY is a much better tool as it utilizes a more stable set of numerical methods. In addition, DGOSPREY has shown the capability to capture complex, multispecies adsorption behavior, while OSPREY currently only works for a single adsorbing species. This capability makes DGOSPREY ultimately a more practical tool for real world simulations involving many different gas species. While DGOSPREY has initially performed very well, there is still need for improvement. The current state of DGOSPREY does not include any micro-scale adsorption kinetics and therefore assumes instantaneous adsorption. This is a major source of error in predicting water vapor breakthrough because the kinetics of that adsorption mechanism is particularly slow. However, this deficiency can be remedied by building kinetic kernels into DGOSPREY. Another source of error in DGOSPREY stems from data gaps in single species, such as Kr and Xe, isotherms. Since isotherm data for each gas is currently available at a single temperature, the model is unable to predict adsorption at temperatures outside of the set of data currently

  5. Controlling user access to electronic resources without password

    Science.gov (United States)

    Smith, Fred Hewitt

    2015-06-16

    Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes pre-determining an association of the restricted computer resource and computer-resource-proximal environmental information. Indicia of user-proximal environmental information are received from a user requesting access to the restricted computer resource. Received indicia of user-proximal environmental information are compared to associated computer-resource-proximal environmental information. User access to the restricted computer resource is selectively granted responsive to a favorable comparison in which the user-proximal environmental information is sufficiently similar to the computer-resource proximal environmental information. In at least some embodiments, the process further includes comparing user-supplied biometric measure and comparing it with a predetermined association of at least one biometric measure of an authorized user. Access to the restricted computer resource is granted in response to a favorable comparison.

  6. User's Self-Prediction of Performance in Motor Imagery Brain-Computer Interface.

    Science.gov (United States)

    Ahn, Minkyu; Cho, Hohyun; Ahn, Sangtae; Jun, Sung C

    2018-01-01

    Performance variation is a critical issue in motor imagery brain-computer interface (MI-BCI), and various neurophysiological, psychological, and anatomical correlates have been reported in the literature. Although the main aim of such studies is to predict MI-BCI performance for the prescreening of poor performers, studies which focus on the user's sense of the motor imagery process and directly estimate MI-BCI performance through the user's self-prediction are lacking. In this study, we first test each user's self-prediction idea regarding motor imagery experimental datasets. Fifty-two subjects participated in a classical, two-class motor imagery experiment and were asked to evaluate their easiness with motor imagery and to predict their own MI-BCI performance. During the motor imagery experiment, an electroencephalogram (EEG) was recorded; however, no feedback on motor imagery was given to subjects. From EEG recordings, the offline classification accuracy was estimated and compared with several questionnaire scores of subjects, as well as with each subject's self-prediction of MI-BCI performance. The subjects' performance predictions during motor imagery task showed a high positive correlation ( r = 0.64, p performance even without feedback information. This implies that the human brain is an active learning system and, by self-experiencing the endogenous motor imagery process, it can sense and adopt the quality of the process. Thus, it is believed that users may be able to predict MI-BCI performance and results may contribute to a better understanding of low performance and advancing BCI.

  7. Brain-computer interface controlled gaming: evaluation of usability by severely motor restricted end-users.

    Science.gov (United States)

    Holz, Elisa Mira; Höhne, Johannes; Staiger-Sälzer, Pit; Tangermann, Michael; Kübler, Andrea

    2013-10-01

    Connect-Four, a new sensorimotor rhythm (SMR) based brain-computer interface (BCI) gaming application, was evaluated by four severely motor restricted end-users; two were in the locked-in state and had unreliable eye-movement. Following the user-centred approach, usability of the BCI prototype was evaluated in terms of effectiveness (accuracy), efficiency (information transfer rate (ITR) and subjective workload) and users' satisfaction. Online performance varied strongly across users and sessions (median accuracy (%) of end-users: A=.65; B=.60; C=.47; D=.77). Our results thus yielded low to medium effectiveness in three end-users and high effectiveness in one end-user. Consequently, ITR was low (0.05-1.44bits/min). Only two end-users were able to play the game in free-mode. Total workload was moderate but varied strongly across sessions. Main sources of workload were mental and temporal demand. Furthermore, frustration contributed to the subjective workload of two end-users. Nevertheless, most end-users accepted the BCI application well and rated satisfaction medium to high. Sources for dissatisfaction were (1) electrode gel and cap, (2) low effectiveness, (3) time-consuming adjustment and (4) not easy-to-use BCI equipment. All four end-users indicated ease of use as being one of the most important aspect of BCI. Effectiveness and efficiency are lower as compared to applications using the event-related potential as input channel. Nevertheless, the SMR-BCI application was satisfactorily accepted by the end-users and two of four could imagine using the BCI application in their daily life. Thus, despite moderate effectiveness and efficiency BCIs might be an option when controlling an application for entertainment. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. A high performance scientific cloud computing environment for materials simulations

    Science.gov (United States)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  9. User Behavior Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Turcotte, Melissa [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Moore, Juston Shane [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-28

    User Behaviour Analytics is the tracking, collecting and assessing of user data and activities. The goal is to detect misuse of user credentials by developing models for the normal behaviour of user credentials within a computer network and detect outliers with respect to their baseline.

  10. Living with Computers. Young Danes' Uses of and Thoughts on the Uses of Computers

    DEFF Research Database (Denmark)

    Stald, Gitte Bang

    1998-01-01

    Young Danes, computers,users, super users, non users, computer access, unge danskere, computere,brugere,superbrugere,ikke-brugere......Young Danes, computers,users, super users, non users, computer access, unge danskere, computere,brugere,superbrugere,ikke-brugere...

  11. Identifying online user reputation of user-object bipartite networks

    Science.gov (United States)

    Liu, Xiao-Lu; Liu, Jian-Guo; Yang, Kai; Guo, Qiang; Han, Jing-Ti

    2017-02-01

    Identifying online user reputation based on the rating information of the user-object bipartite networks is important for understanding online user collective behaviors. Based on the Bayesian analysis, we present a parameter-free algorithm for ranking online user reputation, where the user reputation is calculated based on the probability that their ratings are consistent with the main part of all user opinions. The experimental results show that the AUC values of the presented algorithm could reach 0.8929 and 0.8483 for the MovieLens and Netflix data sets, respectively, which is better than the results generated by the CR and IARR methods. Furthermore, the experimental results for different user groups indicate that the presented algorithm outperforms the iterative ranking methods in both ranking accuracy and computation complexity. Moreover, the results for the synthetic networks show that the computation complexity of the presented algorithm is a linear function of the network size, which suggests that the presented algorithm is very effective and efficient for the large scale dynamic online systems.

  12. Evacuation emergency response model coupling atmospheric release advisory capability output

    International Nuclear Information System (INIS)

    Rosen, L.C.; Lawver, B.S.; Buckley, D.W.; Finn, S.P.; Swenson, J.B.

    1983-01-01

    A Federal Emergency Management Agency (FEMA) sponsored project to develop a coupled set of models between those of the Lawrence Livermore National Laboratory (LLNL) Atmospheric Release Advisory Capability (ARAC) system and candidate evacuation models is discussed herein. This report describes the ARAC system and discusses the rapid computer code developed and the coupling with ARAC output. The computer code is adapted to the use of color graphics as a means to display and convey the dynamics of an emergency evacuation. The model is applied to a specific case of an emergency evacuation of individuals surrounding the Rancho Seco Nuclear Power Plant, located approximately 25 miles southeast of Sacramento, California. The graphics available to the model user for the Rancho Seco example are displayed and noted in detail. Suggestions for future, potential improvements to the emergency evacuation model are presented

  13. Graphical user interfaces and visually disabled users

    NARCIS (Netherlands)

    Poll, L.H.D.; Waterham, R.P.

    1995-01-01

    From February 1992 until the end of 1993, the authors ((IPO) Institute for Perception Research) participated in a European ((TIDE) Technology Initiative for Disabled and Elderly) project which addressed the problem arising for visually disabled computer-users from the growing use of Graphical User

  14. Tactile and bone-conduction auditory brain computer interface for vision and hearing impaired users.

    Science.gov (United States)

    Rutkowski, Tomasz M; Mori, Hiromu

    2015-04-15

    The paper presents a report on the recently developed BCI alternative for users suffering from impaired vision (lack of focus or eye-movements) or from the so-called "ear-blocking-syndrome" (limited hearing). We report on our recent studies of the extents to which vibrotactile stimuli delivered to the head of a user can serve as a platform for a brain computer interface (BCI) paradigm. In the proposed tactile and bone-conduction auditory BCI novel multiple head positions are used to evoke combined somatosensory and auditory (via the bone conduction effect) P300 brain responses, in order to define a multimodal tactile and bone-conduction auditory brain computer interface (tbcaBCI). In order to further remove EEG interferences and to improve P300 response classification synchrosqueezing transform (SST) is applied. SST outperforms the classical time-frequency analysis methods of the non-linear and non-stationary signals such as EEG. The proposed method is also computationally more effective comparing to the empirical mode decomposition. The SST filtering allows for online EEG preprocessing application which is essential in the case of BCI. Experimental results with healthy BCI-naive users performing online tbcaBCI, validate the paradigm, while the feasibility of the concept is illuminated through information transfer rate case studies. We present a comparison of the proposed SST-based preprocessing method, combined with a logistic regression (LR) classifier, together with classical preprocessing and LDA-based classification BCI techniques. The proposed tbcaBCI paradigm together with data-driven preprocessing methods are a step forward in robust BCI applications research. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Steerability Analysis of Tracked Vehicles: Theory and User’s Guide for Computer Program TVSTEER

    Science.gov (United States)

    1986-08-01

    Baladi , Donald E. Barnes, Rebecca P. BergerC oStructures Laboratory NDEPARTMENT OF THE ARMY ___ Waterways Experiment Station, Corps of Engineers . U P0 Box...Analysis of Tracked Vehicles: Theory and User’s Guide for Computer Program TVSTEER - 12 PERSONAL AUTHOR(S) Baladi , George Y., Barnes, Donald E...mathematical model was formulated by Drs. George Y. Baladi and Behzad Rohani. The logic and computer programming were accomplished by Dr. Baladi and

  16. Animated computer graphics models of space and earth sciences data generated via the massively parallel processor

    Science.gov (United States)

    Treinish, Lloyd A.; Gough, Michael L.; Wildenhain, W. David

    1987-01-01

    The capability was developed of rapidly producing visual representations of large, complex, multi-dimensional space and earth sciences data sets via the implementation of computer graphics modeling techniques on the Massively Parallel Processor (MPP) by employing techniques recently developed for typically non-scientific applications. Such capabilities can provide a new and valuable tool for the understanding of complex scientific data, and a new application of parallel computing via the MPP. A prototype system with such capabilities was developed and integrated into the National Space Science Data Center's (NSSDC) Pilot Climate Data System (PCDS) data-independent environment for computer graphics data display to provide easy access to users. While developing these capabilities, several problems had to be solved independently of the actual use of the MPP, all of which are outlined.

  17. Benchmarking therapeutic drug monitoring software: a review of available computer tools.

    Science.gov (United States)

    Fuchs, Aline; Csajka, Chantal; Thoma, Yann; Buclin, Thierry; Widmer, Nicolas

    2013-01-01

    Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare

  18. A memory efficient user interface for CLIPS micro-computer applications

    Science.gov (United States)

    Sterle, Mark E.; Mayer, Richard J.; Jordan, Janice A.; Brodale, Howard N.; Lin, Min-Jin

    1990-01-01

    The goal of the Integrated Southern Pine Beetle Expert System (ISPBEX) is to provide expert level knowledge concerning treatment advice that is convenient and easy to use for Forest Service personnel. ISPBEX was developed in CLIPS and delivered on an IBM PC AT class micro-computer, operating with an MS/DOS operating system. This restricted the size of the run time system to 640K. In order to provide a robust expert system, with on-line explanation, help, and alternative actions menus, as well as features that allow the user to back up or execute 'what if' scenarios, a memory efficient menuing system was developed to interface with the CLIPS programs. By robust, we mean an expert system that (1) is user friendly, (2) provides reasonable solutions for a wide variety of domain specific problems, (3) explains why some solutions were suggested but others were not, and (4) provides technical information relating to the problem solution. Several advantages were gained by using this type of user interface (UI). First, by storing the menus on the hard disk (instead of main memory) during program execution, a more robust system could be implemented. Second, since the menus were built rapidly, development time was reduced. Third, the user may try a new scenario by backing up to any of the input screens and revising segments of the original input without having to retype all the information. And fourth, asserting facts from the menus provided for a dynamic and flexible fact base. This UI technology has been applied successfully in expert systems applications in forest management, agriculture, and manufacturing. This paper discusses the architecture of the UI system, human factors considerations, and the menu syntax design.

  19. Molecular Science Computing: 2010 Greenbook

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.; Cowley, David E.; Dunning, Thom H.; Vorpagel, Erich R.

    2010-04-02

    This 2010 Greenbook outlines the science drivers for performing integrated computational environmental molecular research at EMSL and defines the next-generation HPC capabilities that must be developed at the MSC to address this critical research. The EMSL MSC Science Panel used EMSL’s vision and science focus and white papers from current and potential future EMSL scientific user communities to define the scientific direction and resulting HPC resource requirements presented in this 2010 Greenbook.

  20. Fault Injection and Monitoring Capability for a Fault-Tolerant Distributed Computation System

    Science.gov (United States)

    Torres-Pomales, Wilfredo; Yates, Amy M.; Malekpour, Mahyar R.

    2010-01-01

    The Configurable Fault-Injection and Monitoring System (CFIMS) is intended for the experimental characterization of effects caused by a variety of adverse conditions on a distributed computation system running flight control applications. A product of research collaboration between NASA Langley Research Center and Old Dominion University, the CFIMS is the main research tool for generating actual fault response data with which to develop and validate analytical performance models and design methodologies for the mitigation of fault effects in distributed flight control systems. Rather than a fixed design solution, the CFIMS is a flexible system that enables the systematic exploration of the problem space and can be adapted to meet the evolving needs of the research. The CFIMS has the capabilities of system-under-test (SUT) functional stimulus generation, fault injection and state monitoring, all of which are supported by a configuration capability for setting up the system as desired for a particular experiment. This report summarizes the work accomplished so far in the development of the CFIMS concept and documents the first design realization.

  1. Revised user's guide to the 'DISPOSALS' model

    International Nuclear Information System (INIS)

    Laundy, R.S.; James, A.R.; Groom, M.S.; LeJeune, S.R.

    1985-04-01

    This report provides a User's Guide to the 'DISPOSALS' computer model and includes instructions on how to set up and run a specific problem together with details of the scope, theoretical basis, data requirements and capabilities of the model. The function of the 'DISPOSALS' model is to make assignments of nuclear waste material in an optimum manner to a number of disposal sites each subject to a number of constraints such as limits on the volume and activity. The user is able to vary the number of disposal sites, the range and limits of the constraints to be applied to each disposal site and the objective function for optimisation. The model is based on the Linear Programming technique and uses CAP Scientific's LAMPS and MAGIC packages. Currently the model has been implemented on CAP Scientific's VAX 11/750 minicomputer. (author)

  2. Presenting collocates in a dictionary of computing and the Internet according to user needs

    DEFF Research Database (Denmark)

    Leroyer, Patrick; L'Homme, Marie-Claude; Jousse, Anne-Laure

    2011-01-01

    This paper presents a novel method for organizing and presenting collocations in a specialized dictionary of computing and the Internet. This work is undertaken in order to meet a specific user need, i.e. that of searching for a collocate (or a short list of collocates) that expresses a specific...

  3. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    Science.gov (United States)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and

  4. UTOPIA—User-Friendly Tools for Operating Informatics Applications

    Science.gov (United States)

    Sinnott, J. R.; Attwood, T. K.

    2004-01-01

    Bioinformaticians routinely analyse vast amounts of information held both in large remote databases and in flat data files hosted on local machines. The contemporary toolkit available for this purpose consists of an ad hoc collection of data manipulation tools, scripting languages and visualization systems; these must often be combined in complex and bespoke ways, the result frequently being an unwieldy artefact capable of one specific task, which cannot easily be exploited or extended by other practitioners. Owing to the sizes of current databases and the scale of the analyses necessary, routine bioinformatics tasks are often automated, but many still require the unique experience and intuition of human researchers: this requires tools that support real-time interaction with complex datasets. Many existing tools have poor user interfaces and limited real-time performance when applied to realistically large datasets; much of the user's cognitive capacity is therefore focused on controlling the tool rather than on performing the research. The UTOPIA project is addressing some of these issues by building reusable software components that can be combined to make useful applications in the field of bioinformatics. Expertise in the fields of human computer interaction, high-performance rendering, and distributed systems is being guided by bioinformaticians and end-user biologists to create a toolkit that is both architecturally sound from a computing point of view, and directly addresses end-user and application-developer requirements. PMID:18629035

  5. Design and test of a hybrid foot force sensing and GPS system for richer user mobility activity recognition.

    Science.gov (United States)

    Zhang, Zelun; Poslad, Stefan

    2013-11-01

    Wearable and accompanied sensors and devices are increasingly being used for user activity recognition. However, typical GPS-based and accelerometer-based (ACC) methods face three main challenges: a low recognition accuracy; a coarse recognition capability, i.e., they cannot recognise both human posture (during travelling) and transportation mode simultaneously, and a relatively high computational complexity. Here, a new GPS and Foot-Force (GPS + FF) sensor method is proposed to overcome these challenges that leverages a set of wearable FF sensors in combination with GPS, e.g., in a mobile phone. User mobility activities that can be recognised include both daily user postures and common transportation modes: sitting, standing, walking, cycling, bus passenger, car passenger (including private cars and taxis) and car driver. The novelty of this work is that our approach provides a more comprehensive recognition capability in terms of reliably recognising both human posture and transportation mode simultaneously during travel. In addition, by comparing the new GPS + FF method with both an ACC method (62% accuracy) and a GPS + ACC based method (70% accuracy) as baseline methods, it obtains a higher accuracy (95%) with less computational complexity, when tested on a dataset obtained from ten individuals.

  6. Computer code structure for evaluation of fire protection measures and fighting capability at nuclear plants

    International Nuclear Information System (INIS)

    Anton, V.

    1997-01-01

    In this work a computer code structure for Fire Protection Measures (FPM) and Fire Fighting Capability (FFC) at Nuclear Power Plants (NPP) is presented. It allows to evaluate the category (satisfactory (s), needs for further evaluation (n), unsatisfactory (u)) to which belongs the given NPP for a self-control in view of an IAEA inspection. This possibility of a self assessment resulted from IAEA documents. Our approach is based on international experience gained in this field and stated in IAEA recommendations. As an illustration we used the FORTRAN programming language statement to make clear the structure of the computer code for the problem taken into account. This computer programme can be conceived so that some literal message in English and Romanian languages be displayed beside the percentage assessments. (author)

  7. 78 FR 18353 - Guidance for Industry: Blood Establishment Computer System Validation in the User's Facility...

    Science.gov (United States)

    2013-03-26

    ... SUPPLEMENTARY INFORMATION section for electronic access to the guidance document. Submit electronic comments on... document entitled ``Guidance for Industry: Blood Establishment Computer System Validation in the User's... document to http://www.regulations.gov or written comments to the Division of Dockets Management (see...

  8. Secure and Resilient Cloud Computing for the Department of Defense

    Science.gov (United States)

    2015-07-21

    scalability of resource usage. Lincoln Laboratory is developing technology that will strengthen the security and resilience of cloud computing so that the...capabilities are outsourced to a provider that delivers services to a cloud user (also called a tenant). The DoD is looking to the cloud computing model...hardware. Today’s cloud providers and the technology that underpins them are focused on the availability and scalability of services and not on DoD

  9. Engineering computer graphics in gas turbine engine design, analysis and manufacture

    Science.gov (United States)

    Lopatka, R. S.

    1975-01-01

    A time-sharing and computer graphics facility designed to provide effective interactive tools to a large number of engineering users with varied requirements was described. The application of computer graphics displays at several levels of hardware complexity and capability is discussed, with examples of graphics systems tracing gas turbine product development, beginning with preliminary design through manufacture. Highlights of an operating system stylized for interactive engineering graphics is described.

  10. Computer systems for annotation of single molecule fragments

    Science.gov (United States)

    Schwartz, David Charles; Severin, Jessica

    2016-07-19

    There are provided computer systems for visualizing and annotating single molecule images. Annotation systems in accordance with this disclosure allow a user to mark and annotate single molecules of interest and their restriction enzyme cut sites thereby determining the restriction fragments of single nucleic acid molecules. The markings and annotations may be automatically generated by the system in certain embodiments and they may be overlaid translucently onto the single molecule images. An image caching system may be implemented in the computer annotation systems to reduce image processing time. The annotation systems include one or more connectors connecting to one or more databases capable of storing single molecule data as well as other biomedical data. Such diverse array of data can be retrieved and used to validate the markings and annotations. The annotation systems may be implemented and deployed over a computer network. They may be ergonomically optimized to facilitate user interactions.

  11. Computing facility at SSC for detectors

    International Nuclear Information System (INIS)

    Leibold, P.; Scipiono, B.

    1990-01-01

    A description of the RISC-based distributed computing facility for detector simulaiton being developed at the SSC Laboratory is discussed. The first phase of this facility is scheduled for completion in early 1991. Included is the status of the project, overview of the concepts used to model and define system architecture, networking capabilities for user access, plans for support of physics codes and related topics concerning the implementation of this facility

  12. Evaluation of Rankine cycle air conditioning system hardware by computer simulation

    Science.gov (United States)

    Healey, H. M.; Clark, D.

    1978-01-01

    A computer program for simulating the performance of a variety of solar powered Rankine cycle air conditioning system components (RCACS) has been developed. The computer program models actual equipment by developing performance maps from manufacturers data and is capable of simulating off-design operation of the RCACS components. The program designed to be a subroutine of the Marshall Space Flight Center (MSFC) Solar Energy System Analysis Computer Program 'SOLRAD', is a complete package suitable for use by an occasional computer user in developing performance maps of heating, ventilation and air conditioning components.

  13. Neural Correlates of User-initiated Motor Success and Failure - A Brain-Computer Interface Perspective.

    Science.gov (United States)

    Yazmir, Boris; Reiner, Miriam

    2018-05-15

    Any motor action is, by nature, potentially accompanied by human errors. In order to facilitate development of error-tailored Brain-Computer Interface (BCI) correction systems, we focused on internal, human-initiated errors, and investigated EEG correlates of user outcome successes and errors during a continuous 3D virtual tennis game against a computer player. We used a multisensory, 3D, highly immersive environment. Missing and repelling the tennis ball were considered, as 'error' (miss) and 'success' (repel). Unlike most previous studies, where the environment "encouraged" the participant to perform a mistake, here errors happened naturally, resulting from motor-perceptual-cognitive processes of incorrect estimation of the ball kinematics, and can be regarded as user internal, self-initiated errors. Results show distinct and well-defined Event-Related Potentials (ERPs), embedded in the ongoing EEG, that differ across conditions by waveforms, scalp signal distribution maps, source estimation results (sLORETA) and time-frequency patterns, establishing a series of typical features that allow valid discrimination between user internal outcome success and error. The significant delay in latency between positive peaks of error- and success-related ERPs, suggests a cross-talk between top-down and bottom-up processing, represented by an outcome recognition process, in the context of the game world. Success-related ERPs had a central scalp distribution, while error-related ERPs were centro-parietal. The unique characteristics and sharp differences between EEG correlates of error/success provide the crucial components for an improved BCI system. The features of the EEG waveform can be used to detect user action outcome, to be fed into the BCI correction system. Copyright © 2016 IBRO. Published by Elsevier Ltd. All rights reserved.

  14. Touch in Computer-Mediated Environments: An Analysis of Online Shoppers' Touch-Interface User Experiences

    Science.gov (United States)

    Chung, Sorim

    2016-01-01

    Over the past few years, one of the most fundamental changes in current computer-mediated environments has been input devices, moving from mouse devices to touch interfaces. However, most studies of online retailing have not considered device environments as retail cues that could influence users' shopping behavior. In this research, I examine the…

  15. A tool for computing diversity and consideration on differences between diversity indices

    OpenAIRE

    Palaghianu, Ciprian

    2016-01-01

    Diversity represents a key concept in ecology, and there are various methods of assessing it. The multitude of diversity indices are quite puzzling and sometimes difficult to compute for a large volume of data. This paper promotes a computational tool used to assess the diversity of different entities. The BIODIV software is a user-friendly tool, developed using Microsoft Visual Basic. It is capable to compute several diversity indices such as: Shannon, Simpson, Pielou, Brillouin, Berger-Park...

  16. Application of Selective Algorithm for Effective Resource Provisioning in Cloud Computing Environment

    OpenAIRE

    Katyal, Mayanka; Mishra, Atul

    2014-01-01

    Modern day continued demand for resource hungry services and applications in IT sector has led to development of Cloud computing. Cloud computing environment involves high cost infrastructure on one hand and need high scale computational resources on the other hand. These resources need to be provisioned (allocation and scheduling) to the end users in most efficient manner so that the tremendous capabilities of cloud are utilized effectively and efficiently. In this paper we discuss a selecti...

  17. Evaluating the influence of perceived organizational learning capability on user acceptance of information technology among operating room nurse staff.

    Science.gov (United States)

    Lee, Chien-Ching; Lin, Shih-Pin; Yang, Shu-Ling; Tsou, Mei-Yung; Chang, Kuang-Yi

    2013-03-01

    Medical institutions are eager to introduce new information technology to improve patient safety and clinical efficiency. However, the acceptance of new information technology by medical personnel plays a key role in its adoption and application. This study aims to investigate whether perceived organizational learning capability (OLC) is associated with user acceptance of information technology among operating room nurse staff. Nurse anesthetists and operating room nurses were recruited in this questionnaire survey. A pilot study was performed to ensure the reliability and validity of the translated questionnaire, which consisted of 14 items from the four dimensions of OLC, and 16 items from the four constructs of user acceptance of information technology, including performance expectancy, effort expectancy, social influence, and behavioral intention. Confirmatory factor analysis was applied in the main survey to evaluate the construct validity of the questionnaire. Structural equation modeling was used to test the hypothetical relationships between the four dimensions of user acceptance of information technology and the second-ordered OLC. Goodness of fit of the hypothetic model was also assessed. Performance expectancy, effort expectancy, and social influence positively influenced behavioral intention of users of the clinical information system (all p < 0.001) and accounted for 75% of its variation. The second-ordered OLC was positively associated with performance expectancy, effort expectancy, and social influence (all p < 0.001). However, the hypothetic relationship between perceived OLC and behavioral intention was not significant (p = 0.87). The fit statistical analysis indicated reasonable model fit to data (root mean square error of approximation = 0.07 and comparative fit index = 0.91). Perceived OLC indirectly affects user behavioral intention through the mediation of performance expectancy, effort expectancy, and social influence in the operating room

  18. Theoretical background and user's manual for the computer code on groundwater flow and radionuclide transport calculation in porous rock

    International Nuclear Information System (INIS)

    Shirakawa, Toshihiko; Hatanaka, Koichiro

    2001-11-01

    In order to document a basic manual about input data, output data, execution of computer code on groundwater flow and radionuclide transport calculation in heterogeneous porous rock, we investigated the theoretical background about geostatistical computer codes and the user's manual for the computer code on groundwater flow and radionuclide transport which calculates water flow in three dimension, the path of moving radionuclide, and one dimensional radionuclide migration. In this report, based on above investigation we describe the geostatistical background about simulating heterogeneous permeability field. And we describe construction of files, input and output data, a example of calculating of the programs which simulates heterogeneous permeability field, and calculates groundwater flow and radionuclide transport. Therefore, we can document a manual by investigating the theoretical background about geostatistical computer codes and the user's manual for the computer code on groundwater flow and radionuclide transport calculation. And we can model heterogeneous porous rock and analyze groundwater flow and radionuclide transport by utilizing the information from this report. (author)

  19. MINTEQ user's manual

    International Nuclear Information System (INIS)

    Peterson, S.R.; Hostetler, C.J.; Deutsch, W.J.; Cowan, C.E.

    1987-02-01

    This manual will aid the user in applying the MINTEQ geochemical computer code to model aqueous solutions and the interactions of aqueous solutions with hypothesized assemblages of solid phases. The manual will provide a basic understanding of how the MINTEQ computer code operates and the important principles that are incorporated into the code and instruct a user of the MINTEQ code on how to create input files to simulate a variety of geochemical problems. Chapters 2 through 8 are for the user who has some experience with or wishes to review the principles important to geochemical computer codes. These chapters include information on the methodology MINTEQ uses to incorporate these principles into the code. Chapters 9 through 11 are for the user who wants to know how to create input data files to model various types of problems. 35 refs., 2 figs., 5 tabs

  20. Users manual for CAFE-3D : a computational fluid dynamics fire code

    International Nuclear Information System (INIS)

    Khalil, Imane; Lopez, Carlos; Suo-Anttila, Ahti Jorma

    2005-01-01

    The Container Analysis Fire Environment (CAFE) computer code has been developed to model all relevant fire physics for predicting the thermal response of massive objects engulfed in large fires. It provides realistic fire thermal boundary conditions for use in design of radioactive material packages and in risk-based transportation studies. The CAFE code can be coupled to commercial finite-element codes such as MSC PATRAN/THERMAL and ANSYS. This coupled system of codes can be used to determine the internal thermal response of finite element models of packages to a range of fire environments. This document is a user manual describing how to use the three-dimensional version of CAFE, as well as a description of CAFE input and output parameters. Since this is a user manual, only a brief theoretical description of the equations and physical models is included

  1. Nuclear Engine System Simulation (NESS). Version 2.0: Program user's guide. Final Report

    International Nuclear Information System (INIS)

    Pelaccio, D.G.; Scheil, C.M.; Petrosky, L.

    1993-03-01

    This Program User's Guide discusses the Nuclear Thermal Propulsion (NTP) engine system design features and capabilities modeled in the Nuclear Engine System Simulation (NESS): Version 2.0 program (referred to as NESS throughout the remainder of this document), as well as its operation. NESS was upgraded to include many new modeling capabilities not available in the original version delivered to NASA LeRC in Dec. 1991, NESS's new features include the following: (1) an improved input format; (2) an advanced solid-core NERVA-type reactor system model (ENABLER 2); (3) a bleed-cycle engine system option; (4) an axial-turbopump design option; (5) an automated pump-out turbopump assembly sizing option; (6) an off-design gas generator engine cycle design option; (7) updated hydrogen properties; (8) an improved output formnd (9) personal computer operation capability. Sample design cases are presented in the user's guide that demonstrate many of the new features associated with this upgraded version of NESS, as well as design modeling features associated with the original version of NESS

  2. A content validity approach to creating an end-user computer skill assessment tool

    Directory of Open Access Journals (Sweden)

    Shirley Gibbs

    Full Text Available Practical assessment instruments are commonly used in the workplace and educational environments to assess a person\\'s level of digital literacy and end-user computer skill. However, it is often difficult to find statistical evidence of the actual validity of instruments being used. To ensure that the correct factors are being assessed for a particular purpose it is necessary to undertake some type of psychometric testing, and the first step is to study the content relevance of the measure. The purpose of this paper is to report on the rigorous judgment-quantification process using panels of experts in order to establish inter-rater reliability and agreement in the development of end-user instruments developed to measure workplace skills using spreadsheet and word-processing applications.

  3. An Approach to Providing a User Interface for Military Computer-Aided- Instruction in 1980

    Science.gov (United States)

    1975-11-01

    commercial terminals is the use of a microprocessor unit ( MPU ) LSI chip controller. This technology is flexible and economical •nd can be expected to...various «•gmentt. By using an MPU and developing a software capability, tha vendor can quickly and economically satisfy a large spsctrum of user...the basis for an effective and economical jser interface to military CAI systems. •a. sicumrv CLAMincATioH or THIS P**;:^*— D*. K*fn4) ^vmm m m r

  4. TRAC-P1: an advanced best estimate computer program for PWR LOCA analysis. I. Methods, models, user information, and programming details

    International Nuclear Information System (INIS)

    1978-05-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos Scientific Laboratory (LASL) to provide an advanced ''best estimate'' predictive capability for the analysis of postulated accidents in light water reactors (LWRs). TRAC-Pl provides this analysis capability for pressurized water reactors (PWRs) and for a wide variety of thermal-hydraulic experimental facilities. It features a three-dimensional treatment of the pressure vessel and associated internals; two-phase nonequilibrium hydrodynamics models; flow-regime-dependent constitutive equation treatment; reflood tracking capability for both bottom flood and falling film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions. The TRAC-Pl User's Manual is composed of two separate volumes. Volume I gives a description of the thermal-hydraulic models and numerical solution methods used in the code. Detailed programming and user information is also provided. Volume II presents the results of the developmental verification calculations

  5. User interface support

    Science.gov (United States)

    Lewis, Clayton; Wilde, Nick

    1989-01-01

    Space construction will require heavy investment in the development of a wide variety of user interfaces for the computer-based tools that will be involved at every stage of construction operations. Using today's technology, user interface development is very expensive for two reasons: (1) specialized and scarce programming skills are required to implement the necessary graphical representations and complex control regimes for high-quality interfaces; (2) iteration on prototypes is required to meet user and task requirements, since these are difficult to anticipate with current (and foreseeable) design knowledge. We are attacking this problem by building a user interface development tool based on extensions to the spreadsheet model of computation. The tool provides high-level support for graphical user interfaces and permits dynamic modification of interfaces, without requiring conventional programming concepts and skills.

  6. An overview of the SAFSIM computer program

    International Nuclear Information System (INIS)

    Dobranich, D.

    1993-01-01

    SAFSIM (System Analysis Flow SIMulator) is a FORTRAN computer program that provides engineering simulations of user-specified flow networks at the system level. It includes fluid mechanics, heat transfer, and reactor dynamics capabilities. SAFSIM provides sufficient versatility to allow the simulation of almost any flow system, from a backyard sprinkler system to a clustered nuclear reactor propulsion system. In addition to versatility, speed and robustness are primary goals of SAFSIM development. The current capabilities of SAFSIM are summarized and some sample applications are presented. It is applied here to a nuclear thermal propulsion system and nuclear rocket engine test facility

  7. Neuronvisio: a Graphical User Interface with 3D capabilities for NEURON

    Directory of Open Access Journals (Sweden)

    Michele eMattioni

    2012-06-01

    Full Text Available The NEURON simulation environment is a commonly used tool to perform electrical simulation of neurons and neuronal networks. The NEURON User Interface, based on the now discontinued InterViews library, provides some limited facilities to explore models and to plot their simulation results. Other limitations include the inability to generate a three dimensional visualization, no standard mean to save the results of simulations, or to store the model geometry within the results. Neuronvisio (http://mattions.github.com/neuronvisio/ aims to address these deficiencies through a set of well designed python APIs and provides an improved UI, allowing users to explore and interact with the model.Neuronvisio also facilitates access to previously published models, allowing usersto browse, download and locally run NEURON models stored in ModelDB. Neuronvisio uses the matplotlib library to plot simulation results and uses the HDF standard format to store simulation results. Neuronvisio can be viewed as an extension of NEURON, facilitating typical user workflows such as model browsing, selection, download, compilation and simulation. The 3D viewer simplifies the exploration of complex model structure, while matplotlib permits the plotting of high-quality graphs. The newly introduced ability of saving numerical results allows users to perform additional analysis on their previous simulations.

  8. The Los Alamos universe: Using multimedia to promote laboratory capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Kindel, J.

    2000-03-01

    This project consists of a multimedia presentation that explains the technological capabilities of Los Alamos National Laboratory. It takes the form of a human-computer interface built around the metaphor of the universe. The project is intended promote Laboratory capabilities to a wide audience. Multimedia is simply a means of communicating information through a diverse set of tools--be they text, sound, animation, video, etc. Likewise, Los Alamos National Laboratory is a collection of diverse technologies, projects, and people. Given the ample material available at the Laboratory, there are tangible benefits to be gained by communicating across media. This paper consists of three parts. The first section provides some basic information about the Laboratory, its mission, and its needs. The second section introduces this multimedia presentation and the metaphor it is based on along with some basic concepts of color and user interaction used in the building of this project. The final section covers construction of the project, pitfalls, and future improvements.

  9. A user's guide to the SASSYS-1 control system modeling capability

    International Nuclear Information System (INIS)

    Vilim, R.B.

    1987-06-01

    This report describes a control system modeling capability that has been developed for the analysis of control schemes for advanced liquid metal reactors. The general class of control equations that can be represented using the modeling capability is identified, and the numerical algorithms used to solve these equations are described. The modeling capability has been implemented in the SASSYS-1 systems analysis code. A description of the card input, a sample input deck and some guidelines for running the code are given

  10. Xyce parallel electronic simulator : users' guide.

    Energy Technology Data Exchange (ETDEWEB)

    Mei, Ting; Rankin, Eric Lamont; Thornquist, Heidi K.; Santarelli, Keith R.; Fixel, Deborah A.; Coffey, Todd Stirling; Russo, Thomas V.; Schiek, Richard Louis; Warrender, Christina E.; Keiter, Eric Richard; Pawlowski, Roger Patrick

    2011-05-01

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: (1) Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers; (2) Improved performance for all numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-art algorithms and novel techniques. (3) Device models which are specifically tailored to meet Sandia's needs, including some radiation-aware devices (for Sandia users only); and (4) Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing parallel implementation - which allows it to run efficiently on the widest possible number of computing platforms. These include serial, shared-memory and distributed-memory parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The development of Xyce provides a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods, parallel solver algorithms) research and development can be performed. As a result, Xyce is

  11. Implementation of internet training on posture reform of computer users in iran.

    Science.gov (United States)

    Keykhaie, Zohreh; Zareban, Iraj; Shahrakipoor, Mahnaz; Hormozi, Maryam; Sharifi-Rad, Javad; Masoudi, Gholamreza; Rahimi, Fatemeh

    2014-12-01

    Musculoskeletal disorders are of common problems among computer (PC) users. Training of posture reform plays a significant role in the prevention of the emergence, progression and complications of these diseases. The present research was performed to study the effect of the Internet training on the posture reform of the Internet users working in two Iranian universities including Sistan and Baluchestan University and Islamic Azad University of Zahedan in 2014. This study was a quasi-experimental intervention with control group and conducted in two Iranian universities including Sistan and Baluchestan University and Islamic Azad University of Zahedan. The study was done on 160 PC users in the two groups of intervention (80 people) and control (80 people). Training PowerPoint was sent to the intervention group through the Internet and a post test was given to them after 45 days. Statistical software of SPSS 19 and statistical tests of Kolmogrov, t-test, Fisher Exact test, and correlation coefficient were used for data analysis. After the training, the mean scores of knowledge, attitude, performance and self-efficacy in the intervention group were 24.21 ± 1.34, 38.36 ± 2.89, 7.59 ± 1.16, and 45.06 ± 4.11, respectively (P Internet had a significant impact on the posture reform of the PC users. According to the findings observed, there was a significant relationship between the scores of self-efficacy-performance after training. Therefore, based on the findings of the study, it is suggested that Internet training to increase self-efficacy approach in the successive periods can be effective to reform the postures of PC users.

  12. Relationship Between Ocular Surface Disease Index, Dry Eye Tests, and Demographic Properties in Computer Users

    Directory of Open Access Journals (Sweden)

    Hüseyin Simavlı

    2014-03-01

    Full Text Available Objectives: The aim of the present study is to evaluate the ocular surface disease index (OSDI in computer users and to investigate the correlations of this index with dry eye tests and demographic properties. Materials and Methods: In this prospective study, 178 subjects with an age range of 20-40 years and who spent most of their daily life in front of the computers were included. All participants underwent a complete ophthalmologic examination including basal secretion test, tear break-up time test, and ocular surface staining. In addition, all patients completed the OSDI questionnaire. Results: A total of 178 volunteers (101 female, 77 male with a mean age of 28.8±4.5 years were included in the study. Mean time of computer use was 7.7±1.9 (5-14 hours/day, and mean computer use period was 71.1±39.7 (4-204 months. Mean OSDI score was 44.1±24.7 (0-100. There was a significant negative correlation between the OSDI score and tear break-up time test in the right (p=0.005 r=-0.21 and the left eyes (p=0.003 r=-0.22. There was a significant positive correlation between the OSDI score and gender (p=0.014 r=0.18 and daily computer usage time (p=0.008 r=0.2. In addition to this, there was a significant positive correlation between the OSDI score and ocular surface staining pattern in the right (p=0.03 r=0.16 and the left eyes (p=0.03 r=0.17. Age, smoking, type of computer, use of glasses, presence of symptoms, and basal secretion test were not found to be correlated with OSDI score. Conclusions: Long-term computer use causes ocular surface problems. The OSDI were found to be correlated with tear break-up time test, gender, daily computer usage time, and ocular surface staining pattern in computer users. (Turk J Ophthalmol 2014; 44: 115-8

  13. Computational physics and applied mathematics capability review June 8-10, 2010 (Advance materials to committee members)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Stephen R [Los Alamos National Laboratory

    2010-01-01

    Los Alamos National Laboratory will review its Computational Physics and Applied Mathematics (CPAM) capabilities in 2010. The goals of capability reviews are to assess the quality of science, technology, and engineering (STE) performed by the capability, evaluate the integration of this capability across the Laboratory and within the scientific community, examine the relevance of this capability to the Laboratory's programs, and provide advice on the current and future directions of this capability. This is the first such review for CPAM, which has a long and unique history at the laboratory, starting from the inception of the Laboratory in 1943. The CPAM capability covers an extremely broad technical area at Los Alamos, encompassing a wide array of disciplines, research topics, and organizations. A vast array of technical disciplines and activities are included in this capability, from general numerical modeling, to coupled mUlti-physics simulations, to detailed domain science activities in mathematics, methods, and algorithms. The CPAM capability involves over 12 different technical divisions and a majority of our programmatic and scientific activities. To make this large scope tractable, the CPAM capability is broken into the following six technical 'themes.' These themes represent technical slices through the CP AM capability and collect critical core competencies of the Laboratory, each of which contributes to the capability (and each of which is divided into multiple additional elements in the detailed descriptions of the themes in subsequent sections): (1) Computational Fluid Dynamics - This theme speaks to the vast array of scientific capabilities for the simulation of fluids under shocks, low-speed flow, and turbulent conditions - which are key, historical, and fundamental strengths of the laboratory; (2) Partial Differential Equations - The technical scope of this theme is the applied mathematics and numerical solution of partial

  14. User-centered virtual environment design for virtual rehabilitation

    Directory of Open Access Journals (Sweden)

    Rizzo Albert A

    2010-02-01

    Full Text Available Abstract Background As physical and cognitive rehabilitation protocols utilizing virtual environments transition from single applications to comprehensive rehabilitation programs there is a need for a new design cycle methodology. Current human-computer interaction designs focus on usability without benchmarking technology within a user-in-the-loop design cycle. The field of virtual rehabilitation is unique in that determining the efficacy of this genre of computer-aided therapies requires prior knowledge of technology issues that may confound patient outcome measures. Benchmarking the technology (e.g., displays or data gloves using healthy controls may provide a means of characterizing the "normal" performance range of the virtual rehabilitation system. This standard not only allows therapists to select appropriate technology for use with their patient populations, it also allows them to account for technology limitations when assessing treatment efficacy. Methods An overview of the proposed user-centered design cycle is given. Comparisons of two optical see-through head-worn displays provide an example of benchmarking techniques. Benchmarks were obtained using a novel vision test capable of measuring a user's stereoacuity while wearing different types of head-worn displays. Results from healthy participants who performed both virtual and real-world versions of the stereoacuity test are discussed with respect to virtual rehabilitation design. Results The user-centered design cycle argues for benchmarking to precede virtual environment construction, especially for therapeutic applications. Results from real-world testing illustrate the general limitations in stereoacuity attained when viewing content using a head-worn display. Further, the stereoacuity vision benchmark test highlights differences in user performance when utilizing a similar style of head-worn display. These results support the need for including benchmarks as a means of better

  15. User-centered virtual environment design for virtual rehabilitation.

    Science.gov (United States)

    Fidopiastis, Cali M; Rizzo, Albert A; Rolland, Jannick P

    2010-02-19

    As physical and cognitive rehabilitation protocols utilizing virtual environments transition from single applications to comprehensive rehabilitation programs there is a need for a new design cycle methodology. Current human-computer interaction designs focus on usability without benchmarking technology within a user-in-the-loop design cycle. The field of virtual rehabilitation is unique in that determining the efficacy of this genre of computer-aided therapies requires prior knowledge of technology issues that may confound patient outcome measures. Benchmarking the technology (e.g., displays or data gloves) using healthy controls may provide a means of characterizing the "normal" performance range of the virtual rehabilitation system. This standard not only allows therapists to select appropriate technology for use with their patient populations, it also allows them to account for technology limitations when assessing treatment efficacy. An overview of the proposed user-centered design cycle is given. Comparisons of two optical see-through head-worn displays provide an example of benchmarking techniques. Benchmarks were obtained using a novel vision test capable of measuring a user's stereoacuity while wearing different types of head-worn displays. Results from healthy participants who performed both virtual and real-world versions of the stereoacuity test are discussed with respect to virtual rehabilitation design. The user-centered design cycle argues for benchmarking to precede virtual environment construction, especially for therapeutic applications. Results from real-world testing illustrate the general limitations in stereoacuity attained when viewing content using a head-worn display. Further, the stereoacuity vision benchmark test highlights differences in user performance when utilizing a similar style of head-worn display. These results support the need for including benchmarks as a means of better understanding user outcomes, especially for patient

  16. Science gateways for distributed computing infrastructures development framework and exploitation by scientific user communities

    CERN Document Server

    Kacsuk, Péter

    2014-01-01

    The book describes the science gateway building technology developed in the SCI-BUS European project and its adoption and customization method, by which user communities, such as biologists, chemists, and astrophysicists, can build customized, domain-specific science gateways. Many aspects of the core technology are explained in detail, including its workflow capability, job submission mechanism to various grids and clouds, and its data transfer mechanisms among several distributed infrastructures. The book will be useful for scientific researchers and IT professionals engaged in the develop

  17. Individually adapted imagery improves brain-computer interface performance in end-users with disability.

    Science.gov (United States)

    Scherer, Reinhold; Faller, Josef; Friedrich, Elisabeth V C; Opisso, Eloy; Costa, Ursula; Kübler, Andrea; Müller-Putz, Gernot R

    2015-01-01

    Brain-computer interfaces (BCIs) translate oscillatory electroencephalogram (EEG) patterns into action. Different mental activities modulate spontaneous EEG rhythms in various ways. Non-stationarity and inherent variability of EEG signals, however, make reliable recognition of modulated EEG patterns challenging. Able-bodied individuals who use a BCI for the first time achieve - on average - binary classification performance of about 75%. Performance in users with central nervous system (CNS) tissue damage is typically lower. User training generally enhances reliability of EEG pattern generation and thus also robustness of pattern recognition. In this study, we investigated the impact of mental tasks on binary classification performance in BCI users with central nervous system (CNS) tissue damage such as persons with stroke or spinal cord injury (SCI). Motor imagery (MI), that is the kinesthetic imagination of movement (e.g. squeezing a rubber ball with the right hand), is the "gold standard" and mainly used to modulate EEG patterns. Based on our recent results in able-bodied users, we hypothesized that pair-wise combination of "brain-teaser" (e.g. mental subtraction and mental word association) and "dynamic imagery" (e.g. hand and feet MI) tasks significantly increases classification performance of induced EEG patterns in the selected end-user group. Within-day (How stable is the classification within a day?) and between-day (How well does a model trained on day one perform on unseen data of day two?) analysis of variability of mental task pair classification in nine individuals confirmed the hypothesis. We found that the use of the classical MI task pair hand vs. feed leads to significantly lower classification accuracy - in average up to 15% less - in most users with stroke or SCI. User-specific selection of task pairs was again essential to enhance performance. We expect that the gained evidence will significantly contribute to make imagery-based BCI technology

  18. Individually adapted imagery improves brain-computer interface performance in end-users with disability.

    Directory of Open Access Journals (Sweden)

    Reinhold Scherer

    Full Text Available Brain-computer interfaces (BCIs translate oscillatory electroencephalogram (EEG patterns into action. Different mental activities modulate spontaneous EEG rhythms in various ways. Non-stationarity and inherent variability of EEG signals, however, make reliable recognition of modulated EEG patterns challenging. Able-bodied individuals who use a BCI for the first time achieve - on average - binary classification performance of about 75%. Performance in users with central nervous system (CNS tissue damage is typically lower. User training generally enhances reliability of EEG pattern generation and thus also robustness of pattern recognition. In this study, we investigated the impact of mental tasks on binary classification performance in BCI users with central nervous system (CNS tissue damage such as persons with stroke or spinal cord injury (SCI. Motor imagery (MI, that is the kinesthetic imagination of movement (e.g. squeezing a rubber ball with the right hand, is the "gold standard" and mainly used to modulate EEG patterns. Based on our recent results in able-bodied users, we hypothesized that pair-wise combination of "brain-teaser" (e.g. mental subtraction and mental word association and "dynamic imagery" (e.g. hand and feet MI tasks significantly increases classification performance of induced EEG patterns in the selected end-user group. Within-day (How stable is the classification within a day? and between-day (How well does a model trained on day one perform on unseen data of day two? analysis of variability of mental task pair classification in nine individuals confirmed the hypothesis. We found that the use of the classical MI task pair hand vs. feed leads to significantly lower classification accuracy - in average up to 15% less - in most users with stroke or SCI. User-specific selection of task pairs was again essential to enhance performance. We expect that the gained evidence will significantly contribute to make imagery-based BCI

  19. Conference Report: 18th Conference on Computer-Assisted Qualitative Data Analysis (CAQD) 2016: MAXQDA User Conference

    OpenAIRE

    Galan-Diaz, Carlos

    2017-01-01

    During the first week of March 2016, 120 researchers from 12 different countries, including Syria, Japan, the USA and Turkey, met in Berlin (Germany) to learn more about their computer-assisted qualitative data analysis skills. The 18th Conference on Computer-Assisted Qualitative Data Analysis (CAQD) offered several workshops, a research methods poster session, and the opportunity to share and discuss best practice between attendees, trainers and speakers (informally and through the user foru...

  20. Sensor Alerting Capability

    Science.gov (United States)

    Henriksson, Jakob; Bermudez, Luis; Satapathy, Goutam

    2013-04-01

    There is a large amount of sensor data generated today by various sensors, from in-situ buoys to mobile underwater gliders. Providing sensor data to the users through standardized services, language and data model is the promise of OGC's Sensor Web Enablement (SWE) initiative. As the amount of data grows it is becoming difficult for data providers, planners and managers to ensure reliability of data and services and to monitor critical data changes. Intelligent Automation Inc. (IAI) is developing a net-centric alerting capability to address these issues. The capability is built on Sensor Observation Services (SOSs), which is used to collect and monitor sensor data. The alerts can be configured at the service level and at the sensor data level. For example it can alert for irregular data delivery events or a geo-temporal statistic of sensor data crossing a preset threshold. The capability provides multiple delivery mechanisms and protocols, including traditional techniques such as email and RSS. With this capability decision makers can monitor their assets and data streams, correct failures or be alerted about a coming phenomena.

  1. MERBoard User's Guide

    Science.gov (United States)

    Trimble, Jay; Shab, Ted; Vera, Alonso; Gaswiller, Rich; Clancy, Daniel (Technical Monitor)

    2002-01-01

    An important goal of MERBoard is to allow users to quickly and easily share information. The front-end interface is physically a large plasma computer display with a touch screen, allowing multiple people to interact shoulder-to-shoulder or in a small meeting area. The software system allows people to interactively create digital whiteboards, browse the web, give presentations and connect to personal computers (for example, to run applications not on the MERBoard computer itself). There are four major integrated applications: a browser; a remote connection to another computer (VNC); a digital whiteboard; and a digital space (MERSpace), which is a digital repository for each individual user.

  2. Computer Agent's Role in Modeling an Online Math Help User

    Directory of Open Access Journals (Sweden)

    Dragana Martinovic

    2007-06-01

    Full Text Available This paper investigates perspectives of deployments of open learner model on mathematics online help sites. It proposes enhancing a regular human-to-human interaction with an involvement of a computer agent suitable for tracking users, checking their input and making useful suggestions. Such a design would provide the most support for the interlocutors while keeping the nature of existing environment intact. Special considerations are given to peer-to-peer and expert-to-student mathematics online help that is free of charge and asynchronous. Examples from other collaborative, Web-based environments are also discussed. Suggestions for improving the existing architectures are given, based on the results of a number of studies on on-line learning systems.

  3. NetMOD Version 2.0 User?s Manual.

    Energy Technology Data Exchange (ETDEWEB)

    Merchant, Bion J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    NetMOD ( Net work M onitoring for O ptimal D etection) is a Java-based software package for conducting simulation of seismic, hydracoustic, and infrasonic networks. Specifically, NetMOD simulates the detection capabilities of monitoring networks. Network simulations have long been used to study network resilience to station outages and to determine where additional stations are needed to reduce monitoring thresholds. NetMOD makes use of geophysical models to determine the source characteristics, signal attenuation along the path between the source and station, and the performance and noise properties of the station. These geophysical models are combined to simulate the relative amplitudes of signal and noise that are observed at each of the stations. From these signal-to-noise ratios (SNR), the probability of detection can be computed given a detection threshold. This manual describes how to configure and operate NetMOD to perform detection simulations. In addition, NetMOD is distributed with simulation datasets for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) International Monitoring System (IMS) seismic, hydroacoustic, and infrasonic networks for the purpose of demonstrating NetMOD's capabilities and providing user training. The tutorial sections of this manual use this dataset when describing how to perform the steps involved when running a simulation. ACKNOWLEDGEMENTS We would like to thank the reviewers of this document for their contributions.

  4. Xyce Parallel Electronic Simulator : users' guide, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Hoekstra, Robert John; Waters, Lon J.; Rankin, Eric Lamont; Fixel, Deborah A.; Russo, Thomas V.; Keiter, Eric Richard; Hutchinson, Scott Alan; Pawlowski, Roger Patrick; Wix, Steven D.

    2004-06-01

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator capable of simulating electrical circuits at a variety of abstraction levels. Primarily, Xyce has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability the current state-of-the-art in the following areas: {sm_bullet} Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers. {sm_bullet} Improved performance for all numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-art algorithms and novel techniques. {sm_bullet} Device models which are specifically tailored to meet Sandia's needs, including many radiation-aware devices. {sm_bullet} A client-server or multi-tiered operating model wherein the numerical kernel can operate independently of the graphical user interface (GUI). {sm_bullet} Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing of computing platforms. These include serial, shared-memory and distributed-memory parallel implementation - which allows it to run efficiently on the widest possible number parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. One feature required by designers is the ability to add device models, many specific to the needs of Sandia, to the code. To this end, the device package in the Xyce

  5. An airborne dispersion/dose assessment computer program. Phase 1

    International Nuclear Information System (INIS)

    Scott, C.K.; Kennedy, E.R.; Hughs, R.

    1991-05-01

    The Atomic Energy Control Board (AECB) staff have a need for an airborne dispersion-dose assessment computer programme for a microcomputer. The programme must be capable of analyzing the dispersion of both radioactive and non-radioactive materials. A further requirement of the programme is that it be implemented on the AECB complex of microcomputers and that it have an advanced graphical user interface. A survey of computer programs was conducted to determine which, if any, could meet the AECB's requirements in whole or in part. Ten programmes were selected for detailed review including programs for nuclear and non-radiological emergencies. None of the available programmes for radiation dose assessment meets all the requirements for reasons of user interaction, method of source term estimation or site specificity. It is concluded that the best option for meeting the AECB requirements is to adopt the CAMEO programme (specifically the ALOHA portion) which has a superior graphical user interface and add the necessary models for radiation dose assessment

  6. COYOTE: a finite element computer program for nonlinear heat conduction problems

    International Nuclear Information System (INIS)

    Gartling, D.K.

    1978-06-01

    COYOTE is a finite element computer program designed for the solution of two-dimensional, nonlinear heat conduction problems. The theoretical and mathematical basis used to develop the code is described. Program capabilities and complete user instructions are presented. Several example problems are described in detail to demonstrate the use of the program

  7. Development and evaluation of a head-controlled human-computer interface with mouse-like functions for physically disabled users

    Directory of Open Access Journals (Sweden)

    César Augusto Martins Pereira

    2009-01-01

    Full Text Available OBJECTIVES: The objectives of this study were to develop a pointing device controlled by head movement that had the same functions as a conventional mouse and to evaluate the performance of the proposed device when operated by quadriplegic users. METHODS: Ten individuals with cervical spinal cord injury participated in functional evaluations of the developed pointing device. The device consisted of a video camera, computer software, and a target attached to the front part of a cap, which was placed on the user's head. The software captured images of the target coming from the video camera and processed them with the aim of determining the displacement from the center of the target and correlating this with the movement of the computer cursor. Evaluation of the interaction between each user and the proposed device was carried out using 24 multidirectional tests with two degrees of difficulty. RESULTS: According to the parameters of mean throughput and movement time, no statistically significant differences were observed between the repetitions of the tests for either of the studied levels of difficulty. CONCLUSIONS: The developed pointing device adequately emulates the movement functions of the computer cursor. It is easy to use and can be learned quickly when operated by quadriplegic individuals.

  8. Argonne's Laboratory computing center - 2007 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.; Pieper, G. W.

    2008-05-28

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific

  9. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    Science.gov (United States)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  10. A Survey on Mobile Edge Networks: Convergence of Computing, Caching and Communications

    OpenAIRE

    Wang, Shuo; Zhang, Xing; Zhang, Yan; Wang, Lin; Yang, Juwo; Wang, Wenbo

    2017-01-01

    As the explosive growth of smart devices and the advent of many new applications, traffic volume has been growing exponentially. The traditional centralized network architecture cannot accommodate such user demands due to heavy burden on the backhaul links and long latency. Therefore, new architectures which bring network functions and contents to the network edge are proposed, i.e., mobile edge computing and caching. Mobile edge networks provide cloud computing and caching capabilities at th...

  11. Generic communications index: User's manual

    International Nuclear Information System (INIS)

    Dean, R.S.; Steinbrecher, D.H.; Hennick, A.

    1987-12-01

    This report is a manual for providing information required to use a special computer program developed by the NRC for indexing generic communications. The program is written in a user-friendly menu driven form using dBASE III programming language. It facilitates use of the required dBASE III search and sort capabilities to access records in a database called Generic Communications Index. This index is made up of one record each for all bulletins, circulars, and information notices, including revisions and supplements, from 1971, when such documentation started, through 1986 (or to the latest update). The program is designed for use by anyone modestly acquainted with the general use of IBM-compatible personal computers. The manual contains both a brief overview and a detailed description of the program, as well as detailed instructions for getting started using the program on a personal computer with either a two-floppy disk or a hard disk system. Included at the end are a brief description of how to handle problems which might occur, and notes on the makeup of the program and database files for help in adding records of communications for future years

  12. User's manual for seismic analysis code 'SONATINA-2V'

    International Nuclear Information System (INIS)

    Hanawa, Satoshi; Iyoku, Tatsuo

    2001-08-01

    The seismic analysis code, SONATINA-2V, has been developed to analyze the behavior of the HTTR core graphite components under seismic excitation. The SONATINA-2V code is a two-dimensional computer program capable of analyzing the vertical arrangement of the HTTR graphite components, such as fuel blocks, replaceable reflector blocks, permanent reflector blocks, as well as their restraint structures. In the analytical model, each block is treated as rigid body and is restrained by dowel pins which restrict relative horizontal movement but allow vertical and rocking motions between upper and lower blocks. Moreover, the SONATINA-2V code is capable of analyzing the core vibration behavior under both simultaneous excitations of vertical and horizontal directions. The SONATINA-2V code is composed of the main program, pri-processor for making the input data to SONATINA-2V and post-processor for data processing and making the graphics from analytical results. Though the SONATINA-2V code was developed in order to work in the MSP computer system of Japan Atomic Energy Research Institute (JAERI), the computer system was abolished with the technical progress of computer. Therefore, improvement of this analysis code was carried out in order to operate the code under the UNIX machine, SR8000 computer system, of the JAERI. The users manual for seismic analysis code, SONATINA-2V, including pri- and post-processor is given in the present report. (author)

  13. Wireless-Uplinks-Based Energy-Efficient Scheduling in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Xing Liu

    2015-01-01

    Full Text Available Mobile cloud computing (MCC combines cloud computing and mobile internet to improve the computational capabilities of resource-constrained mobile devices (MDs. In MCC, mobile users could not only improve the computational capability of MDs but also save operation consumption by offloading the mobile applications to the cloud. However, MCC faces the problem of energy efficiency because of time-varying channels when the offloading is being executed. In this paper, we address the issue of energy-efficient scheduling for wireless uplink in MCC. By introducing Lyapunov optimization, we first propose a scheduling algorithm that can dynamically choose channel to transmit data based on queue backlog and channel statistics. Then, we show that the proposed scheduling algorithm can make a tradeoff between queue backlog and energy consumption in a channel-aware MCC system. Simulation results show that the proposed scheduling algorithm can reduce the time average energy consumption for offloading compared to the existing algorithm.

  14. A distributed, graphical user interface based, computer control system for atomic physics experiments.

    Science.gov (United States)

    Keshet, Aviv; Ketterle, Wolfgang

    2013-01-01

    Atomic physics experiments often require a complex sequence of precisely timed computer controlled events. This paper describes a distributed graphical user interface-based control system designed with such experiments in mind, which makes use of off-the-shelf output hardware from National Instruments. The software makes use of a client-server separation between a user interface for sequence design and a set of output hardware servers. Output hardware servers are designed to use standard National Instruments output cards, but the client-server nature should allow this to be extended to other output hardware. Output sequences running on multiple servers and output cards can be synchronized using a shared clock. By using a field programmable gate array-generated variable frequency clock, redundant buffers can be dramatically shortened, and a time resolution of 100 ns achieved over effectively arbitrary sequence lengths.

  15. A distributed, graphical user interface based, computer control system for atomic physics experiments

    Science.gov (United States)

    Keshet, Aviv; Ketterle, Wolfgang

    2013-01-01

    Atomic physics experiments often require a complex sequence of precisely timed computer controlled events. This paper describes a distributed graphical user interface-based control system designed with such experiments in mind, which makes use of off-the-shelf output hardware from National Instruments. The software makes use of a client-server separation between a user interface for sequence design and a set of output hardware servers. Output hardware servers are designed to use standard National Instruments output cards, but the client-server nature should allow this to be extended to other output hardware. Output sequences running on multiple servers and output cards can be synchronized using a shared clock. By using a field programmable gate array-generated variable frequency clock, redundant buffers can be dramatically shortened, and a time resolution of 100 ns achieved over effectively arbitrary sequence lengths.

  16. Fast Computation of the Two-Point Correlation Function in the Age of Big Data

    Science.gov (United States)

    Pellegrino, Andrew; Timlin, John

    2018-01-01

    We present a new code which quickly computes the two-point correlation function for large sets of astronomical data. This code combines the ease of use of Python with the speed of parallel shared libraries written in C. We include the capability to compute the auto- and cross-correlation statistics, and allow the user to calculate the three-dimensional and angular correlation functions. Additionally, the code automatically divides the user-provided sky masks into contiguous subsamples of similar size, using the HEALPix pixelization scheme, for the purpose of resampling. Errors are computed using jackknife and bootstrap resampling in a way that adds negligible extra runtime, even with many subsamples. We demonstrate comparable speed with other clustering codes, and code accuracy compared to known and analytic results.

  17. Error correcting code with chip kill capability and power saving enhancement

    Energy Technology Data Exchange (ETDEWEB)

    Gara, Alan G [Mount Kisco, NY; Chen, Dong [Croton On Husdon, NY; Coteus, Paul W [Yorktown Heights, NY; Flynn, William T [Rochester, MN; Marcella, James A [Rochester, MN; Takken, Todd [Brewster, NY; Trager, Barry M [Yorktown Heights, NY; Winograd, Shmuel [Scarsdale, NY

    2011-08-30

    A method and system are disclosed for detecting memory chip failure in a computer memory system. The method comprises the steps of accessing user data from a set of user data chips, and testing the user data for errors using data from a set of system data chips. This testing is done by generating a sequence of check symbols from the user data, grouping the user data into a sequence of data symbols, and computing a specified sequence of syndromes. If all the syndromes are zero, the user data has no errors. If one of the syndromes is non-zero, then a set of discriminator expressions are computed, and used to determine whether a single or double symbol error has occurred. In the preferred embodiment, less than two full system data chips are used for testing and correcting the user data.

  18. Semiconductor research capabilities at the Lawrence Berkeley Laboratory

    International Nuclear Information System (INIS)

    1987-02-01

    This document discusses semiconductor research capabilities (advanced materials, processing, packaging) and national user facilities (electron microscopy, heavy-ion accelerators, advanced light source)

  19. Storing files in a parallel computing system based on user-specified parser function

    Science.gov (United States)

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  20. End User Development and Infrastructuring - Sustaining Organizational Innovation Capabilities

    DEFF Research Database (Denmark)

    Dittrich, Yvonne; Bolmsten, Johan; Eriksson, Jeanette

    2017-01-01

    of an IT infrastructure based on flexible technologies. The chapter further discusses how such practices are supported by (participatory) organizational IT management structures and processes. Finally, it discusses how EUD in this way contributes to the innovation capability of the organization. The conclusion points...

  1. Designing a Mobile Game for Home Computer Users to Protect Against Phishing Attacks

    OpenAIRE

    Arachchilage, Nalin Asanka Gamagedara; Cole, Melissa

    2016-01-01

    This research aims to design an educational mobile game for home computer users to prevent from phishing attacks. Phishing is an online identity theft which aims to steal sensitive information such as username, password and online banking details from victims. To prevent this, phishing education needs to be considered. Mobile games could facilitate to embed learning in a natural environment. The paper introduces a mobile game design based on a story which is simplifying and exaggerating real ...

  2. Trolling in asynchronous computer-mediated communication:\\ud From user discussions to academic definitions

    OpenAIRE

    Hardaker, Claire

    2010-01-01

    Whilst computer-mediated communication (CMC) can benefit users by providing quick and easy communication between those separated by time and space, it can also provide varying degrees of anonymity that may encourage a sense of impunity and freedom from being held accountable for\\ud inappropriate online behaviour. As such, CMC is a fertile ground for studying impoliteness, whether it occurs in response to perceived threat (flaming), or as an end in its own right (trolling). Currently, first an...

  3. Graphical User Interface Development for Representing Air Flow Patterns

    Science.gov (United States)

    Chaudhary, Nilika

    2004-01-01

    In the Turbine Branch, scientists carry out experimental and computational work to advance the efficiency and diminish the noise production of jet engine turbines. One way to do this is by decreasing the heat that the turbine blades receive. Most of the experimental work is carried out by taking a single turbine blade and analyzing the air flow patterns around it, because this data indicates the sections of the turbine blade that are getting too hot. Since the cost of doing turbine blade air flow experiments is very high, researchers try to do computational work that fits the experimental data. The goal of computational fluid dynamics is for scientists to find a numerical way to predict the complex flow patterns around different turbine blades without physically having to perform tests or costly experiments. When visualizing flow patterns, scientists need a way to represent the flow conditions around a turbine blade. A researcher will assign specific zones that surround the turbine blade. In a two-dimensional view, the zones are usually quadrilaterals. The next step is to assign boundary conditions which define how the flow enters or exits one side of a zone. way of setting up computational zones and grids, visualizing flow patterns, and storing all the flow conditions in a file on the computer for future computation. Such a program is necessary because the only method for creating flow pattern graphs is by hand, which is tedious and time-consuming. By using a computer program to create the zones and grids, the graph would be faster to make and easier to edit. Basically, the user would run a program that is an editable graph. The user could click and drag with the mouse to form various zones and grids, then edit the locations of these grids, add flow and boundary conditions, and finally save the graph for future use and analysis. My goal this summer is to create a graphical user interface (GUI) that incorporates all of these elements. I am writing the program in

  4. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia

    Science.gov (United States)

    Segkouli, Sofia; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos

    2015-01-01

    Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users. PMID:26339282

  5. Percept User Manual.

    Energy Technology Data Exchange (ETDEWEB)

    Carnes, Brian [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kennon, Stephen Ray [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    This document is the main user guide for the Sierra/Percept capabilities including the mesh_adapt and mesh_transfer tools. Basic capabilities for uniform mesh refinement (UMR) and mesh transfers are discussed. Examples are used to provide illustration. Future versions of this manual will include more advanced features such as geometry and mesh smoothing. Additionally, all the options for the mesh_adapt code will be described in detail. Capabilities for local adaptivity in the context of offline adaptivity will also be included. This page intentionally left blank.

  6. User interface for a partially incompatible system software environment with non-ADP users

    Energy Technology Data Exchange (ETDEWEB)

    Loffman, R.S.

    1987-08-01

    Good user interfaces to computer systems and software applications are the result of combining an analysis of user needs with knowledge of interface design principles and techniques. This thesis reports on an interface for an environment: (a) that consists of users who are not computer science or data processing professionals; and (b) which is bound by predetermined software and hardware. The interface was designed which combined these considerations with user interface design principles. Current literature was investigated to establish a baseline of knowledge about user interface design. There are many techniques which can be used to implement a user interface, but all should have the same basic goal, which is to assist the user in the performance of a task. This can be accomplished by providing the user with consistent, well-structured interfaces which also provide flexibility to adapt to differences among users. The interface produced used menu selection and command language techniques to make two different operating system environments appear similar. Additional included features helped to address the needs of different users. The original goal was also to make the transition between the two systems transparent. This was not fully accomplished due to software and hardware limitations.

  7. Visual illusions on the Internet: 15 years of change in technology and user behavior.

    Science.gov (United States)

    Bach, Michael

    2014-01-01

    Abstract. Looking back over 15 years of demonstrating visual phenomena and optical illusions on the Internet, I will discuss two major topics. The first concerns the methodology used to present interactive visual experiments on the web, with respect to (a) wide accessibility, ie independent of browser and platform, (b) capable and elegant graphic user interface, (c) sufficient computational power, (d) ease of development and, finally, (e) future-proofing in an ever-changing online environment. The second major topic addresses some aspects of user behaviour, mainly temporal patterns (eg changes over weeks. years, long-term), which reveal that there are more visitors during office hours.

  8. Evolving Capabilities for Virtual Globes

    Science.gov (United States)

    Glennon, A.

    2006-12-01

    Though thin-client spatial visualization software like Google Earth and NASA World Wind enjoy widespread popularity, a common criticism is their general lack of analytical functionality. This concern, however, is rapidly being addressed; standard and advanced geographic information system (GIS) capabilities are being developed for virtual globes--though not centralized into a single implementation or software package. The innovation is mostly originating from the user community. Three such capabilities relevant to the earth science, education, and emergency management communities are modeling dynamic spatial phenomena, real-time data collection and visualization, and multi-input collaborative databases. Modeling dynamic spatial phenomena has been facilitated through joining virtual globe geometry definitions--like KML--to relational databases. Real-time data collection uses short scripts to transform user-contributed data into a format usable by virtual globe software. Similarly, collaborative data collection for virtual globes has become possible by dynamically referencing online, multi-person spreadsheets. Examples of these functions include mapping flows within a karst watershed, real-time disaster assessment and visualization, and a collaborative geyser eruption spatial decision support system. Virtual globe applications will continue to evolve further analytical capabilities, more temporal data handling, and from nano to intergalactic scales. This progression opens education and research avenues in all scientific disciplines.

  9. The Atmospheric Release Advisory Capability Site Workstation System

    International Nuclear Information System (INIS)

    Foster, K.T.; Sumikawa, D.A.; Foster, C.S.; Baskett, R.L.

    1993-01-01

    The Atmospheric Release Advisory Capability (ARAC) is a centralized emergency response service that assesses the consequences that may result from an atmospheric release of toxic material. ARAC was developed by the Lawrence Livermore National Laboratory (LLNL) for the Departments of Energy (DOE) and Defense (DOD) and responds principally to radiological accidents. ARAC provides radiological health and safety guidance to decision makers in the form of computer-generated estimates of the effects of an actual, or potential release of radioactive material into the atmosphere. Upon receipt of the release scenario, the ARAC assessment staff extracts meteorological, topographic, and geographic data from resident world-wide databases for use in complex, three-dimensional transport and diffusion models. These dispersion models generate air concentration (or dose) and ground deposition contour plots showing estimates of the contamination patterns produced as the toxic material is carried by the prevailing winds. To facilitate the ARAC response to a release from specific DOE and DOD sites and to provide these sites with a local emergency response tool, a remote Site Workstation System (SWS) is being placed at various ARAC-supported facilities across the country.. This SWS replaces the existing antiquated ARAC Site System now installed at many of these sites. The new system gives users access to complex atmospheric dispersion models that may be run either by the ARAC staff at LLNL, or (in a later phase of the system) by site personnel using the computational resources of the SWS. Supporting this primary function are a variety of SWS-resident supplemental capabilities that include meteorological data acquisition, manipulation of release-specific databases, computer-based communications, and the use of a simpler Gaussian trajectory puff model that is based on Environmental Protection Agency's INPUFF code

  10. The FOSS GIS Workbench on the GFZ Load Sharing Facility compute cluster

    Science.gov (United States)

    Löwe, P.; Klump, J.; Thaler, J.

    2012-04-01

    Compute clusters can be used as GIS workbenches, their wealth of resources allow us to take on geocomputation tasks which exceed the limitations of smaller systems. To harness these capabilities requires a Geographic Information System (GIS), able to utilize the available cluster configuration/architecture and a sufficient degree of user friendliness to allow for wide application. In this paper we report on the first successful porting of GRASS GIS, the oldest and largest Free Open Source (FOSS) GIS project, onto a compute cluster using Platform Computing's Load Sharing Facility (LSF). In 2008, GRASS6.3 was installed on the GFZ compute cluster, which at that time comprised 32 nodes. The interaction with the GIS was limited to the command line interface, which required further development to encapsulate the GRASS GIS business layer to facilitate its use by users not familiar with GRASS GIS. During the summer of 2011, multiple versions of GRASS GIS (v 6.4, 6.5 and 7.0) were installed on the upgraded GFZ compute cluster, now consisting of 234 nodes with 480 CPUs providing 3084 cores. The GFZ compute cluster currently offers 19 different processing queues with varying hardware capabilities and priorities, allowing for fine-grained scheduling and load balancing. After successful testing of core GIS functionalities, including the graphical user interface, mechanisms were developed to deploy scripted geocomputation tasks onto dedicated processing queues. The mechanisms are based on earlier work by NETELER et al. (2008). A first application of the new GIS functionality was the generation of maps of simulated tsunamis in the Mediterranean Sea for the Tsunami Atlas of the FP-7 TRIDEC Project (www.tridec-online.eu). For this, up to 500 processing nodes were used in parallel. Further trials included the processing of geometrically complex problems, requiring significant amounts of processing time. The GIS cluster successfully completed all these tasks, with processing times

  11. Xyce Parallel Electronic Simulator - User's Guide, Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    HUTCHINSON, SCOTT A; KEITER, ERIC R.; HOEKSTRA, ROBERT J.; WATERS, LON J.; RUSSO, THOMAS V.; RANKIN, ERIC LAMONT; WIX, STEVEN D.

    2002-11-01

    This manual describes the use of the Xyce Parallel Electronic Simulator code for simulating electrical circuits at a variety of abstraction levels. The Xyce Parallel Electronic Simulator has been written to support,in a rigorous manner, the simulation needs of the Sandia National Laboratories electrical designers. As such, the development has focused on improving the capability over the current state-of-the-art in the following areas: (1) Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers. (2) Improved performance for all numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-art algorithms and novel techniques. (3) A client-server or multi-tiered operating model wherein the numerical kernel can operate independently of the graphical user interface (GUI). (4) Object-oriented code design and implementation using modern coding-practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. The code is a parallel code in the most general sense of the phrase--a message passing parallel implementation--which allows it to run efficiently on the widest possible number of computing platforms. These include serial, shared-memory and distributed-memory parallel as well as heterogeneous platforms. Furthermore, careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved even as the number of processors grows. Another feature required by designers is the ability to add device models, many specific to the needs of Sandia, to the code. To this end, the device package in the Xyce Parallel Electronic Simulator is designed to support a variety of device model inputs. These input formats include standard analytical models, behavioral models

  12. Integrated Fuel-Coolant Interaction (IFCI 7.0) Code User's Manual

    International Nuclear Information System (INIS)

    Young, Michael F.

    1999-01-01

    The integrated fuel-coolant interaction (IFCI) computer code is being developed at Sandia National Laboratories to investigate the fuel-coolant interaction (FCI) problem at large scale using a two-dimensional, three-field hydrodynamic framework and physically based models. IFCI will be capable of treating all major FCI processes in an integrated manner. This document is a description of IFCI 7.0. The user's manual describes the hydrodynamic method and physical models used in IFCI 7.0. Appendix A is an input manual provided for the creation of working decks

  13. Development of a user-friendly, low-cost home energy monitoring and recording system

    International Nuclear Information System (INIS)

    Fletcher, James; Malalasekera, Weeratunge

    2016-01-01

    This paper reports research undertaken to develop a user-friendly home energy monitoring system which is capable of collecting, processing and displaying detailed usage data. The system allows users to monitor power usage and switch their electronic appliances remotely, using any web enabled device, including computers, phones and tablets. The system aims to raise awareness of consumer energy use by gathering data about usage habits, and displaying this information to support consumers when selecting energy tariffs or new appliances. To achieve these aims, bespoke electrical hardware, or ‘nodes’, have been designed and built to monitor power usage, switch devices on and off, and communicate via a Wi-Fi connection, with bespoke software, the ‘server’. The server hosts a webpage which allows users to see a real-time overview of how power is being used in the home as well as allowing scheduled tasks and triggered tasks (which respond to events) to be programmed. The system takes advantage of well standardised networking specifications, such as Wi-Fi and TCP, allowing access from within the home, or remotely through the internet. The server runs under Debian Linux on a Raspberry Pi computer and is written in Python, HTML and JavaScript. The server includes advanced functionality, such as device recognition which allows users to individually monitor several devices that share a single node. The openPicus Flyport is used to provide Wi-Fi connectivity and programmable logic control to nodes. The Flyport is programmed with code compiled from C. - Highlights: • The system is capable of collecting, processing and displaying detailed usage data. • The system is built using commonly available components and software. • Nodes in this system can communicate via a Wi-Fi connection with a server. • The data saved in the server can be used in smart grid applications.

  14. National Synchrotron Light Source user's manual: Guide to the VUV and x-ray beamlines

    International Nuclear Information System (INIS)

    Gmuer, N.F.

    1993-04-01

    The success of the National Synchrotron Light Source is based, in large part, on the size of the user community and the diversity of the scientific and technical disciplines represented by these users. As evidence of this success, the VUV Ring has just celebrated its 10th anniversary and the X-ray Ring will do the same in 1995. In order to enhance this success, the NSLS User's Manual: Guide to the VUV and X-Ray Beamlines - Fifth Edition, is being published. This Manual presents to the scientific community-at-large the current and projected architecture, capabilities and research programs of the various VUV and X-ray beamlines. Also detailed is the research and computer equipment a General User can expect to find and use at each beamline when working at the NSLS. The Manual is updated periodically in order to keep pace with the constant changes on these beamlines

  15. Advanced Simulation Capability for Environmental Management (ASCEM) Phase II Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Freshley, M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hubbard, S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flach, G. [Savannah River National Lab. (SRNL), Aiken, SC (United States); Freedman, V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Agarwal, D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Andre, B. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bott, Y. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chen, X. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Davis, J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Faybishenko, B. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gorton, I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Murray, C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Moulton, D. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Meyer, J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Rockhold, M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Shoshani, A. [LBNL; Steefel, C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wainwright, H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Waichler, S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2012-09-28

    In 2009, the National Academies of Science (NAS) reviewed and validated the U.S. Department of Energy Office of Environmental Management (EM) Technology Program in its publication, Advice on the Department of Energy’s Cleanup Technology Roadmap: Gaps and Bridges. The NAS report outlined prioritization needs for the Groundwater and Soil Remediation Roadmap, concluded that contaminant behavior in the subsurface is poorly understood, and recommended further research in this area as a high priority. To address this NAS concern, the EM Office of Site Restoration began supporting the development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific approach that uses an integration of toolsets for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM modeling toolset is modular and open source. It is divided into three thrust areas: Multi-Process High Performance Computing (HPC), Platform and Integrated Toolsets, and Site Applications. The ASCEM toolsets will facilitate integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. During fiscal year 2012, the ASCEM project continued to make significant progress in capabilities development. Capability development occurred in both the Platform and Integrated Toolsets and Multi-Process HPC Simulator areas. The new Platform and Integrated Toolsets capabilities provide the user an interface and the tools necessary for end-to-end model development that includes conceptual model definition, data management for model input, model calibration and uncertainty analysis, and model output processing including visualization. The new HPC Simulator capabilities target increased functionality of process model representations, toolsets for interaction with the Platform, and model confidence testing and verification for

  16. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  17. Adding Pluggable and Personalized Natural Control Capabilities to Existing Applications

    Science.gov (United States)

    Lamberti, Fabrizio; Sanna, Andrea; Carlevaris, Gilles; Demartini, Claudio

    2015-01-01

    Advancements in input device and sensor technologies led to the evolution of the traditional human-machine interaction paradigm based on the mouse and keyboard. Touch-, gesture- and voice-based interfaces are integrated today in a variety of applications running on consumer devices (e.g., gaming consoles and smartphones). However, to allow existing applications running on desktop computers to utilize natural interaction, significant re-design and re-coding efforts may be required. In this paper, a framework designed to transparently add multi-modal interaction capabilities to applications to which users are accustomed is presented. Experimental observations confirmed the effectiveness of the proposed framework and led to a classification of those applications that could benefit more from the availability of natural interaction modalities. PMID:25635410

  18. Techniques for animation of CFD results. [computational fluid dynamics

    Science.gov (United States)

    Horowitz, Jay; Hanson, Jeffery C.

    1992-01-01

    Video animation is becoming increasingly vital to the computational fluid dynamics researcher, not just for presentation, but for recording and comparing dynamic visualizations that are beyond the current capabilities of even the most powerful graphic workstation. To meet these needs, Lewis Research Center has recently established a facility to provide users with easy access to advanced video animation capabilities. However, producing animation that is both visually effective and scientifically accurate involves various technological and aesthetic considerations that must be understood both by the researcher and those supporting the visualization process. These considerations include: scan conversion, color conversion, and spatial ambiguities.

  19. A Distributed User Information System

    Science.gov (United States)

    1990-03-01

    NOE08 Department of Computer Science NOVO 8 1990 University of Maryland S College Park, MD 20742 D Abstract Current user information database technology ...Transactions on Computer Systems, May 1988. [So189] K. Sollins. A plan for internet directory services. Technical report, DDN Network Information Center...2424 A Distributed User Information System DTiC Steven D. Miller, Scott Carson, and Leo Mark DELECTE Institute for Advanced Computer Studies and

  20. User perspectives on computer applications

    International Nuclear Information System (INIS)

    Trammell, H.E.

    1979-04-01

    Experiences of a technical group that uses the services of computer centers are recounted. An orientation on the ORNL Engineering Technology Division and its missions is given to provide background on the diversified efforts undertaken by the Division and its opportunities to benefit from computer technology. Specific ways in which computers are used within the Division are described; these include facility control, data acquisition, data analysis, theory applications, code development, information processing, cost control, management of purchase requisitions, maintenance of personnel information, and control of technical publications. Problem areas found to need improvement are the overloading of computers during normal working hours, lack of code transportability, delay in obtaining routine programming, delay in key punching services, bewilderment in the use of large computer centers, complexity of job control language, and uncertain quality of software. 20 figures

  1. Whatever works: a systematic user-centered training protocol to optimize brain-computer interfacing individually.

    Directory of Open Access Journals (Sweden)

    Elisabeth V C Friedrich

    Full Text Available This study implemented a systematic user-centered training protocol for a 4-class brain-computer interface (BCI. The goal was to optimize the BCI individually in order to achieve high performance within few sessions for all users. Eight able-bodied volunteers, who were initially naïve to the use of a BCI, participated in 10 sessions over a period of about 5 weeks. In an initial screening session, users were asked to perform the following seven mental tasks while multi-channel EEG was recorded: mental rotation, word association, auditory imagery, mental subtraction, spatial navigation, motor imagery of the left hand and motor imagery of both feet. Out of these seven mental tasks, the best 4-class combination as well as most reactive frequency band (between 8-30 Hz was selected individually for online control. Classification was based on common spatial patterns and Fisher's linear discriminant analysis. The number and time of classifier updates varied individually. Selection speed was increased by reducing trial length. To minimize differences in brain activity between sessions with and without feedback, sham feedback was provided in the screening and calibration runs in which usually no real-time feedback is shown. Selected task combinations and frequency ranges differed between users. The tasks that were included in the 4-class combination most often were (1 motor imagery of the left hand (2, one brain-teaser task (word association or mental subtraction (3, mental rotation task and (4 one more dynamic imagery task (auditory imagery, spatial navigation, imagery of the feet. Participants achieved mean performances over sessions of 44-84% and peak performances in single-sessions of 58-93% in this user-centered 4-class BCI protocol. This protocol is highly adjustable to individual users and thus could increase the percentage of users who can gain and maintain BCI control. A high priority for future work is to examine this protocol with severely

  2. On the control of brain-computer interfaces by users with cerebral palsy.

    Science.gov (United States)

    Daly, Ian; Billinger, Martin; Laparra-Hernández, José; Aloise, Fabio; García, Mariano Lloria; Faller, Josef; Scherer, Reinhold; Müller-Putz, Gernot

    2013-09-01

    Brain-computer interfaces (BCIs) have been proposed as a potential assistive device for individuals with cerebral palsy (CP) to assist with their communication needs. However, it is unclear how well-suited BCIs are to individuals with CP. Therefore, this study aims to investigate to what extent these users are able to gain control of BCIs. This study is conducted with 14 individuals with CP attempting to control two standard online BCIs (1) based upon sensorimotor rhythm modulations, and (2) based upon steady state visual evoked potentials. Of the 14 users, 8 are able to use one or other of the BCIs, online, with a statistically significant level of accuracy, without prior training. Classification results are driven by neurophysiological activity and not seen to correlate with occurrences of artifacts. However, many of these users' accuracies, while statistically significant, would require either more training or more advanced methods before practical BCI control would be possible. The results indicate that BCIs may be controlled by individuals with CP but that many issues need to be overcome before practical application use may be achieved. This is the first study to assess the ability of a large group of different individuals with CP to gain control of an online BCI system. The results indicate that six users could control a sensorimotor rhythm BCI and three a steady state visual evoked potential BCI at statistically significant levels of accuracy (SMR accuracies; mean ± STD, 0.821 ± 0.116, SSVEP accuracies; 0.422 ± 0.069). Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  3. Evolution of the Darlington NGS fuel handling computer systems

    International Nuclear Information System (INIS)

    Leung, V.; Crouse, B.

    1996-01-01

    The ability to improve the capabilities and reliability of digital control systems in nuclear power stations to meet changing plant and personnel requirements is a formidable challenge. Many of these systems have high quality assurance standards that must be met to ensure adequate nuclear safety. Also many of these systems contain obsolete hardware along with software that is not easily transported to newer technology computer equipment. Combining modern technology upgrades into a system of obsolete hardware components is not an easy task. Lastly, as users become more accustomed to using modern technology computer systems in other areas of the station (e.g. information systems), their expectations of the capabilities of the plant systems increase. This paper will present three areas of the Darlington NGS fuel handling computer system that have been or are in the process of being upgraded to current technology components within the framework of an existing fuel handling control system. (author). 3 figs

  4. Evolution of the Darlington NGS fuel handling computer systems

    Energy Technology Data Exchange (ETDEWEB)

    Leung, V; Crouse, B [Ontario Hydro, Bowmanville (Canada). Darlington Nuclear Generating Station

    1997-12-31

    The ability to improve the capabilities and reliability of digital control systems in nuclear power stations to meet changing plant and personnel requirements is a formidable challenge. Many of these systems have high quality assurance standards that must be met to ensure adequate nuclear safety. Also many of these systems contain obsolete hardware along with software that is not easily transported to newer technology computer equipment. Combining modern technology upgrades into a system of obsolete hardware components is not an easy task. Lastly, as users become more accustomed to using modern technology computer systems in other areas of the station (e.g. information systems), their expectations of the capabilities of the plant systems increase. This paper will present three areas of the Darlington NGS fuel handling computer system that have been or are in the process of being upgraded to current technology components within the framework of an existing fuel handling control system. (author). 3 figs.

  5. The Effort to Reduce a Muscle Fatigue Through Gymnastics Relaxation and Ergonomic Approach for Computer Users in Central Building State University of Medan

    Science.gov (United States)

    Gultom, Syamsul; Darma Sitepu, Indra; Hasibuan, Nurman

    2018-03-01

    Fatigue due to long and continuous computer usage can lead to problems of dominant fatigue associated with decreased performance and work motivation. Specific targets in the first phase have been achieved in this research such as: (1) Identified complaints on workers using computers, using the Bourdon Wiersma test kit. (2) Finding the right relaxation & work posture draft for a solution to reduce muscle fatigue in computer-based workers. The type of research used in this study is research and development method which aims to produce the products or refine existing products. The final product is a prototype of back-holder, monitoring filter and arranging a relaxation exercise as well as the manual book how to do this while in front of the computer to lower the fatigue level for computer users in Unimed’s Administration Center. In the first phase, observations and interviews have been conducted and identified the level of fatigue on the employees of computer users at Uniemd’s Administration Center using Bourdon Wiersma test and has obtained the following results: (1) The average velocity time of respondents in BAUK, BAAK and BAPSI after working with the value of interpretation of the speed obtained value of 8.4, WS 13 was in a good enough category, (2) The average of accuracy of respondents in BAUK, in BAAK and in BAPSI after working with interpretation value accuracy obtained Value of 5.5, WS 8 was in doubt-category. This result shows that computer users experienced a significant tiredness at the Unimed Administration Center, (3) the consistency of the average of the result in measuring tiredness level on computer users in Unimed’s Administration Center after working with values in consistency of interpretation obtained Value of 5.5 with WS 8 was put in a doubt-category, which means computer user in The Unimed Administration Center suffered an extreme fatigue. In phase II, based on the results of the first phase in this research, the researcher offers

  6. Understanding users

    DEFF Research Database (Denmark)

    Johannsen, Carl Gustav Viggo

    2014-01-01

    Segmentation of users can help libraries in the process of understanding user similarities and differences. Segmentation can also form the basis for selecting segments of target users and for developing tailored services for specific target segments. Several approaches and techniques have been...... tested in library contexts and the aim of this article is to identify the main approaches and to discuss their perspectives, including their strenghts and weaknesses in, especially, public library contexts. The purpose is also to prsent and discuss the results of a recent - 2014 - Danish library user...... segmentation project using computer-generated clusters. Compared to traditional marketing texts, this article also tries to identify user segments or images or metaphors by the library profession itself....

  7. Wind-US Users Guide Version 3.0

    Science.gov (United States)

    Yoder, Dennis A.

    2016-01-01

    Wind-US is a computational platform which may be used to numerically solve various sets of equations governing physical phenomena. Currently, the code supports the solution of the Euler and Navier-Stokes equations of fluid mechanics, along with supporting equation sets governing turbulent and chemically reacting flows. Wind-US is a product of the NPARC Alliance, a partnership between the NASA Glenn Research Center (GRC) and the Arnold Engineering Development Complex (AEDC) dedicated to the establishment of a national, applications-oriented flow simulation capability. The Boeing Company has also been closely associated with the Alliance since its inception, and represents the interests of the NPARC User's Association. The "Wind-US User's Guide" describes the operation and use of Wind-US, including: a basic tutorial; the physical and numerical models that are used; the boundary conditions; monitoring convergence; the files that are read and/or written; parallel execution; and a complete list of input keywords and test options. For current information about Wind-US and the NPARC Alliance, please see the Wind-US home page at http://www.grc.nasa.gov/WWW/winddocs/ and the NPARC Alliance home page at http://www.grc.nasa.gov/WWW/wind/. This manual describes the operation and use of Wind-US, a computational platform which may be used to numerically solve various sets of equations governing physical phenomena. Wind-US represents a merger of the capabilities of four CFD codes - NASTD (a structured grid flow solver developed at McDonnell Douglas, now part of Boeing), NPARC (the original NPARC Alliance structured grid flow solver), NXAIR (an AEDC structured grid code used primarily for store separation analysis), and ICAT (an unstructured grid flow solver developed at the Rockwell Science Center and Boeing).

  8. User Frustrations as Opportunities

    Directory of Open Access Journals (Sweden)

    Michael Weiss

    2012-04-01

    Full Text Available User frustrations are an excellent source of new product ideas. Starting with this observation, this article describes an approach that entrepreneurs can use to discover business opportunities. Opportunity discovery starts with a problem that the user has, but may not be able to articulate. User-centered design techniques can help elicit those latent needs. The entrepreneur should then try to understand how users are solving their problem today, before proposing a solution that draws on the unique skills and technical capabilities available to the entrepreneur. Finally, an in-depth understanding of the user allows the entrepreneur to hone in on the points of difference and resonance that are the foundation of a strong customer value proposition.

  9. Brain Computer Interface on Track to Home

    OpenAIRE

    Miralles, Felip; Vargiu, Eloisa; Dauwalder, Stefan; Solà, Marc; Müller-Putz, Gernot; Wriessnegger, Selina C.; Pinegger, Andreas; Kübler, Andrea; Halder, Sebastian; Käthner, Ivo; Martin, Suzanne; Daly, Jean; Armstrong, Elaine; Guger, Christoph; Hintermüller, Christoph

    2015-01-01

    The novel BackHome system offers individuals with disabilities a range of useful services available via brain-computer interfaces (BCIs), to help restore their independence. This is the time such technology is ready to be deployed in the real world, that is, at the target end users' home. This has been achieved by the development of practical electrodes, easy to use software, and delivering telemonitoring and home support capabilities which have been conceived, implemented, and tested within ...

  10. IntelliTable: Inclusively-Designed Furniture with Robotic Capabilities.

    Science.gov (United States)

    Prescott, Tony J; Conran, Sebastian; Mitchinson, Ben; Cudd, Peter

    2017-01-01

    IntelliTable is a new proof-of-principle assistive technology system with robotic capabilities in the form of an elegant universal cantilever table able to move around by itself, or under user control. We describe the design and current capabilities of the table and the human-centered design methodology used in its development and initial evaluation. The IntelliTable study has delivered robotic platform programmed by a smartphone that can navigate around a typical home or care environment, avoiding obstacles, and positioning itself at the user's command. It can also be configured to navigate itself to pre-ordained places positions within an environment using ceiling tracking, responsive optical guidance and object-based sonar navigation.

  11. Computable general equilibrium model fiscal year 2014 capability development report

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Brian Keith [Los Alamos National Laboratory; Boero, Riccardo [Los Alamos National Laboratory

    2016-05-11

    This report provides an overview of the development of the NISAC CGE economic modeling capability since 2012. This capability enhances NISAC's economic modeling and analysis capabilities to answer a broader set of questions than possible with previous economic analysis capability. In particular, CGE modeling captures how the different sectors of the economy, for example, households, businesses, government, etc., interact to allocate resources in an economy and this approach captures these interactions when it is used to estimate the economic impacts of the kinds of events NISAC often analyzes.

  12. SAFSIM: A computer program for engineering simulations of space reactor system performance

    International Nuclear Information System (INIS)

    Dobranich, D.

    1992-01-01

    SAFSIM (System Analysis Flow SIMulator) is a FORTRAN computer program that provides engineering simulations of user-specified flow networks at the system level. It includes fluid mechanics, heat transfer, and reactor dynamics capabilities. SAFSIM provides sufficient versatility to allow the simulation of almost any flow system, from a backyard sprinkler system to a clustered nuclear reactor propulsion system. In addition to versatility, speed and robustness are primary goals of SAFSIM. The current capabilities of SAFSIM are summarized, and some illustrative example results are presented

  13. Evaluating user experience with respect to user expectations in brain-computer interface games

    NARCIS (Netherlands)

    Gürkök, Hayrettin; Hakvoort, G.; Poel, Mannes; Müller-Putz, G.R.; Scherer, R.; Billinger, M.; Kreilinger, A.; Kaiser, V.; Neuper, C.

    Evaluating user experience (UX) with respect to previous experiences can provide insight into whether a product can positively aect a user's opinion about a technology. If it can, then we can say that the product provides a positive UX. In this paper we propose a method to assess the UX in BCI

  14. COMPARATIVE STUDY OF CLOUD COMPUTING AND MOBILE CLOUD COMPUTING

    OpenAIRE

    Nidhi Rajak*, Diwakar Shukla

    2018-01-01

    Present era is of Information and Communication Technology (ICT) and there are number of researches are going on Cloud Computing and Mobile Cloud Computing such security issues, data management, load balancing and so on. Cloud computing provides the services to the end user over Internet and the primary objectives of this computing are resource sharing and pooling among the end users. Mobile Cloud Computing is a combination of Cloud Computing and Mobile Computing. Here, data is stored in...

  15. RADTRAN 4: User guide

    International Nuclear Information System (INIS)

    Neuhauser, K.S.; Kanipe, F.L.

    1992-01-01

    RADTRAN 4 is used to evaluate radiological consequences of incident-free transportation, as well as the radiological risks from vehicular accidents occurring during transportation. This User Guide is Volume 3 in a series of four volume of the documentation of the RADTRAN 4 computer code for transportation risk analysis. The other three volumes are Volume 1, the Executive Summary; Volume 2, the Technical Manual; and Volume 4, the Programmer's Manual. The theoretical and calculational basis for the operations performed by RADTRAN 4 are discussed in Volume 2. Throughout this User Guide the reader will be referred to Volume 2 for detailed discussions of certain RADTRAN features. This User Guide supersedes the document ''RADTRAN III'' by Madsen et al. (1983). This RADTRAN 4 User Guide specifies and describes the required data, control inputs, input sequences, user options, program limitations, and other activities necessary for execution of the RADTRAN 4 computer code

  16. Transferring brain-computer interfaces beyond the laboratory: successful application control for motor-disabled users.

    Science.gov (United States)

    Leeb, Robert; Perdikis, Serafeim; Tonin, Luca; Biasiucci, Andrea; Tavella, Michele; Creatura, Marco; Molina, Alberto; Al-Khodairy, Abdul; Carlson, Tom; Millán, José D R

    2013-10-01

    Brain-computer interfaces (BCIs) are no longer only used by healthy participants under controlled conditions in laboratory environments, but also by patients and end-users, controlling applications in their homes or clinics, without the BCI experts around. But are the technology and the field mature enough for this? Especially the successful operation of applications - like text entry systems or assistive mobility devices such as tele-presence robots - requires a good level of BCI control. How much training is needed to achieve such a level? Is it possible to train naïve end-users in 10 days to successfully control such applications? In this work, we report our experiences of training 24 motor-disabled participants at rehabilitation clinics or at the end-users' homes, without BCI experts present. We also share the lessons that we have learned through transferring BCI technologies from the lab to the user's home or clinics. The most important outcome is that 50% of the participants achieved good BCI performance and could successfully control the applications (tele-presence robot and text-entry system). In the case of the tele-presence robot the participants achieved an average performance ratio of 0.87 (max. 0.97) and for the text entry application a mean of 0.93 (max. 1.0). The lessons learned and the gathered user feedback range from pure BCI problems (technical and handling), to common communication issues among the different people involved, and issues encountered while controlling the applications. The points raised in this paper are very widely applicable and we anticipate that they might be faced similarly by other groups, if they move on to bringing the BCI technology to the end-user, to home environments and towards application prototype control. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Industrial application of a graphics computer-based training system

    International Nuclear Information System (INIS)

    Klemm, R.W.

    1985-01-01

    Graphics Computer Based Training (GCBT) roles include drilling, tutoring, simulation and problem solving. Of these, Commonwealth Edison uses mainly tutoring, simulation and problem solving. These roles are not separate in any particular program. They are integrated to provide tutoring and part-task simulation, part-task simulation and problem solving, or problem solving tutoring. Commonwealth's Graphics Computer Based Training program was a result of over a year's worth of research and planning. The keys to the program are it's flexibility and control. Flexibility is maintained through stand alone units capable of program authoring and modification for plant/site specific users. Yet, the system has the capability to support up to 31 terminals with a 40 mb hard disk drive. Control of the GCBT program is accomplished through establishment of development priorities and a central development facility (Commonwealth Edison's Production Training Center)

  18. Stochastic Simulation Service: Bridging the Gap between the Computational Expert and the Biologist.

    Directory of Open Access Journals (Sweden)

    Brian Drawert

    2016-12-01

    Full Text Available We present StochSS: Stochastic Simulation as a Service, an integrated development environment for modeling and simulation of both deterministic and discrete stochastic biochemical systems in up to three dimensions. An easy to use graphical user interface enables researchers to quickly develop and simulate a biological model on a desktop or laptop, which can then be expanded to incorporate increasing levels of complexity. StochSS features state-of-the-art simulation engines. As the demand for computational power increases, StochSS can seamlessly scale computing resources in the cloud. In addition, StochSS can be deployed as a multi-user software environment where collaborators share computational resources and exchange models via a public model repository. We demonstrate the capabilities and ease of use of StochSS with an example of model development and simulation at increasing levels of complexity.

  19. OVIS 3.2 user's guide.

    Energy Technology Data Exchange (ETDEWEB)

    Mayo, Jackson R.; Gentile, Ann C.; Brandt, James M.; Houf, Catherine A.; Thompson, David C.; Roe, Diana C.; Wong, Matthew H.; Pebay, Philippe Pierre

    2010-10-01

    This document describes how to obtain, install, use, and enjoy a better life with OVIS version 3.2. The OVIS project targets scalable, real-time analysis of very large data sets. We characterize the behaviors of elements and aggregations of elements (e.g., across space and time) in data sets in order to detect meaningful conditions and anomalous behaviors. We are particularly interested in determining anomalous behaviors that can be used as advance indicators of significant events of which notification can be made or upon which action can be taken or invoked. The OVIS open source tool (BSD license) is available for download at ovis.ca.sandia.gov. While we intend for it to support a variety of application domains, the OVIS tool was initially developed for, and continues to be primarily tuned for, the investigation of High Performance Compute (HPC) cluster system health. In this application it is intended to be both a system administrator tool for monitoring and a system engineer tool for exploring the system state in depth. OVIS 3.2 provides a variety of statistical tools for examining the behavior of elements in a cluster (e.g., nodes, racks) and associated resources (e.g., storage appliances and network switches). It provides an interactive 3-D physical view in which the cluster elements can be colored by raw or derived element values (e.g., temperatures, memory errors). The visual display allows the user to easily determine abnormal or outlier behaviors. Additionally, it provides search capabilities for certain scheduler logs. The OVIS capabilities were designed to be highly interactive - for example, the job search may drive an analysis which in turn may drive the user generation of a derived value which would then be examined on the physical display. The OVIS project envisions the capabilities of its tools applied to compute cluster monitoring. In the future, integration with the scheduler or resource manager will be included in a release to enable intelligent

  20. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    Science.gov (United States)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user

  1. User interface issues in supporting human-computer integrated scheduling

    Science.gov (United States)

    Cooper, Lynne P.; Biefeld, Eric W.

    1991-01-01

    The topics are presented in view graph form and include the following: characteristics of Operations Mission Planner (OMP) schedule domain; OMP architecture; definition of a schedule; user interface dimensions; functional distribution; types of users; interpreting user interaction; dynamic overlays; reactive scheduling; and transitioning the interface.

  2. MACSIS User's Manual and Code Description

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Woan; Lee, Byoung Oon; Kim, Kyung Gun; Kim, Young Jin [Korea Atomic Energy Research Institute, Taejeon (Korea); Lee, Dong Uk [Hanyang Univ., Seoul (Korea)

    2000-03-01

    MACSIS is a computer program for simulating the behavior of metal fuel elements under normal operating conditions of a Liquid Metal Cooled Reactor. It computes the one-dimensional temperature distribution and the thermo-mechanical characteristics of fuel rod under the steady state operation condition, including the swelling and rod deformation. The amount of fission gas released during the irradiation of the fuel is also computed. The thermal expansion and the gas pressure inside the fuel element are then used to compute the stresses and strains in the cladding. This document is mainly intended as a user's manual for the MACSIS code. A short description of the capabilities of the code and detailed input instructions are supplied for this purpose. MACSIS is constructed of a series of modules with a single set of dimensional units used throughout to provide flexibility in model usage and ease of upgrading as models developed from future tests are finalized. Radial steady state heat transfer can be computed for 21 axial segments. The code computes all major quantities which affect in-reactor performances of fuel rod, such as, fission gas generation and retention, fission gas release, swelling, and deformation, etc. 37 refs., 24 figs., 3 tabs. (Author)

  3. User's Manual for the Object User Interface (OUI): An Environmental Resource Modeling Framework

    Science.gov (United States)

    Markstrom, Steven L.; Koczot, Kathryn M.

    2008-01-01

    The Object User Interface is a computer application that provides a framework for coupling environmental-resource models and for managing associated temporal and spatial data. The Object User Interface is designed to be easily extensible to incorporate models and data interfaces defined by the user. Additionally, the Object User Interface is highly configurable through the use of a user-modifiable, text-based control file that is written in the eXtensible Markup Language. The Object User Interface user's manual provides (1) installation instructions, (2) an overview of the graphical user interface, (3) a description of the software tools, (4) a project example, and (5) specifications for user configuration and extension.

  4. Computer utility for interactive instrument control

    International Nuclear Information System (INIS)

    Day, P.

    1975-08-01

    A careful study of the ANL laboratory automation needs in 1967 led to the conclusion that a central computer could support all of the real-time needs of a diverse collection of research instruments. A suitable hardware configuration would require an operating system to provide effective protection, fast real-time response and efficient data transfer. An SDS Sigma 5 satisfied all hardware criteria, however it was necessary to write an original operating system; services include program generation, experiment control real-time analysis, interactive graphics and final analysis. The system is providing real-time support for 21 concurrently running experiments, including an automated neutron diffractometer, a pulsed NMR spectrometer and multi-particle detection systems. It guarantees the protection of each user's interests and dynamically assigns core memory, disk space and 9-track magnetic tape usage. Multiplexor hardware capability allows the transfer of data between a user's device and assigned core area at rates of 100,000 bytes/sec. Real-time histogram generation for a user can proceed at rates of 50,000 points/sec. The facility has been self-running (no computer operator) for five years with a mean time between failures of 10 []ays and an uptime of 157 hours/week. (auth)

  5. Assistive technology use is associated with reduced capability poverty: a cross-sectional study in Bangladesh.

    Science.gov (United States)

    Borg, Johan; Ostergren, Per-Olof; Larsson, Stig; Rahman, Asm Atiqur; Bari, Nazmul; Khan, Ahm Noman

    2012-03-01

    About half of all people with disabilities in developing countries live in extreme poverty. Focusing on the ends rather than the economic means of human development, the capability approach offers an alternative view of poverty. The purpose of this study was to explore the relation between assistive technology use and capability poverty in a low-income country. Self-reported data on food intake, health care, education, politics, self-determination, self-respect, family relationships and friendships were collected in Bangladesh through interviews of people with hearing impairments using and not using hearings aids, and people with ambulatory impairments using and not using manual wheelchairs (N = 583). Differences in outcomes between users and non-users of assistive technology were analyzed using logistic regression. Assistive technology users were more likely than non-users to report enhanced capabilities, hearing aid users to a larger extent than wheelchair users. Synergistic effects between assistive technology use and education were found. The use of assistive technology is predictive of reduced capability poverty in Bangladesh. Lack of wheelchair accessibility and the nature of selected outcomes may explain the limited association in the ambulatory group. Enhancing the effects of the other, there is support for providing education in combination with hearing aids. [Box: see text].

  6. Task 7: ADPAC User's Manual

    Science.gov (United States)

    Hall, E. J.; Topp, D. A.; Delaney, R. A.

    1996-01-01

    The overall objective of this study was to develop a 3-D numerical analysis for compressor casing treatment flowfields. The current version of the computer code resulting from this study is referred to as ADPAC (Advanced Ducted Propfan Analysis Codes-Version 7). This report is intended to serve as a computer program user's manual for the ADPAC code developed under Tasks 6 and 7 of the NASA Contract. The ADPAC program is based on a flexible multiple- block grid discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. An iterative implicit algorithm is available for rapid time-dependent flow calculations, and an advanced two equation turbulence model is incorporated to predict complex turbulent flows. The consolidated code generated during this study is capable of executing in either a serial or parallel computing mode from a single source code. Numerous examples are given in the form of test cases to demonstrate the utility of this approach for predicting the aerodynamics of modem turbomachinery configurations.

  7. Integrated Reliability and Risk Analysis System (IRRAS) Version 2.0 user's guide

    International Nuclear Information System (INIS)

    Russell, K.D.; Sattison, M.B.; Rasmuson, D.M.

    1990-06-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Also provided in the system is an integrated full-screen editor for use when interfacing with remote mainframe computer systems. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 2.0 and is the subject of this user's guide. Version 2.0 of IRRAS provides all of the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance. 9 refs., 292 figs., 4 tabs

  8. PREDICTING ATTENUATION OF VIRUSES DURING PERCOLATION IN SOILS: 2. USER'S GUIDE TO THE VIRULO 1.0 COMPUTER MODEL

    Science.gov (United States)

    In the EPA document Predicting Attenuation of Viruses During Percolation in Soils 1. Probabilistic Model the conceptual, theoretical, and mathematical foundations for a predictive screening model were presented. In this current volume we present a User's Guide for the computer mo...

  9. Architecture and User-Context Models of CoCare: A Context-Aware Mobile Recommender System for Health Promotion.

    Science.gov (United States)

    Cerón-Rios, Gineth; López, Diego M; Blobel, Bernd

    2017-01-01

    Recommender systems (RS) are useful tools for filtering and sorting items and information for users. There is a wide diversity of approaches that help creating personalized recommendations. Context-aware recommender systems (CARS) are a kind of RS which provide adaptation capabilities to the user's environment, e.g., by sensing data through wearable devices or other biomedical sensors. In healthcare and wellbeing, CARS can support health promotion and health education, considering that each individual requires tailored intervention programs. Our research aims at proposing a context-aware mobile recommender system for the promotion of healthy habits. The system is adapted to the user's needs, his/her health information, interests, time, location and lifestyles. In this paper, the CARS computational architecture and the user and context models of health promotion are presented, which were used to implement and test a prototype recommender system.

  10. ABrox-A user-friendly Python module for approximate Bayesian computation with a focus on model comparison.

    Science.gov (United States)

    Mertens, Ulf Kai; Voss, Andreas; Radev, Stefan

    2018-01-01

    We give an overview of the basic principles of approximate Bayesian computation (ABC), a class of stochastic methods that enable flexible and likelihood-free model comparison and parameter estimation. Our new open-source software called ABrox is used to illustrate ABC for model comparison on two prominent statistical tests, the two-sample t-test and the Levene-Test. We further highlight the flexibility of ABC compared to classical Bayesian hypothesis testing by computing an approximate Bayes factor for two multinomial processing tree models. Last but not least, throughout the paper, we introduce ABrox using the accompanied graphical user interface.

  11. A Summary of Actinide Enrichment Technologies and Capability Gaps

    Energy Technology Data Exchange (ETDEWEB)

    Patton, Bradley D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Robinson, Sharon M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-01

    The evaluation performed in this study indicates that a new program is needed to efficiently provide a national actinide radioisotope enrichment capability to produce milligram-to-gram quantities of unique materials for user communities. This program should leverage past actinide enrichment, the recent advances in stable isotope enrichment, and assessments of the future requirements to cost effectively develop this capability while establishing an experience base for a new generation of researchers in this vital area. Preliminary evaluations indicate that an electromagnetic isotope separation (EMIS) device would have the capability to meet the future needs of the user community for enriched actinides. The EMIS technology could be potentially coupled with other enrichment technologies, such as irradiation, as pre-enrichment and/or post-enrichment systems to increase the throughput, reduce losses of material, and/or reduce operational costs of the base EMIS system. Past actinide enrichment experience and advances in the EMIS technology applied in stable isotope separations should be leveraged with this new evaluation information to assist in the establishment of a domestic actinide radioisotope enrichment capability.

  12. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    Science.gov (United States)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  13. User's manual for computer code RIBD-II, a fission product inventory code

    International Nuclear Information System (INIS)

    Marr, D.R.

    1975-01-01

    The computer code RIBD-II is used to calculate inventories, activities, decay powers, and energy releases for the fission products generated in a fuel irradiation. Changes from the earlier RIBD code are: the expansion to include up to 850 fission product isotopes, input in the user-oriented NAMELIST format, and run-time choice of fuels from an extensively enlarged library of nuclear data. The library that is included in the code package contains yield data for 818 fission product isotopes for each of fourteen different fissionable isotopes, together with fission product transmutation cross sections for fast and thermal systems. Calculational algorithms are little changed from those in RIBD. (U.S.)

  14. Single user systems

    International Nuclear Information System (INIS)

    Willers, I.

    1985-01-01

    The first part of the talk is devoted to establishing concepts and trends in interactive computing technology and is intended to provide a framework. I then discuss personal computing architectures of today and planned architectures of the 1990s. I present current personal computer environments for the programmer and for the user. Scenarios for future computing environments are developed. Finally, I open a discussion on the social implications of personal computers. (orig./HSI)

  15. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00291854; The ATLAS collaboration; Di Girolamo, Alessandro; Alandes Pradillo, Maria

    2017-01-01

    The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computin...

  16. Designing for User Engagment Aesthetic and Attractive User Interfaces

    CERN Document Server

    Sutcliffe, Alistair

    2009-01-01

    This book explores the design process for user experience and engagement, which expands the traditional concept of usability and utility in design to include aesthetics, fun and excitement. User experience has evolved as a new area of Human Computer Interaction research, motivated by non-work oriented applications such as games, education and emerging interactive Web 2.0. The chapter starts by examining the phenomena of user engagement and experience and setting them in the perspective of cognitive psychology, in particular motivation, emotion and mood. The perspective of aesthetics is expande

  17. Building a computer-aided design capability using a standard time share operating system

    Science.gov (United States)

    Sobieszczanski, J.

    1975-01-01

    The paper describes how an integrated system of engineering computer programs can be built using a standard commercially available operating system. The discussion opens with an outline of the auxiliary functions that an operating system can perform for a team of engineers involved in a large and complex task. An example of a specific integrated system is provided to explain how the standard operating system features can be used to organize the programs into a simple and inexpensive but effective system. Applications to an aircraft structural design study are discussed to illustrate the use of an integrated system as a flexible and efficient engineering tool. The discussion concludes with an engineer's assessment of an operating system's capabilities and desirable improvements.

  18. Shell stability analysis in a computer aided engineering (CAE) environment

    Science.gov (United States)

    Arbocz, J.; Hol, J. M. A. M.

    1993-01-01

    The development of 'DISDECO', the Delft Interactive Shell DEsign COde is described. The purpose of this project is to make the accumulated theoretical, numerical and practical knowledge of the last 25 years or so readily accessible to users interested in the analysis of buckling sensitive structures. With this open ended, hierarchical, interactive computer code the user can access from his workstation successively programs of increasing complexity. The computational modules currently operational in DISDECO provide the prospective user with facilities to calculate the critical buckling loads of stiffened anisotropic shells under combined loading, to investigate the effects the various types of boundary conditions will have on the critical load, and to get a complete picture of the degrading effects the different shapes of possible initial imperfections might cause, all in one interactive session. Once a design is finalized, its collapse load can be verified by running a large refined model remotely from behind the workstation with one of the current generation 2-dimensional codes, with advanced capabilities to handle both geometric and material nonlinearities.

  19. MACBenAbim: A Multi-platform Mobile Application for searching keyterms in Computational Biology and Bioinformatics.

    Science.gov (United States)

    Oluwagbemi, Olugbenga O; Adewumi, Adewole; Esuruoso, Abimbola

    2012-01-01

    Computational biology and bioinformatics are gradually gaining grounds in Africa and other developing nations of the world. However, in these countries, some of the challenges of computational biology and bioinformatics education are inadequate infrastructures, and lack of readily-available complementary and motivational tools to support learning as well as research. This has lowered the morale of many promising undergraduates, postgraduates and researchers from aspiring to undertake future study in these fields. In this paper, we developed and described MACBenAbim (Multi-platform Mobile Application for Computational Biology and Bioinformatics), a flexible user-friendly tool to search for, define and describe the meanings of keyterms in computational biology and bioinformatics, thus expanding the frontiers of knowledge of the users. This tool also has the capability of achieving visualization of results on a mobile multi-platform context. MACBenAbim is available from the authors for non-commercial purposes.

  20. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-07-01

    Computational steering has revolutionized the traditional workflow in high performance computing (HPC) applications. The standard workflow that consists of preparation of an application’s input, running of a simulation, and visualization of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation application at run-time. It allows modification of application-defined control parameters at run-time using various user-steering applications. In this project, we propose a computational steering framework for HPC environments that provides an innovative solution and easy-to-use platform, which allows users to connect and interact with running application(s) in real-time. This framework uses RealityGrid as the underlying steering library and adds several enhancements to the library to enable steering support for Blue Gene systems. Included in the scope of this project is the development of a scalable and efficient steering relay server that supports many-to-many connectivity between multiple steered applications and multiple steering clients. Steered applications can range from intermediate simulation and physical modeling applications to complex computational fluid dynamics (CFD) applications or advanced visualization applications. The Blue Gene supercomputer presents special challenges for remote access because the compute nodes reside on private networks. This thesis presents an implemented solution and demonstrates it on representative applications. Thorough implementation details and application enablement steps are also presented in this thesis to encourage direct usage of this framework.

  1. Atmospheric release advisory capability

    International Nuclear Information System (INIS)

    Sullivan, T.J.

    1981-01-01

    The ARAC system (Atmospheric Release Advisory Capability) is described. The system is a collection of people, computers, computer models, topographic data and meteorological input data that together permits a calculation of, in a quasi-predictive sense, where effluent from an accident will migrate through the atmosphere, where it will be deposited on the ground, and what instantaneous and integrated dose an exposed individual would receive

  2. Advanced Query and Data Mining Capabilities for MaROS

    Science.gov (United States)

    Wang, Paul; Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Hy, Franklin H.

    2013-01-01

    The Mars Relay Operational Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay network. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. As part of MaROS, the innovators have developed and implemented a feature set that operates on several levels of the software architecture. This new feature is an advanced querying capability through either the Web-based user interface, or through a back-end REST interface to access all of the data gathered from the network. This software is not meant to replace the REST interface, but to augment and expand the range of available data. The current REST interface provides specific data that is used by the MaROS Web application to display and visualize the information; however, the returned information from the REST interface has typically been pre-processed to return only a subset of the entire information within the repository, particularly only the information that is of interest to the GUI (graphical user interface). The new, advanced query and data mining capabilities allow users to retrieve the raw data and/or to perform their own data processing. The query language used to access the repository is a restricted subset of the structured query language (SQL) that can be built safely from the Web user interface, or entered as freeform SQL by a user. The results are returned in a CSV (Comma Separated Values) format for easy exporting to third party tools and applications that can be used for data mining or user-defined visualization and interpretation. This is the first time that a service is capable of providing access to all cross-project relay data from a single Web resource. Because MaROS contains the data for a variety of missions from the Mars network, which span both NASA and ESA, the software also establishes an access control list (ACL) on each data record

  3. NET-COMPUTER: Internet Computer Architecture and its Application in E-Commerce

    Directory of Open Access Journals (Sweden)

    P. O. Umenne

    2012-12-01

    Full Text Available Research in Intelligent Agents has yielded interesting results, some of which have been translated into commer­cial ventures. Intelligent Agents are executable software components that represent the user, perform tasks on behalf of the user and when the task terminates, the Agents send the result to the user. Intelligent Agents are best suited for the Internet: a collection of computers connected together in a world-wide computer network. Swarm and HYDRA computer architectures for Agents’ execution were developed at the University of Surrey, UK in the 90s. The objective of the research was to develop a software-based computer architecture on which Agents execution could be explored. The combination of Intelligent Agents and HYDRA computer architecture gave rise to a new computer concept: the NET-Computer in which the comput­ing resources reside on the Internet. The Internet computers form the hardware and software resources, and the user is provided with a simple interface to access the Internet and run user tasks. The Agents autonomously roam the Internet (NET-Computer executing the tasks. A growing segment of the Internet is E-Commerce for online shopping for products and services. The Internet computing resources provide a marketplace for product suppliers and consumers alike. Consumers are looking for suppliers selling products and services, while suppliers are looking for buyers. Searching the vast amount of information available on the Internet causes a great deal of problems for both consumers and suppliers. Intelligent Agents executing on the NET-Computer can surf through the Internet and select specific information of interest to the user. The simulation results show that Intelligent Agents executing HYDRA computer architecture could be applied in E-Commerce.

  4. Reliability of a computer and Internet survey (Computer User Profile) used by adults with and without traumatic brain injury (TBI).

    Science.gov (United States)

    Kilov, Andrea M; Togher, Leanne; Power, Emma

    2015-01-01

    To determine test-re-test reliability of the 'Computer User Profile' (CUP) in people with and without TBI. The CUP was administered on two occasions to people with and without TBI. The CUP investigated the nature and frequency of participants' computer and Internet use. Intra-class correlation coefficients and kappa coefficients were conducted to measure reliability of individual CUP items. Descriptive statistics were used to summarize content of responses. Sixteen adults with TBI and 40 adults without TBI were included in the study. All participants were reliable in reporting demographic information, frequency of social communication and leisure activities and computer/Internet habits and usage. Adults with TBI were reliable in 77% of their responses to survey items. Adults without TBI were reliable in 88% of their responses to survey items. The CUP was practical and valuable in capturing information about social, leisure, communication and computer/Internet habits of people with and without TBI. Adults without TBI scored more items with satisfactory reliability overall in their surveys. Future studies may include larger samples and could also include an exploration of how people with/without TBI use other digital communication technologies. This may provide further information on determining technology readiness for people with TBI in therapy programmes.

  5. Xyce parallel electronic simulator : users' guide. Version 5.1.

    Energy Technology Data Exchange (ETDEWEB)

    Mei, Ting; Rankin, Eric Lamont; Thornquist, Heidi K.; Santarelli, Keith R.; Fixel, Deborah A.; Coffey, Todd Stirling; Russo, Thomas V.; Schiek, Richard Louis; Keiter, Eric Richard; Pawlowski, Roger Patrick

    2009-11-01

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: (1) Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers. (2) Improved performance for all numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-art algorithms and novel techniques. (3) Device models which are specifically tailored to meet Sandia's needs, including some radiation-aware devices (for Sandia users only). (4) Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing parallel implementation - which allows it to run efficiently on the widest possible number of computing platforms. These include serial, shared-memory and distributed-memory parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The development of Xyce provides a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods, parallel solver algorithms) research and development can be performed. As a result, Xyce is a

  6. Xyce Parallel Electronic Simulator : users' guide, version 4.1.

    Energy Technology Data Exchange (ETDEWEB)

    Mei, Ting; Rankin, Eric Lamont; Thornquist, Heidi K.; Santarelli, Keith R.; Fixel, Deborah A.; Coffey, Todd Stirling; Russo, Thomas V.; Schiek, Richard Louis; Keiter, Eric Richard; Pawlowski, Roger Patrick

    2009-02-01

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: (1) Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers. (2) Improved performance for all numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-art algorithms and novel techniques. (3) Device models which are specifically tailored to meet Sandia's needs, including some radiation-aware devices (for Sandia users only). (4) Object-oriented code design and implementation using modern coding practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. Xyce is a parallel code in the most general sense of the phrase - a message passing parallel implementation - which allows it to run efficiently on the widest possible number of computing platforms. These include serial, shared-memory and distributed-memory parallel as well as heterogeneous platforms. Careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The development of Xyce provides a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia has an 'in-house' capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods, parallel solver algorithms) research and development can be performed. As a result, Xyce is a

  7. Ensuring US National Aeronautics Test Capabilities

    Science.gov (United States)

    Marshall, Timothy J.

    2010-01-01

    U.S. leadership in aeronautics depends on ready access to technologically advanced, efficient, and affordable aeronautics test capabilities. These systems include major wind tunnels and propulsion test facilities and flight test capabilities. The federal government owns the majority of the major aeronautics test capabilities in the United States, primarily through the National Aeronautics and Space Administration (NASA) and the Department of Defense (DoD). However, changes in the Aerospace landscape, primarily the decrease in demand for testing over the last 20 years required an overarching strategy for management of these national assets. Therefore, NASA established the Aeronautics Test Program (ATP) as a two-pronged strategic initiative to: (1) retain and invest in NASA aeronautics test capabilities considered strategically important to the agency and the nation, and (2) establish a strong, high level partnership with the DoD. Test facility utilization is a critical factor for ATP because it relies on user occupancy fees to recover a substantial part of the operations costs for its facilities. Decreasing utilization is an indicator of excess capacity and in some cases low-risk redundancy (i.e., several facilities with basically the same capability and overall low utilization). However, low utilization does not necessarily translate to lack of strategic importance. Some facilities with relatively low utilization are nonetheless vitally important because of the unique nature of the capability and the foreseeable aeronautics testing needs. Unfortunately, since its inception, the customer base for ATP has continued to shrink. Utilization of ATP wind tunnels has declined by more than 50% from the FY 2006 levels. This significant decrease in customer usage is attributable to several factors, including the overall decline in new programs and projects in the aerospace sector; the impact of computational fluid dynamics (CFD) on the design, development, and research

  8. Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia

    Directory of Open Access Journals (Sweden)

    Sofia Segkouli

    2015-01-01

    Full Text Available Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM simulating mild cognitive impairment (MCI through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users’ cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces’ design supported by increased tasks’ complexity to capture a more detailed profile of users’ capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces’ evaluation through simulation on the basis of virtual models of MCI users.

  9. Computation system for nuclear reactor core analysis

    International Nuclear Information System (INIS)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals

  10. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  11. An Empirical Study of User Experience on Touch Mice

    Science.gov (United States)

    Chou, Jyh Rong

    2016-01-01

    The touch mouse is a new type of computer mouse that provides users with a new way of touch-based environment to interact with computers. For more than a decade, user experience (UX) has grown into a core concept of human-computer interaction (HCI), describing a user's perceptions and responses that result from the use of a product in a particular…

  12. DIII-D tokamak control and neutral beam computer system upgrades

    International Nuclear Information System (INIS)

    Penaflor, B.G.; McHarg, B.B.; Piglowski, D.A.; Pham, D.; Phillips, J.C.

    2004-01-01

    This paper covers recent computer system upgrades made to the DIII-D tokamak control and neutral beam computer systems. The systems responsible for monitoring and controlling the DIII-D tokamak and injecting neutral beam power have recently come online with new computing hardware and software. The new hardware and software have provided a number of significant improvements over the previous Modcomp AEG VME and accessware based systems. These improvements include the incorporation of faster, less expensive, and more readily available computing hardware which have provided performance increases of up to a factor 20 over the prior systems. A more modern graphical user interface with advanced plotting capabilities has improved feedback to users on the operating status of the tokamak and neutral beam systems. The elimination of aging and non supportable hardware and software has increased overall maintainability. The distinguishing characteristics of the new system include: (1) a PC based computer platform running the Redhat version of the Linux operating system; (2) a custom PCI CAMAC software driver developed by general atomics for the kinetic systems 2115 serial highway card; and (3) a custom developed supervisory control and data acquisition (SCADA) software package based on Kylix, an inexpensive interactive development environment (IDE) tool from borland corporation. This paper provides specific details of the upgraded computer systems

  13. Problems and accommodation strategies reported by computer users with rheumatoid arthritis or fibromyalgia.

    Science.gov (United States)

    Baker, Nancy A; Rubinstein, Elaine N; Rogers, Joan C

    2012-09-01

    Little is known about the problems experienced by and the accommodation strategies used by computer users with rheumatoid arthritis (RA) or fibromyalgia (FM). This study (1) describes specific problems and accommodation strategies used by people with RA and FM during computer use; and (2) examines if there were significant differences in the problems and accommodation strategies between the different equipment items for each diagnosis. Subjects were recruited from the Arthritis Network Disease Registry. Respondents completed a self-report survey, the Computer Problems Survey. Data were analyzed descriptively (percentages; 95% confidence intervals). Differences in the number of problems and accommodation strategies were calculated using nonparametric tests (Friedman's test and Wilcoxon Signed Rank Test). Eighty-four percent of respondents reported at least one problem with at least one equipment item (RA = 81.5%; FM = 88.9%), with most respondents reporting problems with their chair. Respondents most commonly used timing accommodation strategies to cope with mouse and keyboard problems, personal accommodation strategies to cope with chair problems and environmental accommodation strategies to cope with monitor problems. The number of problems during computer use was substantial in our sample, and our respondents with RA and FM may not implement the most effective strategies to deal with their chair, keyboard, or mouse problems. This study suggests that workers with RA and FM might potentially benefit from education and interventions to assist with the development of accommodation strategies to reduce problems related to computer use.

  14. Evaluation of the discrete vortex wake cross flow model using vector computers. Part 2: User's manual for DIVORCE

    Science.gov (United States)

    Deffenbaugh, F. D.; Vitz, J. F.

    1979-01-01

    The users manual for the Discrete Vortex Cross flow Evaluator (DIVORCE) computer program is presented. DIVORCE was developed in FORTRAN 4 for the DCD 6600 and CDC 7600 machines. Optimal calls to a NASA vector subroutine package are provided for use with the CDC 7600.

  15. Factors affecting adoption behavior for Tablet device among computer users in Pakistan

    Directory of Open Access Journals (Sweden)

    Rafique Ahmed

    2016-12-01

    Full Text Available Mobile computing represents a need of this decade. Mobile computing is possible with a tablet device, for which there is no clear-cut definition. It is partly because mobile computation field is still an emerging field. Tablet industry is still in its infancy stage and therefore, standards have yet to be defined. Given the limitations, however, a tablet device can be defined as a computing device smaller and slower than a laptop, however larger, and faster than a palm type device. In this research work, factors affecting adoption behavior for tablet device among computer users have been studied. An integral part of the study was to compare effect of the income level on adoption behavior. In this regard, two samples of private and public university students were studied. A modified technology acceptance model (TAM has been used. Two variables were added to TAM model based on Pakistan’s demographics. A questionnaire was used to collect data. 1000 questionnaires were distributed from which we received 972; twenty two questionnaires were having major missing values so they were separated from analysis. Twenty five respondents were found outliers during data screening; by this sample used in this study is 925. Results were analyzed using linear regression which showed only perceived ease of use and perceived usefulness affected attitude to adopt tablet device. These results were found to be consistent for both private and public universities. Facilitation conditions and price perception play an insignificant role. The results confirmed perceived usefulness and ease of use are the only important factors affecting adoption behavior for tablet device.

  16. LAURA Users Manual: 5.2-43231

    Science.gov (United States)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil

    2009-01-01

    This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multiphysics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  17. Laura Users Manual: 5.1-41601

    Science.gov (United States)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil

    2009-01-01

    This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multiphysics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  18. LAURA Users Manual: 5.3-48528

    Science.gov (United States)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Chirstopher O.; Kleb, Bil

    2010-01-01

    This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  19. LAURA Users Manual: 5.5-64987

    Science.gov (United States)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, William L.

    2013-01-01

    This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintain ability by eliminating the requirement for problem dependent recompilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the Fun3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  20. LAURA Users Manual: 5.4-54166

    Science.gov (United States)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil

    2011-01-01

    This users manual provides in-depth information concerning installation and execution of Laura, version 5. Laura is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 Laura code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, Laura now shares gas-physics modules, MPI modules, and other low-level modules with the Fun3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  1. Integrated Fuel-Coolant Interaction (IFCI 7.0) Code User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Young, Michael F.

    1999-05-01

    The integrated fuel-coolant interaction (IFCI) computer code is being developed at Sandia National Laboratories to investigate the fuel-coolant interaction (FCI) problem at large scale using a two-dimensional, three-field hydrodynamic framework and physically based models. IFCI will be capable of treating all major FCI processes in an integrated manner. This document is a description of IFCI 7.0. The user's manual describes the hydrodynamic method and physical models used in IFCI 7.0. Appendix A is an input manual provided for the creation of working decks.

  2. Game Analytics for Game User Research, Part 1

    DEFF Research Database (Denmark)

    Seif El-Nasr, Magy; Desurvire, Heather; Aghabeigi, Bardia

    2013-01-01

    The emerging field of game user research (GUR) investigates interaction between players and games and the surrounding context of play. Game user researchers have explored methods from, for example, human-computer interaction, psychology, interaction design......The emerging field of game user research (GUR) investigates interaction between players and games and the surrounding context of play. Game user researchers have explored methods from, for example, human-computer interaction, psychology, interaction design...

  3. Perspectives on the Massachusetts Community Health Information Profile (MassCHIP): developing an online data query system to target a variety of user needs and capabilities.

    Science.gov (United States)

    Cohen, Bruce B; Franklin, Saul; West, James K

    2006-01-01

    The Massachusetts Community Health Information Profile (MassCHIP) has many distinctive features. These features evolved to maximize the usefulness of this query system for a broad group of users with varied needs, differing levels of knowledge about public health, and diverse experience using public health data. Three major features of MassCHIP help target our large user population. These features are as follows: (1) multiple avenues of entry to initiate queries ranging from an alphabetical list of simple topics to detailed International Classification of Disease codes; (2) the inclusion of data sets from other state agencies in addition to those of the Massachusetts Department of Public Health to reflect a broad view of public health; and (3) the capacity to retrieve data for multiple levels of geography, from the neighborhood through the state, including planning districts and hospitals. In this article, we discuss the history and design of MassCHIP, and focus on the features of MassCHIP that target a great variety of user needs and capabilities, and which are distinctive among Web-based data query systems.

  4. Users guide to REGIONAL-1: a regional assessment model

    International Nuclear Information System (INIS)

    Davis, W.E.; Eadie, W.J.; Powell, D.C.

    1979-09-01

    A guide was prepared to allow a user to run the PNL long-range transport model, REGIONAL 1. REGIONAL 1 is a computer model set up to run atmospheric assessments on a regional basis. The model has the capability of being run in three modes for a single time period. The three modes are: (1) no deposition, (2) dry deposition, (3) wet and dry deposition. The guide provides the physical and mathematical basis used in the model for calculating transport, diffusion, and deposition for all three modes. Also the guide includes a program listing with an explanation of the listings and an example in the form of a short-term assessment for 48 hours. The purpose of the example is to allow a person who has past experience with programming and meteorology to operate the assessment model and compare his results with the guide results. This comparison will assure the user that the program is operating in a proper fashion

  5. User's manual for the G.T.M.-1 computer code

    International Nuclear Information System (INIS)

    Prado-Herrero, P.

    1992-01-01

    This document describes the GTM-1 ( Geosphere Transport Model, release-1) computer code and is intended to provide the reader with enough detailed information in order to use the code. GTM-1 was developed for the assessment of radionuclide migration by the ground water through geologic deposits whose properties can change along the pathway.GTM-1 solves the transport equation by the finite differences method ( Crank-Nicolson scheme ). It was developped for specific use within Probabilistic System Assessment (PSA) Monte Carlo Method codes; in this context the first application of GTM-1 was within the LISA (Long Term Isolation System Assessment) code. GTM-1 is also available as an independent model, which includes various submodels simulating a multi-barrier disposal system. The code has been tested with the PSACOIN ( Probabilistic System Assessment Codes intercomparison) benchmarks exercises from PSAC User Group (OECD/NEA). 10 refs., 6 Annex., 2 tabs

  6. URSULA2 computer program. Volume 3. User's manual. Final report

    International Nuclear Information System (INIS)

    Singhal, A.K.

    1980-01-01

    This report is intended to provide documentation for the users of the URSULA2 code so that they can appreciate its important features such as: code structure, flow chart, grid notations, coding style, usage of secondary storage and its interconnection with the input preparation program (Reference H3201/4). Subroutines and subprograms have been divided into four functional groups. The functions of all subroutines have been explained with particular emphasis on the control subroutine (MAIN) and the data input subroutine (BLOCK DATA). Computations for the flow situations similar to the reference case can be performed simply by making alterations in BLOCK DATA. Separate guides for the preparation of input data and for the interpretation of program output have been provided. Furthermore, two appendices; one for the URSULA2 listing and the second for the glossary of FORTRAN variables, are included to make this report self-sufficient

  7. Brain Computer Interface on Track to Home.

    Science.gov (United States)

    Miralles, Felip; Vargiu, Eloisa; Dauwalder, Stefan; Solà, Marc; Müller-Putz, Gernot; Wriessnegger, Selina C; Pinegger, Andreas; Kübler, Andrea; Halder, Sebastian; Käthner, Ivo; Martin, Suzanne; Daly, Jean; Armstrong, Elaine; Guger, Christoph; Hintermüller, Christoph; Lowish, Hannah

    2015-01-01

    The novel BackHome system offers individuals with disabilities a range of useful services available via brain-computer interfaces (BCIs), to help restore their independence. This is the time such technology is ready to be deployed in the real world, that is, at the target end users' home. This has been achieved by the development of practical electrodes, easy to use software, and delivering telemonitoring and home support capabilities which have been conceived, implemented, and tested within a user-centred design approach. The final BackHome system is the result of a 3-year long process involving extensive user engagement to maximize effectiveness, reliability, robustness, and ease of use of a home based BCI system. The system is comprised of ergonomic and hassle-free BCI equipment; one-click software services for Smart Home control, cognitive stimulation, and web browsing; and remote telemonitoring and home support tools to enable independent home use for nonexpert caregivers and users. BackHome aims to successfully bring BCIs to the home of people with limited mobility to restore their independence and ultimately improve their quality of life.

  8. Graphical user interface to optimize image contrast parameters used in object segmentation - biomed 2009.

    Science.gov (United States)

    Anderson, Jeffrey R; Barrett, Steven F

    2009-01-01

    Image segmentation is the process of isolating distinct objects within an image. Computer algorithms have been developed to aid in the process of object segmentation, but a completely autonomous segmentation algorithm has yet to be developed [1]. This is because computers do not have the capability to understand images and recognize complex objects within the image. However, computer segmentation methods [2], requiring user input, have been developed to quickly segment objects in serial sectioned images, such as magnetic resonance images (MRI) and confocal laser scanning microscope (CLSM) images. In these cases, the segmentation process becomes a powerful tool in visualizing the 3D nature of an object. The user input is an important part of improving the performance of many segmentation methods. A double threshold segmentation method has been investigated [3] to separate objects in gray scaled images, where the gray level of the object is among the gray levels of the background. In order to best determine the threshold values for this segmentation method the image must be manipulated for optimal contrast. The same is true of other segmentation and edge detection methods as well. Typically, the better the image contrast, the better the segmentation results. This paper describes a graphical user interface (GUI) that allows the user to easily change image contrast parameters that will optimize the performance of subsequent object segmentation. This approach makes use of the fact that the human brain is extremely effective in object recognition and understanding. The GUI provides the user with the ability to define the gray scale range of the object of interest. These lower and upper bounds of this range are used in a histogram stretching process to improve image contrast. Also, the user can interactively modify the gamma correction factor that provides a non-linear distribution of gray scale values, while observing the corresponding changes to the image. This

  9. Microgravity computing codes. User's guide

    Science.gov (United States)

    1982-01-01

    Codes used in microgravity experiments to compute fluid parameters and to obtain data graphically are introduced. The computer programs are stored on two diskettes, compatible with the floppy disk drives of the Apple 2. Two versions of both disks are available (DOS-2 and DOS-3). The codes are written in BASIC and are structured as interactive programs. Interaction takes place through the keyboard of any Apple 2-48K standard system with single floppy disk drive. The programs are protected against wrong commands given by the operator. The programs are described step by step in the same order as the instructions displayed on the monitor. Most of these instructions are shown, with samples of computation and of graphics.

  10. Information-computational platform for collaborative multidisciplinary investigations of regional climatic changes and their impacts

    Science.gov (United States)

    Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara

    2013-04-01

    Analysis of growing volume of related to climate change data from sensors and model outputs requires collaborative multidisciplinary efforts of researchers. To do it timely and in reliable way one needs in modern information-computational infrastructure supporting integrated studies in the field of environmental sciences. Recently developed experimental software and hardware platform Climate (http://climate.scert.ru/) provides required environment for regional climate change related investigations. The platform combines modern web 2.0 approach, GIS-functionality and capabilities to run climate and meteorological models, process large geophysical datasets and support relevant analysis. It also supports joint software development by distributed research groups, and organization of thematic education for students and post-graduate students. In particular, platform software developed includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also run of integrated into the platform WRF and «Planet Simulator» models, modeling results data preprocessing and visualization is provided. All functions of the platform are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of selection of geographical region of interest (pan and zoom), data layers manipulation (order, enable/disable, features extraction) and visualization of results. Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches. Using it even unskilled user without specific knowledge can perform reliable computational processing and visualization of large meteorological, climatic and satellite monitoring datasets through

  11. Simulation-based computation of dose to humans in radiological environments

    International Nuclear Information System (INIS)

    Breazeal, N.L.; Davis, K.R.; Watson, R.A.; Vickers, D.S.; Ford, M.S.

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface

  12. Simulation-based computation of dose to humans in radiological environments

    Energy Technology Data Exchange (ETDEWEB)

    Breazeal, N.L. [Sandia National Labs., Livermore, CA (United States); Davis, K.R.; Watson, R.A. [Sandia National Labs., Albuquerque, NM (United States); Vickers, D.S. [Brigham Young Univ., Provo, UT (United States). Dept. of Electrical and Computer Engineering; Ford, M.S. [Battelle Pantex, Amarillo, TX (United States). Dept. of Radiation Safety

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface.

  13. A user-friendly SSVEP-based brain-computer interface using a time-domain classifier.

    Science.gov (United States)

    Luo, An; Sullivan, Thomas J

    2010-04-01

    We introduce a user-friendly steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI) system. Single-channel EEG is recorded using a low-noise dry electrode. Compared to traditional gel-based multi-sensor EEG systems, a dry sensor proves to be more convenient, comfortable and cost effective. A hardware system was built that displays four LED light panels flashing at different frequencies and synchronizes with EEG acquisition. The visual stimuli have been carefully designed such that potential risk to photosensitive people is minimized. We describe a novel stimulus-locked inter-trace correlation (SLIC) method for SSVEP classification using EEG time-locked to stimulus onsets. We studied how the performance of the algorithm is affected by different selection of parameters. Using the SLIC method, the average light detection rate is 75.8% with very low error rates (an 8.4% false positive rate and a 1.3% misclassification rate). Compared to a traditional frequency-domain-based method, the SLIC method is more robust (resulting in less annoyance to the users) and is also suitable for irregular stimulus patterns.

  14. Muddy Learning: Evaluating Learning in Multi-User Computer-Based Environments

    National Research Council Canada - National Science Library

    McArthur, David

    1998-01-01

    ... (Multiple User Synthetic Environments), and MOOs (Multi-User Object Oriented), enables users to create new "rooms" in virtual worlds, define their own personnaes, and engage visitors in rich dialogues...

  15. Towards User-Friendly Spelling with an Auditory Brain-Computer Interface: The CharStreamer Paradigm

    Science.gov (United States)

    Höhne, Johannes; Tangermann, Michael

    2014-01-01

    Realizing the decoding of brain signals into control commands, brain-computer interfaces (BCI) aim to establish an alternative communication pathway for locked-in patients. In contrast to most visual BCI approaches which use event-related potentials (ERP) of the electroencephalogram, auditory BCI systems are challenged with ERP responses, which are less class-discriminant between attended and unattended stimuli. Furthermore, these auditory approaches have more complex interfaces which imposes a substantial workload on their users. Aiming for a maximally user-friendly spelling interface, this study introduces a novel auditory paradigm: “CharStreamer”. The speller can be used with an instruction as simple as “please attend to what you want to spell”. The stimuli of CharStreamer comprise 30 spoken sounds of letters and actions. As each of them is represented by the sound of itself and not by an artificial substitute, it can be selected in a one-step procedure. The mental mapping effort (sound stimuli to actions) is thus minimized. Usability is further accounted for by an alphabetical stimulus presentation: contrary to random presentation orders, the user can foresee the presentation time of the target letter sound. Healthy, normal hearing users (n = 10) of the CharStreamer paradigm displayed ERP responses that systematically differed between target and non-target sounds. Class-discriminant features, however, varied individually from the typical N1-P2 complex and P3 ERP components found in control conditions with random sequences. To fully exploit the sequential presentation structure of CharStreamer, novel data analysis approaches and classification methods were introduced. The results of online spelling tests showed that a competitive spelling speed can be achieved with CharStreamer. With respect to user rating, it clearly outperforms a control setup with random presentation sequences. PMID:24886978

  16. Transient analysis capabilities at ABB-CE

    International Nuclear Information System (INIS)

    Kling, C.L.

    1992-01-01

    The transient capabilities at ABB-Combustion Engineering (ABB-CE) Nuclear Power are a function of the computer hardware and related network used, the computer software that has evolved over the years, and the commercial technical exchange agreements with other related organizations and customers. ABB-CEA is changing from a mainframe/personal computer network to a distributed workstation/personal computer local area network. The paper discusses computer hardware, mainframe computing, personal computers, mainframe/personal computer networks, workstations, transient analysis computer software, design/operation transient analysis codes, safety (licensed) analysis codes, cooperation with ABB-Atom, and customer support

  17. Advanced simulation capability for environmental management - current status and future applications

    Energy Technology Data Exchange (ETDEWEB)

    Freshley, Mark; Scheibe, Timothy [Pacific Northwest National Laboratory, Richland, Washington (United States); Robinson, Bruce; Moulton, J. David; Dixon, Paul [Los Alamos National Laboratory, Los Alamos, New Mexico (United States); Marble, Justin; Gerdes, Kurt [U.S. Department of Energy, Office of Environmental Management, Washington DC (United States); Stockton, Tom [Neptune and Company, Inc, Los Alamos, New Mexico (United States); Seitz, Roger [Savannah River National Laboratory, Aiken, South Carolina (United States); Black, Paul [Neptune and Company, Inc, Lakewood, Colorado (United States)

    2013-07-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management (EM), Office of Soil and Groundwater (EM-12), is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach that is currently aimed at understanding and predicting contaminant fate and transport in natural and engineered systems. ASCEM is a modular and open source high-performance computing tool. It will be used to facilitate integrated approaches to modeling and site characterization, and provide robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of capabilities, with current emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Tool-sets and High-Performance Computing (HPC) multi-process simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, tool-sets for interaction with Platform, and verification and model confidence testing. The integration of the Platform and HPC capabilities were tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities in 2012. The current maturity of the ASCEM computational and analysis capabilities has afforded the opportunity for collaborative efforts to develop decision analysis tools to support and optimize radioactive waste disposal. Recent advances in computerized decision analysis frameworks provide the perfect opportunity to bring this capability into ASCEM. This will allow radioactive waste

  18. User Interface Design for Dynamic Geometry Software

    Science.gov (United States)

    Kortenkamp, Ulrich; Dohrmann, Christian

    2010-01-01

    In this article we describe long-standing user interface issues with Dynamic Geometry Software and common approaches to address them. We describe first prototypes of multi-touch-capable DGS. We also give some hints on the educational benefits of proper user interface design.

  19. EDTGRAF, DISSPLA User Interface Program

    International Nuclear Information System (INIS)

    Bloom, I.

    1989-01-01

    1 - Description of program or function: EDTGRAF is a graphics package that allows the user to access the high-quality graphics available in Computer Associates' (CA) DISSPLA without any knowledge of programming. EDTGRAF reduces the complex syntax of DISSPLA to simple menus. The use of menus decreases computer time spent preprocessing command input. High-quality graphics can be produced quickly in two and three dimensions and in color. EDTGRAF has capabilities for storing and later retrieving information needed to produce graphics. EDTGRAF will screen most input for entries that do not make sense in the current graph and will produce a meaningful error message. Two orientations are available for plots - COMIC, which produces a plot with the y-axis perpendicular to the long axis of the paper, and MOVIE, which produces a plot with the y-axis parallel to the long axis of the paper. 2 - Restrictions on the complexity of the problem - Maxima of: 1000 ordered pairs (or triplets) per dataset, 18 datasets per graph. Errors generated by DISSPLA are not trapped by EDTGRAF. EDTGRAF assumes the paper is 8.5 x 11 inches

  20. Scale-4 and related modular systems for the evaluation of nuclear facilities and package design featuring criticality, shielding and transfer capabilities

    International Nuclear Information System (INIS)

    1994-01-01

    Nuclear industry, licensing and regulatory authorities need to be able to rely on good performance of computer codes and nuclear data used in calculations for design and operation of nuclear energy facilities. Given the international impact of a major nuclear accident, and the current crisis in public confidence, it is equally important that the methods, programs and data issued should be internationally accepted. The SCALE modular system has been developed and its capabilities extended during the last 15 years. The driving idea behind its development is that it should contain well established computer codes and data libraries, have an user friendly input format, combine and automate analyses requiring multiple computer codes or calculations into standard analytic sequences and to be well documented and publicly available. The fifth version called SCALE-4 has now been released through the Radiation Shielding Information Center (RSIC) to the OECD/NEA Data Bank. SCALE is now used worldwide. The NEA Data Bank alone has distributed more than one hundred copies of the different versions. The OECD/NEA Data Bank has been asked by its international management committee to hold a seminar with the purpose of exchanging information on the latest developments and experiences among code authors and users, to ensure that users have a correct understanding as to how SCALE should be used to model different problems, and to issue recommendations for further development and benchmarking

  1. Geochemical computer codes. A review

    International Nuclear Information System (INIS)

    Andersson, K.

    1987-01-01

    In this report a review of available codes is performed and some code intercomparisons are also discussed. The number of codes treating natural waters (groundwater, lake water, sea water) is large. Most geochemical computer codes treat equilibrium conditions, although some codes with kinetic capability are available. A geochemical equilibrium model consists of a computer code, solving a set of equations by some numerical method and a data base, consisting of thermodynamic data required for the calculations. There are some codes which treat coupled geochemical and transport modeling. Some of these codes solve the equilibrium and transport equations simultaneously while other solve the equations separately from each other. The coupled codes require a large computer capacity and have thus as yet limited use. Three code intercomparisons have been found in literature. It may be concluded that there are many codes available for geochemical calculations but most of them require a user that us quite familiar with the code. The user also has to know the geochemical system in order to judge the reliability of the results. A high quality data base is necessary to obtain a reliable result. The best results may be expected for the major species of natural waters. For more complicated problems, including trace elements, precipitation/dissolution, adsorption, etc., the results seem to be less reliable. (With 44 refs.) (author)

  2. New ROOT Graphical User Interfaces for fitting

    International Nuclear Information System (INIS)

    Maline, D Gonzalez; Moneta, L; Antcheva, I

    2010-01-01

    ROOT, as a scientific data analysis framework, provides extensive capabilities via Graphical User Interfaces (GUI) for performing interactive analysis and visualizing data objects like histograms and graphs. A new interface for fitting has been developed for performing, exploring and comparing fits on data point sets such as histograms, multi-dimensional graphs or trees. With this new interface, users can build interactively the fit model function, set parameter values and constraints and select fit and minimization methods with their options. Functionality for visualizing the fit results is as well provided, with the possibility of drawing residuals or confidence intervals. Furthermore, the new fit panel reacts as a standalone application and it does not prevent users from interacting with other windows. We will describe in great detail the functionality of this user interface, covering as well new capabilities provided by the new fitting and minimization tools introduced recently in the ROOT framework.

  3. Presto 4.14 users guide.

    Energy Technology Data Exchange (ETDEWEB)

    Spencer, Benjamin Whiting

    2009-10-01

    Presto is a three-dimensional transient dynamics code with a versatile element library, nonlinear material models, large deformation capabilities, and contact. It is built on the SIERRA Framework [1, 2]. SIERRA provides a data management framework in a parallel computing environment that allows the addition of capabilities in a modular fashion. Contact capabilities are parallel and scalable. The Presto 4.14 User's Guide provides information about the functionality in Presto and the command structure required to access this functionality in a user input file. This document is divided into chapters based primarily on functionality. For example, the command structure related to the use of various element types is grouped in one chapter; descriptions of material models are grouped in another chapter. The input and usage of Presto is similar to that of the code Adagio [3]. Adagio is a three-dimensional quasi-static code with a versatile element library, nonlinear material models, large deformation capabilities, and contact. Adagio, like Presto, is built on the SIERRA Framework [1]. Contact capabilities for Adagio are also parallel and scalable. A significant feature of Adagio is that it offers a multilevel, nonlinear iterative solver. Because of the similarities in input and usage between Presto and Adagio, the user's guides for the two codes are structured in the same manner and share common material. (Once you have mastered the input structure for one code, it will be easy to master the syntax structure for the other code.) To maintain the commonality between the two user's guides, we have used a variety of techniques. For example, references to Adagio may be found in the Presto user's guide and vice versa, and the chapter order across the two guides is the same. On the other hand, each of the two user's guides is expressly tailored to the features of the specific code and documents the particular functionality for that code. For example

  4. Presto 4.16 users guide.

    Energy Technology Data Exchange (ETDEWEB)

    2010-05-01

    Presto is a three-dimensional transient dynamics code with a versatile element library, nonlinear material models, large deformation capabilities, and contact. It is built on the SIERRA Framework [1, 2]. SIERRA provides a data management framework in a parallel computing environment that allows the addition of capabilities in a modular fashion. Contact capabilities are parallel and scalable. The Presto 4.16 User's Guide provides information about the functionality in Presto and the command structure required to access this functionality in a user input file. This document is divided into chapters based primarily on functionality. For example, the command structure related to the use of various element types is grouped in one chapter; descriptions of material models are grouped in another chapter. The input and usage of Presto is similar to that of the code Adagio [3]. Adagio is a three-dimensional quasi-static code with a versatile element library, nonlinear material models, large deformation capabilities, and contact. Adagio, like Presto, is built on the SIERRA Framework [1]. Contact capabilities for Adagio are also parallel and scalable. A significant feature of Adagio is that it offers a multilevel, nonlinear iterative solver. Because of the similarities in input and usage between Presto and Adagio, the user's guides for the two codes are structured in the same manner and share common material. (Once you have mastered the input structure for one code, it will be easy to master the syntax structure for the other code.) To maintain the commonality between the two user's guides, we have used a variety of techniques. For example, references to Adagio may be found in the Presto user's guide and vice versa, and the chapter order across the two guides is the same. On the other hand, each of the two user's guides is expressly tailored to the features of the specific code and documents the particular functionality for that code. For example

  5. NET-COMPUTER: Internet Computer Architecture and its Application in E-Commerce

    OpenAIRE

    P. O. Umenne; M. O. Odhiambo

    2012-01-01

    Research in Intelligent Agents has yielded interesting results, some of which have been translated into commer­cial ventures. Intelligent Agents are executable software components that represent the user, perform tasks on behalf of the user and when the task terminates, the Agents send the result to the user. Intelligent Agents are best suited for the Internet: a collection of computers connected together in a world-wide computer network. Swarm and HYDRA computer architectures for Agents’ ex...

  6. User involvement competence for radical innovation

    DEFF Research Database (Denmark)

    Lettl, Christopher

    2007-01-01

    -assisted navigation systems. The case study analysis reveals that firms who closely interact with specific users benefit significantly for their radical innovation work. These users have a high motivation toward new solutions, are open to new technologies, possess diverse competencies, and are embedded into a very......One important market related capability for firms which seek to develop radical innovations is the competence to involve the 'right' users at the 'right' time in the 'right' form. While former studies have identified a rather passive role of users in the radical innovation process, this paper...

  7. User involvement competence for radical innovation

    DEFF Research Database (Denmark)

    Lettl, Christopher

    2007-01-01

    One important market related capability for firms which seek to develop radical innovations is the competence to involve the 'right' users at the 'right' time in the 'right' form. While former studies have identified a rather passive role of users in the radical innovation process, this paper......-assisted navigation systems. The case study analysis reveals that firms who closely interact with specific users benefit significantly for their radical innovation work. These users have a high motivation toward new solutions, are open to new technologies, possess diverse competencies, and are embedded into a very...

  8. Evaluation of secure capability-based access control in the M2M local cloud platform

    DEFF Research Database (Denmark)

    Anggorojati, Bayu; Prasad, Neeli R.; Prasad, Ramjee

    2016-01-01

    delegation. Recently, the capability based access control has been considered as method to manage access in the Internet of Things (IoT) or M2M domain. In this paper, the implementation and evaluation of a proposed secure capability based access control in the M2M local cloud platform is presented......Managing access to and protecting resources is one of the important aspect in managing security, especially in a distributed computing system such as Machine-to-Machine (M2M). One such platform known as the M2M local cloud platform, referring to BETaaS architecture [1], which conceptually consists...... of multiple distributed M2M gateways, creating new challenges in the access control. Some existing access control systems lack in scalability and flexibility to manage access from users or entity that belong to different authorization domains, or fails to provide fine grained and flexible access right...

  9. Web-based computational chemistry education with CHARMMing I: Lessons and tutorial.

    Science.gov (United States)

    Miller, Benjamin T; Singh, Rishi P; Schalk, Vinushka; Pevzner, Yuri; Sun, Jingjun; Miller, Carrie S; Boresch, Stefan; Ichiye, Toshiko; Brooks, Bernard R; Woodcock, H Lee

    2014-07-01

    This article describes the development, implementation, and use of web-based "lessons" to introduce students and other newcomers to computer simulations of biological macromolecules. These lessons, i.e., interactive step-by-step instructions for performing common molecular simulation tasks, are integrated into the collaboratively developed CHARMM INterface and Graphics (CHARMMing) web user interface (http://www.charmming.org). Several lessons have already been developed with new ones easily added via a provided Python script. In addition to CHARMMing's new lessons functionality, web-based graphical capabilities have been overhauled and are fully compatible with modern mobile web browsers (e.g., phones and tablets), allowing easy integration of these advanced simulation techniques into coursework. Finally, one of the primary objections to web-based systems like CHARMMing has been that "point and click" simulation set-up does little to teach the user about the underlying physics, biology, and computational methods being applied. In response to this criticism, we have developed a freely available tutorial to bridge the gap between graphical simulation setup and the technical knowledge necessary to perform simulations without user interface assistance.

  10. Web-based computational chemistry education with CHARMMing I: Lessons and tutorial.

    Directory of Open Access Journals (Sweden)

    Benjamin T Miller

    2014-07-01

    Full Text Available This article describes the development, implementation, and use of web-based "lessons" to introduce students and other newcomers to computer simulations of biological macromolecules. These lessons, i.e., interactive step-by-step instructions for performing common molecular simulation tasks, are integrated into the collaboratively developed CHARMM INterface and Graphics (CHARMMing web user interface (http://www.charmming.org. Several lessons have already been developed with new ones easily added via a provided Python script. In addition to CHARMMing's new lessons functionality, web-based graphical capabilities have been overhauled and are fully compatible with modern mobile web browsers (e.g., phones and tablets, allowing easy integration of these advanced simulation techniques into coursework. Finally, one of the primary objections to web-based systems like CHARMMing has been that "point and click" simulation set-up does little to teach the user about the underlying physics, biology, and computational methods being applied. In response to this criticism, we have developed a freely available tutorial to bridge the gap between graphical simulation setup and the technical knowledge necessary to perform simulations without user interface assistance.

  11. High Performance Computing Facility Operational Assessment, FY 2011 Oak Ridge Leadership Computing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Ann E [ORNL; Bland, Arthur S Buddy [ORNL; Hack, James J [ORNL; Barker, Ashley D [ORNL; Boudwin, Kathlyn J. [ORNL; Kendall, Ricky A [ORNL; Messer, Bronson [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL; Wells, Jack C [ORNL; White, Julia C [ORNL

    2011-08-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor that uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and

  12. The user facility FELIX: Past, present and future

    International Nuclear Information System (INIS)

    Meer, A.F.G. van der; Amersfoort, P.W. van

    1995-01-01

    The performance over the past year and the current user-relevant characteristics of the User Facility FELIX will be discussed. Also the existing plans for improving and extending the capabilities and provisions will be presented

  13. The Identification, Implementation, and Evaluation of Critical User Interface Design Features of Computer-Assisted Instruction Programs in Mathematics for Students with Learning Disabilities

    Science.gov (United States)

    Seo, You-Jin; Woo, Honguk

    2010-01-01

    Critical user interface design features of computer-assisted instruction programs in mathematics for students with learning disabilities and corresponding implementation guidelines were identified in this study. Based on the identified features and guidelines, a multimedia computer-assisted instruction program, "Math Explorer", which delivers…

  14. Identification and implementation of end-user needs during development of a state-of-the-art modeling tool-set - 59069

    International Nuclear Information System (INIS)

    Seitz, Roger; Williamson, Mark; Gerdes, Kurt; Freshley, Mark; Dixon, Paul; Collazo, Yvette T.; Hubbard, Susan

    2012-01-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management, Technology Innovation and Development is supporting a multi-National Laboratory effort to develop the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is an emerging state-of-the-art scientific approach and software infrastructure for understanding and predicting contaminant fate and transport in natural and engineered systems. These modular and open-source high performance computing tools and user interfaces will facilitate integrated approaches that enable standardized assessments of performance and risk for EM cleanup and closure decisions. The ASCEM team recognized that engaging end-users in the ASCEM development process would lead to enhanced development and implementation of the ASCEM tool-sets in the user community. End-user involvement in ASCEM covers a broad spectrum of perspectives, including: performance assessment (PA) and risk assessment practitioners, research scientists, decision-makers, oversight personnel, and regulators engaged in the US DOE cleanup mission. End-users are primarily engaged in ASCEM via the ASCEM User Steering Committee (USC) and the 'user needs interface' task. Future plans also include user involvement in demonstrations of the ASCEM tools. This paper will describe the details of how end users have been engaged in the ASCEM program and will demonstrate how this involvement has strengthened both the tool development and community confidence. ASCEM tools requested by end-users specifically target modeling challenges associated with US DOE cleanup activities. The demonstration activities involve application of ASCEM tools and capabilities to representative problems at DOE sites. Selected results from the ASCEM Phase 1 demonstrations are discussed to illustrate how capabilities requested by end-users were implemented in prototype versions of the ASCEM tool. The ASCEM team engaged a variety of interested parties early in the development

  15. CAL--ERDA users manual. [Building Design Language; LOADS, SYSTEMS, PLANT, ECONOMICS, REPORT, EXECUTIVE, CAL-ERDA

    Energy Technology Data Exchange (ETDEWEB)

    Graven, R. M.; Hirsch, P. R.

    1977-10-30

    A new set of computer programs capable of rapid and detailed analysis of energy consumption in buildings is described. The Building Design Language (BDL) has been written to allow simplified manipulation of the many variables used to describe a building and its operation. Programs presented in this manual include: (1) a Building Design Language program to analyze the input instructions, execute computer system control commands, perform data assignments and data retrieval, and control the operation of the LOADS, SYSTEMS, PLANT, ECONOMICS, and REPORT programs; (2) a LOADS analysis program which calculates peak (design) loads and hourly space loads due to ambient weather conditions and the internal occupancy, lighting, and equipment within the building, as well as variations in the size, location, orientation, construction, walls, roofs, floors, fenestrations, attachments (awnings, balconies), and shape of a building; (3) a HEATING, Ventilating, and Air-Conditioning (HVAC) SYSTEMS program capable of modeling the operation of HVAC components, including fans, coils, economizers, and humidifiers; (4) a PLANT equipment program which models the operation of boilers, chillers, electrical-generation equipment (e.g., diesel engines or turbines), heat-storage apparatus (e.g., chilled or heated water) and solar heating and/or cooling systems; (5) an ECONOMICS analysis program which calculates life-cycle costs; (6) a REPORT program which produces tables of user-selected variables and arranges them according to user-selected formats; and (7) an EXECUTIVE processor to create computer-system control commands. Libraries of weather data, typical schedule data, and data on the properties of walls, roofs, and floors are available.

  16. Tingling/numbness in the hands of computer users: neurophysiological findings from the NUDATA study

    DEFF Research Database (Denmark)

    Overgaard, E.; Brandt, L. P.; Ellemann, K.

    2004-01-01

    OBJECTIVES: To investigate whether tingling/numbness of the hands and fingers among computer users is associated with elevated vibration threshold as a sign of early nerve compression. METHODS: Within the Danish NUDATA study, vibratory sensory testing with monitoring of the digital vibration...... once a week or daily within the last 3 months. Participants with more than slight muscular pain or disorders of the neck and upper extremities, excessive alcohol consumption, previous injuries of the upper extremities, or concurrent medical diseases were excluded. The two groups had a similar amount...... of work with mouse, keyboard, and computer. RESULTS: Seven of the 20 cases (35%) had elevated vibration thresholds, compared with 3 of the 20 controls (15%); this difference was not statistically significant (chi2=2.13, P=0.14). Compared with controls, cases had increased perception threshold for all...

  17. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  18. COMMIX-1A: a three-dimensional transient single-phase computer program for thermal hydraulic analysis of single and multicomponent systems. Volume I: users manual

    International Nuclear Information System (INIS)

    Domanus, H.M.; Schmitt, R.C.; Sha, W.T.; Shah, V.L.

    1983-12-01

    The COMMIX-1A computer program is an updated and improved version of COMMIX-1 designed to analyze steady-state/transient, single-phase, three-dimensional fluid flow with heat transfer in reactor components and multicomponent systems. A new porous-media formulation via local volume averaging has been derived and employed in the COMMIX code. The concepts of volume porosity, directional surface permeability, distributed resistance, and distributed heat source or sink is used in the new porous-media formulation to model a flow domain with stationary structures. The concept of directional surface permeability is new and greatly facilitates modeling of velocity and temperature fields in anisotropic media. The new porous-media formulation represents the first unified approach to thermal-hydraulic analysis. It is now possible to perform a multidimensional thermal-hydraulic simulation of either a single component, such as a rod bundle, reactor plenum, piping system, heat exchanger, etc., or a multicomponent system that is a combination of these components. The conservation equations of mass, momentum, and energy based on the new porous-media formulation are solved as a boundary-value problem in space and an initial-value problem in time. Two other unique features provided in the COMMIX-1A code are (1) two solution procedures - a semi-implicit procedure modified from ICE and a fully-implicit procedure, named SIMPLEST-ANL, similar to the SIMPLE/SIMPLER algorithms - available a user's option and (2) a geometrical package capable of approximating many geometries. This report (Volume I) describes in detail the basic equations, formulations, solution procedures, flow charts, rebalancing scheme for faster convergence, options available to users, models to describe the auxiliary phenomena, input instructions, and two sample problems. The Volume II assembles and summarizes the results of many simulations that have been performed with COMMIX-1A computer program

  19. Development of point Kernel radiation shielding analysis computer program implementing recent nuclear data and graphic user interfaces

    International Nuclear Information System (INIS)

    Kang, S.; Lee, S.; Chung, C.

    2002-01-01

    There is an increasing demand for safe and efficient use of radiation and radioactive work activity along with shielding analysis as a result the number of nuclear and conventional facilities using radiation or radioisotope rises. Most Korean industries and research institutes including Korea Power Engineering Company (KOPEC) have been using foreign computer programs for radiation shielding analysis. Korean nuclear regulations have introduced new laws regarding the dose limits and radiological guides as prescribed in the ICRP 60. Thus, the radiation facilities should be designed and operated to comply with these new regulations. In addition, the previous point kernel shielding computer code utilizes antiquated nuclear data (mass attenuation coefficient, buildup factor, etc) which were developed in 1950∼1960. Subsequently, the various nuclear data such mass attenuation coefficient, buildup factor, etc. have been updated during the past few decades. KOPEC's strategic directive is to become a self-sufficient and independent nuclear design technology company, thus KOPEC decided to develop a new radiation shielding computer program that included the latest regulatory requirements and updated nuclear data. This new code was designed by KOPEC with developmental cooperation with Hanyang University, Department of Nuclear Engineering. VisualShield is designed with a graphical user interface to allow even users unfamiliar to radiation shielding theory to proficiently prepare input data sets and analyzing output results

  20. Comparison of capability between two versions of reactor transient diagnosis expert system 'DISKET' programmed in different languages

    International Nuclear Information System (INIS)

    Yokobayashi, Masao; Yoshida, Kazuo

    1991-01-01

    An expert system DISKET has been developed at JAERI to apply knowledge engineering techniques to the transient diagnosis of nuclear power plant. The first version of DISKET programmed in UTILISP has been developed with the main-frame computer FACOM M-780 at JAERI. The LISP language is not suitable for on-line diagnostic systems because it is highly dependent on computer to be used and requires a large computer memory. The large mainframe computer is also not suitable because there are various restrictions as a multi-user computer system. The second version of DISKET for a practical use has been developed in FORTRAN to realize on-line real time diagnoses with limited computer resources. These two versions of DISKET with the same knowledge base have been compared in running capability, and it has been found that the LISP version of DISKET needs more than two times of memory and CPU time of FORTRAN version. From this result, it is shown that this approach is a practical one to develop expert systems for on-line real time diagnosis of transients with limited computer resources. (author)

  1. MAGIC user's group software

    International Nuclear Information System (INIS)

    Warren, G.; Ludeking, L.; McDonald, J.; Nguyen, K.; Goplen, B.

    1990-01-01

    The MAGIC User's Group has been established to facilitate the use of electromagnetic particle-in-cell software by universities, government agencies, and industrial firms. The software consists of a series of independent executables that are capable of inter-communication. MAGIC, SOS, μ SOS are used to perform electromagnetic simulations while POSTER is used to provide post-processing capabilities. Each is described in the paper. Use of the codes for Klystrode simulation is discussed

  2. Decision-making in stimulant and opiate addicts in protracted abstinence: evidence from computational modeling with pure users.

    Science.gov (United States)

    Ahn, Woo-Young; Vasilev, Georgi; Lee, Sung-Ha; Busemeyer, Jerome R; Kruschke, John K; Bechara, Antoine; Vassileva, Jasmin

    2014-01-01

    Substance dependent individuals (SDI) often exhibit decision-making deficits; however, it remains unclear whether the nature of the underlying decision-making processes is the same in users of different classes of drugs and whether these deficits persist after discontinuation of drug use. We used computational modeling to address these questions in a unique sample of relatively "pure" amphetamine-dependent (N = 38) and heroin-dependent individuals (N = 43) who were currently in protracted abstinence, and in 48 healthy controls (HC). A Bayesian model comparison technique, a simulation method, and parameter recovery tests were used to compare three cognitive models: (1) Prospect Valence Learning with decay reinforcement learning rule (PVL-DecayRI), (2) PVL with delta learning rule (PVL-Delta), and (3) Value-Plus-Perseverance (VPP) model based on Win-Stay-Lose-Switch (WSLS) strategy. The model comparison results indicated that the VPP model, a hybrid model of reinforcement learning (RL) and a heuristic strategy of perseverance had the best post-hoc model fit, but the two PVL models showed better simulation and parameter recovery performance. Computational modeling results suggested that overall all three groups relied more on RL than on a WSLS strategy. Heroin users displayed reduced loss aversion relative to HC across all three models, which suggests that their decision-making deficits are longstanding (or pre-existing) and may be driven by reduced sensitivity to loss. In contrast, amphetamine users showed comparable cognitive functions to HC with the VPP model, whereas the second best-fitting model with relatively good simulation performance (PVL-DecayRI) revealed increased reward sensitivity relative to HC. These results suggest that some decision-making deficits persist in protracted abstinence and may be mediated by different mechanisms in opiate and stimulant users.

  3. Decision-making in stimulant and opiate addicts in protracted abstinence: evidence from computational modeling with pure users

    Directory of Open Access Journals (Sweden)

    Woo-Young eAhn

    2014-08-01

    Full Text Available Substance dependent individuals (SDI often exhibit decision-making deficits; however, it remains unclear whether the nature of the underlying decision-making processes is the same in users of different classes of drugs and whether these deficits persist after discontinuation of drug use. We used computational modeling to address these questions in a unique sample of relatively pure amphetamine-dependent (N=38 and heroin-dependent individuals (N=43 who were currently in protracted abstinence, and in 48 healthy controls. A Bayesian model comparison technique, a simulation method, and parameter recovery tests were used to compare three cognitive models: (1 Prospect Valence Learning with decay reinforcement learning rule (PVL-DecayRI, (2 PVL with delta learning rule (PVL-Delta, and (3 Value-Plus-Perseverance (VPP models based on Win-Stay-Lose-Switch (WSLS strategy. The model comparison results indicated that the VPP model, a hybrid model of reinforcement learning (RL and a heuristic strategy of perseverance had the best post hoc model fit, but the two PVL models showed better simulation performance. Computational modeling results suggested that overall all three groups relied more on RL than on a WSLS strategy. Heroin users displayed reduced loss aversion relative to healthy controls across all three models, which suggests that their decision-making deficits are longstanding (or pre-existing and may be driven by reduced sensitivity to loss. In contrast, amphetamine users showed comparable cognitive functions to healthy controls with the VPP model, whereas the second best-fitting model with relatively good simulation performance (PVL-DecayRI revealed increased reward sensitivity relative to healthy controls. These results suggest that some decision-making deficits persist in protracted abstinence and may be mediated by different mechanisms in opiate and stimulant users.

  4. Wearable computer for mobile augmented-reality-based controlling of an intelligent robot

    Science.gov (United States)

    Turunen, Tuukka; Roening, Juha; Ahola, Sami; Pyssysalo, Tino

    2000-10-01

    An intelligent robot can be utilized to perform tasks that are either hazardous or unpleasant for humans. Such tasks include working in disaster areas or conditions that are, for example, too hot. An intelligent robot can work on its own to some extent, but in some cases the aid of humans will be needed. This requires means for controlling the robot from somewhere else, i.e. teleoperation. Mobile augmented reality can be utilized as a user interface to the environment, as it enhances the user's perception of the situation compared to other interfacing methods and allows the user to perform other tasks while controlling the intelligent robot. Augmented reality is a method that combines virtual objects into the user's perception of the real world. As computer technology evolves, it is possible to build very small devices that have sufficient capabilities for augmented reality applications. We have evaluated the existing wearable computers and mobile augmented reality systems to build a prototype of a future mobile terminal- the CyPhone. A wearable computer with sufficient system resources for applications, wireless communication media with sufficient throughput and enough interfaces for peripherals has been built at the University of Oulu. It is self-sustained in energy, with enough operating time for the applications to be useful, and uses accurate positioning systems.

  5. PARFUME User's Guide

    International Nuclear Information System (INIS)

    Hamman, Kurt

    2010-01-01

    PARFUME, a fuel performance analysis and modeling code, is being developed at the Idaho National Laboratory for evaluating gas reactor coated particle fuel assemblies for prismatic, pebble bed, and plate type fuel geometries. The code is an integrated mechanistic analysis tool that evaluates the thermal, mechanical, and physico-chemical behavior of coated fuel particles (TRISO) and the probability for fuel failure given the particle-to-particle statistical variations in physical dimensions and material properties that arise during the fuel fabrication process. Using a robust finite difference numerical scheme, PARFUME is capable of performing steady state and transient heat transfer and fission product diffusion analyses for the fuel. Written in FORTRAN 90, PARFUME is easy to read, maintain, and modify. Currently, PARFUME is supported only on MS Windows platforms. This document represents the initial version of the PARFUME User Guide, a supplement to the PARFUME Theory and Model Basis Report which describes the theoretical aspects of the code. User information is provided including: (1) code development, (2) capabilities and limitations, (3) installation and execution, (4) user input and output, (5) sample problems, and (6) error messages. In the near future, the INL plans to release a fully benchmarked and validated beta version of PARFUME.

  6. Dynamic provisioning of local and remote compute resources with OpenStack

    Science.gov (United States)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  7. RELAP4/MOD5: a computer program for transient thermal-hydraulic analysis of nuclear reactors and related systems. User's manual. Volume II. Program implementation

    International Nuclear Information System (INIS)

    1976-09-01

    This portion of the RELAP4/MOD5 User's Manual presents the details of setting up and entering the reactor model to be evaluated. The input card format and arrangement is presented in depth, including not only cards for data but also those for editing and restarting. Problem initalization including pressure distribution and energy balance is discussed. A section entitled ''User Guidelines'' is included to provide modeling recommendations, analysis and verification techniques, and computational difficulty resolution. The section is concluded with a discussion of the computer output form and format

  8. Optical correction of refractive error for preventing and treating eye symptoms in computer users.

    Science.gov (United States)

    Heus, Pauline; Verbeek, Jos H; Tikka, Christina

    2018-04-10

    Computer users frequently complain about problems with seeing and functioning of the eyes. Asthenopia is a term generally used to describe symptoms related to (prolonged) use of the eyes like ocular fatigue, headache, pain or aching around the eyes, and burning and itchiness of the eyelids. The prevalence of asthenopia during or after work on a computer ranges from 46.3% to 68.5%. Uncorrected or under-corrected refractive error can contribute to the development of asthenopia. A refractive error is an error in the focusing of light by the eye and can lead to reduced visual acuity. There are various possibilities for optical correction of refractive errors including eyeglasses, contact lenses and refractive surgery. To examine the evidence on the effectiveness, safety and applicability of optical correction of refractive error for reducing and preventing eye symptoms in computer users. We searched the Cochrane Central Register of Controlled Trials (CENTRAL); PubMed; Embase; Web of Science; and OSH update, all to 20 December 2017. Additionally, we searched trial registries and checked references of included studies. We included randomised controlled trials (RCTs) and quasi-randomised trials of interventions evaluating optical correction for computer workers with refractive error for preventing or treating asthenopia and their effect on health related quality of life. Two authors independently assessed study eligibility and risk of bias, and extracted data. Where appropriate, we combined studies in a meta-analysis. We included eight studies with 381 participants. Three were parallel group RCTs, three were cross-over RCTs and two were quasi-randomised cross-over trials. All studies evaluated eyeglasses, there were no studies that evaluated contact lenses or surgery. Seven studies evaluated computer glasses with at least one focal area for the distance of the computer screen with or without additional focal areas in presbyopic persons. Six studies compared computer

  9. A personal computer code for seismic evaluations of nuclear power plant facilities

    International Nuclear Information System (INIS)

    Xu, J.; Graves, H.

    1990-01-01

    A wide range of computer programs and modeling approaches are often used to justify the safety of nuclear power plants. It is often difficult to assess the validity and accuracy of the results submitted by various utilities without developing comparable computer solutions. Taken this into consideration, CARES is designed as an integrated computational system which can perform rapid evaluations of structural behavior and examine capability of nuclear power plant facilities, thus CARES may be used by the NRC to determine the validity and accuracy of analysis methodologies employed for structural safety evaluations of nuclear power plants. CARES has been designed to: operate on a PC, have user friendly input/output interface, and have quick turnaround. The CARES program is structured in a modular format. Each module performs a specific type of analysis. The basic modules of the system are associated with capabilities for static, seismic and nonlinear analyses. This paper describes the various features which have been implemented into the Seismic Module of CARES version 1.0. In Section 2 a description of the Seismic Module is provided. The methodologies and computational procedures thus far implemented into the Seismic Module are described in Section 3. Finally, a complete demonstration of the computational capability of CARES in a typical soil-structure interaction analysis is given in Section 4 and conclusions are presented in Section 5. 5 refs., 4 figs

  10. End-User Recommendations on LOGOMON - a Computer Based Speech Therapy System for Romanian Language

    Directory of Open Access Journals (Sweden)

    SCHIPOR, O. A.

    2010-11-01

    Full Text Available In this paper we highlight the relations between LOGOMON - a Computer Based Speech Therapy System and dyslalia's training steps. Dyslalia is a speech disorder that affects pronunciation of one or many sounds. This presentation of the system is completed by a research regarding end-user (i.e. teachers and parents attitude about the speech assisted therapy in general and about LOGOMON System in particular. The results of this research allow the improvement of our CBST system because the obtained information can be a source of adaptability to different expectations of the beneficiaries.

  11. "Head up and eyes out" advances in head mounted displays capabilities

    Science.gov (United States)

    Cameron, Alex

    2013-06-01

    There are a host of helmet and head mounted displays, flooding the market place with displays which provide what is essentially a mobile computer display. What sets aviators HMDs apart is that they provide the user with accurate conformal information embedded in the pilots real world view (see through display) where the information presented is intuitive and easy to use because it overlays the real world (mix of sensor imagery, symbolic information and synthetic imagery) and enables them to stay head up, eyes out, - improving their effectiveness, reducing workload and improving safety. Such systems are an enabling technology in the provision of enhanced Situation Awareness (SA) and reducing user workload in high intensity situations. Safety Is Key; so the addition of these HMD functions cannot detract from the aircrew protection functions of conventional aircrew helmets which also include life support and audio communications. These capabilities are finding much wider application in new types of compact man mounted audio/visual products enabled by the emergence of new families of micro displays, novel optical concepts and ultra-compact low power processing solutions. This papers attempts to capture the key drivers and needs for future head mounted systems for aviation applications.

  12. User's Guide for Mixed-Size Sediment Transport Model for Networks of One-Dimensional Open Channels

    Science.gov (United States)

    Bennett, James P.

    2001-01-01

    This user's guide describes a mathematical model for predicting the transport of mixed sizes of sediment by flow in networks of one-dimensional open channels. The simulation package is useful for general sediment routing problems, prediction of erosion and deposition following dam removal, and scour in channels at road embankment crossings or other artificial structures. The model treats input hydrographs as stepwise steady-state, and the flow computation algorithm automatically switches between sub- and supercritical flow as dictated by channel geometry and discharge. A variety of boundary conditions including weirs and rating curves may be applied both external and internal to the flow network. The model may be used to compute flow around islands and through multiple openings in embankments, but the network must be 'simple' in the sense that the flow directions in all channels can be specified before simulation commences. The location and shape of channel banks are user specified, and all bedelevation changes take place between these banks and above a user-specified bedrock elevation. Computation of sediment-transport emphasizes the sand-size range (0.0625-2.0 millimeter) but the user may select any desired range of particle diameters including silt and finer (user may set the original bed-sediment composition of any number of layers of known thickness. The model computes the time evolution of total transport and the size composition of bed- and suspended-load sand through any cross section of interest. It also tracks bed -surface elevation and size composition. The model is written in the FORTRAN programming language for implementation on personal computers using the WINDOWS operating system and, along with certain graphical output display capability, is accessed from a graphical user interface (GUI). The GUI provides a framework for selecting input files and parameters of a number of components of the sediment-transport process. There are no restrictions in the

  13. Computation system for nuclear reactor core analysis. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.; Petrie, L.M.

    1977-04-01

    This report documents a system which contains computer codes as modules developed to evaluate nuclear reactor core performance. The diffusion theory approximation to neutron transport may be applied with the VENTURE code treating up to three dimensions. The effect of exposure may be determined with the BURNER code, allowing depletion calculations to be made. The features and requirements of the system are discussed and aspects common to the computational modules, but the latter are documented elsewhere. User input data requirements, data file management, control, and the modules which perform general functions are described. Continuing development and implementation effort is enhancing the analysis capability available locally and to other installations from remote terminals.

  14. Motivating Contributions for Home Computer Security

    Science.gov (United States)

    Wash, Richard L.

    2009-01-01

    Recently, malicious computer users have been compromising computers en masse and combining them to form coordinated botnets. The rise of botnets has brought the problem of home computers to the forefront of security. Home computer users commonly have insecure systems; these users do not have the knowledge, experience, and skills necessary to…

  15. Effects of two eye drop products on computer users with subjective ocular discomfort.

    Science.gov (United States)

    Skilling, Francis C; Weaver, Tony A; Kato, Kenneth P; Ford, Jerry G; Dussia, Elyse M

    2005-01-01

    An increasing number of people seek medical attention for symptoms of visual discomfort due to computer vision syndrome (CVS). We compared the efficacy and adverse event rates of a new eye lubricant, OptiZen (InnoZen, Inc., polysorbate 80 0.5%) and Visine Original (Pfizer Consumer Healthcare, tetrahydrozoline HCl 0.05%). In this double-blind parallel arm trial, 50 healthy men and women, ages 18 to 65 years, with symptoms of CVS who use a video display terminal for a minimum of 4 hours per day were randomized to OptiZen (n = 25) or Visine Original (n= 25), 1 to 2 drops b.i.d. for 5 days. The primary end-points were ocular discomfort and adverse events. OptiZen and Visine Original had similar efficacy in alleviating symptoms of ocular discomfort (odds ratio of 1.23 [95% confidence interval, 0.63 to 2.42], P= 0.55). OptiZen and Visine Original were very similar with respect to odds ratios and 95% confidence interval (CI) for each of the measurement times (P= 0.72). Visine Original users reported a significantly higher incidence of temporary ocular stinging/burning immediately after drug instillation (28%, 7/25) than did OptiZen users (4%, 1/24) (P= 0.05). Patients using OptiZen were 89% less likely to have stinging/burning effects than those patients using Visine Original (95% CI: 0.01 to 0.95). OptiZen and Visine Original are effective at alleviating ocular discomfort associated with prolonged computer use. Adverse event findings suggest that OptiZen causes less ocular discomfort on instillation, potentially attributable to its milder ingredient profile.

  16. XTV users guide

    International Nuclear Information System (INIS)

    Dearing, J.F.; Johns, R.C.

    1996-09-01

    XTV is an X-Windows based Graphical User Interface for viewing results of Transient Reactor Analysis Code (TRAC) calculations. It provides static and animated color mapped visualizations of both thermal-hydraulic and heat conduction components in a TRAC model of a nuclear power plant, as well as both on-screen and hard copy two-dimensional plot capabilities. XTV is the successor to TRAP, the former TRAC postprocessor using the proprietary DISSPLA graphics library. This manual describes Version 2.0, which requires TRAC version 5.4.20 or later for full visualization capabilities

  17. IPTV Service Framework Based on Secure Authentication and Lightweight Content Encryption for Screen-Migration in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Aymen Abdullah Alsaffar

    2015-01-01

    Full Text Available These days, the advancing of smart devices (e.g. smart phones, tablets, PC, etc. capabilities and the increase of internet bandwidth enables IPTV service provider to extend their services to smart mobile devices. User can just receive their IPTV service using any smart devices by accessing the internet via wireless network from anywhere anytime in the world which is convenience for users. However, wireless network communication has well a known critical security threats and vulnerabilities to user smart devices and IPTV service such as user identity theft, reply attack, MIM attack, and so forth. A secure authentication for user devices and multimedia protection mechanism is necessary to protect both user devices and IPTV services. As result, we proposed framework of IPTV service based on secure authentication mechanism and lightweight content encryption method for screen-migration in Cloud computing. We used cryptographic nonce combined with user ID and password to authenticate user device in any mobile terminal they passes by. In addition we used Lightweight content encryption to protect and reduce the content decode overload at mobile terminals. Our proposed authentication mechanism reduces the computational processing by 30% comparing to other authentication mechanism and our lightweight content encryption reduces encryption delay to 0.259 second.

  18. Pengukuran End-User Computing Satisfaction Atas Penggunaan Sistem Informasi Akademik

    Directory of Open Access Journals (Sweden)

    Boy Suzanto

    2015-04-01

    Full Text Available Academic information system is very dependent on the components in generating information systems that fit their needs. The gap that occurs in the use of academic information system on the behavior of users of information systems will result in optimal intention to use by the user. Hence the need for in-depth measurement of academic information system that has been running against the attitude of academic information system users to improve the behavior of users of information systems is the goal of this study. Exploratory research methods using the number of respondents as many as 124 students. Data were analyzed using Structural Equation Modelling (SEM, component based Partial Least Squares (PLS. The results showed that the influence of academic information system on the attitudes of users by 0.57 or 57% and influence the attitude of academic information system users to conduct on the intention to use of academic information systems at 0.50 or 50%.

  19. HECTR Version 1.5 user's manual

    International Nuclear Information System (INIS)

    Dingman, S.E.; Camp, A.L.; Wong, C.C.; King, D.B.; Gasser, R.D.

    1986-04-01

    This report describes the use and features of HECTR Version 1.5. HECTR is a relatively fast-running, lumped-volume containment analysis computer program that is most useful for performing parametric studies. The main purpose of HECTR is to analyze nuclear reactor accidents involving the transport and combustion of hydrogen, but HECTR can also function as an experiment analysis tool and can solve a limited set of other types of containment problems. New models added to HECTR Version 1.5 include fan coolers, containment leakage, continuous burning, and the capability to treat carbon monoxide and carbon dioxide. Models for the ice condenser, sumps, and Mark III suppression pool were upgraded. HECTR is designed for flexibility and provides for user control of many important parameters, particularly those related to hydrogen combustion. Built-in correlations and default values of key parameters are also provided

  20. Computing for magnetic fusion energy research: An updated vision

    International Nuclear Information System (INIS)

    Henline, P.; Giarrusso, J.; Davis, S.; Casper, T.

    1993-01-01

    This Fusion Computing Council perspective is written to present the primary of the fusion computing community at the time of publication of the report necessarily as a summary of the information contained in the individual sections. These concerns reflect FCC discussions during final review of contributions from the various working groups and portray our latest information. This report itself should be considered as dynamic, requiring periodic updating in an attempt to track rapid evolution of the computer industry relevant to requirements for magnetic fusion research. The most significant common concern among the Fusion Computing Council working groups is networking capability. All groups see an increasing need for network services due to the use of workstations, distributed computing environments, increased use of graphic services, X-window usage, remote experimental collaborations, remote data access for specific projects and other collaborations. Other areas of concern include support for workstations, enhanced infrastructure to support collaborations, the User Service Centers, NERSC and future massively parallel computers, and FCC sponsored workshops

  1. PREREM: an interactive data preprocessing code for INREM II. Part I: user's manual. Part II: code structure

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, M.T.; Fields, D.E.

    1981-05-01

    PREREM is an interactive computer code developed as a data preprocessor for the INREM-II (Killough, Dunning, and Pleasant, 1978a) internal dose program. PREREM is intended to provide easy access to current and self-consistent nuclear decay and radionuclide-specific metabolic data sets. Provision is made for revision of metabolic data, and the code is intended for both production and research applications. Documentation for the code is in two parts. Part I is a user's manual which emphasizes interpretation of program prompts and choice of user input. Part II stresses internal structure and flow of program control and is intended to assist the researcher who wishes to revise or modify the code or add to its capabilities. PREREM is written for execution on a Digital Equipment Corporation PDP-10 System and much of the code will require revision before it can be run on other machines. The source program length is 950 lines (116 blocks) and computer core required for execution is 212 K bytes. The user must also have sufficient file space for metabolic and S-factor data sets. Further, 64 100 K byte blocks of computer storage space are required for the nuclear decay data file. Computer storage space must also be available for any output files produced during the PREREM execution. 9 refs., 8 tabs.

  2. Report to users of ATLAS, January 1998

    International Nuclear Information System (INIS)

    Ahmad, I.; Hofman, D.

    1998-01-01

    This report is aimed at informing users about the operating schedule, user policies, and recent changes in research capabilities. It covers the following subjects: (1) status of the Argonne Tandem-Linac Accelerator System (ATLAS) accelerator; (2) the move of Gammasphere from LBNL to ANL; (3) commissioning of the CPT mass spectrometer at ATLAS; (4) highlights of recent research at ATLAS; (5) Program Advisory Committee; and (6) ATLAS User Group Executive Committee

  3. UIMX: A User Interface Management System For Scientific Computing With X Windows

    Science.gov (United States)

    Foody, Michael

    1989-09-01

    Applications with iconic user interfaces, (for example, interfaces with pulldown menus, radio buttons, and scroll bars), such as those found on Apple's Macintosh computer and the IBM PC under Microsoft's Presentation Manager, have become very popular, and for good reason. They are much easier to use than applications with traditional keyboard-oriented interfaces, so training costs are much lower and just about anyone can use them. They are standardized between applications, so once you learn one application you are well along the way to learning another. The use of one reinforces the common elements between applications of the interface, and, as a result, you remember how to use them longer. Finally, for the developer, their support costs can be much lower because of their ease of use.

  4. Xyce parallel electronic simulator users guide, version 6.0.

    Energy Technology Data Exchange (ETDEWEB)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.; Schiek, Richard Louis; Thornquist, Heidi K.; Verley, Jason C.; Fixel, Deborah A.; Coffey, Todd S; Pawlowski, Roger P; Warrender, Christina E.; Baur, David Gregory.

    2013-08-01

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to develop new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.

  5. Xyce parallel electronic simulator users' guide, Version 6.0.1.

    Energy Technology Data Exchange (ETDEWEB)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.; Schiek, Richard Louis; Thornquist, Heidi K.; Verley, Jason C.; Fixel, Deborah A.; Coffey, Todd S; Pawlowski, Roger P; Warrender, Christina E.; Baur, David Gregory.

    2014-01-01

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to develop new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandias needs, including some radiationaware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase a message passing parallel implementation which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.

  6. Xyce parallel electronic simulator users guide, version 6.1

    Energy Technology Data Exchange (ETDEWEB)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.; Schiek, Richard Louis; Sholander, Peter E.; Thornquist, Heidi K.; Verley, Jason C.; Baur, David Gregory

    2014-03-01

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas; Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers; A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to develop new types of analysis without requiring the implementation of analysis-specific device models; Device models that are specifically tailored to meet Sandia's needs, including some radiationaware devices (for Sandia users only); and Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase-a message passing parallel implementation-which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows.

  7. Analysis of User Interaction with a Brain-Computer Interface Based on Steady-State Visually Evoked Potentials: Case Study of a Game.

    Science.gov (United States)

    Leite, Harlei Miguel de Arruda; de Carvalho, Sarah Negreiros; Costa, Thiago Bulhões da Silva; Attux, Romis; Hornung, Heiko Horst; Arantes, Dalton Soares

    2018-01-01

    This paper presents a systematic analysis of a game controlled by a Brain-Computer Interface (BCI) based on Steady-State Visually Evoked Potentials (SSVEP). The objective is to understand BCI systems from the Human-Computer Interface (HCI) point of view, by observing how the users interact with the game and evaluating how the interface elements influence the system performance. The interactions of 30 volunteers with our computer game, named "Get Coins," through a BCI based on SSVEP, have generated a database of brain signals and the corresponding responses to a questionnaire about various perceptual parameters, such as visual stimulation, acoustic feedback, background music, visual contrast, and visual fatigue. Each one of the volunteers played one match using the keyboard and four matches using the BCI, for comparison. In all matches using the BCI, the volunteers achieved the goals of the game. Eight of them achieved a perfect score in at least one of the four matches, showing the feasibility of the direct communication between the brain and the computer. Despite this successful experiment, adaptations and improvements should be implemented to make this innovative technology accessible to the end user.

  8. Controlling user access to electronic resources without password

    Science.gov (United States)

    Smith, Fred Hewitt

    2017-08-22

    Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes obtaining an image from a communication device of a user. An individual and a landmark are identified within the image. Determinations are made that the individual is the user and that the landmark is a predetermined landmark. Access to a restricted computing resource is granted based on the determining that the individual is the user and that the landmark is the predetermined landmark. Other embodiments are disclosed.

  9. Personal computer wallpaper user segmentation based on Sasang typology

    Directory of Open Access Journals (Sweden)

    Joung-Youn Lee

    2015-03-01

    Conclusion: By proposing the Sasang typology as a factor in influencing an HCI usage pattern in this study, it can be used to predict the user's HCI experience, or suggest a native design methodology that can actively cope with the user's psychological environment.

  10. User's manual for seismic analysis code 'SONATINA-2V'

    Energy Technology Data Exchange (ETDEWEB)

    Hanawa, Satoshi; Iyoku, Tatsuo [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment

    2001-08-01

    The seismic analysis code, SONATINA-2V, has been developed to analyze the behavior of the HTTR core graphite components under seismic excitation. The SONATINA-2V code is a two-dimensional computer program capable of analyzing the vertical arrangement of the HTTR graphite components, such as fuel blocks, replaceable reflector blocks, permanent reflector blocks, as well as their restraint structures. In the analytical model, each block is treated as rigid body and is restrained by dowel pins which restrict relative horizontal movement but allow vertical and rocking motions between upper and lower blocks. Moreover, the SONATINA-2V code is capable of analyzing the core vibration behavior under both simultaneous excitations of vertical and horizontal directions. The SONATINA-2V code is composed of the main program, pri-processor for making the input data to SONATINA-2V and post-processor for data processing and making the graphics from analytical results. Though the SONATINA-2V code was developed in order to work in the MSP computer system of Japan Atomic Energy Research Institute (JAERI), the computer system was abolished with the technical progress of computer. Therefore, improvement of this analysis code was carried out in order to operate the code under the UNIX machine, SR8000 computer system, of the JAERI. The users manual for seismic analysis code, SONATINA-2V, including pri- and post-processor is given in the present report. (author)

  11. Development of Parallel Computing Framework to Enhance Radiation Transport Code Capabilities for Rare Isotope Beam Facility Design

    Energy Technology Data Exchange (ETDEWEB)

    Kostin, Mikhail [Michigan State Univ., East Lansing, MI (United States); Mokhov, Nikolai [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Niita, Koji [Research Organization for Information Science and Technology, Ibaraki-ken (Japan)

    2013-09-25

    A parallel computing framework has been developed to use with general-purpose radiation transport codes. The framework was implemented as a C++ module that uses MPI for message passing. It is intended to be used with older radiation transport codes implemented in Fortran77, Fortran 90 or C. The module is significantly independent of radiation transport codes it can be used with, and is connected to the codes by means of a number of interface functions. The framework was developed and tested in conjunction with the MARS15 code. It is possible to use it with other codes such as PHITS, FLUKA and MCNP after certain adjustments. Besides the parallel computing functionality, the framework offers a checkpoint facility that allows restarting calculations with a saved checkpoint file. The checkpoint facility can be used in single process calculations as well as in the parallel regime. The framework corrects some of the known problems with the scheduling and load balancing found in the original implementations of the parallel computing functionality in MARS15 and PHITS. The framework can be used efficiently on homogeneous systems and networks of workstations, where the interference from the other users is possible.

  12. Development of real-time visualization system for Computational Fluid Dynamics on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Otani, Takayuki; Matsumoto, Hideki; Takei, Toshifumi; Doi, Shun

    1998-03-01

    A real-time visualization system for computational fluid dynamics in a network connecting between a parallel computing server and the client terminal was developed. Using the system, a user can visualize the results of a CFD (Computational Fluid Dynamics) simulation on the parallel computer as a client terminal during the actual computation on a server. Using GUI (Graphical User Interface) on the client terminal, to user is also able to change parameters of the analysis and visualization during the real-time of the calculation. The system carries out both of CFD simulation and generation of a pixel image data on the parallel computer, and compresses the data. Therefore, the amount of data from the parallel computer to the client is so small in comparison with no compression that the user can enjoy the swift image appearance comfortably. Parallelization of image data generation is based on Owner Computation Rule. GUI on the client is built on Java applet. A real-time visualization is thus possible on the client PC only if Web browser is implemented on it. (author)

  13. On the Computational Capabilities of Physical Systems. Part 1; The Impossibility of Infallible Computation

    Science.gov (United States)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In this first of two papers, strong limits on the accuracy of physical computation are established. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out any computational task in the subset of such tasks that can be posed to C. This result holds whether the computational tasks concern a system that is physically isolated from C, or instead concern a system that is coupled to C. As a particular example, this result means that there cannot be a physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly 'processing information faster than the universe does'. The results also mean that there cannot exist an infallible, general-purpose observation apparatus, and that there cannot be an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - a definition of 'physical computation' - is needed to address the issues considered in these papers. While this definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. The second in this pair of papers presents a preliminary exploration of some of this mathematical structure, including in particular that of prediction complexity, which is a 'physical computation

  14. Wind-US Users Guide Version 4.0

    Science.gov (United States)

    Yoder, Dennis A.

    2016-01-01

    Wind-US is a computational platform which may be used to numerically solve various sets of equations governing physical phenomena. Currently, the code supports the solution of the Euler and Navier-Stokes equations of fluid mechanics, along with supporting equation sets governing turbulent and chemically reacting flows. Wind-US is a product of the NPARC Alliance, a partnership between the NASA Glenn Research Center (GRC) and the Arnold Engineering Development Complex (AEDC) dedicated to the establishment of a national, applications-oriented flow simulation capability. The Boeing Company has also been closely associated with the Alliance since its inception, and represents the interests of the NPARC User's Association. The "Wind-US User's Guide" describes the operation and use of Wind-US, including: a basic tutorial; the physical and numerical models that are used; the boundary conditions; monitoring convergence; the files that are read and/or written; parallel execution; and a complete list of input keywords and test options. For current information about Wind-US and the NPARC Alliance, please see the Wind-US home page at http://www.grc.nasa.gov/WWW/winddocs/ and the NPARC Alliance home page at http://www.grc.nasa.gov/WWW/wind/.

  15. SNAP Operating System (SOS) user's guide

    International Nuclear Information System (INIS)

    Sabuda, J.D.; Polito, J.; Walker, J.L.; Grant, F.H. III.

    1982-03-01

    The SNAP Operating System (SOS) is a FORTRAN 77 program which provides assistance to the safeguards analyst who uses the Safeguards Automated Facility Evaluation (SAFE) and the Safeguards Network Analysis Procedure (SNAP) techniques. Features offered by SOS are a data base system for storing a library of SNAP applications, computer graphics representation of SNAP models, a computer graphics editor to develop and modify SNAP models, a SAFE-to-SNAP interface, automatic generation of SNAP input data, and a computer graphics postprocessor for SNAP. The SOS User's Guide is designed to provide the user with the information necessary to use SOS effectively. Examples are used throughout to illustrate the concepts. The format of the user's guide follows the same sequence as would be used in executing an actual application

  16. Attitudinal Study of User and Non-User Teachers' towards ICT in Relation to Their School Teaching Subjects

    Science.gov (United States)

    Lal, Chhavi

    2014-01-01

    The present study aimed to know the attitude towards ICT of user and non-user teachers of ICT. The data were collected from 40 (20 male and 20 Female) user and non-user of ICT secondary school teachers from Agra city. Attitude towards ICT was measured by Computer Attitude Scale (CAS), originally developed by Loyd and Gressard (1984). Data were…

  17. A review of brain-computer interface games and an opinion survey from researchers, developers and users.

    Science.gov (United States)

    Ahn, Minkyu; Lee, Mijin; Choi, Jinyoung; Jun, Sung Chan

    2014-08-11

    In recent years, research on Brain-Computer Interface (BCI) technology for healthy users has attracted considerable interest, and BCI games are especially popular. This study reviews the current status of, and describes future directions, in the field of BCI games. To this end, we conducted a literature search and found that BCI control paradigms using electroencephalographic signals (motor imagery, P300, steady state visual evoked potential and passive approach reading mental state) have been the primary focus of research. We also conducted a survey of nearly three hundred participants that included researchers, game developers and users around the world. From this survey, we found that all three groups (researchers, developers and users) agreed on the significant influence and applicability of BCI and BCI games, and they all selected prostheses, rehabilitation and games as the most promising BCI applications. User and developer groups tended to give low priority to passive BCI and the whole head sensor array. Developers gave higher priorities to "the easiness of playing" and the "development platform" as important elements for BCI games and the market. Based on our assessment, we discuss the critical point at which BCI games will be able to progress from their current stage to widespread marketing to consumers. In conclusion, we propose three critical elements important for expansion of the BCI game market: standards, gameplay and appropriate integration.

  18. A Review of Brain-Computer Interface Games and an Opinion Survey from Researchers, Developers and Users

    Directory of Open Access Journals (Sweden)

    Minkyu Ahn

    2014-08-01

    Full Text Available In recent years, research on Brain-Computer Interface (BCI technology for healthy users has attracted considerable interest, and BCI games are especially popular. This study reviews the current status of, and describes future directions, in the field of BCI games. To this end, we conducted a literature search and found that BCI control paradigms using electroencephalographic signals (motor imagery, P300, steady state visual evoked potential and passive approach reading mental state have been the primary focus of research. We also conducted a survey of nearly three hundred participants that included researchers, game developers and users around the world. From this survey, we found that all three groups (researchers, developers and users agreed on the significant influence and applicability of BCI and BCI games, and they all selected prostheses, rehabilitation and games as the most promising BCI applications. User and developer groups tended to give low priority to passive BCI and the whole head sensor array. Developers gave higher priorities to “the easiness of playing” and the “development platform” as important elements for BCI games and the market. Based on our assessment, we discuss the critical point at which BCI games will be able to progress from their current stage to widespread marketing to consumers. In conclusion, we propose three critical elements important for expansion of the BCI game market: standards, gameplay and appropriate integration.

  19. System administration of ATLAS TDAQ computing environment

    Science.gov (United States)

    Adeel-Ur-Rehman, A.; Bujor, F.; Benes, J.; Caramarcu, C.; Dobson, M.; Dumitrescu, A.; Dumitru, I.; Leahu, M.; Valsan, L.; Oreshkin, A.; Popov, D.; Unel, G.; Zaytsev, A.

    2010-04-01

    This contribution gives a thorough overview of the ATLAS TDAQ SysAdmin group activities which deals with administration of the TDAQ computing environment supporting High Level Trigger, Event Filter and other subsystems of the ATLAS detector operating on LHC collider at CERN. The current installation consists of approximately 1500 netbooted nodes managed by more than 60 dedicated servers, about 40 multi-screen user interface machines installed in the control rooms and various hardware and service monitoring machines as well. In the final configuration, the online computer farm will be capable of hosting tens of thousands applications running simultaneously. The software distribution requirements are matched by the two level NFS based solution. Hardware and network monitoring systems of ATLAS TDAQ are based on NAGIOS and MySQL cluster behind it for accounting and storing the monitoring data collected, IPMI tools, CERN LANDB and the dedicated tools developed by the group, e.g. ConfdbUI. The user management schema deployed in TDAQ environment is founded on the authentication and role management system based on LDAP. External access to the ATLAS online computing facilities is provided by means of the gateways supplied with an accounting system as well. Current activities of the group include deployment of the centralized storage system, testing and validating hardware solutions for future use within the ATLAS TDAQ environment including new multi-core blade servers, developing GUI tools for user authentication and roles management, testing and validating 64-bit OS, and upgrading the existing TDAQ hardware components, authentication servers and the gateways.

  20. The Graphical User Interface: Crisis, Danger, and Opportunity.

    Science.gov (United States)

    Boyd, L. H.; And Others

    1990-01-01

    This article describes differences between the graphical user interface and traditional character-based interface systems, identifies potential problems posed by graphic computing environments for blind computer users, and describes some programs and strategies that are being developed to provide access to those environments. (Author/JDD)

  1. Designing end-user interfaces

    CERN Document Server

    Heaton, N

    1988-01-01

    Designing End-User Interfaces: State of the Art Report focuses on the field of human/computer interaction (HCI) that reviews the design of end-user interfaces.This compilation is divided into two parts. Part I examines specific aspects of the problem in HCI that range from basic definitions of the problem, evaluation of how to look at the problem domain, and fundamental work aimed at introducing human factors into all aspects of the design cycle. Part II consists of six main topics-definition of the problem, psychological and social factors, principles of interface design, computer intelligenc

  2. Pervasive brain monitoring and data sharing based on multi-tier distributed computing and linked data technology.

    Science.gov (United States)

    Zao, John K; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping

    2014-01-01

    EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system.

  3. Pervasive brain monitoring and data sharing based on multi-tier distributed computing and linked data technology

    Science.gov (United States)

    Zao, John K.; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping

    2014-01-01

    EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system. PMID:24917804

  4. Customization of user interfaces to reduce errors and enhance user acceptance.

    Science.gov (United States)

    Burkolter, Dina; Weyers, Benjamin; Kluge, Annette; Luther, Wolfram

    2014-03-01

    Customization is assumed to reduce error and increase user acceptance in the human-machine relation. Reconfiguration gives the operator the option to customize a user interface according to his or her own preferences. An experimental study with 72 computer science students using a simulated process control task was conducted. The reconfiguration group (RG) interactively reconfigured their user interfaces and used the reconfigured user interface in the subsequent test whereas the control group (CG) used a default user interface. Results showed significantly lower error rates and higher acceptance of the RG compared to the CG while there were no significant differences between the groups regarding situation awareness and mental workload. Reconfiguration seems to be promising and therefore warrants further exploration. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  5. On The Computational Capabilities of Physical Systems. Part 2; Relationship With Conventional Computer Science

    Science.gov (United States)

    Wolpert, David H.; Koga, Dennis (Technical Monitor)

    2000-01-01

    In the first of this pair of papers, it was proven that there cannot be a physical computer to which one can properly pose any and all computational tasks concerning the physical universe. It was then further proven that no physical computer C can correctly carry out all computational tasks that can be posed to C. As a particular example, this result means that no physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly "processing information faster than the universe does". These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - "physical computation" - is needed to address the issues considered in these papers, which concern real physical computers. While this novel definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. This second paper of the pair presents a preliminary exploration of some of this mathematical structure. Analogues of Chomskian results concerning universal Turing Machines and the Halting theorem are derived, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analogue of algorithmic information complexity, "prediction complexity", is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task

  6. The computer integrated documentation project: A merge of hypermedia and AI techniques

    Science.gov (United States)

    Mathe, Nathalie; Boy, Guy

    1993-01-01

    To generate intelligent indexing that allows context-sensitive information retrieval, a system must be able to acquire knowledge directly through interaction with users. In this paper, we present the architecture for CID (Computer Integrated Documentation). CID is a system that enables integration of various technical documents in a hypertext framework and includes an intelligent browsing system that incorporates indexing in context. CID's knowledge-based indexing mechanism allows case based knowledge acquisition by experimentation. It utilizes on-line user information requirements and suggestions either to reinforce current indexing in case of success or to generate new knowledge in case of failure. This allows CID's intelligent interface system to provide helpful responses, based on previous experience (user feedback). We describe CID's current capabilities and provide an overview of our plans for extending the system.

  7. Space Logistics: Launch Capabilities

    Science.gov (United States)

    Furnas, Randall B.

    1989-01-01

    The current maximum launch capability for the United States are shown. The predicted Earth-to-orbit requirements for the United States are presented. Contrasting the two indicates the strong National need for a major increase in Earth-to-orbit lift capability. Approximate weights for planned payloads are shown. NASA is studying the following options to meet the need for a new heavy-lift capability by mid to late 1990's: (1) Shuttle-C for near term (include growth versions); and (2) the Advanced Lauching System (ALS) for the long term. The current baseline two-engine Shuttle-C has a 15 x 82 ft payload bay and an expected lift capability of 82,000 lb to Low Earth Orbit. Several options are being considered which have expanded diameter payload bays. A three-engine Shuttle-C with an expected lift of 145,000 lb to LEO is being evaluated as well. The Advanced Launch System (ALS) is a potential joint development between the Air Force and NASA. This program is focused toward long-term launch requirements, specifically beyond the year 2000. The basic approach is to develop a family of vehicles with the same high reliability as the Shuttle system, yet offering a much greater lift capability at a greatly reduced cost (per pound of payload). The ALS unmanned family of vehicles will provide a low end lift capability equivalent to Titan IV, and a high end lift capability greater than the Soviet Energia if requirements for such a high-end vehicle are defined.In conclusion, the planning of the next generation space telescope should not be constrained to the current launch vehicles. New vehicle designs will be driven by the needs of anticipated heavy users.

  8. User's Manual for FOMOCO Utilities-Force and Moment Computation Tools for Overset Grids

    Science.gov (United States)

    Chan, William M.; Buning, Pieter G.

    1996-01-01

    In the numerical computations of flows around complex configurations, accurate calculations of force and moment coefficients for aerodynamic surfaces are required. When overset grid methods are used, the surfaces on which force and moment coefficients are sought typically consist of a collection of overlapping surface grids. Direct integration of flow quantities on the overlapping grids would result in the overlapped regions being counted more than once. The FOMOCO Utilities is a software package for computing flow coefficients (force, moment, and mass flow rate) on a collection of overset surfaces with accurate accounting of the overlapped zones. FOMOCO Utilities can be used in stand-alone mode or in conjunction with the Chimera overset grid compressible Navier-Stokes flow solver OVERFLOW. The software package consists of two modules corresponding to a two-step procedure: (1) hybrid surface grid generation (MIXSUR module), and (2) flow quantities integration (OVERINT module). Instructions on how to use this software package are described in this user's manual. Equations used in the flow coefficients calculation are given in Appendix A.

  9. Position of document holder and work related risk factors for neck pain among computer users: a narrative review.

    Science.gov (United States)

    Ambusam, S; Baharudin, O; Roslizawati, N; Leonard, J

    2015-01-01

    Document holder is used as a remedy to address occupational neck pain among computer users. An understanding on the effects of the document holder along with other work related risk factors while working in computer workstation requires attention. A comprehensive knowledge on the optimal location of the document holder in computer use and associated work related factors that may contribute to neck pain reviewed in this article. A literature search has been conducted over the past 14 years based on the published articles from January 1990 to January 2014 in both Science Direct and PubMed databases. Medical Subject Headings (MeSH) keywords for search were neck muscle OR head posture OR muscle tension' OR muscle activity OR work related disorders OR neck pain AND/OR document location OR document holder OR source document OR copy screen holder.Document holder placed lateral to the screen was most preferred to reduce neck discomfort among occupational typists. Document without a holder was placed flat on the surface is least preferred. The head posture and muscle activity increases when the document is placed flat on the surface compared to when placed on the document holder. Work related factors such as static posture, repetitive movement, prolong sitting and awkward positions were the risk factors for chronic neck pain. This review highlights the optimal location for document holder for computer users to reduce neck pain. Together, the importance of work related risk factors for to neck pain on occupational typist is emphasized for the clinical management.

  10. Xyce™ Parallel Electronic Simulator Users' Guide, Version 6.5.

    Energy Technology Data Exchange (ETDEWEB)

    Keiter, Eric R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Electrical Models and Simulation; Aadithya, Karthik V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Electrical Models and Simulation; Mei, Ting [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Electrical Models and Simulation; Russo, Thomas V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Electrical Models and Simulation; Schiek, Richard L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Electrical Models and Simulation; Sholander, Peter E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Electrical Models and Simulation; Thornquist, Heidi K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Electrical Models and Simulation; Verley, Jason C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Electrical Models and Simulation

    2016-06-01

    This manual describes the use of the Xyce Parallel Electronic Simulator. Xyce has been designed as a SPICE-compatible, high-performance analog circuit simulator, and has been written to support the simulation needs of the Sandia National Laboratories electrical designers. This development has focused on improving capability over the current state-of-the-art in the following areas: Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). This includes support for most popular parallel and serial computers. A differential-algebraic-equation (DAE) formulation, which better isolates the device model package from solver algorithms. This allows one to develop new types of analysis without requiring the implementation of analysis-specific device models. Device models that are specifically tailored to meet Sandia's needs, including some radiation- aware devices (for Sandia users only). Object-oriented code design and implementation using modern coding practices. Xyce is a parallel code in the most general sense of the phrase -- a message passing parallel implementation -- which allows it to run efficiently a wide range of computing platforms. These include serial, shared-memory and distributed-memory parallel platforms. Attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved as the number of processors grows. The information herein is subject to change without notice. Copyright © 2002-2016 Sandia Corporation. All rights reserved.

  11. User Defined Data in the New Analysis Model of the BaBar Experiment

    Energy Technology Data Exchange (ETDEWEB)

    De Nardo, G.

    2005-04-06

    The BaBar experiment has recently revised its Analysis Model. One of the key ingredient of BaBar new Analysis Model is the support of the capability to add to the Event Store user defined data, which can be the output of complex computations performed at an advanced stage of a physics analysis, and are associated to analysis objects. In order to provide flexibility and extensibility with respect to object types, template generic programming has been adopted. In this way the model is non-intrusive with respect to reconstruction and analysis objects it manages, not requiring changes in their interfaces and implementations. Technological details are hidden as much as possible to the user, providing a simple interface. In this paper we present some of the limitations of the old model and how they are addressed by the new Analysis Model.

  12. Development of Point Kernel Shielding Analysis Computer Program Implementing Recent Nuclear Data and Graphic User Interfaces

    International Nuclear Information System (INIS)

    Kang, Sang Ho; Lee, Seung Gi; Chung, Chan Young; Lee, Choon Sik; Lee, Jai Ki

    2001-01-01

    In order to comply with revised national regulationson radiological protection and to implement recent nuclear data and dose conversion factors, KOPEC developed a new point kernel gamma and beta ray shielding analysis computer program. This new code, named VisualShield, adopted mass attenuation coefficient and buildup factors from recent ANSI/ANS standards and flux-to-dose conversion factors from the International Commission on Radiological Protection (ICRP) Publication 74 for estimation of effective/equivalent dose recommended in ICRP 60. VisualShield utilizes graphical user interfaces and 3-D visualization of the geometric configuration for preparing input data sets and analyzing results, which leads users to error free processing with visual effects. Code validation and data analysis were performed by comparing the results of various calculations to the data outputs of previous programs such as MCNP 4B, ISOSHLD-II, QAD-CGGP, etc

  13. High resolution, monochromatic x-ray topography capability at CHESS

    Energy Technology Data Exchange (ETDEWEB)

    Finkelstein, K. D., E-mail: kdf1@cornell.edu; Pauling, A.; Brown, Z. [CHESS, Cornell University, Ithaca, NY (United States); Jones, R. [Department of Physics, University of Connecticut, Storrs, CT (United States); Tarun, A.; Misra, D. S. [IIa Technologies (Singapore); Jupitz, S. [St. Mary’s College of Maryland, St. Mary’s City, MD (United States); Sagan, D. C. [CLASSE, Cornell University, Ithaca, NY (United States)

    2016-07-27

    CHESS has a monochromatic x-ray topography capability serving continually expanding user interest. The setup consists of a beam expanding monochromator, 6-circle diffactometer, and CHESS designed CMOS camera with real time sample-alignment capability. This provides rocking curve mapping with angle resolution as small as 2 µradians, spatial resolution to 3 microns, and field of view up to 7mm. Thus far the capability has been applied for: improving CVD-diamond growth, evaluating perfection of ultra-thin diamond membranes, correlating performance of diamond-based electronics with crystal defect structure, and defect analysis of single crystal silicon carbide. This paper describes our topography system, explains its capabilities, and presents experimental results from several applications.

  14. High resolution, monochromatic x-ray topography capability at CHESS

    International Nuclear Information System (INIS)

    Finkelstein, K. D.; Pauling, A.; Brown, Z.; Jones, R.; Tarun, A.; Misra, D. S.; Jupitz, S.; Sagan, D. C.

    2016-01-01

    CHESS has a monochromatic x-ray topography capability serving continually expanding user interest. The setup consists of a beam expanding monochromator, 6-circle diffactometer, and CHESS designed CMOS camera with real time sample-alignment capability. This provides rocking curve mapping with angle resolution as small as 2 µradians, spatial resolution to 3 microns, and field of view up to 7mm. Thus far the capability has been applied for: improving CVD-diamond growth, evaluating perfection of ultra-thin diamond membranes, correlating performance of diamond-based electronics with crystal defect structure, and defect analysis of single crystal silicon carbide. This paper describes our topography system, explains its capabilities, and presents experimental results from several applications.

  15. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    Energy Technology Data Exchange (ETDEWEB)

    Arndt, S.A.

    1997-07-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for code use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities.

  16. Capabilities needed for the next generation of thermo-hydraulic codes for use in real time applications

    International Nuclear Information System (INIS)

    Arndt, S.A.

    1997-01-01

    The real-time reactor simulation field is currently at a crossroads in terms of the capability to perform real-time analysis using the most sophisticated computer codes. Current generation safety analysis codes are being modified to replace simplified codes that were specifically designed to meet the competing requirement for real-time applications. The next generation of thermo-hydraulic codes will need to have included in their specifications the specific requirement for use in a real-time environment. Use of the codes in real-time applications imposes much stricter requirements on robustness, reliability and repeatability than do design and analysis applications. In addition, the need for code use by a variety of users is a critical issue for real-time users, trainers and emergency planners who currently use real-time simulation, and PRA practitioners who will increasingly use real-time simulation for evaluating PRA success criteria in near real-time to validate PRA results for specific configurations and plant system unavailabilities

  17. Social Capital, IT Capability, and the Success of Knowledge Management Systems

    Directory of Open Access Journals (Sweden)

    Irene Y.L. Chen

    2009-03-01

    Full Text Available Many organizations have implemented knowledge management systems to support knowledge management. However, many of such systems have failed due to the lack of relationship networks and IT capability within organizations. Motivated by such concerns, this paper examines the factors that may facilitate the success of knowledge management systems. The ten constructs derived from social capital theory, resource-based view and IS success model are integrated into the current research model. Twenty-one hypotheses derived from the research model are empirically validated using a field survey of KMS users. The results suggest that social capital and organizational IT capability are important preconditions of the success of knowledge management systems. Among the posited relationships, trust, social interaction ties, IT capability do not significantly impact service quality, system quality and IT capability, respectively. Against prior expectation, service quality and knowledge quality do not significantly influence perceived KMS benefits and user satisfaction, respectively. Discussion of the results and conclusion are provided. This study then provides insights for future research avenue.

  18. PENGARUH COMPUTER SELF-EFFICACY, COMPUTER ANXIETY DAN ATTITUDE PADA SYSTEM USE DAN DAMPAKNYA TERHADAP USER SATISFACTION DAN INDIVIDUAL IMPACT (Studi pada Mahasiswa Program Sarjana Angkatan 2011-2013 sebagai Pengguna Sistem Informasi Akademik Mahasiswa (SIAM di Universitas Brawijaya

    Directory of Open Access Journals (Sweden)

    Wempi Naviera

    2017-06-01

    ABSTRAK  Penelitian ini bertujuan untuk mengetahui sejauh mana kepuasan mahasiswa terhadap implementasi sistem informasi yang ada di Universitas Brawijaya, khususnya Sistem Informasi Akademik Mahasiswa (SIAM sekaligus untuk mengetahui dampak individu yang diakibatkan oleh adanya sistem informasi tersebut. Model yang diajukan pada penelitian ini merupakan gabungan beberapa konstruk yang disusun menjadi satu model dan diuji menggunakan beberapa teori dari peneliti terdahulu. Variabel dalam penelitian ini adalah Computer Self-Efficacy, Computer Anxiety, Attitude Pada System Use, User Satisfaction, Individual Impact. Penelitian ini dilakukan di beberapa fakultas di Universitas Brawijaya. Sampel pada penelitian ini adalah mahasiswa angkatan 2011 – 2013, sejumlah 345 orang mahasiswa. Software PASW Statistics (SPSS versi 16 dan Partial Least Square (PLS version 3.2.1 digunakan untuk menguji model dan hipotesis yang diajukan. Kata Kunci: Computer Self-Efficacy, Computer Anxiety, Attitude Pada System Use, User Satisfaction, Individual Impact.

  19. Systems Analysis, Machineable Circulation Data and Library Users and Non-Users.

    Science.gov (United States)

    Lubans, John, Jr.

    A study to be made with computer-based circulation data of the non-use and use of a large academic library is discussed. A search of the literature reveals that computer-based circulation systems can be, but have not been, utilized to provide data bases for systematic analyses of library users and resources. The data gathered in the circulation…

  20. Public Auditing with Privacy Protection in a Multi-User Model of Cloud-Assisted Body Sensor Networks.

    Science.gov (United States)

    Li, Song; Cui, Jie; Zhong, Hong; Liu, Lu

    2017-05-05

    Wireless Body Sensor Networks (WBSNs) are gaining importance in the era of the Internet of Things (IoT). The modern medical system is a particular area where the WBSN techniques are being increasingly adopted for various fundamental operations. Despite such increasing deployments of WBSNs, issues such as the infancy in the size, capabilities and limited data processing capacities of the sensor devices restrain their adoption in resource-demanding applications. Though providing computing and storage supplements from cloud servers can potentially enrich the capabilities of the WBSNs devices, data security is one of the prevailing issues that affects the reliability of cloud-assisted services. Sensitive applications such as modern medical systems demand assurance of the privacy of the users' medical records stored in distant cloud servers. Since it is economically impossible to set up private cloud servers for every client, auditing data security managed in the remote servers has necessarily become an integral requirement of WBSNs' applications relying on public cloud servers. To this end, this paper proposes a novel certificateless public auditing scheme with integrated privacy protection. The multi-user model in our scheme supports groups of users to store and share data, thus exhibiting the potential for WBSNs' deployments within community environments. Furthermore, our scheme enriches user experiences by offering public verifiability, forward security mechanisms and revocation of illegal group members. Experimental evaluations demonstrate the security effectiveness of our proposed scheme under the Random Oracle Model (ROM) by outperforming existing cloud-assisted WBSN models.